WorldWideScience

Sample records for analysis benchmarks phase

  1. Shielding Benchmark Computational Analysis

    International Nuclear Information System (INIS)

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.

    2000-01-01

    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)

  2. OECD/NEA expert group on uncertainty analysis for criticality safety assessment: Results of benchmark on sensitivity calculation (phase III)

    Energy Technology Data Exchange (ETDEWEB)

    Ivanova, T.; Laville, C. [Institut de Radioprotection et de Surete Nucleaire IRSN, BP 17, 92262 Fontenay aux Roses (France); Dyrda, J. [Atomic Weapons Establishment AWE, Aldermaston, Reading, RG7 4PR (United Kingdom); Mennerdahl, D. [E Mennerdahl Systems EMS, Starvaegen 12, 18357 Taeby (Sweden); Golovko, Y.; Raskach, K.; Tsiboulia, A. [Inst. for Physics and Power Engineering IPPE, 1, Bondarenko sq., 249033 Obninsk (Russian Federation); Lee, G. S.; Woo, S. W. [Korea Inst. of Nuclear Safety KINS, 62 Gwahak-ro, Yuseong-gu, Daejeon 305-338 (Korea, Republic of); Bidaud, A.; Sabouri, P. [Laboratoire de Physique Subatomique et de Cosmologie LPSC, CNRS-IN2P3/UJF/INPG, Grenoble (France); Patel, A. [U.S. Nuclear Regulatory Commission (NRC), Washington, DC 20555-0001 (United States); Bledsoe, K.; Rearden, B. [Oak Ridge National Laboratory ORNL, M.S. 6170, P.O. Box 2008, Oak Ridge, TN 37831 (United States); Gulliford, J.; Michel-Sendis, F. [OECD/NEA, 12, Bd des Iles, 92130 Issy-les-Moulineaux (France)

    2012-07-01

    The sensitivities of the k{sub eff} eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplifications impact the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods. (authors)

  3. Benchmarks for Uncertainty Analysis in Modelling (UAM) for the Design, Operation and Safety Analysis of LWRs - Volume I: Specification and Support Data for Neutronics Cases (Phase I)

    International Nuclear Information System (INIS)

    Ivanov, K.; Avramova, M.; Kamerow, S.; Kodeli, I.; Sartori, E.; Ivanov, E.; Cabellos, O.

    2013-01-01

    released. This report presents benchmark specifications for Phase I (Neutronics Phase) of the OECD LWR UAM benchmark in a format similar to the previous OECD/NRC benchmark specifications. Phase I consists of the following exercises: - Exercise 1 (I-1): 'Cell Physics' focused on the derivation of the multi-group microscopic cross-section libraries and their uncertainties. - Exercise 2 (I-2): 'Lattice Physics' focused on the derivation of the few-group macroscopic cross-section libraries and their uncertainties. - Exercise 3 (I-3): 'Core Physics' focused on the core steady-state stand-alone neutronics calculations and their uncertainties. These exercises follow those established in the industry and regulation routine calculation scheme for LWR design and safety analysis. This phase is focused on understanding uncertainties in the prediction of key reactor core parameters associated with LWR stand-alone neutronics core simulation. Such uncertainties occur due to input data uncertainties, modelling errors, and numerical approximations. The chosen approach in Phase I is to select/propagate the most important contributors for each exercise which can be treated in a practical manner. The cross-section uncertainty information is considered as the most important source of input uncertainty for Phase I. The cross-section related uncertainties are propagated through the 3 Exercises of Phase I. In Exercise I-1 these are the variance and covariance data associated with continuous energy cross-sections in evaluated nuclear data files. In Exercise I-2 these are the variance and covariance data associated with multi-group cross-sections used as input in the lattice physics codes. In Exercise I-3 these are the variance and covariance data associated with few-group cross-sections used as input in the core simulators. Depending on the availability of different methods in the computer code of choice for a given exercise, the related methodological uncertainties can play a smaller or larger

  4. Joint European contribution to phase 5 of the BN600 hybrid reactor benchmark core analysis (European ERANOS formulaire for fast reactor core analysis)

    International Nuclear Information System (INIS)

    Rimpault, G.

    2004-01-01

    Hybrid UOX/MOX fueled core of the BN-600 reactor was endorsed as an international benchmark. BFS-2 critical facility was designed for full size simulation of core and shielding of large fast reactors (up tp 3000 MWe). Wide experimental programme including measurements of criticality, fission rates, rod worths, and SVRE was established. Four BFS-62 critical assemblies have been designed to study changes in BN-600 reactor physics-when moving to a hybrid MOX core. BFS-62-3A assembly is a full scale model of the BN-600 reactor hybrid core. it consists of three regions of UO 2 fuel, axial and radial fertile blankets, MOX fuel added in a ring between MC and OC zones, 120 deg sector of stainless steel reflector included within radial blanket. Joint European contribution to the Phase 5 benchmark analysis was performed by Serco Assurance Winfrith (UK) and CEA Cadarache (France). Analysis was carried out using Version 1.2 of the ERANOS code; and data system for advanced and fast reactor core applications. Nuclear data is based on the JEF2.2 nuclear data evaluation (including sodium). Results for Phase 5 of the BN-600 benchmark have been determined for criticality and SVRE in both diffusion and transport theory. Full details of the results are presented in a paper posted on the IAEA Business Collaborator website nad a brief summary is provided in this paper

  5. Analysis of the VVER-1000 coolant transient benchmark phase 1 with the code system RELAP5/PARCS

    International Nuclear Information System (INIS)

    Victor Hugo Sanchez Espinoza

    2005-01-01

    Full text of publication follows: As part of the reactor dynamics activities of FZK/IRS, the qualification of best-estimate coupled code systems for reactor safety evaluations is a key step toward improving their prediction capability and acceptability. The VVER-1000 Coolant Transient Benchmark Phase 1 represents an excellent opportunity to validate the simulation capability of the coupled code system RELAP5/PACRS regarding both the thermal hydraulic plant response (RELAP5) using measured data obtained during commissioning tests at the Kozloduy nuclear power plant unit 6 and the neutron kinetics models of PARCS for hexagonal geometries. The Phase 1 is devoted to the analysis of the switching on of one main coolant pump while the other three pumps are in operation. It includes the following exercises: (a) investigation of the integral plant response using a best-estimate thermal hydraulic system code with a point kinetics model (b) analysis of the core response for given initial and transient thermal hydraulic boundary conditions using a coupled code system with 3D-neutron kinetics model and (c) investigation of the integral plant response using a best-estimate coupled code system with 3D-neutron kinetics. Already before the test, complex flow conditions exist within the RPV e.g. coolant mixing in the upper plenum caused by the reverse flow through the loop-3 with the stopped pump. The test is initiated by switching on the main coolant pump of loop-3 that leads to a reversal of the flow through the respective piping. After about 13 s the mass flow rate through this loop reaches values comparable with the one of the other loops. During this time period, the increased primary coolant flow causes a reduction of the core averaged coolant temperature and thus an increase of the core power. Later on, the power stabilizes at a level higher than the initial power. In this analysis, special attention is paid on the prediction of the spatial asymmetrical core cooling during

  6. OECD/NEA BENCHMARK FOR UNCERTAINTY ANALYSIS IN MODELING (UAM FOR LWRS – SUMMARY AND DISCUSSION OF NEUTRONICS CASES (PHASE I

    Directory of Open Access Journals (Sweden)

    RYAN N. BRATTON

    2014-06-01

    Full Text Available A Nuclear Energy Agency (NEA, Organization for Economic Co-operation and Development (OECD benchmark for Uncertainty Analysis in Modeling (UAM is defined in order to facilitate the development and validation of available uncertainty analysis and sensitivity analysis methods for best-estimate Light water Reactor (LWR design and safety calculations. The benchmark has been named the OECD/NEA UAM-LWR benchmark, and has been divided into three phases each of which focuses on a different portion of the uncertainty propagation in LWR multi-physics and multi-scale analysis. Several different reactor cases are modeled at various phases of a reactor calculation. This paper discusses Phase I, known as the “Neutronics Phase”, which is devoted mostly to the propagation of nuclear data (cross-section uncertainty throughout steady-state stand-alone neutronics core calculations. Three reactor systems (for which design, operation and measured data are available are rigorously studied in this benchmark: Peach Bottom Unit 2 BWR, Three Mile Island Unit 1 PWR, and VVER-1000 Kozloduy-6/Kalinin-3. Additional measured data is analyzed such as the KRITZ LEU criticality experiments and the SNEAK-7A and 7B experiments of the Karlsruhe Fast Critical Facility. Analyzed results include the top five neutron-nuclide reactions, which contribute the most to the prediction uncertainty in keff, as well as the uncertainty in key parameters of neutronics analysis such as microscopic and macroscopic cross-sections, six-group decay constants, assembly discontinuity factors, and axial and radial core power distributions. Conclusions are drawn regarding where further studies should be done to reduce uncertainties in key nuclide reaction uncertainties (i.e.: 238U radiative capture and inelastic scattering (n, n’ as well as the average number of neutrons released per fission event of 239Pu.

  7. The IAEA Coordinated Research Program on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis: Description of the Benchmark Test Cases and Phases

    Energy Technology Data Exchange (ETDEWEB)

    Frederik Reitsma; Gerhard Strydom; Bismark Tyobeka; Kostadin Ivanov

    2012-10-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The uncertainties in the HTR analysis tools are today typically assessed with sensitivity analysis and then a few important input uncertainties (typically based on a PIRT process) are varied in the analysis to find a spread in the parameter of importance. However, one wish to apply a more fundamental approach to determine the predictive capability and accuracies of coupled neutronics/thermal-hydraulics and depletion simulations used for reactor design and safety assessment. Today there is a broader acceptance of the use of uncertainty analysis even in safety studies and it has been accepted by regulators in some cases to replace the traditional conservative analysis. Finally, there is also a renewed focus in supplying reliable covariance data (nuclear data uncertainties) that can then be used in uncertainty methods. Uncertainty and sensitivity studies are therefore becoming an essential component of any significant effort in data and simulation improvement. In order to address uncertainty in analysis and methods in the HTGR community the IAEA launched a Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modelling early in 2012. The project is built on the experience of the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity, but focuses specifically on the peculiarities of HTGR designs and its simulation requirements. Two benchmark problems were defined with the prismatic type design represented by the MHTGR-350 design from General Atomics (GA) while a 250 MW modular pebble bed design, similar to the INET (China) and indirect-cycle PBMR (South Africa) designs are also included. In the paper more detail on the benchmark cases, the different specific phases and tasks and the latest

  8. HPC Benchmark Suite NMx, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Intelligent Automation Inc., (IAI) and University of Central Florida (UCF) propose to develop a comprehensive numerical test suite for benchmarking current and...

  9. Joint European contribution to phases 1 and 2 of the BN600 hybrid reactor benchmark core analysis

    International Nuclear Information System (INIS)

    Rimpault, Gerald; Newton, Tim; Smith, Peter

    2000-01-01

    This paper describes the ERANOS code developed within the European cooperation on fast reactors. Reference scheme and ERANOS code validation are included. The method for BN-600 reactor core analysis and the results of phases 1 and two are presented. They include effective multiplication factors, fuel Doppler constants; steel Doppler constants; sodium density coefficient; steel density coefficients; fuel density coefficient; absorber density coefficient; axial and radial expansion coefficients; dynamic parameters; power distribution; beta and neutron life time; reaction rate distribution

  10. Benchmarking

    OpenAIRE

    Meylianti S., Brigita

    1999-01-01

    Benchmarking has different meaning to different people. There are five types of benchmarking, namely internal benchmarking, competitive benchmarking, industry / functional benchmarking, process / generic benchmarking and collaborative benchmarking. Each type of benchmarking has its own advantages as well as disadvantages. Therefore it is important to know what kind of benchmarking is suitable to a specific application. This paper will discuss those five types of benchmarking in detail, includ...

  11. Benchmark problems for numerical implementations of phase field models

    International Nuclear Information System (INIS)

    Jokisaari, A. M.; Voorhees, P. W.; Guyer, J. E.; Warren, J.; Heinonen, O. G.

    2016-01-01

    Here, we present the first set of benchmark problems for phase field models that are being developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST). While many scientific research areas use a limited set of well-established software, the growing phase field community continues to develop a wide variety of codes and lacks benchmark problems to consistently evaluate the numerical performance of new implementations. Phase field modeling has become significantly more popular as computational power has increased and is now becoming mainstream, driving the need for benchmark problems to validate and verify new implementations. We follow the example set by the micromagnetics community to develop an evolving set of benchmark problems that test the usability, computational resources, numerical capabilities and physical scope of phase field simulation codes. In this paper, we propose two benchmark problems that cover the physics of solute diffusion and growth and coarsening of a second phase via a simple spinodal decomposition model and a more complex Ostwald ripening model. We demonstrate the utility of benchmark problems by comparing the results of simulations performed with two different adaptive time stepping techniques, and we discuss the needs of future benchmark problems. The development of benchmark problems will enable the results of quantitative phase field models to be confidently incorporated into integrated computational materials science and engineering (ICME), an important goal of the Materials Genome Initiative.

  12. Analysis of Benchmark 2 results

    International Nuclear Information System (INIS)

    Bacha, F.; Lefievre, B.; Maillard, J.; Silva, J.

    1994-01-01

    The code GEANT315 has been compared to different codes in two benchmarks. We analyze its performances through our results, especially in the thick target case. In spite of gaps in nucleus-nucleus interaction theories at intermediate energies, benchmarks allow possible improvements of physical models used in our codes. Thereafter, a scheme of radioactive waste burning system is studied. (authors). 4 refs., 7 figs., 1 tab

  13. KAERI results for BN600 full MOX benchmark (Phase 4)

    International Nuclear Information System (INIS)

    Lee, Kibog Lee

    2003-01-01

    The purpose of this document is to report the results of KAERI's calculation for the Phase-4 of BN-600 full MOX fueled core benchmark analyses according to the RCM report of IAEA CRP Action on U pdated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effects. T he BN-600 full MOX core model is based on the specification in the document, F ull MOX Model (Phase4. doc ) . This document addresses the calculational methods employed in the benchmark analyses and benchmark results carried out by KAERI

  14. Discussion of OECD LWR Uncertainty Analysis in Modelling Benchmark

    International Nuclear Information System (INIS)

    Ivanov, K.; Avramova, M.; Royer, E.; Gillford, J.

    2013-01-01

    The demand for best estimate calculations in nuclear reactor design and safety evaluations has increased in recent years. Uncertainty quantification has been highlighted as part of the best estimate calculations. The modelling aspects of uncertainty and sensitivity analysis are to be further developed and validated on scientific grounds in support of their performance and application to multi-physics reactor simulations. The Organization for Economic Co-operation and Development (OECD) / Nuclear Energy Agency (NEA) Nuclear Science Committee (NSC) has endorsed the creation of an Expert Group on Uncertainty Analysis in Modelling (EGUAM). Within the framework of activities of EGUAM/NSC the OECD/NEA initiated the Benchmark for Uncertainty Analysis in Modelling for Design, Operation, and Safety Analysis of Light Water Reactor (OECD LWR UAM benchmark). The general objective of the benchmark is to propagate the predictive uncertainties of code results through complex coupled multi-physics and multi-scale simulations. The benchmark is divided into three phases with Phase I highlighting the uncertainty propagation in stand-alone neutronics calculations, while Phase II and III are focused on uncertainty analysis of reactor core and system respectively. This paper discusses the progress made in Phase I calculations, the Specifications for Phase II and the incoming challenges in defining Phase 3 exercises. The challenges of applying uncertainty quantification to complex code systems, in particular the time-dependent coupled physics models are the large computational burden and the utilization of non-linear models (expected due to the physics coupling). (authors)

  15. Benchmark calculation of subchannel analysis codes

    International Nuclear Information System (INIS)

    1996-02-01

    In order to evaluate the analysis capabilities of various subchannel codes used in thermal-hydraulic design of light water reactors, benchmark calculations were performed. The selected benchmark problems and major findings obtained by the calculations were as follows: (1)As for single-phase flow mixing experiments between two channels, the calculated results of water temperature distribution along the flow direction were agreed with experimental results by tuning turbulent mixing coefficients properly. However, the effect of gap width observed in the experiments could not be predicted by the subchannel codes. (2)As for two-phase flow mixing experiments between two channels, in high water flow rate cases, the calculated distributions of air and water flows in each channel were well agreed with the experimental results. In low water flow cases, on the other hand, the air mixing rates were underestimated. (3)As for two-phase flow mixing experiments among multi-channels, the calculated mass velocities at channel exit under steady-state condition were agreed with experimental values within about 10%. However, the predictive errors of exit qualities were as high as 30%. (4)As for critical heat flux(CHF) experiments, two different results were obtained. A code indicated that the calculated CHF's using KfK or EPRI correlations were well agreed with the experimental results, while another code suggested that the CHF's were well predicted by using WSC-2 correlation or Weisman-Pei mechanistic model. (5)As for droplets entrainment and deposition experiments, it was indicated that the predictive capability was significantly increased by improving correlations. On the other hand, a remarkable discrepancy between codes was observed. That is, a code underestimated the droplet flow rate and overestimated the liquid film flow rate in high quality cases, while another code overestimated the droplet flow rate and underestimated the liquid film flow rate in low quality cases. (J.P.N.)

  16. HPC Benchmark Suite NMx, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — In the phase II effort, Intelligent Automation Inc., (IAI) and University of Central Florida (UCF) propose to develop a comprehensive numerical test suite for...

  17. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  18. Analysis of an OECD/NEA high-temperature reactor benchmark

    International Nuclear Information System (INIS)

    Hosking, J. G.; Newton, T. D.; Koeberl, O.; Morris, P.; Goluoglu, S.; Tombakoglu, T.; Colak, U.; Sartori, E.

    2006-01-01

    This paper describes analyses of the OECD/NEA HTR benchmark organized by the 'Working Party on the Scientific Issues of Reactor Systems (WPRS)', formerly the 'Working Party on the Physics of Plutonium Fuels and Innovative Fuel Cycles'. The benchmark was specifically designed to provide inter-comparisons for plutonium and thorium fuels when used in HTR systems. Calculations considering uranium fuel have also been included in the benchmark, in order to identify any increased uncertainties when using plutonium or thorium fuels. The benchmark consists of five phases, which include cell and whole-core calculations. Analysis of the benchmark has been performed by a number of international participants, who have used a range of deterministic and Monte Carlo code schemes. For each of the benchmark phases, neutronics parameters have been evaluated. Comparisons are made between the results of the benchmark participants, as well as comparisons between the predictions of the deterministic calculations and those from detailed Monte Carlo calculations. (authors)

  19. Space Weather Action Plan Solar Radio Burst Phase 1 Benchmarks and the Steps to Phase 2

    Science.gov (United States)

    Biesecker, D. A.; White, S. M.; Gopalswamy, N.; Black, C.; Love, J. J.; Pierson, J.

    2017-12-01

    Solar radio bursts, when at the right frequency and when strong enough, can interfere with radar, communication, and tracking signals. In severe cases, radio bursts can inhibit the successful use of radio communications and disrupt a wide range of systems that are reliant on Position, Navigation, and Timing services on timescales ranging from minutes to hours across wide areas on the dayside of Earth. The White House's Space Weather Action Plan asked for solar radio burst intensity benchmarks for an event occurrence frequency of 1 in 100 years and also a theoretical maximum intensity benchmark. The benchmark team has developed preliminary (phase 1) benchmarks for the VHF (30-300 MHz), UHF (300-3000 MHz), GPS (1176-1602 MHz), F10.7 (2800 MHz), and Microwave (4000-20000) bands. The preliminary benchmarks were derived based on previously published work. Limitations in the published work will be addressed in phase 2 of the benchmark process. In addition, deriving theoretical maxima requires additional work, where it is even possible to, in order to meet the Action Plan objectives. In this presentation, we will present the phase 1 benchmarks, the basis used to derive them, and the limitations of that work. We will also discuss the work that needs to be done to complete the phase 2 benchmarks.

  20. BN-600 MOX Core Benchmark Analysis. Results from Phases 4 and 6 of a Coordinated Research Project on Updated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effects

    International Nuclear Information System (INIS)

    2013-12-01

    For those Member States that have or have had significant fast reactor development programmes, it is of utmost importance that they have validated up to date codes and methods for fast reactor physics analysis in support of R and D and core design activities in the area of actinide utilization and incineration. In particular, some Member States have recently focused on fast reactor systems for minor actinide transmutation and on cores optimized for consuming rather than breeding plutonium; the physics of the breeder reactor cycle having already been widely investigated. Plutonium burning systems may have an important role in managing plutonium stocks until the time when major programmes of self-sufficient fast breeder reactors are established. For assessing the safety of these systems, it is important to determine the prediction accuracy of transient simulations and their associated reactivity coefficients. In response to Member States' expressed interest, the IAEA sponsored a coordinated research project (CRP) on Updated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effects. The CRP started in November 1999 and, at the first meeting, the members of the CRP endorsed a benchmark on the BN-600 hybrid core for consideration in its first studies. Benchmark analyses of the BN-600 hybrid core were performed during the first three phases of the CRP, investigating different nuclear data and levels of approximation in the calculation of safety related reactivity effects and their influence on uncertainties in transient analysis prediction. In an additional phase of the benchmark studies, experimental data were used for the verification and validation of nuclear data libraries and methods in support of the previous three phases. The results of phases 1, 2, 3 and 5 of the CRP are reported in IAEA-TECDOC-1623, BN-600 Hybrid Core Benchmark Analyses, Results from a Coordinated Research Project on Updated Codes and Methods to Reduce the

  1. JNC results of BN-600 benchmark calculation (phase 3)

    International Nuclear Information System (INIS)

    Ishikawa, M.

    2002-01-01

    The present work is the result of phase 3 BN-600 core benchmark problem, meaning burnup and heterogeneity. Analytical method applied consisted of: JENDL-3.2 nuclear data library, group constants (70 group, ABBN type self shielding transport factors), heterogeneous cell model for fuel and control rod, basic diffusion calculation (CITATION code), transport theory and mesh size correction (NSHEX code based on SN transport nodal method developed by JNC). Burnup and heterogeneity calculation results are presented obtained by applying both diffusion and transport approach for beginning and end of cycle

  2. Uncertainty and sensitivity analysis in reactivity-initiated accident fuel modeling: synthesis of organisation for economic co-operation and development (OECD/nuclear energy agency (NEA benchmark on reactivity-initiated accident codes phase-II

    Directory of Open Access Journals (Sweden)

    Olivier Marchand

    2018-03-01

    Full Text Available In the framework of OECD/NEA Working Group on Fuel Safety, a RIA fuel-rod-code Benchmark Phase I was organized in 2010–2013. It consisted of four experiments on highly irradiated fuel rodlets tested under different experimental conditions. This benchmark revealed the need to better understand the basic models incorporated in each code for realistic simulation of the complicated integral RIA tests with high burnup fuel rods. A second phase of the benchmark (Phase II was thus launched early in 2014, which has been organized in two complementary activities: (1 comparison of the results of different simulations on simplified cases in order to provide additional bases for understanding the differences in modelling of the concerned phenomena; (2 assessment of the uncertainty of the results. The present paper provides a summary and conclusions of the second activity of the Benchmark Phase II, which is based on the input uncertainty propagation methodology. The main conclusion is that uncertainties cannot fully explain the difference between the code predictions. Finally, based on the RIA benchmark Phase-I and Phase-II conclusions, some recommendations are made. Keywords: RIA, Codes Benchmarking, Fuel Modelling, OECD

  3. Benchmarking

    OpenAIRE

    Beretta Sergio; Dossi Andrea; Grove Hugh

    2000-01-01

    Due to their particular nature, the benchmarking methodologies tend to exceed the boundaries of management techniques, and to enter the territories of managerial culture. A culture that is also destined to break into the accounting area not only strongly supporting the possibility of fixing targets, and measuring and comparing the performance (an aspect that is already innovative and that is worthy of attention), but also questioning one of the principles (or taboos) of the accounting or...

  4. Analysis of a molten salt reactor benchmark

    International Nuclear Information System (INIS)

    Ghosh, Biplab; Bajpai, Anil; Degweker, S.B.

    2013-01-01

    This paper discusses results of our studies of an IAEA molten salt reactor (MSR) benchmark. The benchmark, proposed by Japan, involves burnup calculations of a single lattice cell of a MSR for burning plutonium and other minor actinides. We have analyzed this cell with in-house developed burnup codes BURNTRAN and McBURN. This paper also presents a comparison of the results of our codes and those obtained by the proposers of the benchmark. (author)

  5. JNC results of BN-600 benchmark calculation (phase 4)

    International Nuclear Information System (INIS)

    Ishikawa, Makoto

    2003-01-01

    The present work is the results of JNC, Japan, for the Phase 4 of the BN-600 core benchmark problem (Hex-Z fully MOX fuelled core model) organized by IAEA. The benchmark specification is based on 1) the RCM report of IAEA CRP on 'Updated Codes and Methods to Reduce the Calculational Uncertainties of LMFR Reactivity Effects, Action 3.12' (Calculations for BN-600 fully fuelled MOX core for subsequent transient analyses). JENDL-3.2 nuclear data library was used for calculating 70 group ABBN-type group constants. Cell models for fuel assembly and control rod calculations were applied: homogeneous and heterogeneous (cylindrical supercell) model. Basic diffusion calculation was three-dimensional Hex-Z model, 18 group (Citation code). Transport calculations were 18 group, three-dimensional (NSHEC code) based on Sn-transport nodal method developed at JNC. The generated thermal power per fission was based on Sher's data corrected on the basis of ENDF/B-IV data library. Calculation results are presented in Tables for intercomparison

  6. Repeated Results Analysis for Middleware Regression Benchmarking

    Czech Academy of Sciences Publication Activity Database

    Bulej, Lubomír; Kalibera, T.; Tůma, P.

    2005-01-01

    Roč. 60, - (2005), s. 345-358 ISSN 0166-5316 R&D Projects: GA ČR GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : middleware benchmarking * regression benchmarking * regression testing Subject RIV: JD - Computer Applications, Robotics Impact factor: 0.756, year: 2005

  7. INL Results for Phases I and III of the OECD/NEA MHTGR-350 Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Gerhard Strydom; Javier Ortensi; Sonat Sen; Hans Hammer

    2013-09-01

    The Idaho National Laboratory (INL) Very High Temperature Reactor (VHTR) Technology Development Office (TDO) Methods Core Simulation group led the construction of the Organization for Economic Cooperation and Development (OECD) Modular High Temperature Reactor (MHTGR) 350 MW benchmark for comparing and evaluating prismatic VHTR analysis codes. The benchmark is sponsored by the OECD's Nuclear Energy Agency (NEA), and the project will yield a set of reference steady-state, transient, and lattice depletion problems that can be used by the Department of Energy (DOE), the Nuclear Regulatory Commission (NRC), and vendors to assess their code suits. The Methods group is responsible for defining the benchmark specifications, leading the data collection and comparison activities, and chairing the annual technical workshops. This report summarizes the latest INL results for Phase I (steady state) and Phase III (lattice depletion) of the benchmark. The INSTANT, Pronghorn and RattleSnake codes were used for the standalone core neutronics modeling of Exercise 1, and the results obtained from these codes are compared in Section 4. Exercise 2 of Phase I requires the standalone steady-state thermal fluids modeling of the MHTGR-350 design, and the results for the systems code RELAP5-3D are discussed in Section 5. The coupled neutronics and thermal fluids steady-state solution for Exercise 3 are reported in Section 6, utilizing the newly developed Parallel and Highly Innovative Simulation for INL Code System (PHISICS)/RELAP5-3D code suit. Finally, the lattice depletion models and results obtained for Phase III are compared in Section 7. The MHTGR-350 benchmark proved to be a challenging simulation set of problems to model accurately, and even with the simplifications introduced in the benchmark specification this activity is an important step in the code-to-code verification of modern prismatic VHTR codes. A final OECD/NEA comparison report will compare the Phase I and III

  8. OECD/NEA burnup credit calculational criticality benchmark Phase I-B results

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, M.D.; Parks, C.V. [Oak Ridge National Lab., TN (United States); Brady, M.C. [Sandia National Labs., Las Vegas, NV (United States)

    1996-06-01

    In most countries, criticality analysis of LWR fuel stored in racks and casks has assumed that the fuel is fresh with the maximum allowable initial enrichment. This assumption has led to the design of widely spaced and/or highly poisoned storage and transport arrays. If credit is assumed for fuel burnup, initial enrichment limitations can be raised in existing systems, and more compact and economical arrays can be designed. Such reliance on the reduced reactivity of spent fuel for criticality control is referred to as burnup credit. The Burnup Credit Working Group, formed under the auspices of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development, has established a set of well-defined calculational benchmarks designed to study significant aspects of burnup credit computational methods. These benchmarks are intended to provide a means for the intercomparison of computer codes, methods, and data applied in spent fuel analysis. The benchmarks have been divided into multiple phases, each phase focusing on a particular feature of burnup credit analysis. This report summarizes the results and findings of the Phase I-B benchmark, which was proposed to provide a comparison of the ability of different code systems and data libraries to perform depletion analysis for the prediction of spent fuel isotopic concentrations. Results included here represent 21 different sets of calculations submitted by 16 different organizations worldwide and are based on a limited set of nuclides determined to have the most important effect on the neutron multiplication factor of light-water-reactor spent fuel. A comparison of all sets of results demonstrates that most methods agree to within 10% in the ability to estimate the spent fuel concentrations of most actinides. All methods agree within 11% about the average for all fission products studied. Most deviations are less than 10%, and many are less than 5%. The exceptions are Sm 149, Sm 151, and Gd 155.

  9. OECD/NEA burnup credit calculational criticality benchmark Phase I-B results

    International Nuclear Information System (INIS)

    DeHart, M.D.; Parks, C.V.; Brady, M.C.

    1996-06-01

    In most countries, criticality analysis of LWR fuel stored in racks and casks has assumed that the fuel is fresh with the maximum allowable initial enrichment. This assumption has led to the design of widely spaced and/or highly poisoned storage and transport arrays. If credit is assumed for fuel burnup, initial enrichment limitations can be raised in existing systems, and more compact and economical arrays can be designed. Such reliance on the reduced reactivity of spent fuel for criticality control is referred to as burnup credit. The Burnup Credit Working Group, formed under the auspices of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development, has established a set of well-defined calculational benchmarks designed to study significant aspects of burnup credit computational methods. These benchmarks are intended to provide a means for the intercomparison of computer codes, methods, and data applied in spent fuel analysis. The benchmarks have been divided into multiple phases, each phase focusing on a particular feature of burnup credit analysis. This report summarizes the results and findings of the Phase I-B benchmark, which was proposed to provide a comparison of the ability of different code systems and data libraries to perform depletion analysis for the prediction of spent fuel isotopic concentrations. Results included here represent 21 different sets of calculations submitted by 16 different organizations worldwide and are based on a limited set of nuclides determined to have the most important effect on the neutron multiplication factor of light-water-reactor spent fuel. A comparison of all sets of results demonstrates that most methods agree to within 10% in the ability to estimate the spent fuel concentrations of most actinides. All methods agree within 11% about the average for all fission products studied. Most deviations are less than 10%, and many are less than 5%. The exceptions are Sm 149, Sm 151, and Gd 155

  10. OECD/NEA Burnup Credit Calculational Criticality Benchmark Phase I-B Results

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, M.D.

    1993-01-01

    Burnup credit is an ongoing technical concern for many countries that operate commercial nuclear power reactors. In a multinational cooperative effort to resolve burnup credit issues, a Burnup Credit Working Group has been formed under the auspices of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development. This working group has established a set of well-defined calculational benchmarks designed to study significant aspects of burnup credit computational methods. These benchmarks are intended to provide a means for the intercomparison of computer codes, methods, and data applied in spent fuel analysis. The benchmarks have been divided into multiple phases, each phase focusing on a particular feature of burnup credit analysis. This report summarizes the results and findings of the Phase I-B benchmark, which was proposed to provide a comparison of the ability of different code systems and data libraries to perform depletion analysis for the prediction of spent fuel isotopic concentrations. Results included here represent 21 different sets of calculations submitted by 16 different organizations worldwide, and are based on a limited set of nuclides determined to have the most important effect on the neutron multiplication factor of light-water-reactor spent fuel. A comparison of all sets of results demonstrates that most methods are in agreement to within 10% in the ability to estimate the spent fuel concentrations of most actinides. All methods are within 11% agreement about the average for all fission products studied. Furthermore, most deviations are less than 10%, and many are less than 5%. The exceptions are {sup 149}Sm, {sup 151}Sm, and {sup 155}Gd.

  11. OECD/NEA Burnup Credit Calculational Criticality Benchmark Phase I-B Results

    International Nuclear Information System (INIS)

    DeHart, M.D.

    1993-01-01

    Burnup credit is an ongoing technical concern for many countries that operate commercial nuclear power reactors. In a multinational cooperative effort to resolve burnup credit issues, a Burnup Credit Working Group has been formed under the auspices of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development. This working group has established a set of well-defined calculational benchmarks designed to study significant aspects of burnup credit computational methods. These benchmarks are intended to provide a means for the intercomparison of computer codes, methods, and data applied in spent fuel analysis. The benchmarks have been divided into multiple phases, each phase focusing on a particular feature of burnup credit analysis. This report summarizes the results and findings of the Phase I-B benchmark, which was proposed to provide a comparison of the ability of different code systems and data libraries to perform depletion analysis for the prediction of spent fuel isotopic concentrations. Results included here represent 21 different sets of calculations submitted by 16 different organizations worldwide, and are based on a limited set of nuclides determined to have the most important effect on the neutron multiplication factor of light-water-reactor spent fuel. A comparison of all sets of results demonstrates that most methods are in agreement to within 10% in the ability to estimate the spent fuel concentrations of most actinides. All methods are within 11% agreement about the average for all fission products studied. Furthermore, most deviations are less than 10%, and many are less than 5%. The exceptions are 149 Sm, 151 Sm, and 155 Gd

  12. Benchmarking Benchmarks

    NARCIS (Netherlands)

    D.C. Blitz (David)

    2011-01-01

    textabstractBenchmarking benchmarks is a bundle of six studies that are inspired by the prevalence of benchmarking in academic finance research as well as in investment practice. Three studies examine if current benchmark asset pricing models adequately describe the cross-section of stock returns.

  13. Benchmarking of small-signal dynamics of single-phase PLLs

    DEFF Research Database (Denmark)

    Zhang, Chong; Wang, Xiongfei; Blaabjerg, Frede

    2015-01-01

    Phase-looked Loop (PLL) is a critical component for the control and grid synchronization of grid-connected power converters. This paper presents a benchmarking study on the small-signal dynamics of three commonly used PLLs for single-phase converters, including enhanced PLL, second......-order generalized integrator based PLL, and the inverse-PLL. First, a unified small-signal model of those PLLs is established for comparing their dynamics. Then, a systematic design guideline for parameters tuning of the PLLs is formulated. To confirm the validity of theoretical analysis, nonlinear time...

  14. Benchmarks

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...

  15. Issues in benchmarking human reliability analysis methods: A literature review

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Hendrickson, Stacey M.L.; Forester, John A.; Tran, Tuan Q.; Lois, Erasmia

    2010-01-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies is presented in order to aid in the design of future HRA benchmarking endeavors.

  16. Issues in benchmarking human reliability analysis methods : a literature review.

    Energy Technology Data Exchange (ETDEWEB)

    Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  17. OECD/NEA Sandia Fuel Project phase I: Benchmark of the ignition testing

    Energy Technology Data Exchange (ETDEWEB)

    Adorni, Martina, E-mail: martina_adorni@hotmail.it [UNIPI (Italy); Herranz, Luis E. [CIEMAT (Spain); Hollands, Thorsten [GRS (Germany); Ahn, Kwang-II [KAERI (Korea, Republic of); Bals, Christine [GRS (Germany); D' Auria, Francesco [UNIPI (Italy); Horvath, Gabor L. [NUBIKI (Hungary); Jaeckel, Bernd S. [PSI (Switzerland); Kim, Han-Chul; Lee, Jung-Jae [KINS (Korea, Republic of); Ogino, Masao [JNES (Japan); Techy, Zsolt [NUBIKI (Hungary); Velazquez-Lozad, Alexander; Zigh, Abdelghani [USNRC (United States); Rehacek, Radomir [OECD/NEA (France)

    2016-10-15

    Highlights: • A unique PWR spent fuel pool experimental project is analytically investigated. • Predictability of fuel clad ignition in case of a complete loss of coolant in SFPs is assessed. • Computer codes reasonably estimate peak cladding temperature and time of ignition. - Abstract: The OECD/NEA Sandia Fuel Project provided unique thermal-hydraulic experimental data associated with Spent Fuel Pool (SFP) complete drain down. The study conducted at Sandia National Laboratories (SNL) was successfully completed (July 2009 to February 2013). The accident conditions of interest for the SFP were simulated in a full scale prototypic fashion (electrically heated, prototypic assemblies in a prototypic SFP rack) so that the experimental results closely represent actual fuel assembly responses. A major impetus for this work was to facilitate severe accident code validation and to reduce modeling uncertainties within the codes. Phase I focused on axial heating and burn propagation in a single PWR 17 × 17 assembly (i.e. “hot neighbors” configuration). Phase II addressed axial and radial heating and zirconium fire propagation including effects of fuel rod ballooning in a 1 × 4 assembly configuration (i.e. single, hot center assembly and four, “cooler neighbors”). This paper summarizes the comparative analysis regarding the final destructive ignition test of the phase I of the project. The objective of the benchmark is to evaluate and compare the predictive capabilities of computer codes concerning the ignition testing of PWR fuel assemblies. Nine institutions from eight different countries were involved in the benchmark calculations. The time to ignition and the maximum temperature are adequately captured by the calculations. It is believed that the benchmark constitutes an enlargement of the validation range for the codes to the conditions tested, thus enhancing the code applicability to other fuel assembly designs and configurations. The comparison of

  18. Impact testing and analysis for structural code benchmarking

    International Nuclear Information System (INIS)

    Glass, R.E.

    1989-01-01

    Sandia National Laboratories, in cooperation with industry and other national laboratories, has been benchmarking computer codes (''Structural Code Benchmarking for the Analysis of Impact Response of Nuclear Material Shipping Cask,'' R.E. Glass, Sandia National Laboratories, 1985; ''Sample Problem Manual for Benchmarking of Cask Analysis Codes,'' R.E. Glass, Sandia National Laboratories, 1988; ''Standard Thermal Problem Set for the Evaluation of Heat Transfer Codes Used in the Assessment of Transportation Packages, R.E. Glass, et al., Sandia National Laboratories, 1988) used to predict the structural, thermal, criticality, and shielding behavior of radioactive materials packages. The first step in the benchmarking of the codes was to develop standard problem sets and to compare the results from several codes and users. This step for structural analysis codes has been completed as described in ''Structural Code Benchmarking for the Analysis of Impact Response of Nuclear Material Shipping Casks,'' R.E. Glass, Sandia National Laboratories, 1985. The problem set is shown in Fig. 1. This problem set exercised the ability of the codes to predict the response to end (axisymmetric) and side (plane strain) impacts with both elastic and elastic/plastic materials. The results from these problems showed that there is good agreement in predicting elastic response. Significant differences occurred in predicting strains for the elastic/plastic models. An example of the variation in predicting plastic behavior is given, which shows the hoop strain as a function of time at the impacting end of Model B. These differences in predicting plastic strains demonstrated a need for benchmark data for a cask-like problem. 6 refs., 5 figs

  19. Benchmark enclosure fire suppression experiments - phase 1 test report.

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa, Victor G.; Nichols, Robert Thomas; Blanchat, Thomas K.

    2007-06-01

    A series of fire benchmark water suppression tests were performed that may provide guidance for dispersal systems for the protection of high value assets. The test results provide boundary and temporal data necessary for water spray suppression model development and validation. A review of fire suppression in presented for both gaseous suppression and water mist fire suppression. The experimental setup and procedure for gathering water suppression performance data are shown. Characteristics of the nozzles used in the testing are presented. Results of the experiments are discussed.

  20. A biosegmentation benchmark for evaluation of bioimage analysis methods

    Directory of Open Access Journals (Sweden)

    Kvilekval Kristian

    2009-11-01

    Full Text Available Abstract Background We present a biosegmentation benchmark that includes infrastructure, datasets with associated ground truth, and validation methods for biological image analysis. The primary motivation for creating this resource comes from the fact that it is very difficult, if not impossible, for an end-user to choose from a wide range of segmentation methods available in the literature for a particular bioimaging problem. No single algorithm is likely to be equally effective on diverse set of images and each method has its own strengths and limitations. We hope that our benchmark resource would be of considerable help to both the bioimaging researchers looking for novel image processing methods and image processing researchers exploring application of their methods to biology. Results Our benchmark consists of different classes of images and ground truth data, ranging in scale from subcellular, cellular to tissue level, each of which pose their own set of challenges to image analysis. The associated ground truth data can be used to evaluate the effectiveness of different methods, to improve methods and to compare results. Standard evaluation methods and some analysis tools are integrated into a database framework that is available online at http://bioimage.ucsb.edu/biosegmentation/. Conclusion This online benchmark will facilitate integration and comparison of image analysis methods for bioimages. While the primary focus is on biological images, we believe that the dataset and infrastructure will be of interest to researchers and developers working with biological image analysis, image segmentation and object tracking in general.

  1. Developing a benchmark for emotional analysis of music.

    Science.gov (United States)

    Aljanaki, Anna; Yang, Yi-Hsuan; Soleymani, Mohammad

    2017-01-01

    Music emotion recognition (MER) field rapidly expanded in the last decade. Many new methods and new audio features are developed to improve the performance of MER algorithms. However, it is very difficult to compare the performance of the new methods because of the data representation diversity and scarcity of publicly available data. In this paper, we address these problems by creating a data set and a benchmark for MER. The data set that we release, a MediaEval Database for Emotional Analysis in Music (DEAM), is the largest available data set of dynamic annotations (valence and arousal annotations for 1,802 songs and song excerpts licensed under Creative Commons with 2Hz time resolution). Using DEAM, we organized the 'Emotion in Music' task at MediaEval Multimedia Evaluation Campaign from 2013 to 2015. The benchmark attracted, in total, 21 active teams to participate in the challenge. We analyze the results of the benchmark: the winning algorithms and feature-sets. We also describe the design of the benchmark, the evaluation procedures and the data cleaning and transformations that we suggest. The results from the benchmark suggest that the recurrent neural network based approaches combined with large feature-sets work best for dynamic MER.

  2. Policy Analysis of the English Graduation Benchmark in Taiwan

    Science.gov (United States)

    Shih, Chih-Min

    2012-01-01

    To nudge students to study English and to improve their English proficiency, many universities in Taiwan have imposed an English graduation benchmark on their students. This article reviews this policy, using the theoretic framework for education policy analysis proposed by Haddad and Demsky (1995). The author presents relevant research findings,…

  3. Policy analysis of the English graduation benchmark in Taiwan ...

    African Journals Online (AJOL)

    To nudge students to study English and to improve their English proficiency, many universities in Taiwan have imposed an English graduation benchmark on their students. This article reviews this policy, using the theoretic framework for education policy analysis proposed by Haddad and Demsky (1995). The author ...

  4. Impact testing and analysis for structural code benchmarking

    International Nuclear Information System (INIS)

    Glass, R.E.

    1989-01-01

    Sandia National Laboratories, in cooperation with industry and other national laboratories, has been benchmarking computer codes used to predict the structural, thermal, criticality, and shielding behavior of radioactive materials packages. The first step in the benchmarking of the codes was to develop standard problem sets and to compare the results from several codes and users. This step for structural analysis codes has been completed as described in Structural Code Benchmarking for the Analysis of Impact Response of Nuclear Material Shipping Casks, R.E. Glass, Sandia National Laboratories, 1985. The problem set is shown in Fig. 1. This problem set exercised the ability of the codes to predict the response to end (axisymmetric) and side (plane strain) impacts with both elastic and elastic/plastic materials. The results from these problems showed that there is good agreement in predicting elastic response. Significant differences occurred in predicting strains for the elastic/plastic models. An example of the variation in predicting plastic behavior is given, which shows the hoop strain as a function of time at the impacting end of Model B. These differences in predicting plastic strains demonstrated a need for benchmark data for a cask-like problem

  5. The University of Pisa calculations for the Phase I of the OECD/NEA UAM Benchmark

    International Nuclear Information System (INIS)

    Ball, M.; Parisi, C.; D'Auria, F.

    2009-01-01

    In this paper we present the Univ. of Pisa preliminary results for the first exercise of the Phase I of the OECD/NEA Benchmark on the Uncertainty in Analysis and Modeling. The scope of exercise one is to address the uncertainties due to the basic nuclear data as well as the impact of processing the nuclear and covariance data, selection of multi-group structure and self-shielding treatment. DRAGON code and TSUNAMI code were employed, using the available covariance data matrix. The execution of DRAGON calculations required the use of ANGELO and LAMBDA codes for the extension of the covariance matrix from the original SCALE 44 group structure to DRAGON 69 group structure. The uncertainties for the main cross sections were evaluated and are presented here. (authors)

  6. Numisheet2005 Benchmark Analysis on Forming of an Automotive Underbody Cross Member: Benchmark 2

    International Nuclear Information System (INIS)

    Buranathiti, Thaweepat; Cao Jian

    2005-01-01

    This report presents an international cooperation benchmark effort focusing on simulations of a sheet metal stamping process. A forming process of an automotive underbody cross member using steel and aluminum blanks is used as a benchmark. Simulation predictions from each submission are analyzed via comparison with the experimental results. A brief summary of various models submitted for this benchmark study is discussed. Prediction accuracy of each parameter of interest is discussed through the evaluation of cumulative errors from each submission

  7. Benchmarking of thermalhydraulic loop models for lead-alloy-cooled advanced nuclear energy systems. Phase I: Isothermal forced convection case

    International Nuclear Information System (INIS)

    2012-06-01

    Under the auspices of the NEA Nuclear Science Committee (NSC), the Working Party on Scientific Issues of the Fuel Cycle (WPFC) has been established to co-ordinate scientific activities regarding various existing and advanced nuclear fuel cycles, including advanced reactor systems, associated chemistry and flowsheets, development and performance of fuel and materials and accelerators and spallation targets. The WPFC has different expert groups to cover a wide range of scientific issues in the field of nuclear fuel cycle. The Task Force on Lead-Alloy-Cooled Advanced Nuclear Energy Systems (LACANES) was created in 2006 to study thermal-hydraulic characteristics of heavy liquid metal coolant loop. The objectives of the task force are to (1) validate thermal-hydraulic loop models for application to LACANES design analysis in participating organisations, by benchmarking with a set of well-characterised lead-alloy coolant loop test data, (2) establish guidelines for quantifying thermal-hydraulic modelling parameters related to friction and heat transfer by lead-alloy coolant and (3) identify specific issues, either in modelling and/or in loop testing, which need to be addressed via possible future work. Nine participants from seven different institutes participated in the first phase of the benchmark. This report provides details of the benchmark specifications, method and code characteristics and results of the preliminary study: pressure loss coefficient and Phase-I. A comparison and analysis of the results will be performed together with Phase-II

  8. Sensitivity Analysis of OECD Benchmark Tests in BISON

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gamble, Kyle [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schmidt, Rodney C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Williamson, Richard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on sensitivity analysis of a fuels performance benchmark problem. The benchmark problem was defined by the Uncertainty Analysis in Modeling working group of the Nuclear Science Committee, part of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development (OECD ). The benchmark problem involv ed steady - state behavior of a fuel pin in a Pressurized Water Reactor (PWR). The problem was created in the BISON Fuels Performance code. Dakota was used to generate and analyze 300 samples of 17 input parameters defining core boundary conditions, manuf acturing tolerances , and fuel properties. There were 24 responses of interest, including fuel centerline temperatures at a variety of locations and burnup levels, fission gas released, axial elongation of the fuel pin, etc. Pearson and Spearman correlatio n coefficients and Sobol' variance - based indices were used to perform the sensitivity analysis. This report summarizes the process and presents results from this study.

  9. Yucca Mountain Project thermal and mechanical codes first benchmark exercise: Part 3, Jointed rock mass analysis

    International Nuclear Information System (INIS)

    Costin, L.S.; Bauer, S.J.

    1991-10-01

    Thermal and mechanical models for intact and jointed rock mass behavior are being developed, verified, and validated at Sandia National Laboratories for the Yucca Mountain Site Characterization Project. Benchmarking is an essential part of this effort and is one of the tools used to demonstrate verification of engineering software used to solve thermomechanical problems. This report presents the results of the third (and final) phase of the first thermomechanical benchmark exercise. In the first phase of this exercise, nonlinear heat conduction code were used to solve the thermal portion of the benchmark problem. The results from the thermal analysis were then used as input to the second and third phases of the exercise, which consisted of solving the structural portion of the benchmark problem. In the second phase of the exercise, a linear elastic rock mass model was used. In the third phase of the exercise, two different nonlinear jointed rock mass models were used to solve the thermostructural problem. Both models, the Sandia compliant joint model and the RE/SPEC joint empirical model, explicitly incorporate the effect of the joints on the response of the continuum. Three different structural codes, JAC, SANCHO, and SPECTROM-31, were used with the above models in the third phase of the study. Each model was implemented in two different codes so that direct comparisons of results from each model could be made. The results submitted by the participants showed that the finite element solutions using each model were in reasonable agreement. Some consistent differences between the solutions using the two different models were noted but are not considered important to verification of the codes. 9 refs., 18 figs., 8 tabs

  10. Higgs Pair Production: Choosing Benchmarks With Cluster Analysis

    CERN Document Server

    Carvalho, Alexandra; Dorigo, Tommaso; Goertz, Florian; Gottardo, Carlo A.; Tosi, Mia

    2016-01-01

    New physics theories often depend on a large number of free parameters. The precise values of those parameters in some cases drastically affect the resulting phenomenology of fundamental physics processes, while in others finite variations can leave it basically invariant at the level of detail experimentally accessible. When designing a strategy for the analysis of experimental data in the search for a signal predicted by a new physics model, it appears advantageous to categorize the parameter space describing the model according to the corresponding kinematical features of the final state. A multi-dimensional test statistic can be used to gauge the degree of similarity in the kinematics of different models; a clustering algorithm using that metric may then allow the division of the space into homogeneous regions, each of which can be successfully represented by a benchmark point. Searches targeting those benchmark points are then guaranteed to be sensitive to a large area of the parameter space. In this doc...

  11. Benchmark validation of statistical models: Application to mediation analysis of imagery and memory.

    Science.gov (United States)

    MacKinnon, David P; Valente, Matthew J; Wurpts, Ingrid C

    2018-03-29

    This article describes benchmark validation, an approach to validating a statistical model. According to benchmark validation, a valid model generates estimates and research conclusions consistent with a known substantive effect. Three types of benchmark validation-(a) benchmark value, (b) benchmark estimate, and (c) benchmark effect-are described and illustrated with examples. Benchmark validation methods are especially useful for statistical models with assumptions that are untestable or very difficult to test. Benchmark effect validation methods were applied to evaluate statistical mediation analysis in eight studies using the established effect that increasing mental imagery improves recall of words. Statistical mediation analysis led to conclusions about mediation that were consistent with established theory that increased imagery leads to increased word recall. Benchmark validation based on established substantive theory is discussed as a general way to investigate characteristics of statistical models and a complement to mathematical proof and statistical simulation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  12. A new algorithm for benchmarking in integer data envelopment analysis

    Directory of Open Access Journals (Sweden)

    M. M. Omran

    2012-08-01

    Full Text Available The aim of this study is to investigate the effect of integer data in data envelopment analysis (DEA. The inputs and outputs in different types of DEA are considered to be continuous. In most application-oriented problems, some or all data are integers; and subsequently, the continuous condition of the values is omitted. For example, situations in which the inputs/outputs are representatives of the number of cars, people, etc. In fact, the benchmark unit is artificial and does not contain integer inputs/outputs after projection on the efficiency frontier. By rounding off the projection point, we may lose the feasibility or end up having inefficient DMU. In such cases, it is required to provide a benchmark unit such that the considered unit reaches the efficiency. In the present short communication, by proposing a novel algorithm, the projecting of an inefficient DMU is carried out in such a way that produced benchmarking takes values with fully integer inputs/outputs.

  13. The Data Envelopment Analysis Method in Benchmarking of Technological Incubators

    Directory of Open Access Journals (Sweden)

    Bożena Kaczmarska

    2010-01-01

    Full Text Available This paper presents an original concept for the application of Data Envelopment Analysis (DEA in benchmarking processes within innovation and entrepreneurship centers based on the example of technological incubators. Applying the DEA method, it is possible to order analyzed objects, on the basis of explicitly defined relative efficiency, by compiling a rating list and rating classes. Establishing standards and indicating “clearances” allows the studied objects - innovation and entrepreneurship centers - to select a way of developing effectively, as well as preserving their individuality and a unique way of acting with the account of local needs. (original abstract

  14. Multi-Core Processor Memory Contention Benchmark Analysis Case Study

    Science.gov (United States)

    Simon, Tyler; McGalliard, James

    2009-01-01

    Multi-core processors dominate current mainframe, server, and high performance computing (HPC) systems. This paper provides synthetic kernel and natural benchmark results from an HPC system at the NASA Goddard Space Flight Center that illustrate the performance impacts of multi-core (dual- and quad-core) vs. single core processor systems. Analysis of processor design, application source code, and synthetic and natural test results all indicate that multi-core processors can suffer from significant memory subsystem contention compared to similar single-core processors.

  15. Space Weather Action Plan Ionizing Radiation Benchmarks: Phase 1 update and plans for Phase 2

    Science.gov (United States)

    Talaat, E. R.; Kozyra, J.; Onsager, T. G.; Posner, A.; Allen, J. E., Jr.; Black, C.; Christian, E. R.; Copeland, K.; Fry, D. J.; Johnston, W. R.; Kanekal, S. G.; Mertens, C. J.; Minow, J. I.; Pierson, J.; Rutledge, R.; Semones, E.; Sibeck, D. G.; St Cyr, O. C.; Xapsos, M.

    2017-12-01

    Changes in the near-Earth radiation environment can affect satellite operations, astronauts in space, commercial space activities, and the radiation environment on aircraft at relevant latitudes or altitudes. Understanding the diverse effects of increased radiation is challenging, but producing ionizing radiation benchmarks will help address these effects. The following areas have been considered in addressing the near-Earth radiation environment: the Earth's trapped radiation belts, the galactic cosmic ray background, and solar energetic-particle events. The radiation benchmarks attempt to account for any change in the near-Earth radiation environment, which, under extreme cases, could present a significant risk to critical infrastructure operations or human health. The goal of these ionizing radiation benchmarks and associated confidence levels will define at least the radiation intensity as a function of time, particle type, and energy for an occurrence frequency of 1 in 100 years and an intensity level at the theoretical maximum for the event. In this paper, we present the benchmarks that address radiation levels at all applicable altitudes and latitudes in the near-Earth environment, the assumptions made and the associated uncertainties, and the next steps planned for updating the benchmarks.

  16. Post-BEMUSE Reflood Model Input Uncertainty Methods (PREMIUM) Benchmark Phase II: Identification of Influential Parameters

    International Nuclear Information System (INIS)

    Kovtonyuk, A.; Petruzzi, A.; D'Auria, F.

    2015-01-01

    heat transfer coefficients, a qualitative (but not quantitative) agreement between different codes is observed. - For other parameters, like interphase friction coefficient and droplet diameter, a contrary behaviour (i.e. in correspondence of one of the extreme of the IP range, the direction of change of the responses is different) between different codes and even between different selected models within the same code can be observed. This suggests that the effect of such parameters on the cladding temperatures is quite complex, probably because it involves a lot of physical models (e.g., via interphase friction and interphase heat transfer coefficients for the droplet diameter). It shall be noted that the analysis of differences between the reflood models of different codes is out of scope of the PREMIUM benchmark. Nevertheless, it is recommended to take into account the physical models/ input parameters found as influential by the other participants in order to select the influential input parameters for which uncertainties are to be quantified within the Phase III of PREMIUM. In particular, input parameters identified as influential by other participants using the same code should be considered

  17. Numisheet2005 Benchmark Analysis on Forming of an Automotive Deck Lid Inner Panel: Benchmark 1

    International Nuclear Information System (INIS)

    Buranathiti, Thaweepat; Cao Jian

    2005-01-01

    Numerical simulations in sheet metal forming processes have been a very challenging topic in industry. There are many computer codes and modeling techniques existing today. However, there are many unknowns affecting the prediction accuracy. Systematic benchmark tests are needed to accelerate the future implementations and to provide as a reference. This report presents an international cooperative benchmark effort for an automotive deck lid inner panel. Predictions from simulations are analyzed and discussed against the corresponding experimental results. The correlations between accuracy of each parameter of interest are discussed in this report

  18. Tendances Carbone no. 79 'Free allocations under Phase 3 benchmarks: early evidence of what has changed'

    International Nuclear Information System (INIS)

    Sartor, Oliver

    2013-01-01

    Among the publications of CDC Climat Research, 'Tendances Carbone' bulletin specifically studies the developments of the European market for CO 2 allowances. This issue addresses the following points: One of the most controversial changes to the EU ETS in Phase 3 (2013-2020) has been the introduction of emissions-performance benchmarks for determining free allocations to non-electricity producers. Phases 1 and 2 used National Allocation Plans (NAPs). For practical reasons NAPs were drawn up by each Member State, but this led to problems, including over-generous allowance allocation, insufficiently harmonised allocations across countries and distorted incentives to reduce emissions. Benchmarking tries to fix things by allocating the equivalent of 100% of allowances needed if every installation used the best available technology. But this is not universally popular and industries say that they might lose international competitiveness. So a new study by CDC Climat and the Climate Economics Chair examined the data from the preliminary Phase 3 free allocations of 20 EU Member States and asked: how much are free allocations actually going to change with benchmarking?

  19. Benchmark analysis of MCNP trademark ENDF/B-VI iron

    International Nuclear Information System (INIS)

    Court, J.D.; Hendricks, J.S.

    1994-12-01

    The MCNP ENDF/B-VI iron cross-section data was subjected to four benchmark studies as part of the Hiroshima/Nagasaki dose re-evaluation for the National Academy of Science and the Defense Nuclear Agency. The four benchmark studies were: (1) the iron sphere benchmarks from the Lawrence Livermore Pulsed Spheres; (2) the Oak Ridge National Laboratory Fusion Reactor Shielding Benchmark; (3) a 76-cm diameter iron sphere benchmark done at the University of Illinois; (4) the Oak Ridge National Laboratory Benchmark for Neutron Transport through Iron. MCNP4A was used to model each benchmark and computational results from the ENDF/B-VI iron evaluations were compared to ENDF/B-IV, ENDF/B-V, the MCNP Recommended Data Set (which includes Los Alamos National Laboratory Group T-2 evaluations), and experimental data. The results show that the ENDF/B-VI iron evaluations are as good as, or better than, previous data sets

  20. Sensitivity and Uncertainty Analysis of IAEA CRP HTGR Benchmark Using McCARD

    International Nuclear Information System (INIS)

    Jang, Sang Hoon; Shim, Hyung Jin

    2016-01-01

    The benchmark consists of 4 phases starting from the local standalone modeling (Phase I) to the safety calculation of coupled system with transient situation (Phase IV). As a preliminary study of UAM on HTGR, this paper covers the exercise 1 and 2 of Phase I which defines the unit cell and lattice geometry of MHTGR-350 (General Atomics). The objective of these exercises is to quantify the uncertainty of the multiplication factor induced by perturbing nuclear data as well as to analyze the specific features of HTGR such as double heterogeneity and self-shielding treatment. The uncertainty quantification of IAEA CRP HTGR UAM benchmarks were conducted using first-order AWP method in McCARD. Uncertainty of the multiplication factor was estimated only for the microscopic cross section perturbation. To reduce the computation time and memory shortage, recently implemented uncertainty analysis module in MC wielandt calculation was adjusted. The covariance data of cross section was generated by NJOY/ERRORR module with ENDF/B-VII.1. The numerical result was compared with evaluation result of DeCART/MUSAD code system developed by KAERI. IAEA CRP HTGR UAM benchmark problems were analyzed using McCARD. The numerical results were compared with Serpent for eigenvalue calculation and DeCART/MUSAD for S/U analysis. In eigenvalue calculation, inconsistencies were found in the result with ENDF/B-VII.1 cross section library and it was found to be the effect of thermal scattering data of graphite. As to S/U analysis, McCARD results matched well with DeCART/MUSAD, but showed some discrepancy in 238U capture regarding implicit uncertainty.

  1. Power loss benchmark of nine-switch converters in three-phase online-UPS application

    DEFF Research Database (Denmark)

    Qin, Zian; Loh, Poh Chiang; Blaabjerg, Frede

    2014-01-01

    Three-phase online-UPS is an appropriate application for the nine-switch converter, where its high voltage stress of the power device caused by the reduced switch feature can be relieved significantly. Its power loss and loss distribution still have the flexibility from the control point of view...... as parameters like modulation index and phase angle of the load are taken into account. The benchmark of power loss will become a guidance for the users to make best use of the advantages and bypass the disadvantages of nine-switch converters. The results are finally verified on a 1.5 kW prototype....

  2. Plant improvements through the use of benchmarking analysis

    International Nuclear Information System (INIS)

    Messmer, J.R.

    1993-01-01

    As utilities approach the turn of the century, customer and shareholder satisfaction is threatened by rising costs. Environmental compliance expenditures, coupled with low load growth and aging plant assets are forcing utilities to operate existing resources in a more efficient and productive manner. PSI Energy set out in the spring of 1992 on a benchmarking mission to compare four major coal fired plants against others of similar size and makeup, with the goal of finding the best operations in the country. Following extensive analysis of the 'Best in Class' operation, detailed goals and objectives were established for each plant in seven critical areas. Three critical processes requiring rework were identified and required an integrated effort from all plants. The Plant Improvement process has already resulted in higher operation productivity, increased emphasis on planning, and lower costs due to effective material management. While every company seeks improvement, goals are often set in an ambiguous manner. Benchmarking aids in setting realistic goals based on others' actual accomplishments. This paper describes how the utility's short term goals will move them toward being a lower cost producer

  3. Higgs pair production: choosing benchmarks with cluster analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Alexandra; Dall’Osso, Martino; Dorigo, Tommaso [Dipartimento di Fisica e Astronomia and INFN, Sezione di Padova,Via Marzolo 8, I-35131 Padova (Italy); Goertz, Florian [CERN,1211 Geneva 23 (Switzerland); Gottardo, Carlo A. [Physikalisches Institut, Universität Bonn,Nussallee 12, 53115 Bonn (Germany); Tosi, Mia [CERN,1211 Geneva 23 (Switzerland)

    2016-04-20

    New physics theories often depend on a large number of free parameters. The phenomenology they predict for fundamental physics processes is in some cases drastically affected by the precise value of those free parameters, while in other cases is left basically invariant at the level of detail experimentally accessible. When designing a strategy for the analysis of experimental data in the search for a signal predicted by a new physics model, it appears advantageous to categorize the parameter space describing the model according to the corresponding kinematical features of the final state. A multi-dimensional test statistic can be used to gauge the degree of similarity in the kinematics predicted by different models; a clustering algorithm using that metric may allow the division of the space into homogeneous regions, each of which can be successfully represented by a benchmark point. Searches targeting those benchmarks are then guaranteed to be sensitive to a large area of the parameter space. In this document we show a practical implementation of the above strategy for the study of non-resonant production of Higgs boson pairs in the context of extensions of the standard model with anomalous couplings of the Higgs bosons. A non-standard value of those couplings may significantly enhance the Higgs boson pair-production cross section, such that the process could be detectable with the data that the LHC will collect in Run 2.

  4. Benchmarking of LOFT LRTS-COBRA-FRAP safety analysis model

    International Nuclear Information System (INIS)

    Hanson, G.H.; Atkinson, S.A.; Wadkins, R.P.

    1982-05-01

    The purpose of this work was to check out the LOFT LRTS/COBRA-IV/FRAP-T5 safety-analysis models against test data obtained during a LOFT operational transient in which there was a power and fuel-temperature rise. LOFT Experiment L6-3 was an excessive-load-increase anticipated transient test in which the main steam-flow-control valve was driven from its operational position to full-open in seven seconds. The resulting cooldown and reactivity-increase transients provide a good benchmark for the reactivity-and-power-prediction capability of the LRTS calculations, and for the fuel-bundle and fuel-rod temperature-response analysis capability of the LOFT COBRA-IV and FRAP-T5 models

  5. Sustaining knowledge in the neutron generator community and benchmarking study. Phase II.

    Energy Technology Data Exchange (ETDEWEB)

    Huff, Tameka B.; Stubblefield, William Anthony; Cole, Benjamin Holland, II; Baldonado, Esther

    2010-08-01

    This report documents the second phase of work under the Sustainable Knowledge Management (SKM) project for the Neutron Generator organization at Sandia National Laboratories. Previous work under this project is documented in SAND2008-1777, Sustaining Knowledge in the Neutron Generator Community and Benchmarking Study. Knowledge management (KM) systems are necessary to preserve critical knowledge within organizations. A successful KM program should focus on people and the process for sharing, capturing, and applying knowledge. The Neutron Generator organization is developing KM systems to ensure knowledge is not lost. A benchmarking study involving site visits to outside industry plus additional resource research was conducted during this phase of the SKM project. The findings presented in this report are recommendations for making an SKM program successful. The recommendations are activities that promote sharing, capturing, and applying knowledge. The benchmarking effort, including the site visits to Toyota and Halliburton, provided valuable information on how the SEA KM team could incorporate a KM solution for not just the neutron generators (NG) community but the entire laboratory. The laboratory needs a KM program that allows members of the workforce to access, share, analyze, manage, and apply knowledge. KM activities, such as communities of practice (COP) and sharing best practices, provide a solution towards creating an enabling environment for KM. As more and more people leave organizations through retirement and job transfer, the need to preserve knowledge is essential. Creating an environment for the effective use of knowledge is vital to achieving the laboratory's mission.

  6. Sustaining knowledge in the neutron generator community and benchmarking study. Phase II

    International Nuclear Information System (INIS)

    Huff, Tameka B.; Stubblefield, William Anthony; Cole, Benjamin Holland II; Baldonado, Esther

    2010-01-01

    This report documents the second phase of work under the Sustainable Knowledge Management (SKM) project for the Neutron Generator organization at Sandia National Laboratories. Previous work under this project is documented in SAND2008-1777, Sustaining Knowledge in the Neutron Generator Community and Benchmarking Study. Knowledge management (KM) systems are necessary to preserve critical knowledge within organizations. A successful KM program should focus on people and the process for sharing, capturing, and applying knowledge. The Neutron Generator organization is developing KM systems to ensure knowledge is not lost. A benchmarking study involving site visits to outside industry plus additional resource research was conducted during this phase of the SKM project. The findings presented in this report are recommendations for making an SKM program successful. The recommendations are activities that promote sharing, capturing, and applying knowledge. The benchmarking effort, including the site visits to Toyota and Halliburton, provided valuable information on how the SEA KM team could incorporate a KM solution for not just the neutron generators (NG) community but the entire laboratory. The laboratory needs a KM program that allows members of the workforce to access, share, analyze, manage, and apply knowledge. KM activities, such as communities of practice (COP) and sharing best practices, provide a solution towards creating an enabling environment for KM. As more and more people leave organizations through retirement and job transfer, the need to preserve knowledge is essential. Creating an environment for the effective use of knowledge is vital to achieving the laboratory's mission.

  7. List of benchmarks for simulation tools of steam-water two-phase flows

    Energy Technology Data Exchange (ETDEWEB)

    Mimouni, S. [Electricite de France (EDF), Div. R and D, 78 - Chatou (France); Serre, G. [CEA Grenoble, Dept. de Thermohydraulique et de Physique, DTP, 38 (France)

    2001-07-01

    A physical-numerical benchmarks matrix was drawn up in the context of the ECUME co-development action. Its purpose is to test the different potentialities required for the numerical methods to be used in the codes of the future which will benefit from advanced physics simulations. This benchmarks matrix is to be used for each numerical method in order to answer the following questions: What is the two-phase flow field that the combination of physics model + numerical scheme can process? What is the accuracy of the scheme for each type of physics situation? What is the numerical efficiency (computing time) of the numerical scheme for each type of physics situation? (author)

  8. List of benchmarks for simulation tools of steam-water two-phase flows

    International Nuclear Information System (INIS)

    Mimouni, S.; Serre, G.

    2001-01-01

    A physical-numerical benchmarks matrix was drawn up in the context of the ECUME co-development action. Its purpose is to test the different potentialities required for the numerical methods to be used in the codes of the future which will benefit from advanced physics simulations. This benchmarks matrix is to be used for each numerical method in order to answer the following questions: What is the two-phase flow field that the combination of physics model + numerical scheme can process? What is the accuracy of the scheme for each type of physics situation? What is the numerical efficiency (computing time) of the numerical scheme for each type of physics situation? (author)

  9. Benchmarking of Grid Fault Modes in Single-Phase Grid-Connected Photovoltaic Systems

    DEFF Research Database (Denmark)

    Yang, Yongheng; Blaabjerg, Frede; Zou, Zhixiang

    2013-01-01

    Pushed by the booming installations of singlephase photovoltaic (PV) systems, the grid demands regarding the integration of PV systems are expected to be modified. Hence, the future PV systems should become more active with functionalities of Low Voltage Ride-Through (LVRT) and grid support...... phase systems under grid faults. The intent of this paper is to present a benchmarking of grid fault modes that might come in future single-phase PV systems. In order to map future challenges, the relevant synchronization and control strategies are discussed. Some faulty modes are studied experimentally...... and provided at the end of this paper. It is concluded that there are extensive control possibilities in single-phase PV systems under grid faults. The Second Order General Integral based PLL technique might be the most promising candidate for future single-phase PV systems because of its fast adaptive...

  10. Thermal hydraulics-II. 2. Benchmarking of the TRIO Two-Phase-Flow Module

    International Nuclear Information System (INIS)

    Helton, Donald; Kumbaro, Anela; Hassan, Yassin

    2001-01-01

    The Commissariat a l'Energie Atomique (CEA) is currently developing a two-phase-flow module for the Trio-U CFD computer program. Work in the area of advanced numerical technique application to two-phase flow is being carried out by the SYSCO division at the CEA Saclay center. Recently, this division implemented several advanced numerical solvers, including approximate Riemann solvers and flux vector splitting schemes. As a test of these new advances, several benchmark tests were executed. This paper describes the pertinent results of this study. The first benchmark problem was the Ransom faucet problem. This problem consists of a vertical column of water acting under the gravity force. The appeal of this problem is that it tests the program's handling of the body force term and it has an analytical solution. The Trio results [based on a two-fluid, two-dimensional (2-D) simulation] for this problem were very encouraging. The two-phase-flow module was able to reproduce the analytical velocity and void fraction profiles. A reasonable amount of numerical diffusion was observed, and the numerical solution converged to the analytical solution as the grid size was refined, as shown in Fig. 1. A second series of benchmark problems is concerned with the employment of a drag force term. In a first approach, we test the capability of the code to take account of this source term, using a flux scheme solution technique. For this test, a rectangular duct was utilized. As shown in Fig. 2, mesh refinement results in an approach to the analytical solution. Next, a convergent/divergent nozzle problem is proposed. The nozzle is characterized by a brief contraction section and a long expansion section. A two-phase, 2-D, non-condensing model is used in conjunction with the Rieman solver. Figure 3 shows a comparison of the pressure profile for the experimental case and for the values calculated by the TRIO U two-phase-flow module. Trio was able to handle the drag force term and

  11. Benchmarking Analysis between CONTEMPT and COPATTA Containment Codes

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Kwi Hyun; Song, Wan Jung [ENERGEO Inc. Sungnam, (Korea, Republic of); Song, Dong Soo; Byun, Choong Sup [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2006-07-01

    The containment is the requirement that the releases of radioactive materials subsequent to an accident do not result in doses in excess of the values specified in 10 CFR 100. The containment must withstand the pressure and temperature of the DBA(Design Basis Accident) including margin without exceeding the design leakage rate. COPATTA as Bechtel's vendor code is used for the containment pressure and temperature prediction in power uprating project for Kori 3,4 and Yonggwang 1,2 nuclear power plants(NPPs). However, CONTEMPTLT/ 028 is used for calculating the containment pressure and temperatures in equipment qualification project for the same NPPs. During benchmarking analysis between two codes, it is known two codes have model differences. This paper show the performance evaluation results because of the main model differences.

  12. Benchmarking Analysis between CONTEMPT and COPATTA Containment Codes

    International Nuclear Information System (INIS)

    Seo, Kwi Hyun; Song, Wan Jung; Song, Dong Soo; Byun, Choong Sup

    2006-01-01

    The containment is the requirement that the releases of radioactive materials subsequent to an accident do not result in doses in excess of the values specified in 10 CFR 100. The containment must withstand the pressure and temperature of the DBA(Design Basis Accident) including margin without exceeding the design leakage rate. COPATTA as Bechtel's vendor code is used for the containment pressure and temperature prediction in power uprating project for Kori 3,4 and Yonggwang 1,2 nuclear power plants(NPPs). However, CONTEMPTLT/ 028 is used for calculating the containment pressure and temperatures in equipment qualification project for the same NPPs. During benchmarking analysis between two codes, it is known two codes have model differences. This paper show the performance evaluation results because of the main model differences

  13. Evaluation of piping fracture analysis method by benchmark study, 1

    International Nuclear Information System (INIS)

    Takahashi, Yukio; Kashima, Koichi; Kuwabara, Kazuo

    1987-01-01

    Importance of strength evaluation methods for cracked piping is growing with the progress of the rationalization of the nuclear piping system based on the leak-before-break concept. As an analytical tool, finite element method is principally used. To obtain the reliable solutions by the finite element programs, it is important to grasp the influences of various factors on the solutions. In this study, benchmark analysis is carried out for a stainless steel pipe with a circumferential through-wall crack subjected to four-point bending loading. Eight solutions obtained by using five finite element programs are compared with each other. Good agreement is obtained between the solutions on the deformation characteristics as well as fracture mechanics parameters. It is found through this study that the influence of the difference in the solution technique is generally small. (author)

  14. Benchmarking the x-ray phase contrast imaging for ICF DT ice characterization using roughened surrogates

    Energy Technology Data Exchange (ETDEWEB)

    Dewald, E; Kozioziemski, B; Moody, J; Koch, J; Mapoles, E; Montesanti, R; Youngblood, K; Letts, S; Nikroo, A; Sater, J; Atherton, J

    2008-06-26

    We use x-ray phase contrast imaging to characterize the inner surface roughness of DT ice layers in capsules planned for future ignition experiments. It is therefore important to quantify how well the x-ray data correlates with the actual ice roughness. We benchmarked the accuracy of our system using surrogates with fabricated roughness characterized with high precision standard techniques. Cylindrical artifacts with azimuthally uniform sinusoidal perturbations with 100 um period and 1 um amplitude demonstrated 0.02 um accuracy limited by the resolution of the imager and the source size of our phase contrast system. Spherical surrogates with random roughness close to that required for the DT ice for a successful ignition experiment were used to correlate the actual surface roughness to that obtained from the x-ray measurements. When comparing average power spectra of individual measurements, the accuracy mode number limits of the x-ray phase contrast system benchmarked against surface characterization performed by Atomic Force Microscopy are 60 and 90 for surrogates smoother and rougher than the required roughness for the ice. These agreement mode number limits are >100 when comparing matching individual measurements. We will discuss the implications for interpreting DT ice roughness data derived from phase-contrast x-ray imaging.

  15. International Benchmark on Pressurised Water Reactor Sub-channel and Bundle Tests. Volume II: Benchmark Results of Phase I: Void Distribution

    International Nuclear Information System (INIS)

    Rubin, Adam; Avramova, Maria; Velazquez-Lozada, Alexander

    2016-03-01

    This report summarised the first phase of the Nuclear Energy Agency (NEA) and the US Nuclear Regulatory Commission Benchmark based on NUPEC PWR Sub-channel and Bundle Tests (PSBT), which was intended to provide data for the verification of void distribution models in participants' codes. This phase was composed of four exercises; Exercise 1: steady-state single sub-channel benchmark, Exercise 2: steady-state rod bundle benchmark, Exercise 3: transient rod bundle benchmark and Exercise 4: a pressure drop benchmark. The experimental data provided to the participants of this benchmark is from a series of void measurement tests using full-size mock-up tests for both Boiling Water Reactors (BWRs) and Pressurised Water Reactors (PWRs). These tests were performed from 1987 to 1995 by the Nuclear Power Engineering Corporation (NUPEC) in Japan and made available by the Japan Nuclear Energy Safety Organisation (JNES) for the purposes of this benchmark, which was organised by Pennsylvania State University. Twenty-one institutions from nine countries participated in this benchmark. Seventeen different computer codes were used in Exercises 1, 2, 3 and 4. Among the computer codes were porous media, sub-channel, systems thermal-hydraulic code and Computational Fluid Dynamics (CFD) codes. It was observed that the codes tended to overpredict the thermal equilibrium quality at lower elevations and under predict it at higher elevations. There was also a tendency to overpredict void fraction at lower elevations and underpredict it at high elevations for the bundle test cases. The overprediction of void fraction at low elevations is likely caused by the x-ray densitometer measurement method used. Under sub-cooled boiling conditions, the voids accumulate at heated surfaces (and are therefore not seen in the centre of the sub-channel, where the measurements are being taken), so the experimentally-determined void fractions will be lower than the actual void fraction. Some of the best

  16. Cross-section sensitivity and uncertainty analysis of the FNG copper benchmark experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kodeli, I., E-mail: ivan.kodeli@ijs.si [Jožef Stefan Institute, Jamova 39, SI-1000 Ljubljana (Slovenia); Kondo, K. [Karlsruhe Institute of Technology, Postfach 3640, D-76021 Karlsruhe (Germany); Japan Atomic Energy Agency, Rokkasho-mura (Japan); Perel, R.L. [Racah Institute of Physics, Hebrew University of Jerusalem, IL-91904 Jerusalem (Israel); Fischer, U. [Karlsruhe Institute of Technology, Postfach 3640, D-76021 Karlsruhe (Germany)

    2016-11-01

    A neutronics benchmark experiment on copper assembly was performed end 2014–beginning 2015 at the 14-MeV Frascati neutron generator (FNG) of ENEA Frascati with the objective to provide the experimental database required for the validation of the copper nuclear data relevant for ITER design calculations, including the related uncertainties. The paper presents the pre- and post-analysis of the experiment performed using cross-section sensitivity and uncertainty codes, both deterministic (SUSD3D) and Monte Carlo (MCSEN5). Cumulative reaction rates and neutron flux spectra, their sensitivity to the cross sections, as well as the corresponding uncertainties were estimated for different selected detector positions up to ∼58 cm in the copper assembly. This permitted in the pre-analysis phase to optimize the geometry, the detector positions and the choice of activation reactions, and in the post-analysis phase to interpret the results of the measurements and the calculations, to conclude on the quality of the relevant nuclear cross-section data, and to estimate the uncertainties in the calculated nuclear responses and fluxes. Large uncertainties in the calculated reaction rates and neutron spectra of up to 50%, rarely observed at this level in the benchmark analysis using today's nuclear data, were predicted, particularly high for fast reactions. Observed C/E (dis)agreements with values as low as 0.5 partly confirm these predictions. Benchmark results are therefore expected to contribute to the improvement of both cross section as well as covariance data evaluations.

  17. Benchmarking analysis of three multimedia models: RESRAD, MMSOILS, and MEPAS

    International Nuclear Information System (INIS)

    Cheng, J.J.; Faillace, E.R.; Gnanapragasam, E.K.

    1995-11-01

    Multimedia modelers from the United States Environmental Protection Agency (EPA) and the United States Department of Energy (DOE) collaborated to conduct a comprehensive and quantitative benchmarking analysis of three multimedia models. The three models-RESRAD (DOE), MMSOILS (EPA), and MEPAS (DOE)-represent analytically based tools that are used by the respective agencies for performing human exposure and health risk assessments. The study is performed by individuals who participate directly in the ongoing design, development, and application of the models. A list of physical/chemical/biological processes related to multimedia-based exposure and risk assessment is first presented as a basis for comparing the overall capabilities of RESRAD, MMSOILS, and MEPAS. Model design, formulation, and function are then examined by applying the models to a series of hypothetical problems. Major components of the models (e.g., atmospheric, surface water, groundwater) are evaluated separately and then studied as part of an integrated system for the assessment of a multimedia release scenario to determine effects due to linking components of the models. Seven modeling scenarios are used in the conduct of this benchmarking study: (1) direct biosphere exposure, (2) direct release to the air, (3) direct release to the vadose zone, (4) direct release to the saturated zone, (5) direct release to surface water, (6) surface water hydrology, and (7) multimedia release. Study results show that the models differ with respect to (1) environmental processes included (i.e., model features) and (2) the mathematical formulation and assumptions related to the implementation of solutions (i.e., parameterization)

  18. Benchmarking Foot Trajectory Estimation Methods for Mobile Gait Analysis

    Directory of Open Access Journals (Sweden)

    Julius Hannink

    2017-08-01

    Full Text Available Mobile gait analysis systems based on inertial sensing on the shoe are applied in a wide range of applications. Especially for medical applications, they can give new insights into motor impairment in, e.g., neurodegenerative disease and help objectify patient assessment. One key component in these systems is the reconstruction of the foot trajectories from inertial data. In literature, various methods for this task have been proposed. However, performance is evaluated on a variety of datasets due to the lack of large, generally accepted benchmark datasets. This hinders a fair comparison of methods. In this work, we implement three orientation estimation and three double integration schemes for use in a foot trajectory estimation pipeline. All methods are drawn from literature and evaluated against a marker-based motion capture reference. We provide a fair comparison on the same dataset consisting of 735 strides from 16 healthy subjects. As a result, the implemented methods are ranked and we identify the most suitable processing pipeline for foot trajectory estimation in the context of mobile gait analysis.

  19. Benchmarking criticality analysis of TRIGA fuel storage racks.

    Science.gov (United States)

    Robinson, Matthew Loren; DeBey, Timothy M; Higginbotham, Jack F

    2017-01-01

    A criticality analysis was benchmarked to sub-criticality measurements of the hexagonal fuel storage racks at the United States Geological Survey TRIGA MARK I reactor in Denver. These racks, which hold up to 19 fuel elements each, are arranged at 0.61m (2 feet) spacings around the outer edge of the reactor. A 3-dimensional model was created of the racks using MCNP5, and the model was verified experimentally by comparison to measured subcritical multiplication data collected in an approach to critical loading of two of the racks. The validated model was then used to show that in the extreme condition where the entire circumference of the pool was lined with racks loaded with used fuel the storage array is subcritical with a k value of about 0.71; well below the regulatory limit of 0.8. A model was also constructed of the rectangular 2×10 fuel storage array used in many other TRIGA reactors to validate the technique against the original TRIGA licensing sub-critical analysis performed in 1966. The fuel used in this study was standard 20% enriched (LEU) aluminum or stainless steel clad TRIGA fuel. Copyright © 2016. Published by Elsevier Ltd.

  20. MCNP analysis of the nine-cell LWR gadolinium benchmark

    International Nuclear Information System (INIS)

    Arkuszewski, J.J.

    1988-01-01

    The Monte Carlo results for a 9-cell fragment of the light water reactor square lattice with a central gadolinium-loaded pin are presented. The calculations are performed with the code MCNP-3A and the ENDF-B/5 library and compared with the results obtained from the BOXER code system and the JEF-1 library. The objective of this exercise is to study the feasibility of BOXER for the analysis of a Gd-loaded LWR lattice in the broader framework of GAP International Benchmark Analysis. A comparison of results indicates that, apart from unavoidable discrepancies originating from different data evaluations, the BOXER code overestimates the multiplication factor by 1.4 % and underestimates the power release in a Gd cell by 4.66 %. It is hoped that further similar studies with use of the JEF-1 library for both BOXER and MCNP will help to isolate and explain these discrepancies in a cleaner way. (author) 4 refs., 9 figs., 10 tabs

  1. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Science.gov (United States)

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  2. Preliminary analysis of the proposed BN-600 benchmark core

    International Nuclear Information System (INIS)

    John, T.M.

    2000-01-01

    The Indira Gandhi Centre for Atomic Research is actively involved in the design of Fast Power Reactors in India. The core physics calculations are performed by the computer codes that are developed in-house or by the codes obtained from other laboratories and suitably modified to meet the computational requirements. The basic philosophy of the core physics calculations is to use the diffusion theory codes with the 25 group nuclear cross sections. The parameters that are very sensitive is the core leakage, like the power distribution at the core blanket interface etc. are calculated using transport theory codes under the DSN approximations. All these codes use the finite difference approximation as the method to treat the spatial variation of the neutron flux. Criticality problems having geometries that are irregular to be represented by the conventional codes are solved using Monte Carlo methods. These codes and methods have been validated by the analysis of various critical assemblies and calculational benchmarks. Reactor core design procedure at IGCAR consists of: two and three dimensional diffusion theory calculations (codes ALCIALMI and 3DB); auxiliary calculations, (neutron balance, power distributions, etc. are done by codes that are developed in-house); transport theory corrections from two dimensional transport calculations (DOT); irregular geometry treated by Monte Carlo method (KENO); cross section data library used CV2M (25 group)

  3. Benchmark analysis of forecasted seasonal temperature over different climatic areas

    Science.gov (United States)

    Giunta, G.; Salerno, R.; Ceppi, A.; Ercolani, G.; Mancini, M.

    2015-12-01

    From a long-term perspective, an improvement of seasonal forecasting, which is often exclusively based on climatology, could provide a new capability for the management of energy resources in a time scale of just a few months. This paper regards a benchmark analysis in relation to long-term temperature forecasts over Italy in the year 2010, comparing the eni-kassandra meteo forecast (e-kmf®) model, the Climate Forecast System-National Centers for Environmental Prediction (CFS-NCEP) model, and the climatological reference (based on 25-year data) with observations. Statistical indexes are used to understand the reliability of the prediction of 2-m monthly air temperatures with a perspective of 12 weeks ahead. The results show how the best performance is achieved by the e-kmf® system which improves the reliability for long-term forecasts compared to climatology and the CFS-NCEP model. By using the reliable high-performance forecast system, it is possible to optimize the natural gas portfolio and management operations, thereby obtaining a competitive advantage in the European energy market.

  4. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    Directory of Open Access Journals (Sweden)

    Seyhan Yazar

    Full Text Available A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR on Amazon EC2 instances and Google Compute Engine (GCE, using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2 for E.coli and 53.5% (95% CI: 34.4-72.6 for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1 and 173.9% (95% CI: 134.6-213.1 more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  5. Benchmark Analysis of Subcritical Noise Measurements on a Nickel-Reflected Plutonium Metal Sphere

    Energy Technology Data Exchange (ETDEWEB)

    John D. Bess; Jesson Hutchinson

    2009-09-01

    Subcritical experiments using californium source-driven noise analysis (CSDNA) and Feynman variance-to-mean methods were performed with an alpha-phase plutonium sphere reflected by nickel shells, up to a maximum thickness of 7.62 cm. Both methods provide means of determining the subcritical multiplication of a system containing nuclear material. A benchmark analysis of the experiments was performed for inclusion in the 2010 edition of the International Handbook of Evaluated Criticality Safety Benchmark Experiments. Benchmark models have been developed that represent these subcritical experiments. An analysis of the computed eigenvalues and the uncertainty in the experiment and methods was performed. The eigenvalues computed using the CSDNA method were very close to those calculated using MCNP5; however, computed eigenvalues are used in the analysis of the CSDNA method. Independent calculations using KENO-VI provided similar eigenvalues to those determined using the CSDNA method and MCNP5. A slight trend with increasing nickel-reflector thickness was seen when comparing MCNP5 and KENO-VI results. For the 1.27-cm-thick configuration the MCNP eigenvalue was approximately 300 pcm greater. The calculated KENO eigenvalue was about 300 pcm greater for the 7.62-cm-thick configuration. The calculated results were approximately the same for a 5-cm-thick shell. The eigenvalues determined using the Feynman method are up to approximately 2.5% lower than those determined using either the CSDNA method or the Monte Carlo codes. The uncertainty in the results from either method was not large enough to account for the bias between the two experimental methods. An ongoing investigation is being performed to assess what potential uncertainties and/or biases exist that have yet to be properly accounted for. The dominant uncertainty in the CSDNA analysis was the uncertainty in selecting a neutron cross-section library for performing the analysis of the data. The uncertainty in the

  6. NRC-BNL Benchmark Program on Evaluation of Methods for Seismic Analysis of Coupled Systems

    International Nuclear Information System (INIS)

    Chokshi, N.; DeGrassi, G.; Xu, J.

    1999-01-01

    A NRC-BNL benchmark program for evaluation of state-of-the-art analysis methods and computer programs for seismic analysis of coupled structures with non-classical damping is described. The program includes a series of benchmarking problems designed to investigate various aspects of complexities, applications and limitations associated with methods for analysis of non-classically damped structures. Discussions are provided on the benchmarking process, benchmark structural models, and the evaluation approach, as well as benchmarking ground rules. It is expected that the findings and insights, as well as recommendations from this program will be useful in developing new acceptance criteria and providing guidance for future regulatory activities involving licensing applications of these alternate methods to coupled systems

  7. Analysis of a multigroup stylized CANDU half-core benchmark

    International Nuclear Information System (INIS)

    Pounders, Justin M.; Rahnema, Farzad; Serghiuta, Dumitru

    2011-01-01

    Highlights: → This paper provides a benchmark that is a stylized model problem in more than two energy groups that is realistic with respect to the underlying physics. → An 8-group cross section library is provided to augment a previously published 2-group 3D stylized half-core CANDU benchmark problem. → Reference eigenvalues and selected pin and bundle fission rates are included. → 2-, 4- and 47-group Monte Carlo solutions are compared to analyze homogenization-free transport approximations that result from energy condensation. - Abstract: An 8-group cross section library is provided to augment a previously published 2-group 3D stylized half-core Canadian deuterium uranium (CANDU) reactor benchmark problem. Reference eigenvalues and selected pin and bundle fission rates are also included. This benchmark is intended to provide computational reactor physicists and methods developers with a stylized model problem in more than two energy groups that is realistic with respect to the underlying physics. In addition to transport theory code verification, the 8-group energy structure provides reactor physicist with an ideal problem for examining cross section homogenization and collapsing effects in a full-core environment. To this end, additional 2-, 4- and 47-group full-core Monte Carlo benchmark solutions are compared to analyze homogenization-free transport approximations incurred as a result of energy group condensation.

  8. Benchmarking Analysis of Institutional University Autonomy in Denmark, Lithuania, Romania, Scotland, and Sweden

    DEFF Research Database (Denmark)

    This book presents a benchmark, comparative analysis of institutional university autonomy in Denmark, Lithuania, Romania, Scotland and Sweden. These countries are partners in a EU TEMPUS funded project 'Enhancing University Autonomy in Moldova' (EUniAM). This benchmark analysis was conducted...... by the EUniAM Lead Task Force team that collected and analysed secondary and primary data in each of these countries and produced four benchmark reports that are part of this book. For each dimension and interface of institutional university autonomy, the members of the Lead Task Force team identified...... respective evaluation criteria and searched for similarities and differences in approaches to higher education sectors and respective autonomy regimes in these countries. The consolidated report that precedes the benchmark reports summarises the process and key findings from the four benchmark reports...

  9. Benchmarking of Constant Power Generation Strategies for Single-Phase Grid-Connected Photovoltaic Systems

    DEFF Research Database (Denmark)

    Sangwongwanich, Ariya; Yang, Yongheng; Blaabjerg, Frede

    2018-01-01

    strategies based on: 1) a power control method (P-CPG), 2) a current limit method (I-CPG) and 3) the Perturb and Observe algorithm (P&O-CPG). However, the operational mode changes (e.g., from the maximum power point tracking to a CPG operation) will affect the entire system performance. Thus, a benchmarking...... of the presented CPG strategies is also conducted on a 3-kW single-phase grid-connected PV system. Comparisons reveal that either the P-CPG or I-CPG strategies can achieve fast dynamics and satisfactory steady-state performance. In contrast, the P&O-CPG algorithm is the most suitable solution in terms of high...

  10. Analysis of the impact of correlated benchmark experiments on the validation of codes for criticality safety analysis

    International Nuclear Information System (INIS)

    Bock, M.; Stuke, M.; Behler, M.

    2013-01-01

    The validation of a code for criticality safety analysis requires the recalculation of benchmark experiments. The selected benchmark experiments are chosen such that they have properties similar to the application case that has to be assessed. A common source of benchmark experiments is the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments' (ICSBEP Handbook) compiled by the 'International Criticality Safety Benchmark Evaluation Project' (ICSBEP). In order to take full advantage of the information provided by the individual benchmark descriptions for the application case, the recommended procedure is to perform an uncertainty analysis. The latter is based on the uncertainties of experimental results included in most of the benchmark descriptions. They can be performed by means of the Monte Carlo sampling technique. The consideration of uncertainties is also being introduced in the supplementary sheet of DIN 25478 'Application of computer codes in the assessment of criticality safety'. However, for a correct treatment of uncertainties taking into account the individual uncertainties of the benchmark experiments is insufficient. In addition, correlations between benchmark experiments have to be handled correctly. For example, these correlations can arise due to different cases of a benchmark experiment sharing the same components like fuel pins or fissile solutions. Thus, manufacturing tolerances of these components (e.g. diameter of the fuel pellets) have to be considered in a consistent manner in all cases of the benchmark experiment. At the 2012 meeting of the Expert Group on 'Uncertainty Analysis for Criticality Safety Assessment' (UACSA) of the OECD/NEA a benchmark proposal was outlined that aimed for the determination of the impact on benchmark correlations on the estimation of the computational bias of the neutron multiplication factor (k eff ). The analysis presented here is based on this proposal. (orig.)

  11. VENUS-2 Benchmark Problem Analysis with HELIOS-1.9

    International Nuclear Information System (INIS)

    Jeong, Hyeon-Jun; Choe, Jiwon; Lee, Deokjung

    2014-01-01

    Since there are reliable results of benchmark data from the OECD/NEA report of the VENUS-2 MOX benchmark problem, by comparing benchmark results users can identify the credibility of code. In this paper, the solution of the VENUS-2 benchmark problem from HELIOS 1.9 using the ENDF/B-VI library(NJOY91.13) is compared with the result from HELIOS 1.7 with consideration of the MCNP-4B result as reference data. The comparison contains the results of pin cell calculation, assembly calculation, and core calculation. The eigenvalues from those are considered by comparing the results from other codes. In the case of UOX and MOX assemblies, the differences from the MCNP-4B results are about 10 pcm. However, there is some inaccuracy in baffle-reflector condition, and relatively large differences were found in the MOX-reflector assembly and core calculation. Although HELIOS 1.9 utilizes an inflow transport correction, it seems that it has a limited effect on the error in baffle-reflector condition

  12. SMORN-III benchmark test on reactor noise analysis methods

    International Nuclear Information System (INIS)

    Shinohara, Yoshikuni; Hirota, Jitsuya

    1984-02-01

    A computational benchmark test was performed in conjunction with the Third Specialists Meeting on Reactor Noise (SMORN-III) which was held in Tokyo, Japan in October 1981. This report summarizes the results of the test as well as the works made for preparation of the test. (author)

  13. Importance Performance Analysis as a Trade Show Performance Evaluation and Benchmarking Tool

    OpenAIRE

    Tafesse, Wondwesen; Skallerud, Kåre; Korneliussen, Tor

    2010-01-01

    Author's accepted version (post-print). The purpose of this study is to introduce importance performance analysis as a trade show performance evaluation and benchmarking tool. Importance performance analysis considers exhibitors’ performance expectation and perceived performance in unison to evaluate and benchmark trade show performance. The present study uses data obtained from exhibitors of an international trade show to demonstrate how importance performance analysis can be used to eval...

  14. Aagesta-BR3 Decommissioning Cost. Comparison and Benchmarking Analysis

    International Nuclear Information System (INIS)

    Varley, Geoff

    2002-11-01

    equipment. The BR3 work packages described in this report add up to something like 83,000 labour hours plus about MSEK 13 of investments and consumables costs. At Swedish average team labour rates 83,000 hours would equate to about MSEK 52. Adding the investment cost of MSEK 13 gives a total of about MSEK 65. This of course is quite close to the Aagesta figure but it would be wrong to draw immediate, firm conclusions based on these data. Such a comparison should take into account, inter alia: The number and relative sizes of the equipment decontaminated and dismantled at Aagesta and BR3. The assumed productivity in the Aagesta estimate compared to the actual BR3 figures. The physical scale of the Aagesta reactor is somewhat larger than the BR3 reactor, so all other things being equal, one might expect the Aagesta decommissioning cost estimate to be higher than for BR3. Aagesta has better access overall, which should help to constrain costs. The productivity ratio for workers at BR3 on average was high - generally 80 per cent or more, so this is unlikely to be exceeded at Aagesta and might not be equalled, which would tend to push the Aagesta cost up relative to the BR3 situation. There is an additional question of the possible extra work performed at BR3 due to the R and D nature of the project. The BR3 data analysed has tried to strip away any such 'extra' work but nevertheless there may be some residual effect on the final numbers. Analysis and comparison of individual work packages has raised several conclusions, as follows: The constructed cost for Aagesta using BR3 benchmark data is encouragingly close to the Aagesta estimate value but it is not clear that the way of deriving the Aagesta estimate for decontamination was entirely rigorous. The reliability of the Aagesta estimate on these grounds therefore might reasonably be questioned. A significant discrepancy between the BR3 and Aagesta cases appears to exist in respect of the volumes of waste arising from the

  15. Selected examples on multi physics researches at KFKI AEKI-results for phase I of the OECD/NEA UAM benchmark

    International Nuclear Information System (INIS)

    Panka, I.; Kereszturi, A.; Maraczy, C.

    2010-01-01

    Nowadays, there is a tendency to use best estimate plus uncertainty methods in the field of nuclear energy. This implies the application of best estimate code systems and the determination of the corresponding uncertainties. For the latter one an OECD benchmark was set up. The objective of the OECD/NEA Uncertainty Analysis in Best-Estimate Modeling (UAM) LWR benchmark is to determine the uncertainties of the coupled reactor physics/thermal hydraulics LWR calculations at all stages. In this paper the AEKI participation in Phase I will be presented. This Phase is dealing with the evaluation of the uncertainties of the neutronic calculations starting from the pin cell spectral calculations up to the stand-alone neutronics core simulations. (Authors)

  16. Analysis of a computational benchmark for a high-temperature reactor using SCALE

    International Nuclear Information System (INIS)

    Goluoglu, S.

    2006-01-01

    Several proposed advanced reactor concepts require methods to address effects of double heterogeneity. In doubly heterogeneous systems, heterogeneous fuel particles in a moderator matrix form the fuel region of the fuel element and thus constitute the first level of heterogeneity. Fuel elements themselves are also heterogeneous with fuel and moderator or reflector regions, forming the second level of heterogeneity. The fuel elements may also form regular or irregular lattices. A five-phase computational benchmark for a high-temperature reactor (HTR) fuelled with uranium or reactor-grade plutonium has been defined by the Organization for Economic Cooperation and Development, Nuclear Energy Agency (OECD NEA), Nuclear Science Committee, Working Party on the Physics of Plutonium Fuels and Innovative Fuel Cycles. This paper summarizes the analysis results using the latest SCALE code system (to be released in CY 2006 as SCALE 5.1). (authors)

  17. Benchmark Analysis of EBR-II Shutdown Heat Removal Tests

    International Nuclear Information System (INIS)

    2017-08-01

    This publication presents the results and main achievements of an IAEA coordinated research project to verify and validate system and safety codes used in the analyses of liquid metal thermal hydraulics and neutronics phenomena in sodium cooled fast reactors. The publication will be of use to the researchers and professionals currently working on relevant fast reactors programmes. In addition, it is intended to support the training of the next generation of analysts and designers through international benchmark exercises

  18. The analysis of one-dimensional reactor kinetics benchmark computations

    International Nuclear Information System (INIS)

    Sidell, J.

    1975-11-01

    During March 1973 the European American Committee on Reactor Physics proposed a series of simple one-dimensional reactor kinetics problems, with the intention of comparing the relative efficiencies of the numerical methods employed in various codes, which are currently in use in many national laboratories. This report reviews the contributions submitted to this benchmark exercise and attempts to assess the relative merits and drawbacks of the various theoretical and computer methods. (author)

  19. Providing Nuclear Criticality Safety Analysis Education through Benchmark Experiment Evaluation

    International Nuclear Information System (INIS)

    Bess, John D.; Briggs, J. Blair; Nigg, David W.

    2009-01-01

    One of the challenges that today's new workforce of nuclear criticality safety engineers face is the opportunity to provide assessment of nuclear systems and establish safety guidelines without having received significant experience or hands-on training prior to graduation. Participation in the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and/or the International Reactor Physics Experiment Evaluation Project (IRPhEP) provides students and young professionals the opportunity to gain experience and enhance critical engineering skills.

  20. Quantitative Performance Analysis of the SPEC OMPM2001 Benchmarks

    Directory of Open Access Journals (Sweden)

    Vishal Aslot

    2003-01-01

    Full Text Available The state of modern computer systems has evolved to allow easy access to multiprocessor systems by supporting multiple processors on a single physical package. As the multiprocessor hardware evolves, new ways of programming it are also developed. Some inventions may merely be adopting and standardizing the older paradigms. One such evolving standard for programming shared-memory parallel computers is the OpenMP API. The Standard Performance Evaluation Corporation (SPEC has created a suite of parallel programs called SPEC OMP to compare and evaluate modern shared-memory multiprocessor systems using the OpenMP standard. We have studied these benchmarks in detail to understand their performance on a modern architecture. In this paper, we present detailed measurements of the benchmarks. We organize, summarize, and display our measurements using a Quantitative Model. We present a detailed discussion and derivation of the model. Also, we discuss the important loops in the SPEC OMPM2001 benchmarks and the reasons for less than ideal speedup on our platform.

  1. Analyses and results of the OECD/NEA WPNCS EGUNF benchmark phase II. Technical report; Analysen und Ergebnisse zum OECD/NEA WPNCS EGUNF Benchmark Phase II. Technischer Bericht

    Energy Technology Data Exchange (ETDEWEB)

    Hannstein, Volker; Sommer, Fabian

    2017-05-15

    The report summarizes the performed studies and results in the frame of the phase II benchmarks of the expert group of used nuclear fuel (EGUNF) of the working party of nuclear criticality safety (WPNCS) of the nuclear energy agency (NEA) of the organization for economic co-operation and development (OECD). The studies specified within the benchmarks have been realized to the full extent. The scope of the benchmarks was the comparison of a generic BWR fuel element with gadolinium containing fuel rods with several computer codes and cross section libraries of different international working groups and institutions. The used computational model allows the evaluation of the accuracy of fuel rod and their influence of the inventory calculations and the respective influence on BWR burnout credit calculations.

  2. Aagesta-BR3 Decommissioning Cost. Comparison and Benchmarking Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Varley, Geoff [NAC International, Henley on Thames (United Kingdom)

    2002-11-01

    25 is equipment. The BR3 work packages described in this report add up to something like 83,000 labour hours plus about MSEK 13 of investments and consumables costs. At Swedish average team labour rates 83,000 hours would equate to about MSEK 52. Adding the investment cost of MSEK 13 gives a total of about MSEK 65. This of course is quite close to the Aagesta figure but it would be wrong to draw immediate, firm conclusions based on these data. Such a comparison should take into account, inter alia: The number and relative sizes of the equipment decontaminated and dismantled at Aagesta and BR3. The assumed productivity in the Aagesta estimate compared to the actual BR3 figures. The physical scale of the Aagesta reactor is somewhat larger than the BR3 reactor, so all other things being equal, one might expect the Aagesta decommissioning cost estimate to be higher than for BR3. Aagesta has better access overall, which should help to constrain costs. The productivity ratio for workers at BR3 on average was high - generally 80 per cent or more, so this is unlikely to be exceeded at Aagesta and might not be equalled, which would tend to push the Aagesta cost up relative to the BR3 situation. There is an additional question of the possible extra work performed at BR3 due to the R and D nature of the project. The BR3 data analysed has tried to strip away any such 'extra' work but nevertheless there may be some residual effect on the final numbers. Analysis and comparison of individual work packages has raised several conclusions, as follows: The constructed cost for Aagesta using BR3 benchmark data is encouragingly close to the Aagesta estimate value but it is not clear that the way of deriving the Aagesta estimate for decontamination was entirely rigorous. The reliability of the Aagesta estimate on these grounds therefore might reasonably be questioned. A significant discrepancy between the BR3 and Aagesta cases appears to exist in respect of the volumes of waste

  3. Free allocations in EU ETS Phase 3: The impact of emissions performance benchmarking for carbon-intensive industry - Working Paper No. 2013-14

    International Nuclear Information System (INIS)

    Lecourt, S.; Palliere, C.; Sartor, O.

    2013-02-01

    From Phase 3 (2013-20) of the European Union Emissions Trading Scheme, carbon-intensive industrial emitters will receive free allocations based on harmonised, EU-wide benchmarks. This paper analyses the impacts of these new rules on allocations to key energy-intensive sectors across Europe. It explores an original dataset that combines recent data from the National Implementing Measures of 20 EU Member States with the Community Independent Transaction Log and other EU documents. The analysis reveals that free allocations to benchmarked sectors will be reduced significantly compared to Phase 2 (2008-12). This reduction should both increase public revenues from carbon auctions and has the potential to enhance the economic efficiency of the carbon market. The analysis also shows that changes in allocation vary mostly across installations within countries, raising the possibility that the carbon-cost competitiveness impacts may be more intense within rather than across countries. Lastly, the analysis finds evidence that the new benchmarking rules will, as intended, reward installations with better emissions performance and will improve harmonisation of free allocations in the EU ETS by reducing differences in allocation levels across countries with similar carbon intensities of production. (authors)

  4. RETRAN-3D Analysis Of The OECD/NRC Peach Bottom 2 Turbine Trip Benchmark

    International Nuclear Information System (INIS)

    Barten, W.; Coddington, P.

    2003-01-01

    This paper presents the PSI results on the different Phases of the Peach Bottom BWR Turbine Trip Benchmark using the RETRAN-3D code. In the first part of the paper, the analysis of Phase 1 is presented, in which the system pressure is predicted based on a pre-defined core power distribution. These calculations demonstrate the importance of accurate modelling of the non-equilibrium effects within the steam separator region. In the second part, a selection of the RETRAN-3D results for Phase 2 are given, where the power is predicted using a 3-D core with pre-defined core flow and pressure boundary conditions. A comparison of calculations using the different (Benchmark-specified) boundary conditions illustrates the sensitivity of the power maximum on the various resultant system parameters. In the third part of the paper, the results of the Phase 3 calculation are presented. This phase, which is a combination of the analytical work of Phases 1 and 2, gives good agreement with the measured data. The coupling of the pressure and flow oscillations in the steam line, the mass balance in the core, the (void) reactivity and the core power are all discussed. It is shown that the reactivity effects resulting from the change in the core void can explain the overall behaviour of the transient prior to the reactor scram. The time-dependent, normalized power for different thermal-hydraulic channels in the core is discussed in some detail. Up to the time of reactor scram, the power change was similar in all channels, with differences of the order of only a few percent. The axial shape of the channel powers at the time of maximum (overall) power increased in the core centre (compared with the shape at time zero). These changes occur as a consequence of the relative change in the channel void, which is largest in the region of the onset of boiling, and the influence on the different fuel assemblies of the complex ring pattern of the control rods. (author)

  5. RETRAN-3D Analysis Of The OECD/NRC Peach Bottom 2 Turbine Trip Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Barten, W.; Coddington, P

    2003-03-01

    This paper presents the PSI results on the different Phases of the Peach Bottom BWR Turbine Trip Benchmark using the RETRAN-3D code. In the first part of the paper, the analysis of Phase 1 is presented, in which the system pressure is predicted based on a pre-defined core power distribution. These calculations demonstrate the importance of accurate modelling of the non-equilibrium effects within the steam separator region. In the second part, a selection of the RETRAN-3D results for Phase 2 are given, where the power is predicted using a 3-D core with pre-defined core flow and pressure boundary conditions. A comparison of calculations using the different (Benchmark-specified) boundary conditions illustrates the sensitivity of the power maximum on the various resultant system parameters. In the third part of the paper, the results of the Phase 3 calculation are presented. This phase, which is a combination of the analytical work of Phases 1 and 2, gives good agreement with the measured data. The coupling of the pressure and flow oscillations in the steam line, the mass balance in the core, the (void) reactivity and the core power are all discussed. It is shown that the reactivity effects resulting from the change in the core void can explain the overall behaviour of the transient prior to the reactor scram. The time-dependent, normalized power for different thermal-hydraulic channels in the core is discussed in some detail. Up to the time of reactor scram, the power change was similar in all channels, with differences of the order of only a few percent. The axial shape of the channel powers at the time of maximum (overall) power increased in the core centre (compared with the shape at time zero). These changes occur as a consequence of the relative change in the channel void, which is largest in the region of the onset of boiling, and the influence on the different fuel assemblies of the complex ring pattern of the control rods. (author)

  6. Classification of criticality calculations with correlation coefficient method and its application to OECD/NEA burnup credit benchmarks phase III-A and II-A

    International Nuclear Information System (INIS)

    Okuno, Hiroshi

    2003-01-01

    A method for classifying benchmark results of criticality calculations according to similarity was proposed in this paper. After formulation of the method utilizing correlation coefficients, it was applied to burnup credit criticality benchmarks Phase III-A and II-A, which were conducted by the Expert Group on Burnup Credit Criticality Safety under auspices of the Nuclear Energy Agency of the Organisation for Economic Cooperation and Development (OECD/NEA). Phase III-A benchmark was a series of criticality calculations for irradiated Boiling Water Reactor (BWR) fuel assemblies, whereas Phase II-A benchmark was a suite of criticality calculations for irradiated Pressurized Water Reactor (PWR) fuel pins. These benchmark problems and their results were summarized. The correlation coefficients were calculated and sets of benchmark calculation results were classified according to the criterion that the values of the correlation coefficients were no less than 0.15 for Phase III-A and 0.10 for Phase II-A benchmarks. When a couple of benchmark calculation results belonged to the same group, one calculation result was found predictable from the other. An example was shown for each of the Benchmarks. While the evaluated nuclear data seemed the main factor for the classification, further investigations were required for finding other factors. (author)

  7. Criticality Benchmark Analysis of Water-Reflected Uranium Oxyfluoride Slabs

    International Nuclear Information System (INIS)

    Marshall, Margaret A.; Bess, John D.

    2009-01-01

    A series of twelve experiments were conducted in the mid 1950's at the Oak Ridge National Laboratory Critical Experiments Facility to determine the critical conditions of a semi-infinite water-reflected slab of aqueous uranium oxyfluoride (UO2F2). A different slab thickness was used for each experiment. Results from the twelve experiment recorded in the laboratory notebook were published in Reference 1. Seven of the twelve experiments were determined to be acceptable benchmark experiments for the inclusion in the International Handbook of Evaluated Criticality Safety Benchmark Experiments. This evaluation will not only be available to handbook users for the validation of computer codes and integral cross-section data, but also for the reevaluation of experimental data used in the ANSI/ANS-8.1 standard. This evaluation is important as part of the technical basis of the subcritical slab limits in ANSI/ANS-8.1. The original publication of the experimental results was used for the determination of bias and bias uncertainties for subcritical slab limits, as documented by Hugh Clark's paper 'Subcritical Limits for Uranium-235 Systems'.

  8. Specification of phase 3 benchmark (Hex-Z heterogeneous and burnup calculation)

    International Nuclear Information System (INIS)

    Kim, Y.I.

    2002-01-01

    During the second RCM of the IAEA Co-ordinated Research Project Updated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effects the following items were identified as important. Heterogeneity will affect absolute core reactivity. Rod worths could be considerably reduced by heterogeneity effects depending on their detailed design. Heterogeneity effects will affect the resonance self-shielding in the treatment of fuel Doppler, steel Doppler and sodium density effects. However, it was considered more important to concentrate on the sodium density effect in order to reduce the calculational effort required. It was also recognized that burnup effects will have an influence on fuel Doppler and sodium worths. A benchmark for the assessment of heterogeneity effect for Phase 3 was defined. It is to be performed for the Hex-Z model of the reactor only. No calculations will be performed for the R-Z model. For comparison with heterogeneous evaluations, the control rod worth will be calculated at the beginning of the equilibrium cycle, based on the homogeneous model. The definitions of rod raised and rod inserted for SHR are given, using the composition numbers

  9. DRAGON analysis of MOX fueled VVER cell benchmarks

    International Nuclear Information System (INIS)

    Marleau, G.; Foissac, F.

    2002-01-01

    The computational unit-cell benchmarks problems for LEU and MOX fueled VVER-1000 ('water-water energetic reactor') have been analyzed using the code DRAGON with ENDF/B-V and ENDF/B-VI based WIMS-AECL cross section libraries. The results obtained were compared with those generated using the SAS2H module of the SCALE-4.3 computational code system and with the code HELIOS. Good agreements between DRAGON and HELIOS were obtained when the ENDF/B-VI based library was considered while the ENDF/B-V DRAGON results were generally closer to those obtained using SAS2H. This study was useful for the verification of the DRAGON code and confirms that HELIOS and DRAGON have a similar behavior when compatible cross sections library are used. (author)

  10. Benchmark calculations of power distribution within fuel assemblies. Phase 2: comparison of data reduction and power reconstruction methods in production codes

    International Nuclear Information System (INIS)

    2000-01-01

    Systems loaded with plutonium in the form of mixed-oxide (MOX) fuel show somewhat different neutronic characteristics compared with those using conventional uranium fuels. In order to maintain adequate safety standards, it is essential to accurately predict the characteristics of MOX-fuelled systems and to further validate both the nuclear data and the computation methods used. A computation benchmark on power distribution within fuel assemblies to compare different techniques used in production codes for fine flux prediction in systems partially loaded with MOX fuel was carried out at an international level. It addressed first the numerical schemes for pin power reconstruction, then investigated the global performance including cross-section data reduction methods. This report provides the detailed results of this second phase of the benchmark. The analysis of the results revealed that basic data still need to be improved, primarily for higher plutonium isotopes and minor actinides. (author)

  11. Investigation on method of elasto-plastic analysis for piping system (benchmark analysis)

    International Nuclear Information System (INIS)

    Kabaya, Takuro; Kojima, Nobuyuki; Arai, Masashi

    2015-01-01

    This paper provides method of an elasto-plastic analysis for practical seismic design of nuclear piping system. JSME started up the task to establish method of an elasto-plastic analysis for nuclear piping system. The benchmark analyses have been performed in the task to investigate on method of an elasto-plastic analysis. And our company has participated in the benchmark analyses. As a result, we have settled on the method which simulates the result of piping exciting test accurately. Therefore the recommended method of an elasto-plastic analysis is shown as follows; 1) An elasto-plastic analysis is composed of dynamic analysis of piping system modeled by using beam elements and static analysis of deformed elbow modeled by using shell elements. 2) Bi-linear is applied as an elasto-plastic property. Yield point is standardized yield point multiplied by 1.2 times, and second gradient is 1/100 young's modulus. Kinematic hardening is used as a hardening rule. 3) The fatigue life is evaluated on strain ranges obtained by elasto-plastic analysis, by using the rain flow method and the fatigue curve of previous studies. (author)

  12. Application of the Relap5-3D to phase 1 and 3 of the OECD-CSNI/NSC PWR MSLB benchmark related to TMI-1

    International Nuclear Information System (INIS)

    D'Auria, F.; Galassi, G.; Spadoni, A.; Hassan, Y.

    2001-01-01

    The Relap5-3D, the latest in the series of the Relap5 code, distinguishes from the previous versions by the fully integrated, multi-dimensional thermalhydraulic and kinetic modeling capability. It has been applied to Phase I and III of OECD-CSNI/ NSC PWR MSLB Benchmark adopting the same thermalhydraulic input deck already used with Relap5/Parcs and Relap5/Quabbox coupled codes during the previous MSLB analysis. The OECD jointly with the US NRC proposed the PWR MSLB Benchmark in order to gather a common understanding about the coupling between thermal hydraulics and neutronics, and evaluating the behavior of this transient with different coupled codes, giving emphasis to the 3-D modeling. This paper deals with the application of Relap5-3D code to phase I and III of the PWR MSLB Benchmark. The Relap5-3D is a thermal hydraulics-neutronics internally coupled code, the thermal hydraulics module is the INEEL version of Relap and the neutronics module is derived from NESTLE multi-dimension kinetics code. (author)

  13. WLUP benchmarks

    International Nuclear Information System (INIS)

    Leszczynski, Francisco

    2002-01-01

    The IAEA-WIMS Library Update Project (WLUP) is on the end stage. The final library will be released on 2002. It is a result of research and development made by more than ten investigators during 10 years. The organization of benchmarks for testing and choosing the best set of data has been coordinated by the author of this paper. It is presented the organization, name conventions, contents and documentation of WLUP benchmarks, and an updated list of the main parameters for all cases. First, the benchmarks objectives and types are given. Then, comparisons of results from different WIMSD libraries are included. Finally it is described the program QVALUE for analysis and plot of results. Some examples are given. The set of benchmarks implemented on this work is a fundamental tool for testing new multigroup libraries. (author)

  14. Analysis of the OECD main steam line break benchmark using ANC-K/MIDAC code

    International Nuclear Information System (INIS)

    Aoki, Shigeaki; Tahara, Yoshihisa; Suemura, Takayuki; Ogawa, Junto

    2004-01-01

    A three-dimensional (3D) neutronics and thermal-and-hydraulics (T/H) coupling code ANC-K/MIDAC has been developed. It is the combination of the 3D nodal kinetic code ANC-K and the 3D drift flux thermal hydraulic code MIDAC. In order to verify the adequacy of this code, we have performed several international benchmark problems. In this paper, we show the calculation results of ''OECD Main Steam Line Break Benchmark (MSLB benchmark)'', which gives the typical local power peaking problem. And we calculated the return-to-power scenario of the Phase II problem. The comparison of the results shows the very good agreement of important core parameters between the ANC-K/MIDAC and other participant codes. (author)

  15. Benchmark tests and spin adaptation for the particle-particle random phase approximation

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yang; Steinmann, Stephan N.; Peng, Degao [Department of Chemistry, Duke University, Durham, North Carolina 27708 (United States); Aggelen, Helen van, E-mail: Helen.VanAggelen@UGent.be [Department of Chemistry, Duke University, Durham, North Carolina 27708 (United States); Department of Inorganic and Physical Chemistry, Ghent University, 9000 Ghent (Belgium); Yang, Weitao, E-mail: Weitao.Yang@duke.edu [Department of Chemistry and Department of Physics, Duke University, Durham, North Carolina 27708 (United States)

    2013-11-07

    The particle-particle random phase approximation (pp-RPA) provides an approximation to the correlation energy in density functional theory via the adiabatic connection [H. van Aggelen, Y. Yang, and W. Yang, Phys. Rev. A 88, 030501 (2013)]. It has virtually no delocalization error nor static correlation error for single-bond systems. However, with its formal O(N{sup 6}) scaling, the pp-RPA is computationally expensive. In this paper, we implement a spin-separated and spin-adapted pp-RPA algorithm, which reduces the computational cost by a substantial factor. We then perform benchmark tests on the G2/97 enthalpies of formation database, DBH24 reaction barrier database, and four test sets for non-bonded interactions (HB6/04, CT7/04, DI6/04, and WI9/04). For the G2/97 database, the pp-RPA gives a significantly smaller mean absolute error (8.3 kcal/mol) than the direct particle-hole RPA (ph-RPA) (22.7 kcal/mol). Furthermore, the error in the pp-RPA is nearly constant with the number of atoms in a molecule, while the error in the ph-RPA increases. For chemical reactions involving typical organic closed-shell molecules, pp- and ph-RPA both give accurate reaction energies. Similarly, both RPAs perform well for reaction barriers and nonbonded interactions. These results suggest that the pp-RPA gives reliable energies in chemical applications. The adiabatic connection formalism based on pairing matrix fluctuation is therefore expected to lead to widely applicable and accurate density functionals.

  16. Error Analysis of Variations on Larsen's Benchmark Problem

    International Nuclear Information System (INIS)

    Azmy, YY

    2001-01-01

    Error norms for three variants of Larsen's benchmark problem are evaluated using three numerical methods for solving the discrete ordinates approximation of the neutron transport equation in multidimensional Cartesian geometry. The three variants of Larsen's test problem are concerned with the incoming flux boundary conditions: unit incoming flux on the left and bottom edges (Larsen's configuration); unit, incoming flux only on the left edge; unit incoming flux only on the bottom edge. The three methods considered are the Diamond Difference (DD) method, and the constant-approximation versions of the Arbitrarily High Order Transport method of the Nodal type (AHOT-N), and of the Characteristic (AHOT-C) type. The cell-wise error is computed as the difference between the cell-averaged flux computed by each method and the exact value, then the L 1 , L 2 , and L ∞ error norms are calculated. The results of this study demonstrate that while integral error norms, i.e. L 1 , L 2 , converge to zero with mesh refinement, the pointwise L ∞ norm does not due to solution discontinuity across the singular characteristic. Little difference is observed between the error norm behavior of the three methods considered in spite of the fact that AHOT-C is locally exact, suggesting that numerical diffusion across the singular characteristic as the major source of error on the global scale. However, AHOT-C possesses a given accuracy in a larger fraction of computational cells than DD

  17. Analysis on First Criticality Benchmark Calculation of HTR-10 Core

    International Nuclear Information System (INIS)

    Zuhair; Ferhat-Aziz; As-Natio-Lasman

    2000-01-01

    HTR-10 is a graphite-moderated and helium-gas cooled pebble bed reactor with an average helium outlet temperature of 700 o C and thermal power of 10 MW. The first criticality benchmark problem of HTR-10 in this paper includes the loading number calculation of nuclear fuel in the form of UO 2 ball with U-235 enrichment of 17% for the first criticality under the helium atmosphere and core temperature of 20 o C, and the effective multiplication factor (k eff ) calculation of full core (5 m 3 ) under the helium atmosphere and various core temperatures. The group constants of fuel mixture, moderator and reflector materials were generated with WlMS/D4 using spherical model and 4 neutron energy group. The critical core height of 150.1 cm obtained from CITATION in 2-D R-Z reactor geometry exists in the calculation range of INET China, JAERI Japan and BATAN Indonesia, and OKBM Russia. The k eff calculation result of full core at various temperatures shows that the HTR-10 has negative temperature coefficient of reactivity. (author)

  18. Benchmark analysis and evaluations of materials for shielding

    International Nuclear Information System (INIS)

    Benton, E.R.; Gersey, B.B.; Uchihori, Y.; Yasuda, N.; Kitamura, H.; Shavers, M.R.

    2005-01-01

    The goal of this project is to provide a benchmark set of heavy ion beam measurements behind ''standard'' targets made using radiation detectors routinely used for astronaut dosimetry and to test the radiation shielding properties of candidate multifunctional spacecraft materials. These measurements are used in testing and validating space radiation transport codes currently being developed by NASA and in selecting promising materials for further development. The radiation dosimetry instruments being used include CR-39 plastic nuclear track detector (PNTD), Tissue-Equivalent Proportional Counter (TEPC), the Liulin Mobile Dosimetry Unit (MDU) and thermoluminescent detector (TLD). Each set of measurements include LET/y spectra, and dose and dose equivalent as functions of shield thickness. Measurements are being conducted at the NIRS HIMAC, using heavy-ion beams of energy commonly encountered in the galactic cosmic ray (GCR) environment and that have been identified as being of particular concern to the radiation protection of space crews. Measurements are being made behind a set of standard'' targets including Al, Cu, polyethylene (HDPE) and graphite that vary in thickness from 0.5 to > 30 g/cm 2 . In addition, we are measuring the shielding properties of novel shielding materials being developed by and for NASA, including carbon and polymer composites. (author)

  19. Benchmark Analysis of Institutional University Autonomy Higher Education Sectors in Denmark, Lithuania, Romania, Scotland and Sweden

    DEFF Research Database (Denmark)

    Turcan, Romeo V.; Bugaian, Larisa; Gulieva, Valeria

    2015-01-01

    This chapter consolidates the process and the findings from the four benchmark reports. It presents (i) the methodology and methods employed for data collection and data analysis; (ii) the comparative analysis of HE sectors and respective education systems in these countries; (iii) the executive ...

  20. International benchmark study of advanced thermal hydraulic safety analysis codes against measurements on IEA-R1 research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Hainoun, A., E-mail: pscientific2@aec.org.sy [Atomic Energy Commission of Syria (AECS), Nuclear Engineering Department, P.O. Box 6091, Damascus (Syrian Arab Republic); Doval, A. [Nuclear Engineering Department, Av. Cmdt. Luis Piedrabuena 4950, C.P. 8400 S.C de Bariloche, Rio Negro (Argentina); Umbehaun, P. [Centro de Engenharia Nuclear – CEN, IPEN-CNEN/SP, Av. Lineu Prestes 2242-Cidade Universitaria, CEP-05508-000 São Paulo, SP (Brazil); Chatzidakis, S. [School of Nuclear Engineering, Purdue University, West Lafayette, IN 47907 (United States); Ghazi, N. [Atomic Energy Commission of Syria (AECS), Nuclear Engineering Department, P.O. Box 6091, Damascus (Syrian Arab Republic); Park, S. [Research Reactor Design and Engineering Division, Basic Science Project Operation Dept., Korea Atomic Energy Research Institute (Korea, Republic of); Mladin, M. [Institute for Nuclear Research, Campului Street No. 1, P.O. Box 78, 115400 Mioveni, Arges (Romania); Shokr, A. [Division of Nuclear Installation Safety, Research Reactor Safety Section, International Atomic Energy Agency, A-1400 Vienna (Austria)

    2014-12-15

    analysis codes that comprise of CATHARE, RELAP5, MERSAT and PARET. The code RELAP5 was used independently by four of the participating teams and therefore the user effect and its impact on the code results can be characterized. The benchmark results demonstrate that most of the codes have the capability to correctly predict the SS case. However, for the LOFA case the simulation results show discrepancies to the measurement although the majority of the applied codes predict a qualitative correct time evolution of the corresponding transients for the coolant and clad temperatures. It is noted that the peak temperatures and the gradients around them are predicted conservatively. The quantitative assessments of benchmark results indicate different amounts of discrepancy between predictions and measurements ranging between 7% and 20% for peak clad temperatures during LOFA. The comparative prediction capability of the employed codes is addressed by additional code-to-code comparisons based on selected TH parameters that comprise flow rate, pressure drop and heat transfer coefficient during natural circulation phase.

  1. CEC thermal-hydraulic benchmark exercise on Fiploc verification experiment F2 in Battelle model containment. Experimental phases 2, 3 and 4. Results of comparisons

    International Nuclear Information System (INIS)

    Fischer, K.; Schall, M.; Wolf, L.

    1993-01-01

    The present final report comprises the major results of Phase II of the CEC thermal-hydraulic benchmark exercise on Fiploc verification experiment F2 in the Battelle model containment, experimental phases 2, 3 and 4, which was organized and sponsored by the Commission of the European Communities for the purpose of furthering the understanding and analysis of long-term thermal-hydraulic phenomena inside containments during and after severe core accidents. This benchmark exercise received high European attention with eight organizations from six countries participating with eight computer codes during phase 2. Altogether 18 results from computer code runs were supplied by the participants and constitute the basis for comparisons with the experimental data contained in this publication. This reflects both the high technical interest in, as well as the complexity of, this CEC exercise. Major comparison results between computations and data are reported on all important quantities relevant for containment analyses during long-term transients. These comparisons comprise pressure, steam and air content, velocities and their directions, heat transfer coefficients and saturation ratios. Agreements and disagreements are discussed for each participating code/institution, conclusions drawn and recommendations provided. The phase 2 CEC benchmark exercise provided an up-to-date state-of-the-art status review of the thermal-hydraulic capabilities of present computer codes for containment analyses. This exercise has shown that all of the participating codes can simulate the important global features of the experiment correctly, like: temperature stratification, pressure and leakage, heat transfer to structures, relative humidity, collection of sump water. Several weaknesses of individual codes were identified, and this may help to promote their development. As a general conclusion it may be said that while there is still a wide area of necessary extensions and improvements, the

  2. A New Performance Improvement Model: Adding Benchmarking to the Analysis of Performance Indicator Data.

    Science.gov (United States)

    Al-Kuwaiti, Ahmed; Homa, Karen; Maruthamuthu, Thennarasu

    2016-01-01

    A performance improvement model was developed that focuses on the analysis and interpretation of performance indicator (PI) data using statistical process control and benchmarking. PIs are suitable for comparison with benchmarks only if the data fall within the statistically accepted limit-that is, show only random variation. Specifically, if there is no significant special-cause variation over a period of time, then the data are ready to be benchmarked. The proposed Define, Measure, Control, Internal Threshold, and Benchmark model is adapted from the Define, Measure, Analyze, Improve, Control (DMAIC) model. The model consists of the following five steps: Step 1. Define the process; Step 2. Monitor and measure the variation over the period of time; Step 3. Check the variation of the process; if stable (no significant variation), go to Step 4; otherwise, control variation with the help of an action plan; Step 4. Develop an internal threshold and compare the process with it; Step 5.1. Compare the process with an internal benchmark; and Step 5.2. Compare the process with an external benchmark. The steps are illustrated through the use of health care-associated infection (HAI) data collected for 2013 and 2014 from the Infection Control Unit, King Fahd Hospital, University of Dammam, Saudi Arabia. Monitoring variation is an important strategy in understanding and learning about a process. In the example, HAI was monitored for variation in 2013, and the need to have a more predictable process prompted the need to control variation by an action plan. The action plan was successful, as noted by the shift in the 2014 data, compared to the historical average, and, in addition, the variation was reduced. The model is subject to limitations: For example, it cannot be used without benchmarks, which need to be calculated the same way with similar patient populations, and it focuses only on the "Analyze" part of the DMAIC model.

  3. Investigations of the VVER-1000 coolant transient benchmark phase 1 with the coupled code system RELAP5/PARCS

    International Nuclear Information System (INIS)

    Sanchez-Espinoza, Victor Hugo

    2008-07-01

    As part of the reactor dynamics activities of FZK/IRS, the qualification of best-estimate coupled code systems for reactor safety evaluations is a key step toward improving their prediction capability and acceptability. The VVER-1000 Coolant Transient Benchmark Phase 1 represents an excellent opportunity to validate the simulation capability of the coupled code system RELAP5/PACRS regarding both the thermal hydraulic plant response (RELAP5) using measured data obtained during commissioning tests at the Kozloduy nuclear power plant unit 6 and the neutron kinetics models of PARCS for hexagonal geometries. The Phase 1 is devoted to the analysis of the switching on of one main coolant pump while the other three pumps are in operation. It includes the following exercises: (a) investigation of the integral plant response using a best-estimate thermal hydraulic system code with a point kinetics model (b) analysis of the core response for given initial and transient thermal hydraulic boundary conditions using a coupled code system with 3D-neutron kinetics model and (c) investigation of the integral plant response using a best-estimate coupled code system with 3D-neutron kinetics. Already before the test, complex flow conditions exist within the RPV e.g. coolant mixing in the upper plenum caused by the reverse flow through the loop-3 with the stopped pump. The test is initiated by switching on the main coolant pump of loop-3 that leads to a reversal of the flow through the respective piping. After about 13 s the mass flow rate through this loop reaches values comparable with the one of the other loops. During this time period, the increased primary coolant flow causes a reduction of the core averaged coolant temperature and thus an increase of the core power. Later on, the power stabilizes at a level higher than the initial power. In this analysis, special attention is paid on the prediction of the spatial asymmetrical core cooling during the test and its effects on the

  4. Investigations of the VVER-1000 coolant transient benchmark phase 1 with the coupled code system RELAP5/PARCS

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez-Espinoza, Victor Hugo

    2008-07-15

    As part of the reactor dynamics activities of FZK/IRS, the qualification of best-estimate coupled code systems for reactor safety evaluations is a key step toward improving their prediction capability and acceptability. The VVER-1000 Coolant Transient Benchmark Phase 1 represents an excellent opportunity to validate the simulation capability of the coupled code system RELAP5/PACRS regarding both the thermal hydraulic plant response (RELAP5) using measured data obtained during commissioning tests at the Kozloduy nuclear power plant unit 6 and the neutron kinetics models of PARCS for hexagonal geometries. The Phase 1 is devoted to the analysis of the switching on of one main coolant pump while the other three pumps are in operation. It includes the following exercises: (a) investigation of the integral plant response using a best-estimate thermal hydraulic system code with a point kinetics model (b) analysis of the core response for given initial and transient thermal hydraulic boundary conditions using a coupled code system with 3D-neutron kinetics model and (c) investigation of the integral plant response using a best-estimate coupled code system with 3D-neutron kinetics. Already before the test, complex flow conditions exist within the RPV e.g. coolant mixing in the upper plenum caused by the reverse flow through the loop-3 with the stopped pump. The test is initiated by switching on the main coolant pump of loop-3 that leads to a reversal of the flow through the respective piping. After about 13 s the mass flow rate through this loop reaches values comparable with the one of the other loops. During this time period, the increased primary coolant flow causes a reduction of the core averaged coolant temperature and thus an increase of the core power. Later on, the power stabilizes at a level higher than the initial power. In this analysis, special attention is paid on the prediction of the spatial asymmetrical core cooling during the test and its effects on the

  5. Regulatory Benchmarking

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    2017-01-01

    Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators. The appli......Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of bench-marking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...

  6. Regulatory Benchmarking

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    2017-01-01

    Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators. The appli......Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...

  7. JNC results of BFS-62-3A benchmark calculation (CRP: Phase 5)

    International Nuclear Information System (INIS)

    Ishikawa, M.

    2004-01-01

    The present work is the results of JNC, Japan, for the Phase 5 of IAEA CRP benchmark problem (BFS-62-3A critical experiment). Analytical Method of JNC is based on Nuclear Data Library JENDL-3.2; Group Constant Set JFS-3-J3.2R: 70-group, ABBN-type self-shielding factor table based on JENDL-3.2; Effective Cross-section - Current-weighted multigroup transport cross-section. Cell model for the BFS as-built tube and pellets was (Case 1) Homogeneous Model based on IPPE definition; (Case 2) Homogeneous atomic density equivalent to JNC's heterogeneous calculation only to cross-check the adjusted correction factors; (Case 3) Heterogeneous model based on JNC's evaluation, One-dimensional plate-stretch model with Tone's background cross-section method (CASUP code). Basic diffusion Calculation was done in 18-groups and three-dimensional Hex-Z model (by the CITATION code), with Isotropic diffusion coefficients (Case 1 and 2), and Benoist's anisotropic diffusion coefficients (Case 3). For sodium void reactivity, the exact perturbation theory was applied both to basic calculation and correction calculations, ultra-fine energy group correction - approx. 100,000 group constants below 50 keV, and ABBN-type 175 group constants with shielding factors above 50 keV. Transport theory and mesh size correction 18-group, was used for three-dimensional Hex-Z model (the MINIHEX code based on the S4-P0 transport method, which was developed by JNC. Effective delayed Neutron fraction in the reactivity scale was fixed at 0.00623 by IPPE evaluation. Analytical Results of criticality values and sodium void reactivity coefficient obtained by JNC are presented. JNC made a cross-check of the homogeneous model and the adjusted correction factors submitted by IPPE, and confirmed they are consistent. JNC standard system showed quite satisfactory analytical results for the criticality and the sodium void reactivity of BFS-62-3A experiment. JNC calculated the cross-section sensitivity coefficients of BFS

  8. OECD/DOE/CEA VVER-1000 coolant transient (V1000CT) benchmark - a consistent approach for assessing coupled codes for RIA analysis

    International Nuclear Information System (INIS)

    Boyan D Ivanov; Kostadin N Ivanov; Eric Royer; Sylvie Aniel; Nikola Kolev; Pavlin Groudev

    2005-01-01

    Full text of publication follows: The Rod Ejection Accident (REA) and Main Steam Line Break (MSLB) are two of the most important Design Basis Accidents (DBA) for VVER-1000 exhibiting significant localized space-time effects. A consistent approach for assessing coupled three-dimensional (3-D) neutron kinetics/thermal hydraulics codes for these Reactivity Insertion Accidents (RIA) is to first validate the codes using the available plant test (measured) data and after that perform cross code comparative analysis for REA and MSLB scenarios. In the framework of joint effort between the Nuclear Energy Agency (NEA) of OECD, the United States Department of Energy (US DOE), and the Commissariat a l'Energie Atomique (CEA), France a coupled 3-D neutron kinetics/thermal hydraulics benchmark was defined. The benchmark is based on data from the Unit 6 of the Bulgarian Kozloduy Nuclear Power Plant (NPP). In performing this work the PSU, USA and CEA-Saclay, France have collaborated with Bulgarian organizations, in particular with the KNPP and the INRNE. The benchmark consists of two phases: Phase 1: Main Coolant Pump Switching On; Phase 2: Coolant Mixing Tests and MSLB. In addition to the measured (experiment) scenario, an extreme calculation scenario was defined for better testing 3-D neutronics/thermal-hydraulics techniques: rod ejection simulation with control rod being ejected in the core sector cooled by the switched on MCP. Since the previous coupled code benchmarks indicated that further development of the mixing computation models in the integrated codes is necessary, a coolant mixing experiment and MSLB transients are selected for simulation in Phase 2 of the benchmark. The MSLB event is characterized by a large asymmetric cooling of the core, stuck rods and a large primary coolant flow variation. Two scenarios are defined in Phase 2: the first scenario is taken from the current licensing practice and the second one is derived from the original one using aggravating

  9. Inelastic finite element analysis of a pipe-elbow assembly (benchmark problem 2)

    Energy Technology Data Exchange (ETDEWEB)

    Knapp, H P [Internationale Atomreaktorbau GmbH (INTERATOM) Bergisch Gladbach (Germany); Prij, J [Netherlands Energy Research Foundation (ECN) Petten (Netherlands)

    1979-06-01

    In the scope of the international benchmark problem effort on piping systems, benchmark problem 2 consisting of a pipe elbow assembly, subjected to a time dependent in-plane bending moment, was analysed using the finite element program MARC. Numerical results are presented and a comparison with experimental results is made. It is concluded that the main reason for the deviation between the calculated and measured values is due to the fact that creep-plasticity interaction is not taken into account in the analysis. (author)

  10. HEATING6 analysis of international thermal benchmark problem sets 1 and 2

    International Nuclear Information System (INIS)

    Childs, K.W.; Bryan, C.B.

    1986-10-01

    In order to assess the heat transfer computer codes used in the analysis of nuclear fuel shipping casks, the Nuclear Energy Agency Committee on Reactor Physics has defined seven problems for benchmarking thermal codes. All seven of these problems have been solved using the HEATING6 heat transfer code. This report presents the results of five of the problems. The remaining two problems were used in a previous benchmarking of thermal codes used in the United States, and their solutions have been previously published

  11. Benchmark Analysis Of The High Temperature Gas Cooled Reactors Using Monte Carlo Technique

    International Nuclear Information System (INIS)

    Nguyen Kien Cuong; Huda, M.Q.

    2008-01-01

    Information about several past and present experimental and prototypical facilities based on High Temperature Gas-Cooled Reactor (HTGR) concepts have been examined to assess the potential of these facilities for use in this benchmarking effort. Both reactors and critical facilities applicable to pebble-bed type cores have been considered. Two facilities - HTR-PROTEUS of Switzerland and HTR-10 of China and one conceptual design from Germany - HTR-PAP20 - appear to have the greatest potential for use in benchmarking the codes. This study presents the benchmark analysis of these reactors technologies by using MCNP4C2 and MVP/GMVP Codes to support the evaluation and future development of HTGRs. The ultimate objective of this work is to identify and develop new capabilities needed to support Generation IV initiative. (author)

  12. Synthetic graph generation for data-intensive HPC benchmarking: Scalability, analysis and real-world application

    Energy Technology Data Exchange (ETDEWEB)

    Powers, Sarah S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Lothian, Joshua [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-12-01

    The benchmarking effort within the Extreme Scale Systems Center at Oak Ridge National Laboratory seeks to provide High Performance Computing benchmarks and test suites of interest to the DoD sponsor. The work described in this report is a part of the effort focusing on graph generation. A previously developed benchmark, SystemBurn, allows the emulation of a broad spectrum of application behavior profiles within a single framework. To complement this effort, similar capabilities are desired for graph-centric problems. This report described the in-depth analysis of the generated synthetic graphs' properties at a variety of scales using different generator implementations and examines their applicability to replicating real world datasets.

  13. Benchmarking lattice physics data and methods for boiling water reactor analysis

    International Nuclear Information System (INIS)

    Cacciapouti, R.J.; Edenius, M.; Harris, D.R.; Hebert, M.J.; Kapitz, D.M.; Pilat, E.E.; VerPlanck, D.M.

    1983-01-01

    The objective of the work reported was to verify the adequacy of lattice physics modeling for the analysis of the Vermont Yankee BWR using a multigroup, two-dimensional transport theory code. The BWR lattice physics methods have been benchmarked against reactor physics experiments, higher order calculations, and actual operating data

  14. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1)

    DEFF Research Database (Denmark)

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predict...

  15. Benchmarking study and its application for shielding analysis of large accelerator facilities

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hee-Seock; Kim, Dong-hyun; Oranj, Leila Mokhtari; Oh, Joo-Hee; Lee, Arim; Jung, Nam-Suk [POSTECH, Pohang (Korea, Republic of)

    2015-10-15

    Shielding Analysis is one of subjects which are indispensable to construct large accelerator facility. Several methods, such as the Monte Carlo, discrete ordinate, and simplified calculation, have been used for this purpose. The calculation precision is overcome by increasing the trial (history) numbers. However its accuracy is still a big issue in the shielding analysis. To secure the accuracy in the Monte Carlo calculation, the benchmarking study using experimental data and the code comparison are adopted fundamentally. In this paper, the benchmarking result for electrons, protons, and heavy ions are presented as well as the proper application of the results is discussed. The benchmarking calculations, which are indispensable in the shielding analysis were performed for different particles: proton, heavy ion and electron. Four different multi-particle Monte Carlo codes, MCNPX, FLUKA, PHITS, and MARS, were examined for higher energy range equivalent to large accelerator facility. The degree of agreement between the experimental data including the SINBAD database and the calculated results were estimated in the terms of secondary neutron production and attenuation through the concrete and iron shields. The degree of discrepancy and the features of Monte Carlo codes were investigated and the application way of the benchmarking results are discussed in the view of safety margin and selecting the code for the shielding analysis. In most cases, the tested Monte Carlo codes give proper credible results except of a few limitation of each codes.

  16. Benchmarking study and its application for shielding analysis of large accelerator facilities

    International Nuclear Information System (INIS)

    Lee, Hee-Seock; Kim, Dong-hyun; Oranj, Leila Mokhtari; Oh, Joo-Hee; Lee, Arim; Jung, Nam-Suk

    2015-01-01

    Shielding Analysis is one of subjects which are indispensable to construct large accelerator facility. Several methods, such as the Monte Carlo, discrete ordinate, and simplified calculation, have been used for this purpose. The calculation precision is overcome by increasing the trial (history) numbers. However its accuracy is still a big issue in the shielding analysis. To secure the accuracy in the Monte Carlo calculation, the benchmarking study using experimental data and the code comparison are adopted fundamentally. In this paper, the benchmarking result for electrons, protons, and heavy ions are presented as well as the proper application of the results is discussed. The benchmarking calculations, which are indispensable in the shielding analysis were performed for different particles: proton, heavy ion and electron. Four different multi-particle Monte Carlo codes, MCNPX, FLUKA, PHITS, and MARS, were examined for higher energy range equivalent to large accelerator facility. The degree of agreement between the experimental data including the SINBAD database and the calculated results were estimated in the terms of secondary neutron production and attenuation through the concrete and iron shields. The degree of discrepancy and the features of Monte Carlo codes were investigated and the application way of the benchmarking results are discussed in the view of safety margin and selecting the code for the shielding analysis. In most cases, the tested Monte Carlo codes give proper credible results except of a few limitation of each codes

  17. OECD/NEA burnup credit criticality benchmarks phase IIIA: Criticality calculations of BWR spent fuel assemblies in storage and transport

    Energy Technology Data Exchange (ETDEWEB)

    Okuno, Hiroshi; Naito, Yoshitaka [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Ando, Yoshihira [Toshiba Corp., Kawasaki, Kanagawa (Japan)

    2000-09-01

    The report describes the final results of Phase IIIA Benchmarks conducted by the Burnup Credit Criticality Calculation Working Group under the auspices of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development (OECD/NEA). The benchmarks are intended to confirm the predictive capability of the current computer code and data library combinations for the neutron multiplication factor (k{sub eff}) of a layer of irradiated BWR fuel assembly array model. In total 22 benchmark problems are proposed for calculations of k{sub eff}. The effects of following parameters are investigated: cooling time, inclusion/exclusion of FP nuclides and axial burnup profile, and inclusion of axial profile of void fraction or constant void fractions during burnup. Axial profiles of fractional fission rates are further requested for five cases out of the 22 problems. Twenty-one sets of results are presented, contributed by 17 institutes from 9 countries. The relative dispersion of k{sub eff} values calculated by the participants from the mean value is almost within the band of {+-}1%{delta}k/k. The deviations from the averaged calculated fission rate profiles are found to be within {+-}5% for most cases. (author)

  18. Multiphysics field analysis and multiobjective design optimization: a benchmark problem

    Czech Academy of Sciences Publication Activity Database

    di Barba, P.; Doležel, Ivo; Karban, P.; Kůs, P.; Mach, F.; Mognaschi, M. E.; Savini, A.

    2014-01-01

    Roč. 22, č. 7 (2014), s. 1214-1225 ISSN 1741-5977 R&D Projects: GA ČR(CZ) GAP102/11/0498 Institutional support: RVO:61388998 Keywords : coupled-field problems * finite-element analysis * hp-FEM adaptation Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering Impact factor: 0.868, year: 2014

  19. Detail analysis of fusion neutronics benchmark experiment on beryllium

    International Nuclear Information System (INIS)

    Konno, Chikara; Ochiai, Kentaro; Takakura, Kosuke; Ohnishi, Seiki; Kondo, Keitaro; Wada, Masayuki; Sato, Satoshi

    2010-01-01

    Our previous analysis of the integral experiments (in situ and TOF experiments) on beryllium with DT neutrons at JAEA/FNS pointed out two problems by using MCNP4C and the latest nuclear data libraries; one was a strange larger neutron peak around 12 MeV appearing in the TOF experiment analysis with JEFF-3.1 and the other was an overestimation on law energy neutrons in the in situ experiment analyses with all the nuclear data libraries. We investigated reasons for these problems in detail. It was found out that the official ACE file MCJEFF3.1 of JEFF-3.1 had an inconsistency with the original JEFF-3.1, which caused the strange larger neutron peak around 12 MeV in the TOF experiment analysis. We also found out that the calculated thermal neutron peak was probably too large in the in situ experiment. On trial we examined influence of the thermal neutron scattering law data of beryllium metal in ENDF/B-VI. The result pointed out that the coherent elastic scattering cross-section data in the thermal neutron scattering law data of beryllium metal were probably too large.

  20. Analysis of Homogeneous BFS-73-1 MA Benchmark Core

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yeong Il; Yoo, Jae Woon; Song, Hoon; Jang, Jin Wook; Kim, Yeong Il

    2007-06-15

    Analysis of BFS-73-1 critical assembly for MA transmutation has been carried out by using K-CORE system mainly, DIF3D code. All of measured data are compared with the results of analysis and sensitiveness of calculation conditions, for example, number of neutron energy groups, mesh size used, and analysis method, are assessed. Effective multiplication factor was in good agreement within experimental uncertainty in both transport and diffusion calculations. Fission rate distribution of U-235 and U-238 is also fairly good agreed with experimental results within maximum 5% in core region. But large discrepancy was seen in blanket region and it tends to increase as the location closes to core boundary. Largest error of relative reaction rate ratio was seen in Am-243 fission and U-238 capture. For the case of Am-243, the error lay on appropriate range considering the measurement uncertainty of that as 4.6%. Sample reactivity worths for scattering dominant isotope was greatly differ from the experimental results, which can be explained in terms of sample heterogeneity effect, sample self shielding and finally resonance bilinear correction effect. These effects will be evaluated as future study. C/E of effective delayed neutron fraction is within 4%, which is within the measurement uncertainty.

  1. Analysis of Homogeneous BFS-73-1 MA Benchmark Core

    International Nuclear Information System (INIS)

    Kim, Yeong Il; Yoo, Jae Woon; Song, Hoon; Jang, Jin Wook; Kim, Yeong Il

    2007-06-01

    Analysis of BFS-73-1 critical assembly for MA transmutation has been carried out by using K-CORE system mainly, DIF3D code. All of measured data are compared with the results of analysis and sensitiveness of calculation conditions, for example, number of neutron energy groups, mesh size used, and analysis method, are assessed. Effective multiplication factor was in good agreement within experimental uncertainty in both transport and diffusion calculations. Fission rate distribution of U-235 and U-238 is also fairly good agreed with experimental results within maximum 5% in core region. But large discrepancy was seen in blanket region and it tends to increase as the location closes to core boundary. Largest error of relative reaction rate ratio was seen in Am-243 fission and U-238 capture. For the case of Am-243, the error lay on appropriate range considering the measurement uncertainty of that as 4.6%. Sample reactivity worths for scattering dominant isotope was greatly differ from the experimental results, which can be explained in terms of sample heterogeneity effect, sample self shielding and finally resonance bilinear correction effect. These effects will be evaluated as future study. C/E of effective delayed neutron fraction is within 4%, which is within the measurement uncertainty

  2. RELAP5-3D Results for Phase I (Exercise 2) of the OECD/NEA MHTGR-350 MW Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Gerhard Strydom

    2012-06-01

    The coupling of the PHISICS code suite to the thermal hydraulics system code RELAP5-3D has recently been initiated at the Idaho National Laboratory (INL) to provide a fully coupled prismatic Very High Temperature Reactor (VHTR) system modeling capability as part of the NGNP methods development program. The PHISICS code consists of three modules: INSTANT (performing 3D nodal transport core calculations), MRTAU (depletion and decay heat generation) and a perturbation/mixer module. As part of the verification and validation activities, steady state results have been obtained for Exercise 2 of Phase I of the newly-defined OECD/NEA MHTGR-350 MW Benchmark. This exercise requires participants to calculate a steady-state solution for an End of Equilibrium Cycle 350 MW Modular High Temperature Reactor (MHTGR), using the provided geometry, material, and coolant bypass flow description. The paper provides an overview of the MHTGR Benchmark and presents typical steady state results (e.g. solid and gas temperatures, thermal conductivities) for Phase I Exercise 2. Preliminary results are also provided for the early test phase of Exercise 3 using a two-group cross-section library and the Relap5-3D model developed for Exercise 2.

  3. Analysis result for OECD benchmark on thermal fatigue problem

    International Nuclear Information System (INIS)

    Kamaya, Masayuki; Nakamura, Akira; Fujii, Yuzou

    2005-01-01

    The main objective of this analysis is to understand the crack growth behavior under three-dimensional (3D) thermal fatigue by conducting 3D crack initiation and propagation analyses. The possibility of crack propagation through the wall thickness of pipe, and the accuracy of the prediction of crack initiation and propagation are of major interest. In this report, in order to estimate the heat transfer coefficients and evaluate the thermal stress, conventional finite element analysis (FEA) is conducted. Then, the crack driving force is evaluated by using the finite element alternating method (FEAM), which can derive the stress intensity factor (SIF) under 3D mechanical loading based on finite element analysis without generating the mesh for a cracked body. Through these two realistic 3D numerical analyses, it has been tried to predict the crack initiation and propagation behavior. The thermal fatigue crack initiation and propagation behavior were numerically analyzed. The conventional FEA was conducted in order to estimate the heat transfer coefficient and evaluate the thermal stress. Then, the FEAM was conducted to evaluate the SIFs of surface single cracks and interacting multiple cracks, and crack growth was evaluated. The results are summarized as follows: 1. The heat transfer coefficients were estimated as H air = 40 W/m 2 K and H water = 5000 W/m 2 K. This allows simulation of the change in temperature with time at the crack initiation points obtained by the experiment. 2. The maximum stress occurred along the line of symmetry and the maximum Mises equivalent stress was 572 MPa. 3. By taking the effect of mean stress into account according to the modified Goodman diagram, the equivalent stress range and the number of cycles to crack initiation were estimated as 1093 MPa and 3.8x10 4 , respectively, although the tensile strength was assumed to be 600 MPa. 4. It was shown from the evaluated SIFs that longitudinal cracks can penetrate the wall of the pipe

  4. Lessons Learned on Benchmarking from the International Human Reliability Analysis Empirical Study

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Forester, John A.; Bye, Andreas; Dang, Vinh N.; Lois, Erasmia

    2010-01-01

    The International Human Reliability Analysis (HRA) Empirical Study is a comparative benchmark of the prediction of HRA methods to the performance of nuclear power plant crews in a control room simulator. There are a number of unique aspects to the present study that distinguish it from previous HRA benchmarks, most notably the emphasis on a method-to-data comparison instead of a method-to-method comparison. This paper reviews seven lessons learned about HRA benchmarking from conducting the study: (1) the dual purposes of the study afforded by joining another HRA study; (2) the importance of comparing not only quantitative but also qualitative aspects of HRA; (3) consideration of both negative and positive drivers on crew performance; (4) a relatively large sample size of crews; (5) the use of multiple methods and scenarios to provide a well-rounded view of HRA performance; (6) the importance of clearly defined human failure events; and (7) the use of a common comparison language to 'translate' the results of different HRA methods. These seven lessons learned highlight how the present study can serve as a useful template for future benchmarking studies.

  5. Lessons Learned on Benchmarking from the International Human Reliability Analysis Empirical Study

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; John A. Forester; Andreas Bye; Vinh N. Dang; Erasmia Lois

    2010-06-01

    The International Human Reliability Analysis (HRA) Empirical Study is a comparative benchmark of the prediction of HRA methods to the performance of nuclear power plant crews in a control room simulator. There are a number of unique aspects to the present study that distinguish it from previous HRA benchmarks, most notably the emphasis on a method-to-data comparison instead of a method-to-method comparison. This paper reviews seven lessons learned about HRA benchmarking from conducting the study: (1) the dual purposes of the study afforded by joining another HRA study; (2) the importance of comparing not only quantitative but also qualitative aspects of HRA; (3) consideration of both negative and positive drivers on crew performance; (4) a relatively large sample size of crews; (5) the use of multiple methods and scenarios to provide a well-rounded view of HRA performance; (6) the importance of clearly defined human failure events; and (7) the use of a common comparison language to “translate” the results of different HRA methods. These seven lessons learned highlight how the present study can serve as a useful template for future benchmarking studies.

  6. Performance evaluation of tile-based Fisher Ratio analysis using a benchmark yeast metabolome dataset.

    Science.gov (United States)

    Watson, Nathanial E; Parsons, Brendon A; Synovec, Robert E

    2016-08-12

    Performance of tile-based Fisher Ratio (F-ratio) data analysis, recently developed for discovery-based studies using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC×GC-TOFMS), is evaluated with a metabolomics dataset that had been previously analyzed in great detail, but while taking a brute force approach. The previously analyzed data (referred to herein as the benchmark dataset) were intracellular extracts from Saccharomyces cerevisiae (yeast), either metabolizing glucose (repressed) or ethanol (derepressed), which define the two classes in the discovery-based analysis to find metabolites that are statistically different in concentration between the two classes. Beneficially, this previously analyzed dataset provides a concrete means to validate the tile-based F-ratio software. Herein, we demonstrate and validate the significant benefits of applying tile-based F-ratio analysis. The yeast metabolomics data are analyzed more rapidly in about one week versus one year for the prior studies with this dataset. Furthermore, a null distribution analysis is implemented to statistically determine an adequate F-ratio threshold, whereby the variables with F-ratio values below the threshold can be ignored as not class distinguishing, which provides the analyst with confidence when analyzing the hit table. Forty-six of the fifty-four benchmarked changing metabolites were discovered by the new methodology while consistently excluding all but one of the benchmarked nineteen false positive metabolites previously identified. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Performance Based Clustering for Benchmarking of Container Ports: an Application of Dea and Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Jie Wu

    2010-12-01

    Full Text Available The operational performance of container ports has received more and more attentions in both academic and practitioner circles, the performance evaluation and process improvement of container ports have also been the focus of several studies. In this paper, Data Envelopment Analysis (DEA, an effective tool for relative efficiency assessment, is utilized for measuring the performances and benchmarking of the 77 world container ports in 2007. The used approaches in the current study consider four inputs (Capacity of Cargo Handling Machines, Number of Berths, Terminal Area and Storage Capacity and a single output (Container Throughput. The results for the efficiency scores are analyzed, and a unique ordering of the ports based on average cross efficiency is provided, also cluster analysis technique is used to select the more appropriate targets for poorly performing ports to use as benchmarks.

  8. CIEMAT’s contribution to the phase II of the OECD-NEA RIA benchmark on thermo-mechanical fuel codes performance

    Energy Technology Data Exchange (ETDEWEB)

    Sagrado, I.C.; Vallejo, I.; Herranz, L.E.

    2015-07-01

    As a part of the international efforts devoted to validate and/or update the current fuel safety criteria, the OECD-NEA has launched a second phase of the RIA benchmark on thermomechanical fuel codes performance. CIEMAT contributes simulating the ten scenarios proposed with FRAPTRAN and SCANAIR. Both codes lead to similar predictions during the heating-up; however, during the cooling-down significant deviations may appear. They are mainly caused by the estimations of gap closure and re-opening and the clad to water heat exchange approaches. The uncertainty analysis performed for the SCANAIR estimations leads to uncertainty ranges below 15% and 28% for maximum temperatures and deformations, respectively. The corresponding sensitivity analysis shows that, in addition to the injected energy, special attention should be paid to fuel thermal expansion and clad yield stress models. (Author)

  9. A Benchmark Study of a Seismic Analysis Program for a Single Column of a HTGR Core

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Ji Ho [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    A seismic analysis program, SAPCOR (Seismic Analysis of Prismatic HTGR Core), was developed in Korea Atomic Energy Research Institute. The program is used for the evaluation of deformed shapes and forces on the graphite blocks which using point-mass rigid bodies with Kelvin-Voigt impact models. In the previous studies, the program was verified using theoretical solutions and benchmark problems. To validate the program for more complicated problems, a free vibration analysis of a single column of a HTGR core was selected and the calculation results of the SAPCOR and a commercial FEM code, Abaqus, were compared in this study.

  10. BWR stability analysis: methodology of the stability analysis and results of PSI for the NEA/NCR benchmark task

    International Nuclear Information System (INIS)

    Hennig, D.; Nechvatal, L.

    1996-09-01

    The report describes the PSI stability analysis methodology and the validation of this methodology based on the international OECD/NEA BWR stability benchmark task. In the frame of this work, the stability properties of some operation points of the NPP Ringhals 1 have been analysed and compared with the experimental results. (author) figs., tabs., 45 refs

  11. OECD/NEA burnup credit criticality benchmarks phase IIIB: Burnup calculations of BWR fuel assemblies for storage and transport

    International Nuclear Information System (INIS)

    Okuno, Hiroshi; Naito, Yoshitaka; Suyama, Kenya

    2002-02-01

    The report describes the final results of the Phase IIIB Benchmark conducted by the Expert Group on Burnup Credit Criticality Safety under the auspices of the Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development (OECD). The Benchmark was intended to compare the predictability of current computer code and data library combinations for the atomic number densities of an irradiated PWR fuel assembly model. The fuel assembly was irradiated under specific power of 25.6 MW/tHM up to 40 GWd/tHM and cooled for five years. The void fraction was assumed to be uniform throughout the channel box and constant, at 0, 40 and 70%, during burnup. In total, 16 results were submitted from 13 institutes of 7 countries. The calculated atomic number densities of 12 actinides and 20 fission product nuclides were found to be for the most part within a range of ±10% relative to the average, although some results, esp. 155 Eu and gadolinium isotopes, exceeded the band, which will require further investigation. Pin-wise burnup results agreed well among the participants. The results in the infinite neutron multiplication factor k ∞ also accorded well with each other for void fractions of 0 and 40%; however some results deviated from the averaged value noticeably for the void fraction of 70%. (author)

  12. OECD/NEA burnup credit criticality benchmarks phase IIIB. Burnup calculations of BWR fuel assemblies for storage and transport

    Energy Technology Data Exchange (ETDEWEB)

    Okuno, Hiroshi; Naito, Yoshitaka; Suyama, Kenya [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    2002-02-01

    The report describes the final results of the Phase IIIB Benchmark conducted by the Expert Group on Burnup Credit Criticality Safety under the auspices of the Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development (OECD). The Benchmark was intended to compare the predictability of current computer code and data library combinations for the atomic number densities of an irradiated PWR fuel assembly model. The fuel assembly was irradiated under specific power of 25.6 MW/tHM up to 40 GWd/tHM and cooled for five years. The void fraction was assumed to be uniform throughout the channel box and constant, at 0, 40 and 70%, during burnup. In total, 16 results were submitted from 13 institutes of 7 countries. The calculated atomic number densities of 12 actinides and 20 fission product nuclides were found to be for the most part within a range of {+-}10% relative to the average, although some results, esp. {sup 155}Eu and gadolinium isotopes, exceeded the band, which will require further investigation. Pin-wise burnup results agreed well among the participants. The results in the infinite neutron multiplication factor k{sub {infinity}} also accorded well with each other for void fractions of 0 and 40%; however some results deviated from the averaged value noticeably for the void fraction of 70%. (author)

  13. Definition and Analysis of Heavy Water Reactor Benchmarks for Testing New Wims-D Libraries

    International Nuclear Information System (INIS)

    Leszczynski, Francisco

    2000-01-01

    This work is part of the IAEA-WIMS Library Update Project (WLUP). A group of heavy water reactor benchmarks have been selected for testing new WIMS-D libraries, including calculations with WIMSD5B program and the analysis of results.These benchmarks cover a wide variety of reactors and conditions, from fresh fuels to high burnup, and from natural to enriched uranium.Besides, each benchmark includes variations in lattice pitch and in coolants (normally heavy water and void).Multiplication factors with critical experimental bucklings and other parameters are calculated and compared with experimental reference values.The WIMS libraries used for the calculations were generated with basic data from JEF-2.2 Rev.3 (JEF) and ENDF/B-VI iNReleaseln 5 (E6) Results obtained with WIMS-86 (W86) library, included with WIMSD5B package, from Windfrith, UK with adjusted data, are included also, for showing the improvements obtained with the new -not adjusted- libraries.The calculations with WIMSD5B were made with two methods (input program options): PIJ (two-dimension collision probability method) and DSN (one-dimension Sn method, with homogenization of materials by ring).The general conclusions are: the library based on JEF data and the DSN meted give the best results, that in average are acceptable

  14. NODAL3 Sensitivity Analysis for NEACRP 3D LWR Core Transient Benchmark (PWR

    Directory of Open Access Journals (Sweden)

    Surian Pinem

    2016-01-01

    Full Text Available This paper reports the results of sensitivity analysis of the multidimension, multigroup neutron diffusion NODAL3 code for the NEACRP 3D LWR core transient benchmarks (PWR. The code input parameters covered in the sensitivity analysis are the radial and axial node sizes (the number of radial node per fuel assembly and the number of axial layers, heat conduction node size in the fuel pellet and cladding, and the maximum time step. The output parameters considered in this analysis followed the above-mentioned core transient benchmarks, that is, power peak, time of power peak, power, averaged Doppler temperature, maximum fuel centerline temperature, and coolant outlet temperature at the end of simulation (5 s. The sensitivity analysis results showed that the radial node size and maximum time step give a significant effect on the transient parameters, especially the time of power peak, for the HZP and HFP conditions. The number of ring divisions for fuel pellet and cladding gives negligible effect on the transient solutions. For productive work of the PWR transient analysis, based on the present sensitivity analysis results, we recommend NODAL3 users to use 2×2 radial nodes per assembly, 1×18 axial layers per assembly, the maximum time step of 10 ms, and 9 and 1 ring divisions for fuel pellet and cladding, respectively.

  15. Statistical Analysis of Reactor Pressure Vessel Fluence Calculation Benchmark Data Using Multiple Regression Techniques

    International Nuclear Information System (INIS)

    Carew, John F.; Finch, Stephen J.; Lois, Lambros

    2003-01-01

    The calculated >1-MeV pressure vessel fluence is used to determine the fracture toughness and integrity of the reactor pressure vessel. It is therefore of the utmost importance to ensure that the fluence prediction is accurate and unbiased. In practice, this assurance is provided by comparing the predictions of the calculational methodology with an extensive set of accurate benchmarks. A benchmarking database is used to provide an estimate of the overall average measurement-to-calculation (M/C) bias in the calculations ( ). This average is used as an ad-hoc multiplicative adjustment to the calculations to correct for the observed calculational bias. However, this average only provides a well-defined and valid adjustment of the fluence if the M/C data are homogeneous; i.e., the data are statistically independent and there is no correlation between subsets of M/C data.Typically, the identification of correlations between the errors in the database M/C values is difficult because the correlation is of the same magnitude as the random errors in the M/C data and varies substantially over the database. In this paper, an evaluation of a reactor dosimetry benchmark database is performed to determine the statistical validity of the adjustment to the calculated pressure vessel fluence. Physical mechanisms that could potentially introduce a correlation between the subsets of M/C ratios are identified and included in a multiple regression analysis of the M/C data. Rigorous statistical criteria are used to evaluate the homogeneity of the M/C data and determine the validity of the adjustment.For the database evaluated, the M/C data are found to be strongly correlated with dosimeter response threshold energy and dosimeter location (e.g., cavity versus in-vessel). It is shown that because of the inhomogeneity in the M/C data, for this database, the benchmark data do not provide a valid basis for adjusting the pressure vessel fluence.The statistical criteria and methods employed in

  16. Preliminary evaluation of factors associated with premature trial closure and feasibility of accrual benchmarks in phase III oncology trials.

    Science.gov (United States)

    Schroen, Anneke T; Petroni, Gina R; Wang, Hongkun; Gray, Robert; Wang, Xiaofei F; Cronin, Walter; Sargent, Daniel J; Benedetti, Jacqueline; Wickerham, Donald L; Djulbegovic, Benjamin; Slingluff, Craig L

    2010-08-01

    A major challenge for randomized phase III oncology trials is the frequent low rates of patient enrollment, resulting in high rates of premature closure due to insufficient accrual. We conducted a pilot study to determine the extent of trial closure due to poor accrual, feasibility of identifying trial factors associated with sufficient accrual, impact of redesign strategies on trial accrual, and accrual benchmarks designating high failure risk in the clinical trials cooperative group (CTCG) setting. A subset of phase III trials opened by five CTCGs between August 1991 and March 2004 was evaluated. Design elements, experimental agents, redesign strategies, and pretrial accrual assessment supporting accrual predictions were abstracted from CTCG documents. Percent actual/predicted accrual rate averaged per month was calculated. Trials were categorized as having sufficient or insufficient accrual based on reason for trial termination. Analyses included univariate and bivariate summaries to identify potential trial factors associated with accrual sufficiency. Among 40 trials from one CTCG, 21 (52.5%) trials closed due to insufficient accrual. In 82 trials from five CTCGs, therapeutic trials accrued sufficiently more often than nontherapeutic trials (59% vs 27%, p = 0.05). Trials including pretrial accrual assessment more often achieved sufficient accrual than those without (67% vs 47%, p = 0.08). Fewer exclusion criteria, shorter consent forms, other CTCG participation, and trial design simplicity were not associated with achieving sufficient accrual. Trials accruing at a rate much lower than predicted (accrual rate) were consistently closed due to insufficient accrual. This trial subset under-represents certain experimental modalities. Data sources do not allow accounting for all factors potentially related to accrual success. Trial closure due to insufficient accrual is common. Certain trial design factors appear associated with attaining sufficient accrual. Defining

  17. Analysis of the MZA/MZB benchmarks with modern nuclear data sets

    International Nuclear Information System (INIS)

    Rooijen, W.F.G. van

    2013-01-01

    Highlights: • ERANOS libraries are produced based on four modern nuclear data sets. • The MOZART MZA/MZB benchmarks are analyzed with these li- braries. • Results are generally acceptable in an academic context, but for highly accurate applications data adjustment is required. • Some discrepancies between the calculations and the benchmark results remain and cannot be readily explained. • Successful generation of ECCO libraries and covariance data for ERA- NOS. - Abstract: For fast reactor design and analysis, our laboratory uses, amongst others, the ERANOS code system. Unfortunately, the publicly available version of ERANOS does not have the most recent nuclear data. Therefore, it was decided to implement an integrated processing system to generate cross sections libraries for the ECCO cell code, as well as covariance data. Cross sections are generated from the original ENDF files. For our purposes, it is important to ascertain that the ECCO cross section libraries are of adequate quality to allow design and analysis of advanced fast reactors in an academic context. In this paper, we present an analysis of the MZA/MZB benchmarks with nuclear data from JENDL-4.0, JEFF-3.1.2 and ENDF/B-VII.1. Results are that reactivity is generally well predicted, with an uncertainty of about 1% due to covariances of the nuclear data. Reaction rate ratios are satisfactorily calculated, as well as the flux spectrum and reaction rate traverses. Some problems remain: the magnitude of the void effect is not satisfactorily calculated, and reaction rate traverses are not always satisfactorily calculated. On the whole, the ECCO libraries are sufficient for design and analysis tasks in an academic context. For high-precision calculations, such as required for licensing tasks and detailed design calculations, data adjustment is still necessary as the “native” covariance data in the ENDF files is not accurate enough

  18. Benchmarking in University Toolbox

    Directory of Open Access Journals (Sweden)

    Katarzyna Kuźmicz

    2015-06-01

    Full Text Available In the face of global competition and rising challenges that higher education institutions (HEIs meet, it is imperative to increase innovativeness and efficiency of their management. Benchmarking can be the appropriate tool to search for a point of reference necessary to assess institution’s competitive position and learn from the best in order to improve. The primary purpose of the paper is to present in-depth analysis of benchmarking application in HEIs worldwide. The study involves indicating premises of using benchmarking in HEIs. It also contains detailed examination of types, approaches and scope of benchmarking initiatives. The thorough insight of benchmarking applications enabled developing classification of benchmarking undertakings in HEIs. The paper includes review of the most recent benchmarking projects and relating them to the classification according to the elaborated criteria (geographical range, scope, type of data, subject, support and continuity. The presented examples were chosen in order to exemplify different approaches to benchmarking in higher education setting. The study was performed on the basis of the published reports from benchmarking projects, scientific literature and the experience of the author from the active participation in benchmarking projects. The paper concludes with recommendations for university managers undertaking benchmarking, derived on the basis of the conducted analysis.

  19. Deflection-based method for seismic response analysis of concrete walls: Benchmarking of CAMUS experiment

    International Nuclear Information System (INIS)

    Basu, Prabir C.; Roshan, A.D.

    2007-01-01

    A number of shake table tests had been conducted on the scaled down model of a concrete wall as part of CAMUS experiment. The experiments were conducted between 1996 and 1998 in the CEA facilities in Saclay, France. Benchmarking of CAMUS experiments was undertaken as a part of the coordinated research program on 'Safety Significance of Near-Field Earthquakes' organised by International Atomic Energy Agency (IAEA). Technique of deflection-based method was adopted for benchmarking exercise. Non-linear static procedure of deflection-based method has two basic steps: pushover analysis, and determination of target displacement or performance point. Pushover analysis is an analytical procedure to assess the capacity to withstand seismic loading effect that a structural system can offer considering the redundancies and inelastic deformation. Outcome of a pushover analysis is the plot of force-displacement (base shear-top/roof displacement) curve of the structure. This is obtained by step-by-step non-linear static analysis of the structure with increasing value of load. The second step is to determine target displacement, which is also known as performance point. The target displacement is the likely maximum displacement of the structure due to a specified seismic input motion. Established procedures, FEMA-273 and ATC-40, are available to determine this maximum deflection. The responses of CAMUS test specimen are determined by deflection-based method and analytically calculated values compare well with the test results

  20. Burn-up Credit Criticality Safety Benchmark-Phase II-E. Impact of Isotopic Inventory Changes due to Control Rod Insertions on Reactivity and the End Effect in PWR UO2 Fuel Assemblies

    International Nuclear Information System (INIS)

    Neuber, Jens Christian; Tippl, Wolfgang; Hemptinne, Gwendoline de; Maes, Philippe; Ranta-aho, Anssu; Peneliau, Yannick; Jutier, Ludyvine; Tardy, Marcel; Reiche, Ingo; Kroeger, Helge; Nakata, Tetsuo; Armishaw, Malcom; Miller, Thomas M.

    2015-01-01

    PWR UO 2 spent fuel assemblies was analysed. The results of the Phase II-C benchmark were used to define the two axial burn-up profiles for the Phase II-E benchmark such that the impact of the asymmetry on the reactivity and the end effect is bounded. The two profiles together with the sets of isotopic number densities related to different control rod insertion depths during depletion were provided to the participants in the Phase II-E benchmark. To enable the participants to estimate the end effects related to the profiles and the control rod insertion depths the isotopic number densities applying to uniform distributions of the two average burn-ups of 30 MWd/kg U and 50 MWd/kg U were also supplied. In the Phase II-E benchmark basically the same conceptual transport cask configuration was employed as was already used in Phase II-C: A finite transport cask made of stainless steel is used, containing 21 fuel assemblies separated by borated stainless steel plates. The cask was assumed to be fully flooded with pure light water. In total, fourteen solutions were submitted to the Phase II-E benchmark exercise, by ten companies/organisations in seven countries. The participants were asked to calculate, using the two axial burn-up profiles and the related uniform burn-up distributions, the neutron multiplication factors eff k of the cask configuration employing the sets of isotopic number densities related to preset control rod insertion depths during depletion. In addition, the optional task was suggested to the participants to calculate for both, the axial burn-up profiles as well as the related uniform burn-up distributions, the axial fission densities for the axial zones that had been used to describe the axial burn-up distributions for the different control rod insertion depths. For this optional task three solutions were submitted by three companies/organisations in three countries. The analysis of the results obtained for the Phase II-E benchmark exercise begins with

  1. Report on the on-going EUREDATA Benchmark Exercise on data analysis

    International Nuclear Information System (INIS)

    Besi, A.; Colombo, A.G.

    1989-01-01

    In April 1987 the JRC was charged by the Assembly of the EuReDatA members with the organization and the coordination of a Benchmark Exercise (BE) on data analysis. The main aim of the BE is a comparison of the methods used by the various organizations to estimate reliability parameters and functions from field data. The reference data set was to be constituted by raw data taken from the Component Event Data Bank (CEDB). The CEDB is a centralized bank, which collects data describing the operational behaviour of components of nuclear power plants operating in various European Countries. (orig./HSCH)

  2. Statistical process control as a tool for controlling operating room performance: retrospective analysis and benchmarking.

    Science.gov (United States)

    Chen, Tsung-Tai; Chang, Yun-Jau; Ku, Shei-Ling; Chung, Kuo-Piao

    2010-10-01

    There is much research using statistical process control (SPC) to monitor surgical performance, including comparisons among groups to detect small process shifts, but few of these studies have included a stabilization process. This study aimed to analyse the performance of surgeons in operating room (OR) and set a benchmark by SPC after stabilized process. The OR profile of 499 patients who underwent laparoscopic cholecystectomy performed by 16 surgeons at a tertiary hospital in Taiwan during 2005 and 2006 were recorded. SPC was applied to analyse operative and non-operative times using the following five steps: first, the times were divided into two segments; second, they were normalized; third, they were evaluated as individual processes; fourth, the ARL(0) was calculated;, and fifth, the different groups (surgeons) were compared. Outliers were excluded to ensure stability for each group and to facilitate inter-group comparison. The results showed that in the stabilized process, only one surgeon exhibited a significantly shorter total process time (including operative time and non-operative time). In this study, we use five steps to demonstrate how to control surgical and non-surgical time in phase I. There are some measures that can be taken to prevent skew and instability in the process. Also, using SPC, one surgeon can be shown to be a real benchmark. © 2010 Blackwell Publishing Ltd.

  3. Benchmarking of grid fault modes in single-phase grid-connected photovoltaic systems

    DEFF Research Database (Denmark)

    Yang, Yongheng; Blaabjerg, Frede; Zou, Zhixiang

    2012-01-01

    Pushed by the booming installations of single-phase photovoltaic (PV) systems, the grid demands regarding the integration of PV systems are expected to be modified. Hence, the future PV systems should become more active with functionalities of low voltage ride-through (LVRT) and the grid support...

  4. Analysis of CSNI benchmark test on containment using the code CONTRAN

    International Nuclear Information System (INIS)

    Haware, S.K.; Ghosh, A.K.; Raj, V.V.; Kakodkar, A.

    1994-01-01

    A programme of experimental as well as analytical studies on the behaviour of nuclear reactor containment is being actively pursued. A large number ol' experiments on pressure and temperature transients have been carried out on a one-tenth scale model vapour suppression pool containment experimental facility, simulating the 220 MWe Indian Pressurised Heavy Water Reactors. A programme of development of computer codes is underway to enable prediction of containment behaviour under accident conditions. This includes codes for pressure and temperature transients, hydrogen behaviour, aerosol behaviour etc. As a part of this ongoing work, the code CONTRAN (CONtainment TRansient ANalysis) has been developed for predicting the thermal hydraulic transients in a multicompartment containment. For the assessment of the hydrogen behaviour, the models for hydrogen transportation in a multicompartment configuration and hydrogen combustion have been incorporated in the code CONTRAN. The code also has models for the heat and mass transfer due to condensation and convection heat transfer. The structural heat transfer is modeled using the one-dimensional transient heat conduction equation. Extensive validation exercises have been carried out with the code CONTRAN. The code CONTRAN has been successfully used for the analysis of the benchmark test devised by Committee on the Safety of Nuclear Installations (CSNI) of the Organisation for Economic Cooperation and Development (OECD), to test the numerical accuracy and convergence errors in the computation of mass and energy conservation for the fluid and in the computation of heat conduction in structural walls. The salient features of the code CONTRAN, description of the CSNI benchmark test and a comparison of the CONTRAN predictions with the benchmark test results are presented and discussed in the paper. (author)

  5. Application of the random vibration approach in the seismic analysis of LMFBR structures - Benchmark calculations

    International Nuclear Information System (INIS)

    Preumont, A.; Shilab, S.; Cornaggia, L.; Reale, M.; Labbe, P.; Noe, H.

    1992-01-01

    This benchmark exercise is the continuation of the state-of-the-art review (EUR 11369 EN) which concluded that the random vibration approach could be an effective tool in seismic analysis of nuclear power plants, with potential advantages on time history and response spectrum techniques. As compared to the latter, the random vibration method provides an accurate treatment of multisupport excitations, non classical damping as well as the combination of high-frequency modal components. With respect to the former, the random vibration method offers direct information on statistical variability (probability distribution) and cheaper computations. The disadvantages of the random vibration method are that it is based on stationary results, and requires a power spectral density input instead of a response spectrum. A benchmark exercise to compare the three methods from the various aspects mentioned above, on one or several simple structures has been made. The following aspects have been covered with the simplest possible models: (i) statistical variability, (ii) multisupport excitation, (iii) non-classical damping. The random vibration method is therefore concluded to be a reliable method of analysis. Its use is recommended, particularly for preliminary design, owing to its computational advantage on multiple time history analysis

  6. A benchmark for statistical microarray data analysis that preserves actual biological and technical variance.

    Science.gov (United States)

    De Hertogh, Benoît; De Meulder, Bertrand; Berger, Fabrice; Pierre, Michael; Bareke, Eric; Gaigneaux, Anthoula; Depiereux, Eric

    2010-01-11

    Recent reanalysis of spike-in datasets underscored the need for new and more accurate benchmark datasets for statistical microarray analysis. We present here a fresh method using biologically-relevant data to evaluate the performance of statistical methods. Our novel method ranks the probesets from a dataset composed of publicly-available biological microarray data and extracts subset matrices with precise information/noise ratios. Our method can be used to determine the capability of different methods to better estimate variance for a given number of replicates. The mean-variance and mean-fold change relationships of the matrices revealed a closer approximation of biological reality. Performance analysis refined the results from benchmarks published previously.We show that the Shrinkage t test (close to Limma) was the best of the methods tested, except when two replicates were examined, where the Regularized t test and the Window t test performed slightly better. The R scripts used for the analysis are available at http://urbm-cluster.urbm.fundp.ac.be/~bdemeulder/.

  7. Two benchmark cases for the trio two-phase flow module

    Energy Technology Data Exchange (ETDEWEB)

    Helton, D.; Hassan, Y. [Texas A and M University, Nuclear Engineering Dept., College Station, Texas (United States); Kumbaro, A. [CEA Saclay, 91 - Gif-sur-Yvette (France). Dept. de Mecanique et de Technologie

    2001-07-01

    This report presents a series of problems that were studied in order to assess the new implementations recently made to the two-phase flow module. Each problem is designed to give insight into a particular area of the code refinement. As such, each problem, and its corresponding results will be discussed individually, with comparisons made to experimental or analytical results whenever possible. TrioU is a thermal hydraulics program created by CEA. It is currently evolving in to a multi-dimensional, multi-fluid, multi-phase program. The purpose of TrioU is to provide a platform for testing of new numerical methods and physical models that are developed by the Nuclear Reactor Division of CEA. TrioU is written in an object-oriented programming language, and maintained by a version-controllable environment, for ease in parallelization and multiple-site development. (author)

  8. Two benchmark cases for the trio two-phase flow module

    International Nuclear Information System (INIS)

    Helton, D.; Hassan, Y.; Kumbaro, A.

    2001-01-01

    This report presents a series of problems that were studied in order to assess the new implementations recently made to the two-phase flow module. Each problem is designed to give insight into a particular area of the code refinement. As such, each problem, and its corresponding results will be discussed individually, with comparisons made to experimental or analytical results whenever possible. TrioU is a thermal hydraulics program created by CEA. It is currently evolving in to a multi-dimensional, multi-fluid, multi-phase program. The purpose of TrioU is to provide a platform for testing of new numerical methods and physical models that are developed by the Nuclear Reactor Division of CEA. TrioU is written in an object-oriented programming language, and maintained by a version-controllable environment, for ease in parallelization and multiple-site development. (author)

  9. Determining the sensitivity of Data Envelopment Analysis method used in airport benchmarking

    Directory of Open Access Journals (Sweden)

    Mircea BOSCOIANU

    2013-03-01

    Full Text Available In the last decade there were some important changes in the airport industry, caused by the liberalization of the air transportation market. Until recently airports were considered infrastructure elements, and they were evaluated only by traffic values or their maximum capacity. Gradual orientation towards commercial led to the need of finding another ways of evaluation, more efficiency oriented. The existing methods for assessing efficiency used for other production units were not suitable to be used in case of airports due to specific features and high complexity of airport operations. In the last years there were some papers that proposed the Data Envelopment Analysis as a method for assessing the operational efficiency in order to conduct the benchmarking. This method offers the possibility of dealing with a large number of variables of different types, which represents the main advantage of this method and also recommends it as a good benchmarking tool for the airports management. This paper goal is to determine the sensitivity of this method in relation with its inputs and outputs. A Data Envelopment Analysis is conducted for 128 airports worldwide, in both input- and output-oriented measures, and the results are analysed against some inputs and outputs variations. Possible weaknesses of using DEA for assessing airports performance are revealed and analysed against this method advantages.

  10. Comparison of typical inelastic analysis predictions with benchmark problem experimental results

    International Nuclear Information System (INIS)

    Clinard, J.A.; Corum, J.M.; Sartory, W.K.

    1975-01-01

    The results of exemplary inelastic analyses are presented for a series of experimental benchmark problems. Consistent analytical procedures and constitutive relations were used in each of the analyses, and published material behavior data were used in all cases. Two finite-element inelastic computer programs were employed. These programs implement the analysis procedures and constitutive equations for Type 304 stainless steel that are currently used in many analyses of elevated-temperature nuclear reactor system components. The analysis procedures and constitutive relations are briefly discussed, and representative analytical results are presented and compared to the test data. The results that are presented demonstrate the feasibility of performing inelastic analyses, and they are indicative of the general level of agreement that the analyst might expect when using conventional inelastic analysis procedures. (U.S.)

  11. Comparison of typical inelastic analysis predictions with benchmark problem experimental results

    International Nuclear Information System (INIS)

    Clinard, J.A.; Corum, J.M.; Sartory, W.K.

    1975-01-01

    The results of exemplary inelastic analyses for experimental benchmark problems on reactor components are presented. Consistent analytical procedures and constitutive relations were used in each of the analyses, and the material behavior data presented in the Appendix were used in all cases. Two finite-element inelastic computer programs were employed. These programs implement the analysis procedures and constitutive equations for type 304 stainless steel that are currently used in many analyses of elevated-temperature nuclear reactor system components. The analysis procedures and constitutive relations are briefly discussed, and representative analytical results are presented and compared to the test data. The results that are presented demonstrate the feasibility of performing inelastic analyses for the types of problems discussed, and they are indicative of the general level of agreement that the analyst might expect when using conventional inelastic analysis procedures. (U.S.)

  12. The CEC benchmark interclay on rheological models for clays results of pilot phase (January-June 1989) about the boom clay at Mol (B)

    International Nuclear Information System (INIS)

    Come, B.

    1990-01-01

    A pilot phase of a benchmark exercise for rheological models for boom clay, called interclay, was launched by the CEC in January 1989. The purpose of the benchmark is to compare predictions of calculations made about well-defined rock-mechanical problems, similar to real cases at the Mol facilities, using existing data from laboratory tests on samples. Basically, two approaches were to be compared: one considering clay as an elasto-visco-plastic medium (rock-mechanics approach), and one isolating the role of pore-pressure dissipation (soil-mechanics approach)

  13. TRACE/PARCS analysis of the OECD/NEA Oskarshamn-2 BWR stability benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Kozlowski, T. [Univ. of Illinois, Urbana-Champaign, IL (United States); Downar, T.; Xu, Y.; Wysocki, A. [Univ. of Michigan, Ann Arbor, MI (United States); Ivanov, K.; Magedanz, J.; Hardgrove, M. [Pennsylvania State Univ., Univ. Park, PA (United States); March-Leuba, J. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); Hudson, N.; Woodyatt, D. [Nuclear Regulatory Commission, Rockville, MD (United States)

    2012-07-01

    On February 25, 1999, the Oskarshamn-2 NPP experienced a stability event which culminated in diverging power oscillations with a decay ratio of about 1.4. The event was successfully modeled by the TRACE/PARCS coupled code system, and further analysis of the event is described in this paper. The results show very good agreement with the plant data, capturing the entire behavior of the transient including the onset of instability, growth of the oscillations (decay ratio) and oscillation frequency. This provides confidence in the prediction of other parameters which are not available from the plant records. The event provides coupled code validation for a challenging BWR stability event, which involves the accurate simulation of neutron kinetics (NK), thermal-hydraulics (TH), and TH/NK. coupling. The success of this work has demonstrated the ability of the 3-D coupled systems code TRACE/PARCS to capture the complex behavior of BWR stability events. The problem was released as an international OECD/NEA benchmark, and it is the first benchmark based on measured plant data for a stability event with a DR greater than one. Interested participants are invited to contact authors for more information. (authors)

  14. Benchmarking the MCNP code for Monte Carlo modelling of an in vivo neutron activation analysis system.

    Science.gov (United States)

    Natto, S A; Lewis, D G; Ryde, S J

    1998-01-01

    The Monte Carlo computer code MCNP (version 4A) has been used to develop a personal computer-based model of the Swansea in vivo neutron activation analysis (IVNAA) system. The model included specification of the neutron source (252Cf), collimators, reflectors and shielding. The MCNP model was 'benchmarked' against fast neutron and thermal neutron fluence data obtained experimentally from the IVNAA system. The Swansea system allows two irradiation geometries using 'short' and 'long' collimators, which provide alternative dose rates for IVNAA. The data presented here relate to the short collimator, although results of similar accuracy were obtained using the long collimator. The fast neutron fluence was measured in air at a series of depths inside the collimator. The measurements agreed with the MCNP simulation within the statistical uncertainty (5-10%) of the calculations. The thermal neutron fluence was measured and calculated inside the cuboidal water phantom. The depth of maximum thermal fluence was 3.2 cm (measured) and 3.0 cm (calculated). The width of the 50% thermal fluence level across the phantom at its mid-depth was found to be the same by both MCNP and experiment. This benchmarking exercise has given us a high degree of confidence in MCNP as a tool for the design of IVNAA systems.

  15. Constructing Benchmark Databases and Protocols for Medical Image Analysis: Diabetic Retinopathy

    Directory of Open Access Journals (Sweden)

    Tomi Kauppi

    2013-01-01

    Full Text Available We address the performance evaluation practices for developing medical image analysis methods, in particular, how to establish and share databases of medical images with verified ground truth and solid evaluation protocols. Such databases support the development of better algorithms, execution of profound method comparisons, and, consequently, technology transfer from research laboratories to clinical practice. For this purpose, we propose a framework consisting of reusable methods and tools for the laborious task of constructing a benchmark database. We provide a software tool for medical image annotation helping to collect class label, spatial span, and expert's confidence on lesions and a method to appropriately combine the manual segmentations from multiple experts. The tool and all necessary functionality for method evaluation are provided as public software packages. As a case study, we utilized the framework and tools to establish the DiaRetDB1 V2.1 database for benchmarking diabetic retinopathy detection algorithms. The database contains a set of retinal images, ground truth based on information from multiple experts, and a baseline algorithm for the detection of retinopathy lesions.

  16. Comparative analysis of exercise 2 results of the OECD WWER-1000 MSLB benchmark

    International Nuclear Information System (INIS)

    Kolev, N.; Petrov, N.; Royer, E.; Ivanov, B.; Ivanov, K.

    2006-01-01

    In the framework of joint effort between OECD/NEA, US DOE and CEA France a coupled three-dimensional (3D) thermal-hydraulic/neutron kinetics benchmark for WWER-1000 was defined. Phase 2 of this benchmark is labeled W1000CT-2 and consists of calculation of a vessel mixing experiment and main steam line break (MSLB) transients. The reference plant is Kozloduy-6 in Bulgaria. Plant data are available for code validation consisting of one experiment of pump start-up (W1000CT-1) and one experiment of steam generator isolation (W1000CT-2). The validated codes can be used to calculate asymmetric MSLB transients involving similar mixing patterns. This paper summarizes a comparison of the available results for W1000CT-2 Exercise 2 devoted to core-vessel calculation with imposed MSLB vessel boundary conditions. Because of the recent re-calculation of the cross-section libraries, core physics results from PARCS and CRONOS codes could be compared only. The comparison is code-to-code (including BIPR7A/TVS-M lib) and code vs. plant measured data in a steady state close to the MSLB initial state. The results provide a test of the cross-section libraries and show a good agreement of plant measured and computed data. The comparison of full vessel calculations was made from the point of view of vessel mixing, considering mainly the coarse-mesh features of the flow. The FZR and INRNE results from multi-1D calculations with different mixing models are similar, while the FZK calculations with a coarse-3D vessel model show deviations from the others. These deviations seem to be due to an error in the use of a boundary condition after flow reversal (Authors)

  17. OECD/DOE/CEA VVER-1000 coolant transient (V1000CT) benchmark for assessing coupled neutronics/thermal-hydraulics system codes for VVER-1000 RIA analysis

    International Nuclear Information System (INIS)

    Ivanov, B.; Ivanov, K.; Aniel, S.; Royer, E.; Kolev, N.; Groudev, P.

    2004-01-01

    The present paper describes the two phases of the OECD/DOE/CEA VVER-1000 coolant transient benchmark labeled as V1000CT. This benchmark is based on a data from the Bulgarian Kozloduy NPP Unit 6. The first phase of the benchmark was designed for the purpose of assessing neutron kinetics and thermal-hydraulic modeling for a VVER-1000 reactor, and specifically for their use in analyzing reactivity transients in a VVER-1000 reactor. Most of the results of Phase 1 will be compared against experimental data and the rest of the results will be used for code-to-code comparison. The second phase of the benchmark is planned for evaluation and improvement of the mixing computational models. Code-to-code and code-to-data comparisons will be done based on data of a mixing experiment conducted at Kozloduy-6. Main steam line break will be also analyzed in the second phase of the V1000CT benchmark. The results from it will be used for code-to-code comparison. The benchmark team has been involved in analyzing different aspects and performing sensitivity studies of the different benchmark exercises. The paper presents a comparison of selected results, obtained with two different system thermal-hydraulics codes, with the plant data for the Exercise 1 of Phase 1 of the benchmark as well as some results for Exercises 2 and 3. Overall, this benchmark has been well accepted internationally, with many organizations representing 11 countries participating in the first phase of the benchmark. (authors)

  18. Energy use pattern and benchmarking of selected greenhouses in Iran using data envelopment analysis

    International Nuclear Information System (INIS)

    Omid, M.; Ghojabeige, F.; Delshad, M.; Ahmadi, H.

    2011-01-01

    This paper studies the degree of technical efficiency (TE) and scale efficiency (SE) of selected greenhouses in Iran and describes the process of benchmarking energy inputs and cucumber yield. Inquiries on 18 greenhouses were conducted in a face-to-face interviewing during September-December 2008 period. A non-parametric data envelopment analysis (DEA) technique was applied to investigate the degree of TE and SE of producers, and evaluate and rank productivity performance of cucumber producers based on eight energy inputs: human labour, diesel, machinery, fertilizers, chemicals, water for irrigation, seeds and electricity, and output yield values of cucumber. DEA optimizes the performance measure of each greenhouse or decision making unit (DMU). Specifically, the DEA was used to compare the performance of each DMU in region of increasing, constant or decreasing return to scale in multiple-inputs situations. The CRS model helped us to decompose the pure TE into the overall TE and SE components, thereby allowing investigating the scale effects. The results of analysis showed that DEA is an effective tool for analyzing and benchmarking productive efficiency of greenhouses. The VRS analysis showed that only 12 out of the 18 DMUs were efficient. The TE of the inefficient DMUs, on average, was calculated as 91.5%. This implies that the same level of output could be produced with 91.5% of the resources if these units were performing on the frontier. Another interpretation of this result is that 8.5% of overall resources could be saved by raising the performance of these DMUs to the highest level.

  19. Energy use pattern and benchmarking of selected greenhouses in Iran using data envelopment analysis

    Energy Technology Data Exchange (ETDEWEB)

    Omid, M.; Ghojabeige, F.; Ahmadi, H. [Department of Agricultural Machinery, College of Agriculture and Natural Resources, University of Tehran, Karaj (Iran, Islamic Republic of); Delshad, M. [Department of Horticultural Sciences, College of Agriculture and Natural Resources, University of Tehran, Karaj (Iran, Islamic Republic of)

    2011-01-15

    This paper studies the degree of technical efficiency (TE) and scale efficiency (SE) of selected greenhouses in Iran and describes the process of benchmarking energy inputs and cucumber yield. Inquiries on 18 greenhouses were conducted in a face-to-face interviewing during September-December 2008 period. A non-parametric data envelopment analysis (DEA) technique was applied to investigate the degree of TE and SE of producers, and evaluate and rank productivity performance of cucumber producers based on eight energy inputs: human labour, diesel, machinery, fertilizers, chemicals, water for irrigation, seeds and electricity, and output yield values of cucumber. DEA optimizes the performance measure of each greenhouse or decision making unit (DMU). Specifically, the DEA was used to compare the performance of each DMU in region of increasing, constant or decreasing return to scale in multiple-inputs situations. The CRS model helped us to decompose the pure TE into the overall TE and SE components, thereby allowing investigating the scale effects. The results of analysis showed that DEA is an effective tool for analyzing and benchmarking productive efficiency of greenhouses. The VRS analysis showed that only 12 out of the 18 DMUs were efficient. The TE of the inefficient DMUs, on average, was calculated as 91.5%. This implies that the same level of output could be produced with 91.5% of the resources if these units were performing on the frontier. Another interpretation of this result is that 8.5% of overall resources could be saved by raising the performance of these DMUs to the highest level. (author)

  20. Development and benchmark analysis of the hybrid evaluated nuclear data library HENDL1.0

    International Nuclear Information System (INIS)

    Xu Dezheng; Wu Yican; Gao Chunjing; Zheng Shanliang; Li Jingjing; Zhu Xiaoxiang; Liu Haibo

    2004-01-01

    To meet the requirements of fusion-fission sub-critical hybrid reactor design and the other related studies, the evaluate nuclear data library named HENDL1.0/E has been constituted based on the several main national evaluated data libraries. The relevant working libraries including transport sub-libraries HENDL1.0/MG in groupwise form, HENDL1.0/MC in pointwise form, and the burnup sub-library HENDL1.0/BU and response function sub-library HENDL1.0/RF are generated using the nuclear data processing codes NJOY97 and TRANSX2. The simulating calculation and comparative analysis are carried out against a series of existing benchmark test experiments with popular neutron transport codes, in order to validate the correctness and availability of the HENDL1.0. (authors)

  1. An international pooled analysis for obtaining a benchmark dose for environmental lead exposure in children

    DEFF Research Database (Denmark)

    Budtz-Jørgensen, Esben; Bellinger, David; Lanphear, Bruce

    2013-01-01

    Lead is a recognized neurotoxicant, but estimating effects at the lowest measurable levels is difficult. An international pooled analysis of data from seven cohort studies reported an inverse and supra-linear relationship between blood lead concentrations and IQ scores in children. The lack...... of a clear threshold presents a challenge to the identification of an acceptable level of exposure. The benchmark dose (BMD) is defined as the dose that leads to a specific known loss. As an alternative to elusive thresholds, the BMD is being used increasingly by regulatory authorities. Using the pooled data...... yielding lower confidence limits (BMDLs) of about 0.1-1.0 μ g/dL for the dose leading to a loss of one IQ point. We conclude that current allowable blood lead concentrations need to be lowered and further prevention efforts are needed to protect children from lead toxicity....

  2. Benchmark analysis of three main circulation pump sequential trip event at Ignalina NPP

    International Nuclear Information System (INIS)

    Uspuras, E.; Kaliatka, A.; Urbonas, R.

    2001-01-01

    The Ignalina Nuclear Power Plant is a twin-unit with two RBMK-1500 reactors. The primary circuit consists of two symmetrical loops. Eight Main Circulation Pumps (MCPs) at the Ignalina NPP are employed for the coolant water forced circulation through the reactor core. The MCPs are joined in groups of four pumps for each loop (three for normal operation and one on standby). This paper presents the benchmark analysis of three main circulation pump sequential trip event at RBMK-1500 using RELAP5 code. During this event all three MCPs in one circulation loop at Unit 2 Ignalina NPP were tripped one after another, because of inadvertent activation of the fire protection system. The comparison of calculated and measured parameters led us to establish realistic thermal hydraulic characteristics of different main circulation circuit components and to verify the model of drum separators pressure and water level controllers.(author)

  3. Stress analysis of R2 pressure vessel. Structural reliability benchmark exercise

    International Nuclear Information System (INIS)

    Vestergaard, N.

    1987-05-01

    The Structural Reliability Benchmark Exercise (SRBE) is sponsored by the EEC as part of the Reactor Safety Programme. The objectives of the SRBE are to evaluate and improve 1) inspection procedures, which use non-destructive methods to locate defects in pressure (reactor) vessels, as well as 2) analytical damage accumulation models, which predict the time to failure of vessels containing defects. In order to focus attention, an experimental presure vessel has been inspected, subjected fatigue loadings and subsequently analysed by several teams using methods of their choice. The present report contains the first part of the analytical damage accumulation analysis. The stress distributions in the welds of the experimental pressure vessel were determined. These stress distributions will be used to determine the driving forces of the damage accumulation models, which will be addressed in a future report. (author)

  4. The lead cooled fast reactor benchmark Brest-300: analysis with sensitivity method

    International Nuclear Information System (INIS)

    Smirnov, V.; Orlov, V.; Mourogov, A.; Lecarpentier, D.; Ivanova, T.

    2005-01-01

    Lead cooled fast neutrons reactor is one of the most interesting candidates for the development of atomic energy. BREST-300 is a 300 MWe lead cooled fast reactor developed by the NIKIET (Russia) with a deterministic safety approach which aims to exclude reactivity margins greater than the delayed neutron fraction. The development of innovative reactors (lead coolant, nitride fuel...) and fuel cycles with new constraints such as cycle closure or actinide burning, requires new technologies and new nuclear data. In this connection, the tool and neutron data used for the calculational analysis of reactor characteristics requires thorough validation. NIKIET developed a reactor benchmark fitting of design type calculational tools (including neutron data). In the frame of technical exchanges between NIKIET and EDF (France), results of this benchmark calculation concerning the principal parameters of fuel evolution and safety parameters has been inter-compared, in order to estimate the uncertainties and validate the codes for calculations of this new kind of reactors. Different codes and cross-sections data have been used, and sensitivity studies have been performed to understand and quantify the uncertainties sources.The comparison of results shows that the difference on k eff value between ERANOS code with ERALIB1 library and the reference is of the same order of magnitude than the delayed neutron fraction. On the other hand, the discrepancy is more than twice bigger if JEF2.2 library is used with ERANOS. Analysis of discrepancies in calculation results reveals that the main effect is provided by the difference of nuclear data, namely U 238 , Pu 239 fission and capture cross sections and lead inelastic cross sections

  5. Use of Sensitivity and Uncertainty Analysis to Select Benchmark Experiments for the Validation of Computer Codes and Data

    International Nuclear Information System (INIS)

    Elam, K.R.; Rearden, B.T.

    2003-01-01

    Sensitivity and uncertainty analysis methodologies under development at Oak Ridge National Laboratory were applied to determine whether existing benchmark experiments adequately cover the area of applicability for the criticality code and data validation of PuO 2 and mixed-oxide (MOX) powder systems. The study examined three PuO 2 powder systems and four MOX powder systems that would be useful for establishing mass limits for a MOX fuel fabrication facility. Using traditional methods to choose experiments for criticality analysis validation, 46 benchmark critical experiments were identified as applicable to the PuO 2 powder systems. However, only 14 experiments were thought to be within the area of applicability for dry MOX powder systems.The applicability of 318 benchmark critical experiments, including the 60 experiments initially identified, was assessed. Each benchmark and powder system was analyzed using the Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) one-dimensional (TSUNAMI-1D) or TSUNAMI three-dimensional (TSUNAMI-3D) sensitivity analysis sequences, which will be included in the next release of the SCALE code system. This sensitivity data and cross-section uncertainty data were then processed with TSUNAMI-IP to determine the correlation of each application to each experiment in the benchmarking set. Correlation coefficients are used to assess the similarity between systems and determine the applicability of one system for the code and data validation of another.The applicability of most of the experiments identified using traditional methods was confirmed by the TSUNAMI analysis. In addition, some PuO 2 and MOX powder systems were determined to be within the area of applicability of several other benchmarks that would not have been considered using traditional methods. Therefore, the number of benchmark experiments useful for the validation of these systems exceeds the number previously expected. The TSUNAMI analysis

  6. Library Benchmarking

    Directory of Open Access Journals (Sweden)

    Wiji Suwarno

    2017-02-01

    Full Text Available The term benchmarking has been encountered in the implementation of total quality (TQM or in Indonesian termed holistic quality management because benchmarking is a tool to look for ideas or learn from the library. Benchmarking is a processof measuring and comparing for continuous business process of systematic and continuous measurement, the process of measuring and comparing for continuous business process of an organization to get information that can help these organization improve their performance efforts.

  7. Quo Vadis Benchmark Simulation Models? 8th IWA Symposium on Systems Analysis and Integrated Assessment

    DEFF Research Database (Denmark)

    Jeppsson, U.; Alex, J.; Batstone, D,

    2011-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for WWTPs is coming towards an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, hi...

  8. Analysis of the international criticality benchmark no 19 of a realistic fuel dissolver

    International Nuclear Information System (INIS)

    Smith, H.J.; Santamarina, A.

    1991-01-01

    The dispersion of the order of 12000 pcm in the results of the international criticality fuel dissolver benchmark calculation, exercise OECD/19, showed the necessity of analysing the calculational methods used in this case. The APOLLO/PIC method developed to treat this type of problem permits us to propose international reference values. The problem studied here, led us to investigate two supplementary parameters in addition to the double heterogeneity of the fuel: the reactivity variation as a function of moderation and the effects of the size of the fuel pellets during dissolution. The following conclusions were obtained: The fast cross-section sets used by the international SCALE package introduces a bias of - 3000 pcm in undermoderated lattices. More generally, the fast and resonance nuclear data in criticality codes are not sufficiently reliable. Geometries with micro-pellets led to an underestimation of reactivity at the end of dissolution of 3000 pcm in certain 1988 Sn calculations; this bias was avoided in the up-dated 1990 computation because of a correct use of calculation tools. The reactivity introduced by the dissolved fuel is underestimated by 3000 pcm in contributions based on the standard NITAWL module in the SCALE code. More generally, the neutron balance analysis pointed out that standard ND self shielding formalism cannot account for 238 U resonance mutual self-shielding in the pellet-fissile liquor interaction. The combination of these three types of bias explain the underestimation of all of the international contributions of the reactivity of dissolver lattices by -2000 to -6000 pcm. The improved 1990 calculations confirm the need to use rigorous methods in the calculation of systems which involve the fuel double heterogeneity. This study points out the importance of periodic benchmarking exercises for probing the efficacity of criticality codes, data libraries and the users

  9. Calculation of the Thermal Radiation Benchmark Problems for a CANDU Fuel Channel Analysis Using the CFX-10 Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyoung Tae; Park, Joo Hwan; Rhee, Bo Wook

    2006-07-15

    To justify the use of a commercial Computational Fluid Dynamics (CFD) code for a CANDU fuel channel analysis, especially for the radiation heat transfer dominant conditions, the CFX-10 code is tested against three benchmark problems which were used for the validation of a radiation heat transfer in the CANDU analysis code, a CATHENA. These three benchmark problems are representative of the CANDU fuel channel configurations from a simple geometry to whole fuel channel geometry. With assumptions of a non-participating medium completely enclosed with the diffuse, gray and opaque surfaces, the solutions of the benchmark problems are obtained by the concept of surface resistance to radiation accounting for the view factors and the emissivities. The view factors are calculated by the program MATRIX version 1.0 avoiding the difficulty of hand calculation for the complex geometries. For the solutions of the benchmark problems, the temperature or the net radiation heat flux boundary conditions are prescribed for each radiating surface to determine the radiation heat transfer rate or the surface temperature, respectively by using the network method. The Discrete Transfer Model (DTM) is used for the CFX-10 radiation model and its calculation results are compared with the solutions of the benchmark problems. The CFX-10 results for the three benchmark problems are in close agreement with these solutions, so it is concluded that the CFX-10 with a DTM radiation model can be applied to the CANDU fuel channel analysis where a surface radiation heat transfer is a dominant mode of the heat transfer.

  10. Calculation of the Thermal Radiation Benchmark Problems for a CANDU Fuel Channel Analysis Using the CFX-10 Code

    International Nuclear Information System (INIS)

    Kim, Hyoung Tae; Park, Joo Hwan; Rhee, Bo Wook

    2006-07-01

    To justify the use of a commercial Computational Fluid Dynamics (CFD) code for a CANDU fuel channel analysis, especially for the radiation heat transfer dominant conditions, the CFX-10 code is tested against three benchmark problems which were used for the validation of a radiation heat transfer in the CANDU analysis code, a CATHENA. These three benchmark problems are representative of the CANDU fuel channel configurations from a simple geometry to whole fuel channel geometry. With assumptions of a non-participating medium completely enclosed with the diffuse, gray and opaque surfaces, the solutions of the benchmark problems are obtained by the concept of surface resistance to radiation accounting for the view factors and the emissivities. The view factors are calculated by the program MATRIX version 1.0 avoiding the difficulty of hand calculation for the complex geometries. For the solutions of the benchmark problems, the temperature or the net radiation heat flux boundary conditions are prescribed for each radiating surface to determine the radiation heat transfer rate or the surface temperature, respectively by using the network method. The Discrete Transfer Model (DTM) is used for the CFX-10 radiation model and its calculation results are compared with the solutions of the benchmark problems. The CFX-10 results for the three benchmark problems are in close agreement with these solutions, so it is concluded that the CFX-10 with a DTM radiation model can be applied to the CANDU fuel channel analysis where a surface radiation heat transfer is a dominant mode of the heat transfer

  11. WWER-1000 Burnup Credit Benchmark (CB5)

    International Nuclear Information System (INIS)

    Manolova, M.A.

    2002-01-01

    In the paper the specification of WWER-1000 Burnup Credit Benchmark first phase (depletion calculations), given. The second phase - criticality calculations for the WWER-1000 fuel pin cell, will be given after the evaluation of the results, obtained at the first phase. The proposed benchmark is a continuation of the WWER benchmark activities in this field (Author)

  12. Analysis of the OECD/NRC BWR Turbine Trip Transient Benchmark with the Coupled Thermal-Hydraulics and Neutronics Code TRAC-M/PARCS

    International Nuclear Information System (INIS)

    Lee, Deokjung; Downar, Thomas J.; Ulses, Anthony; Akdeniz, Bedirhan; Ivanov, Kostadin N.

    2004-01-01

    An analysis of the Peach Bottom Unit 2 Turbine Trip 2 (TT2) experiment has been performed using the U.S. Nuclear Regulatory Commission coupled thermal-hydraulics and neutronics code TRAC-M/PARCS. The objective of the analysis was to assess the performance of TRAC-M/PARCS on a BWR transient with significance in two-phase flow and spatial variations of the neutron flux. TRAC-M/PARCS results are found to be in good agreement with measured plant data for both steady-state and transient phases of the benchmark. Additional analyses of four fictitious extreme scenarios are performed to provide a basis for code-to-code comparisons and comprehensive testing of the thermal-hydraulics/neutronics coupling. The obtained results of sensitivity studies on the effect of direct moderator heating on transient simulation indicate the importance of this modeling aspect

  13. Burn-up Credit Criticality Safety Benchmark Phase III-C. Nuclide Composition and Neutron Multiplication Factor of a Boiling Water Reactor Spent Fuel Assembly for Burn-up Credit and Criticality Control of Damaged Nuclear Fuel

    International Nuclear Information System (INIS)

    Suyama, K.; Uchida, Y.; Kashima, T.; Ito, T.; Miyaji, T.

    2016-01-01

    similar to those in the previous Phase III-B benchmark. A constant specific power of 25.3 MW/tHM is assumed for a final burn-up value of 50 GWd/tHM. Three cases of cooling time are requested after the burn-up; 0, 5 and 15 years. A constant void fraction of 0, 40 or 70% during the burn-up is assumed. The present benchmark is a compilation of 35 calculation results from 16 institutes in 9 countries covering different cross-section libraries. The total number of the calculation results is twice that of the previous Phase III-B benchmark. Concerning nuclide density, the 2-sigma (r) of 235 U is less than 6% and 239,240,241 Pu are less than 7%. For minor actinides, 2-sigma (r) becomes larger than 10% because of a difference in the cross-section data adopted by each calculation code. For fission product isotopes, 2-sigma (r) is less than 7%, except for some nuclides. Generally, the mutual-agreement of nuclide density has improved from the previous benchmark. For the neutron multiplication factors, 2-sigma (r) is less than 1.1% for lower burn-up and it becomes about 1.6% at 10 GWd/t and gets smaller at 30 and 50 GWd/t. This might be a sufficient agreement considering that the adopted nuclides for the criticality calculation differ in the diverse methodologies used. Comparison of peak k inf shows that it has approximately 2-sigma (r) of 1% and it becomes larger for higher void fraction cases. Comparison of the burn-up distribution results is not the main purpose of this benchmark, but was requested to confirm the credibility of the calculation. A general good agreement of the burn-up distribution is shown. However, how gadolinium depletion is handled may still pose an issue to solve and some uncertainty depending on the analysis code used still remains. Using this benchmark, progress of the burn-up calculation capability is confirmed. Introduction of continuous-energy Monte Carlo codes has a clear advantage in treating multi-dimensional burn-up calculation problems, even though

  14. Interactive benchmarking

    DEFF Research Database (Denmark)

    Lawson, Lartey; Nielsen, Kurt

    2005-01-01

    We discuss individual learning by interactive benchmarking using stochastic frontier models. The interactions allow the user to tailor the performance evaluation to preferences and explore alternative improvement strategies by selecting and searching the different frontiers using directional...... in the suggested benchmarking tool. The study investigates how different characteristics on dairy farms influences the technical efficiency....

  15. RUNE benchmarks

    DEFF Research Database (Denmark)

    Peña, Alfredo

    This report contains the description of a number of benchmarks with the purpose of evaluating flow models for near-shore wind resource estimation. The benchmarks are designed based on the comprehensive database of observations that the RUNE coastal experiment established from onshore lidar...

  16. Benchmark selection

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Tvede, Mich

    2002-01-01

    Within a production theoretic framework, this paper considers an axiomatic approach to benchmark selection. It is shown that two simple and weak axioms; efficiency and comprehensive monotonicity characterize a natural family of benchmarks which typically becomes unique. Further axioms are added...... in order to obtain a unique selection...

  17. On the feasibility of using emergy analysis as a source of benchmarking criteria through data envelopment analysis: A case study for wind energy

    International Nuclear Information System (INIS)

    Iribarren, Diego; Vázquez-Rowe, Ian; Rugani, Benedetto; Benetto, Enrico

    2014-01-01

    The definition of criteria for the benchmarking of similar entities is often a critical issue in analytical studies because of the multiplicity of criteria susceptible to be taken into account. This issue can be aggravated by the need to handle multiple data for multiple facilities. This article presents a methodological framework, named the Em + DEA method, which combines emergy analysis with Data Envelopment Analysis (DEA) for the ecocentric benchmarking of multiple resembling entities (i.e., multiple decision making units or DMUs). Provided that the life-cycle inventories of these DMUs are available, an emergy analysis is performed through the computation of seven different indicators, which refer to the use of fossil, metal, mineral, nuclear, renewable energy, water and land resources. These independent emergy values are then implemented as inputs for DEA computation, thus providing operational emergy-based efficiency scores and, for the inefficient DMUs, target emergy flows (i.e., feasible emergy benchmarks that would turn inefficient DMUs into efficient). The use of the Em + DEA method is exemplified through a case study of wind energy farms. The potential use of CED (cumulative energy demand) and CExD (cumulative exergy demand) indicators as alternative benchmarking criteria to emergy is discussed. The combined use of emergy analysis with DEA is proven to be a valid methodological approach to provide benchmarks oriented towards the optimisation of the life-cycle performance of a set of multiple similar facilities, not being limited to the operational traits of the assessed units. - Highlights: • Combined emergy and DEA method to benchmark multiple resembling entities. • Life-cycle inventory, emergy analysis and DEA as key steps of the Em + DEA method. • Valid ecocentric benchmarking approach proven through a case study of wind farms. • Comparison with life-cycle energy-based benchmarking criteria (CED/CExD + DEA). • Analysts and decision and policy

  18. Coupled code analysis of uncertainty and sensitivity of Kalinin-3 benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Pasichnyk, Ihor; Zwermann, Winfried; Velkov, Kiril [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany); Nikonov, Sergey [VNIIAES, Moscow (Russian Federation)

    2016-09-15

    An uncertainty and sensitivity analysis is performed for the OECD/NEA coolant transient Benchmark (K-3) on measured data at Kalinin-3 Nuclear Power Plant (NPP). A switch off of one main coolant pump (MCP) at nominal reactor power is calculated using a coupled thermohydraulic and neutron-kinetic ATHLET-PARCS code. The objectives are to study uncertainty of total reactor power and to identify the main sources of reactor power uncertainty. The GRS uncertainty and sensitivity software package XSUSA is applied to propagate uncertainties in nuclear data libraries to the full core coupled transient calculations. A set of most important thermal-hydraulic parameters of the primary circuit is identified and a total of 23 thermohydraulic parameters are statistically varied using GRS code SUSA. The ATHLET model contains also a balance-of-plant (BOP) model which is simulated using ATHLET GCSM module. In particular the operation of the main steam generator regulators is modelled in detail. A set of 200 varied coupled ATHLET-PARCS calculations is analyzed. The results obtained show a clustering effect in the behavior of global reactor parameters. It is found that the GCSM system together with varied input parameters strongly influence the overall nuclear power plant behavior and can even lead to a new scenario. Possible reasons of the clustering effect are discussed in the paper. This work is a step forward in establishing a ''best-estimate calculations in combination with performing uncertainty analysis'' methodology for coupled full core calculations.

  19. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1).

    Science.gov (United States)

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.

  20. An analysis of the CSNI/GREST core concrete interaction chemical thermodynamic benchmark exercise using the MPEC2 computer code

    International Nuclear Information System (INIS)

    Muramatsu, Ken; Kondo, Yasuhiko; Uchida, Masaaki; Soda, Kunihisa

    1989-01-01

    Fission product (EP) release during a core concrete interaction (CCI) is an important factor of the uncertainty associated with a source term estimation for an LWR severe accident. An analysis was made on the CCI Chemical Thermodynamic Benchmark Exercise organized by OECD/NEA/CSNI Group of Experts on Source Terms (GREST) for investigating the uncertainty in thermodynamic modeling for CCI. The benchmark exercise was to calculate the equilibrium FP vapor pressure for given system of temperature, pressure, and debris composition. The benchmark consisted of two parts, A and B. Part A was a simplified problem intended to test the numerical techniques. In part B, the participants were requested to use their own best estimate thermodynamic data base to examine the variability of the results due to the difference in thermodynamic data base. JAERI participated in this benchmark exercise with use of the MPEC2 code. Chemical thermodynamic data base needed for analysis of Part B was taken from the VENESA code. This report describes the computer code used, inputs to the code, and results from the calculation by JAERI. The present calculation indicates that the FP vapor pressure depends strongly on temperature and Oxygen potential in core debris and the pattern of dependency may be different for different FP elements. (author)

  1. Single pin BWR benchmark problem for coupled Monte Carlo - Thermal hydraulics analysis

    International Nuclear Information System (INIS)

    Ivanov, A.; Sanchez, V.; Hoogenboom, J. E.

    2012-01-01

    As part of the European NURISP research project, a single pin BWR benchmark problem was defined. The aim of this initiative is to test the coupling strategies between Monte Carlo and subchannel codes developed by different project participants. In this paper the results obtained by the Delft Univ. of Technology and Karlsruhe Inst. of Technology will be presented. The benchmark problem was simulated with the following coupled codes: TRIPOLI-SUBCHANFLOW, MCNP-FLICA, MCNP-SUBCHANFLOW, and KENO-SUBCHANFLOW. (authors)

  2. Single pin BWR benchmark problem for coupled Monte Carlo - Thermal hydraulics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, A.; Sanchez, V. [Karlsruhe Inst. of Technology, Inst. for Neutron Physics and Reactor Technology, Herman-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany); Hoogenboom, J. E. [Delft Univ. of Technology, Faculty of Applied Sciences, Mekelweg 15, 2629 JB Delft (Netherlands)

    2012-07-01

    As part of the European NURISP research project, a single pin BWR benchmark problem was defined. The aim of this initiative is to test the coupling strategies between Monte Carlo and subchannel codes developed by different project participants. In this paper the results obtained by the Delft Univ. of Technology and Karlsruhe Inst. of Technology will be presented. The benchmark problem was simulated with the following coupled codes: TRIPOLI-SUBCHANFLOW, MCNP-FLICA, MCNP-SUBCHANFLOW, and KENO-SUBCHANFLOW. (authors)

  3. VVER-1000 coolant transient benchmark. Phase 1 (V1000CT-1). Vol. 3: summary results of exercise 2 on coupled 3-D kinetics/core thermal-hydraulics

    International Nuclear Information System (INIS)

    2007-01-01

    In the field of coupled neutronics/thermal-hydraulics computation there is a need to enhance scientific knowledge in order to develop advanced modelling techniques for new nuclear technologies and concepts, as well as current applications. (authors) Recently developed best-estimate computer code systems for modelling 3-D coupled neutronics/thermal-hydraulics transients in nuclear cores and for the coupling of core phenomena and system dynamics need to be compared against each other and validated against results from experiments. International benchmark studies have been set up for this purpose. The present volume is a follow-up to the first two volumes. While the first described the specification of the benchmark, the second presented the results of the first exercise that identified the key parameters and important issues concerning the thermal-hydraulic system modelling of the simulated transient caused by the switching on of a main coolant pump when the other three were in operation. Volume 3 summarises the results for Exercise 2 of the benchmark that identifies the key parameters and important issues concerning the 3-D neutron kinetics modelling of the simulated transient. These studies are based on an experiment that was conducted by Bulgarian and Russian engineers during the plant-commissioning phase at the VVER-1000 Kozloduy Unit 6. The final volume will soon be published, completing Phase 1 of this study. (authors)

  4. Two-phase flow characteristics analysis code: MINCS

    International Nuclear Information System (INIS)

    Watanabe, Tadashi; Hirano, Masashi; Akimoto, Masayuki; Tanabe, Fumiya; Kohsaka, Atsuo.

    1992-03-01

    Two-phase flow characteristics analysis code: MINCS (Modularized and INtegrated Code System) has been developed to provide a computational tool for analyzing two-phase flow phenomena in one-dimensional ducts. In MINCS, nine types of two-phase flow models-from a basic two-fluid nonequilibrium (2V2T) model to a simple homogeneous equilibrium (1V1T) model-can be used under the same numerical solution method. The numerical technique is based on the implicit finite difference method to enhance the numerical stability. The code structure is highly modularized, so that new constitutive relations and correlations can be easily implemented into the code and hence evaluated. A flow pattern can be fixed regardless of flow conditions, and state equations or steam tables can be selected. It is, therefore, easy to calculate physical or numerical benchmark problems. (author)

  5. Summary Report of Consultants' Meeting on Accuracy of Experimental and Theoretical Nuclear Cross-Section Data for Ion Beam Analysis and Benchmarking

    International Nuclear Information System (INIS)

    Abriola, Daniel; Dimitriou, Paraskevi; Gurbich, Alexander F.

    2013-11-01

    A summary is given of a Consultants' Meeting assembled to assess the accuracy of experimental and theoretical nuclear cross-section data for Ion Beam Analysis and the role of benchmarking experiments. The participants discussed the different approaches to assigning uncertainties to evaluated data, and presented results of benchmark experiments performed in their laboratories. They concluded that priority should be given to the validation of cross- section data by benchmark experiments, and recommended that an experts meeting be held to prepare the guidelines, methodology and work program of a future coordinated project on benchmarking.

  6. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    Science.gov (United States)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  7. Preliminary uncertainty analysis of OECD/UAM benchmark for the TMI-1 reactor

    International Nuclear Information System (INIS)

    Cardoso, Fabiano S.; Faria, Rochkhudson B.; Silva, Lucas M.C.; Pereira, Claubia; Fortini, Angela

    2015-01-01

    Nowadays the demand from nuclear research centers for safety, regulation and better-estimated predictions provided with confidence bounds has been increasing. On that way, studies have pointed out that present uncertainties in the nuclear data should be significantly reduced, to get the full benefit from the advanced modeling and simulation initiatives. The major outcome of NEA/OECD (UAM) workshop took place Italy on 2006, was the preparation of a benchmark work program with steps (exercises) that would be needed to define the uncertainty and modeling tasks. On that direction, this work was performed within the framework of UAM Exercise 1 (I-1) 'Cell Physics' to validate the study, and to be able estimated the accuracies of the model. The objectives of this study were to make a preliminary analysis of criticality values of TMI-1 PWR and the biases of the results from two different nuclear codes multiplication factor. The range of the bias was obtained using the deterministic codes: NEWT (New ESC-based Weighting Transport code), the two-dimensional transport module that uses AMPX-formatted cross-sections processed by other SCALE; and WIMSD5 (Winfrith Improved Multi-Group Scheme) code. The WIMSD5 system consists of a simplified geometric representation of heterogeneous space zones that are coupled with each other and with the boundaries, while the properties of each spacing element are obtained from Carlson DSN method or Collision Probability method. (author)

  8. Benchmark analysis of SPERT-IV reactor with Monte Carlo code MVP

    International Nuclear Information System (INIS)

    Motalab, M.A.; Mahmood, M.S.; Khan, M.J.H.; Badrun, N.H.; Lyric, Z.I.; Altaf, M.H.

    2014-01-01

    Highlights: • MVP was used for SPERT-IV core modeling. • Neutronics analysis of SPERT-IV reactor was performed. • Calculation performed to estimate critical rod height, excess reactivity. • Neutron flux, time integrated neutron flux and Cd-ratio also calculated. • Calculated values agree with experimental data. - Abstract: The benchmark experiment of the SPERT-IV D-12/25 reactor core has been analyzed with the Monte Carlo code MVP using the cross-section libraries based on JENDL-3.3. The MVP simulation was performed for the clean and cold core. The estimated values of K eff at the experimental critical rod height and the core excess reactivity were within 5% with the experimental data. Thermal neutron flux profiles at different vertical and horizontal positions of the core were also estimated. Cadmium Ratio at different point of the core was also estimated. All estimated results have been compared with the experimental results. Generally good agreement has been found between experimentally determined and the calculated results

  9. Benchmarking the Applicability of Ontology in Geographic Object-Based Image Analysis

    Directory of Open Access Journals (Sweden)

    Sachit Rajbhandari

    2017-11-01

    Full Text Available In Geographic Object-based Image Analysis (GEOBIA, identification of image objects is normally achieved using rule-based classification techniques supported by appropriate domain knowledge. However, GEOBIA currently lacks a systematic method to formalise the domain knowledge required for image object identification. Ontology provides a representation vocabulary for characterising domain-specific classes. This study proposes an ontological framework that conceptualises domain knowledge in order to support the application of rule-based classifications. The proposed ontological framework is tested with a landslide case study. The Web Ontology Language (OWL is used to construct an ontology in the landslide domain. The segmented image objects with extracted features are incorporated into the ontology as instances. The classification rules are written in Semantic Web Rule Language (SWRL and executed using a semantic reasoner to assign instances to appropriate landslide classes. Machine learning techniques are used to predict new threshold values for feature attributes in the rules. Our framework is compared with published work on landslide detection where ontology was not used for the image classification. Our results demonstrate that a classification derived from the ontological framework accords with non-ontological methods. This study benchmarks the ontological method providing an alternative approach for image classification in the case study of landslides.

  10. A subchannel and CFD analysis of void distribution for the BWR fuel bundle test benchmark

    International Nuclear Information System (INIS)

    In, Wang-Kee; Hwang, Dae-Hyun; Jeong, Jae Jun

    2013-01-01

    Highlights: ► We analyzed subchannel void distributions using subchannel, system and CFD codes. ► The mean error and standard deviation at steady states were compared. ► The deviation of the CFD simulation was greater than those of the others. ► The large deviation of the CFD prediction is due to interface model uncertainties. -- Abstract: The subchannel grade and microscopic void distributions in the NUPEC (Nuclear Power Engineering Corporation) BFBT (BWR Full-Size Fine-Mesh Bundle Tests) facility have been evaluated with a subchannel analysis code MATRA, a system code MARS and a CFD code CFX-10. Sixteen test series from five different test bundles were selected for the analysis of the steady-state subchannel void distributions. Four test cases for a high burn-up 8 × 8 fuel bundle with a single water rod were simulated using CFX-10 for the microscopic void distribution benchmark. Two transient cases, a turbine trip without a bypass as a typical power transient and a re-circulation pump trip as a flow transient, were also chosen for this analysis. It was found that the steady-state void distributions calculated by both the MATRA and MARS codes coincided well with the measured data in the range of thermodynamic qualities from 5 to 25%. The results of the transient calculations were also similar to each other and very reasonable. The CFD simulation reproduced the overall radial void distribution trend which produces less vapor in the central part of the bundle and more vapor in the periphery. However, the predicted variation of the void distribution inside the subchannels is small, while the measured one is large showing a very high concentration in the center of the subchannels. The variations of the void distribution between the center of the subchannels and the subchannel gap are estimated to be about 5–10% for the CFD prediction and more than 20% for the experiment

  11. IVO participation in IAEA benchmark for VVER-type nuclear power plants seismic analysis and testing

    International Nuclear Information System (INIS)

    Varpasuo, P.

    1997-12-01

    This study is a part of the IAEA coordinated research program 'Benchmark study for the Seismic Analysis and Testing of VVER Type NPPs'. The study reports the numerical simulation of the blast test for Paks and Kozloduy nuclear power plants beginning from the recorded free-field response and computing the structural response at various points inside the reactor building. The full-scale blast tests of the Paks and Kozloduy NPPs took place in December 1994 and in July 1996. During the tests the plants operated normally. The instrumentation for the tests consisted of 52 recording channels with 200 Hz sampling rate. Detonating 100 kg charges in 50-meter deep boreholes at 2.5-km distance from the plant carried out the blast tests. The 3D structural models for both reactor buildings were analyzed in the frequency domain. The number of modes extracted in both cases was about 500 and the cut-off frequency was 25 Hz. In the response history run the responses of the selected points were evaluated. The input values for response history run were the three components of the excitation, which were transformed from time domain to the frequency domain with the aid of Fourier transform. The analysis was carried out in frequency domain and responses were transferred back to time domain with inverse Fourier transform. The Paks and Kozloduy blast tests produced a wealth of information on the behavior of the nuclear power plant structures excited by blast type loads containing also the low frequency wave train if albeit with small energy content. The comparison of measured and calculated results gave information about the suitability of the selected analysis approach for the investigated blast type loading

  12. International Benchmark on Numerical Simulations for 1D, Nonlinear Site Response (PRENOLIN) : Verification Phase Based on Canonical Cases

    NARCIS (Netherlands)

    Régnier, Julie; Bonilla, Luis-Fabian; Bard, Pierre-Yves; Bertrand, Etienne; Hollender, Fabrice; Kawase, Hiroshi; Sicilia, Deborah; Arduino, Pedro; Amorosi, Angelo; Asimaki, Dominiki; Pisano, F.

    2016-01-01

    PREdiction of NOn‐LINear soil behavior (PRENOLIN) is an international benchmark aiming to test multiple numerical simulation codes that are capable of predicting nonlinear seismic site response with various constitutive models. One of the objectives of this project is the assessment of the

  13. Results of neutronic benchmark analysis for a high temperature reactor of the GT-MHR type - HTR2008-58107

    International Nuclear Information System (INIS)

    Boyarinov, V. F.; Bryzgalov, V. I.; Davidenko, V. D.; Fomichenko, P. A.; Glushkov, E. S.; Gomin, E. A.; Gurevich, M. I.; Kodochigov, N. G.; Marova, E. V.; Mitenkova, E. F.; Novikov, N. V.; Osipov, S. L.; Sukharev, Y. P.; Tsibulsky, V. F.; Yudkevich, M. S.

    2008-01-01

    The paper presents a description of benchmark cases, achieved results, analysis of possible reasons of differences of calculation results obtained by various neutronic codes. The comparative analysis is presented showing the benchmark-results obtained with reference and design codes by Russian specialists (WIMS-D, JAR-HTGR, UNK, MCU, MCNP5-MONTEBURNS1.0-ORIGEN2.0), by French specialists (AP0LL02, TRIP0LI4 codes), and by Korean specialists (HELIOS, MASTER, MCNP5 codes). The analysis of possible reasons for deviations was carried out, which was aimed at the decrease of uncertainties in calculated characteristics. This additional investigation was conducted with the use of 2D models of a fuel assembly cell and a reactor plane section. (authors)

  14. Performance analysis of fusion nuclear-data benchmark experiments for light to heavy materials in MeV energy region with a neutron spectrum shifter

    International Nuclear Information System (INIS)

    Murata, Isao; Ohta, Masayuki; Miyamaru, Hiroyuki; Kondo, Keitaro; Yoshida, Shigeo; Iida, Toshiyuki; Ochiai, Kentaro; Konno, Chikara

    2011-01-01

    Nuclear data are indispensable for development of fusion reactor candidate materials. However, benchmarking of the nuclear data in MeV energy region is not yet adequate. In the present study, benchmark performance in the MeV energy region was investigated theoretically for experiments by using a 14 MeV neutron source. We carried out a systematical analysis for light to heavy materials. As a result, the benchmark performance for the neutron spectrum was confirmed to be acceptable, while for gamma-rays it was not sufficiently accurate. Consequently, a spectrum shifter has to be applied. Beryllium had the best performance as a shifter. Moreover, a preliminary examination of whether it is really acceptable that only the spectrum before the last collision is considered in the benchmark performance analysis. It was pointed out that not only the last collision but also earlier collisions should be considered equally in the benchmark performance analysis.

  15. Benchmarking and the laboratory

    Science.gov (United States)

    Galloway, M; Nadin, L

    2001-01-01

    This article describes how benchmarking can be used to assess laboratory performance. Two benchmarking schemes are reviewed, the Clinical Benchmarking Company's Pathology Report and the College of American Pathologists' Q-Probes scheme. The Clinical Benchmarking Company's Pathology Report is undertaken by staff based in the clinical management unit, Keele University with appropriate input from the professional organisations within pathology. Five annual reports have now been completed. Each report is a detailed analysis of 10 areas of laboratory performance. In this review, particular attention is focused on the areas of quality, productivity, variation in clinical practice, skill mix, and working hours. The Q-Probes scheme is part of the College of American Pathologists programme in studies of quality assurance. The Q-Probes scheme and its applicability to pathology in the UK is illustrated by reviewing two recent Q-Probe studies: routine outpatient test turnaround time and outpatient test order accuracy. The Q-Probes scheme is somewhat limited by the small number of UK laboratories that have participated. In conclusion, as a result of the government's policy in the UK, benchmarking is here to stay. Benchmarking schemes described in this article are one way in which pathologists can demonstrate that they are providing a cost effective and high quality service. Key Words: benchmarking • pathology PMID:11477112

  16. Monitoring Based Commissioning: Benchmarking Analysis of 24 UC/CSU/IOU Projects

    Energy Technology Data Exchange (ETDEWEB)

    Mills, Evan; Mathew, Paul

    2009-04-01

    Buildings rarely perform as intended, resulting in energy use that is higher than anticipated. Building commissioning has emerged as a strategy for remedying this problem in non-residential buildings. Complementing traditional hardware-based energy savings strategies, commissioning is a 'soft' process of verifying performance and design intent and correcting deficiencies. Through an evaluation of a series of field projects, this report explores the efficacy of an emerging refinement of this practice, known as monitoring-based commissioning (MBCx). MBCx can also be thought of as monitoring-enhanced building operation that incorporates three components: (1) Permanent energy information systems (EIS) and diagnostic tools at the whole-building and sub-system level; (2) Retro-commissioning based on the information from these tools and savings accounting emphasizing measurement as opposed to estimation or assumptions; and (3) On-going commissioning to ensure efficient building operations and measurement-based savings accounting. MBCx is thus a measurement-based paradigm which affords improved risk-management by identifying problems and opportunities that are missed with periodic commissioning. The analysis presented in this report is based on in-depth benchmarking of a portfolio of MBCx energy savings for 24 buildings located throughout the University of California and California State University systems. In the course of the analysis, we developed a quality-control/quality-assurance process for gathering and evaluating raw data from project sites and then selected a number of metrics to use for project benchmarking and evaluation, including appropriate normalizations for weather and climate, accounting for variations in central plant performance, and consideration of differences in building types. We performed a cost-benefit analysis of the resulting dataset, and provided comparisons to projects from a larger commissioning 'Meta-analysis' database. A

  17. Funding and financing mechanisms for infrastructure delivery: multi-sector analysis of benchmarking of South Africa against developed countries

    CSIR Research Space (South Africa)

    Matji, MP

    2015-05-01

    Full Text Available -1 AMPEAK Asset Management Conference 2015 Funding and financing mechanisms for infrastructure delivery: multi-sector analysis of benchmarking of South Africa against developed countries Matji, MP and Ruiters, C Abstract: For developing..., the researcher identifies financing opportunities for infrastructure delivery in South Africa and how such opportunities can be explored, taking into account political dynamics and legislative sector-based frameworks. Keywords: Asset Management, Financing...

  18. Analysis and sensitivity studies with CORETRAN and RETRAN-3D of the NEACRP PWR rod ejection benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Ferroukhi, H.; Coddington, P. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    2001-07-01

    The OECD/NEA PWR rod ejection benchmark has been analysed using the 3-D nodal spatial-kinetic codes CORETRAN and RETRAN-3D. The following results were obtained. A) The agreement in 3-D solution between CORETRAN and RETRAN-3D was found to be very good both during steady-state and transient conditions. In particular at HZP (hot zero power), an excellent agreement in the initial steady-state 3-D power distribution and with regard to the core power excursion during the super-prompt critical phase of the transient (i.e. when the negative reactivity feedback is still very weak) was found. This illustrates the consistency in the neutronic solution between both codes. B) At both HZP and FP (full power) conditions, the CORETRAN and RETRAN-3D results lie well within the range of the previous benchmark solutions. In particular at HZP, both codes predict a power excursion and an increase in maximum pellet temperature that are among the closest results to those obtained with the benchmark reference solution. It must here be emphasised that these analyses are by no means a validation of the codes. However, the good agreement of both CORETRAN and RETRAN-3D with other 3-D solutions provides confidence in the ability of these codes to analyse LWR (light water reactor) core transients. In addition, it was found appropriate to perform, for this well-defined international benchmark problem, some sensitivity studies in order to assess the impact of modelling options on the CORETRAN and RETRAN-3D results. (authors)

  19. Analysis and sensitivity studies with CORETRAN and RETRAN-3D of the NEACRP PWR rod ejection benchmark

    International Nuclear Information System (INIS)

    Ferroukhi, H.; Coddington, P.

    2001-01-01

    The OECD/NEA PWR rod ejection benchmark has been analysed using the 3-D nodal spatial-kinetic codes CORETRAN and RETRAN-3D. The following results were obtained. A) The agreement in 3-D solution between CORETRAN and RETRAN-3D was found to be very good both during steady-state and transient conditions. In particular at HZP (hot zero power), an excellent agreement in the initial steady-state 3-D power distribution and with regard to the core power excursion during the super-prompt critical phase of the transient (i.e. when the negative reactivity feedback is still very weak) was found. This illustrates the consistency in the neutronic solution between both codes. B) At both HZP and FP (full power) conditions, the CORETRAN and RETRAN-3D results lie well within the range of the previous benchmark solutions. In particular at HZP, both codes predict a power excursion and an increase in maximum pellet temperature that are among the closest results to those obtained with the benchmark reference solution. It must here be emphasised that these analyses are by no means a validation of the codes. However, the good agreement of both CORETRAN and RETRAN-3D with other 3-D solutions provides confidence in the ability of these codes to analyse LWR (light water reactor) core transients. In addition, it was found appropriate to perform, for this well-defined international benchmark problem, some sensitivity studies in order to assess the impact of modelling options on the CORETRAN and RETRAN-3D results. (authors)

  20. Analysis of neutronics benchmarks for the utilization of mixed oxide fuel in light water reactor using DRAGON code

    International Nuclear Information System (INIS)

    Nithyadevi, Rajan; Thilagam, L.; Karthikeyan, R.; Pal, Usha

    2016-01-01

    Highlights: • Use of advanced computational code – DRAGON-5 using advanced self shielding model USS. • Testing the capability of DRAGON-5 code for the analysis of light water reactor system. • Wide variety of fuels LEU, MOX and spent fuel have been analyzed. • Parameters such as k ∞ , one, few and multi-group macroscopic cross-sections and fluxes were calculated. • Suitability of deterministic methodology employed in DRAGON-5 code is demonstrated for LWR. - Abstract: Advances in reactor physics have led to the development of new computational technologies and upgraded cross-section libraries so as to produce an accurate approximation to the true solution for the problem. Thus it is necessary to revisit the benchmark problems with the advanced computational code system and upgraded cross-section libraries to see how far they are in agreement with the earlier reported values. Present study is one such analysis with the DRAGON code employing advanced self shielding models like USS and 172 energy group ‘JEFF3.1’ cross-section library in DRAGLIB format. Although DRAGON code has already demonstrated its capability for heavy water moderator systems, it is now tested for light water reactor (LWR) and fast reactor systems. As a part of validation of DRAGON for LWR, a VVER computational benchmark titled “Neutronics Benchmarks for the Utilization of Mixed-Oxide Fuel-Volume 3” submitted by the Russian Federation has been taken up. Presently, pincell and assembly calculations are carried out considering variation in fuel temperature (both fresh and spent), moderator temperatures and boron content in the moderator. Various parameters such as infinite neutron multiplication (k ∞ ) factor, one group integrated flux, few group homogenized cross-sections (absorption, nu-fission) and reaction rates (absorption, nu-fission) of individual isotopic nuclides are calculated for different reactor states. Comparisons of results are made with the reported Monte Carlo

  1. An Analysis of Academic Research Libraries Assessment Data: A Look at Professional Models and Benchmarking Data

    Science.gov (United States)

    Lewin, Heather S.; Passonneau, Sarah M.

    2012-01-01

    This research provides the first review of publicly available assessment information found on Association of Research Libraries (ARL) members' websites. After providing an overarching review of benchmarking assessment data, and of professionally recommended assessment models, this paper examines if libraries contextualized their assessment…

  2. A GFR benchmark comparison of transient analysis codes based on the ETDR concept

    International Nuclear Information System (INIS)

    Bubelis, E.; Coddington, P.; Castelliti, D.; Dor, I.; Fouillet, C.; Geus, E. de; Marshall, T.D.; Van Rooijen, W.; Schikorr, M.; Stainsby, R.

    2007-01-01

    A GFR (Gas-cooled Fast Reactor) transient benchmark study was performed to investigate the ability of different code systems to calculate the transition in the core heat removal from the main circuit forced flow to natural circulation cooling using the Decay Heat Removal (DHR) system. This benchmark is based on a main blower failure in the Experimental Technology Demonstration Reactor (ETDR) with reactor scram. The codes taking part into the benchmark are: RELAP5, TRAC/AAA, CATHARE, SIM-ADS, MANTA and SPECTRA. For comparison purposes the benchmark was divided into several stages: the initial steady-state solution, the main blower flow run-down, the opening of the DHR loop and the transition to natural circulation and finally the 'quasi' steady heat removal from the core by the DHR system. The results submitted by the participants showed that all the codes gave consistent results for all four stages of the benchmark. In the steady-state the calculations revealed some differences in the clad and fuel temperatures, the core and main loop pressure drops and in the total Helium mass inventory. Also some disagreements were observed in the Helium and water flow rates in the DHR loop during the final natural circulation stage. Good agreement was observed for the total main blower flow rate and Helium temperature rise in the core, as well as for the Helium inlet temperature into the core. In order to understand the reason for the differences in the initial 'blind' calculations a second round of calculations was performed using a more precise set of boundary conditions

  3. R2/R0-WTR decommissioning cost. Comparison and benchmarking analysis

    International Nuclear Information System (INIS)

    Varley, Geoff; Rusch, Chris

    2001-10-01

    SKI charged NAC International with the task of determining whether or not the decommissioning cost estimates of R2/R0 (hereafter simply referred to as R2) and Aagesta research reactors are reasonable. The associated work was performed in two phases. The objective in Phase I was to make global comparisons of the R2 and Aagesta decommissioning estimates with the estimates/actual costs for the decommissioning of similar research reactors in other countries. This report presents the results of the Phase II investigations. Phase II focused on selected discrete work packages within the decommissioning program of the WTR reactor. To the extent possible a comparison of those tasks with estimates for the R2 reactor has been made, as a basis for providing an opinion on the reasonableness of the R2 estimate. The specific WTR packages include: reactor vessel and internals dismantling; biological shield dismantling; primary coolant piping dismantling; electrical equipment removal; waste packaging; transportation and disposal of radioactive concrete and reactor components; project management, licensing and engineering; and removal of ancillary facilities. The specific tasks were characterised and analysed in terms of fundamental parameters including: task definition; labour hours expended; labour cost; labour productivity; length of work week; working efficiency; working environment and impact on job execution; external costs (contract labour, materials and equipment); total cost; waste volumes; and waste packaging and transport costs. Based on such detailed raw data, normalised unit resources have been derived for selected parts of the decommissioning program, as a first step towards developing benchmarking data for D and D activities at research reactors. Several general conclusions emerged from the WTR decommissioning project. Site characterisation can confirm or negate major assumptions, quantify waste volumes, delineate obstacles to completing work, provide an understanding

  4. Comparison of investigator-delineated gross tumor volumes and quality assurance in pancreatic cancer: Analysis of the pretrial benchmark case for the SCALOP trial.

    Science.gov (United States)

    Fokas, Emmanouil; Clifford, Charlotte; Spezi, Emiliano; Joseph, George; Branagan, Jennifer; Hurt, Chris; Nixon, Lisette; Abrams, Ross; Staffurth, John; Mukherjee, Somnath

    2015-12-01

    To evaluate the variation in investigator-delineated volumes and assess plans from the radiotherapy trial quality assurance (RTTQA) program of SCALOP, a phase II trial in locally advanced pancreatic cancer. Participating investigators (n=25) outlined a pre-trial benchmark case as per RT protocol, and the accuracy of investigators' GTV (iGTV) and PTV (iPTV) was evaluated, against the trials team-defined gold standard GTV (gsGTV) and PTV (gsPTV), using both qualitative and geometric analyses. The median Jaccard Conformity Index (JCI) and Geographical Miss Index (GMI) were calculated. Participating RT centers also submitted a radiotherapy plan for this benchmark case, which was centrally reviewed against protocol-defined constraints. Twenty-five investigator-defined contours were evaluated. The median JCI and GMI of iGTVs were 0.57 (IQR: 0.51-0.65) and 0.26 (IQR: 0.15-0.40). For iPTVs, these were 0.75 (IQR: 0.71-0.79) and 0.14 (IQR: 0.11-0.22) respectively. Qualitative analysis showed largest variation at the tumor edges and failure to recognize a peri-pancreatic lymph node. There were no major protocol deviations in RT planning, but three minor PTV coverage deviations were identified. . SCALOP demonstrated considerable variation in iGTV delineation. RTTQA workshops and real-time central review of delineations are needed in future trials. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. Benchmarking foreign electronics technologies

    Energy Technology Data Exchange (ETDEWEB)

    Bostian, C.W.; Hodges, D.A.; Leachman, R.C.; Sheridan, T.B.; Tsang, W.T.; White, R.M.

    1994-12-01

    This report has been drafted in response to a request from the Japanese Technology Evaluation Center`s (JTEC) Panel on Benchmarking Select Technologies. Since April 1991, the Competitive Semiconductor Manufacturing (CSM) Program at the University of California at Berkeley has been engaged in a detailed study of quality, productivity, and competitiveness in semiconductor manufacturing worldwide. The program is a joint activity of the College of Engineering, the Haas School of Business, and the Berkeley Roundtable on the International Economy, under sponsorship of the Alfred P. Sloan Foundation, and with the cooperation of semiconductor producers from Asia, Europe and the United States. Professors David A. Hodges and Robert C. Leachman are the project`s Co-Directors. The present report for JTEC is primarily based on data and analysis drawn from that continuing program. The CSM program is being conducted by faculty, graduate students and research staff from UC Berkeley`s Schools of Engineering and Business, and Department of Economics. Many of the participating firms are represented on the program`s Industry Advisory Board. The Board played an important role in defining the research agenda. A pilot study was conducted in 1991 with the cooperation of three semiconductor plants. The research plan and survey documents were thereby refined. The main phase of the CSM benchmarking study began in mid-1992 and will continue at least through 1997. reports are presented on the manufacture of integrated circuits; data storage; wireless technology; human-machine interfaces; and optoelectronics. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  6. Comparative analysis of nine structural codes used in the second WIPP benchmark problem

    International Nuclear Information System (INIS)

    Morgan, H.S.; Krieg, R.D.; Matalucci, R.V.

    1981-11-01

    In the Waste Isolation Pilot Plant (WIPP) Benchmark II study, various computer codes were compared on the basis of their capabilities for calculating the response of hypothetical drift configurations for nuclear waste experiments and storage demonstration. The codes used by participants in the study were ANSALT, DAPROK, JAC, REM, SANCHO, SPECTROM, STEALTH, and two different implementations of MARC. Errors were found in the preliminary results, and several calculations were revised. Revised solutions were in reasonable agreement except for the REM solution. The Benchmark II study allowed significant advances in understanding the relative behavior of computer codes available for WIPP calculations. The study also pointed out the possible need for performing critical design calculations with more than one code. Lastly, it indicated the magnitude of the code-to-code spread in results which is to be expected even when a model has been explicitly defined

  7. Benchmark of AC and DC active power decoupling circuits for second-order harmonic mitigation in kW-scale single-phase inverters

    DEFF Research Database (Denmark)

    Qin, Zian; Tang, Yi; Loh, Poh Chiang

    2015-01-01

    studied, where the commercially available film capacitors, circuit topologies, and control strategies for active power decoupling are all taken into account. Then, an adaptive decoupling voltage control method is proposed to further improve the performance of dc decoupling in terms of efficiency...... and reliability. The feasibility and superiority of the identified solution for active power decoupling together with the proposed adaptive decoupling voltage control method are finally verified by both the experimental results obtained on a 2 kW single-phase inverter.......This paper presents the benchmark study of ac and dc active power decoupling circuits for second-order harmonic mitigation in kW-scale single-phase inverters. First of all, the best solutions of active power decoupling to achieve high efficiency and power density are identified and comprehensively...

  8. Benchmark thermal-hydraulic analysis with the Agathe Hex 37-rod bundle

    International Nuclear Information System (INIS)

    Barroyer, P.; Hudina, M.; Huggenberger, M.

    1981-09-01

    Different computer codes are compared, in prediction performance, based on the AGATHE HEX 37-rod bundle experimental results. The compilation of all available calculation results allows a critical assessment of the codes. For the time being, it is concluded which codes are best suited for gas cooled fuel element design purposes. Based on the positive aspects of these cooperative Benchmark exercises, an attempt is made to define a computer code verification procedure. (Auth.)

  9. Study on LBS for Characterization and Analysis of Big Data Benchmarks

    Directory of Open Access Journals (Sweden)

    Aftab Ahmed Chandio

    2014-10-01

    Full Text Available In the past few years, most organizations are gradually diverting their applications and services to Cloud. This is because Cloud paradigm enables (a on-demand accessed and (b large data processing for their applications and users on Internet anywhere in the world. The rapid growth of urbanization in developed and developing countries leads a new emerging concept called Urban Computing, one of the application domains that is rapidly deployed to the Cloud. More precisely, in the concept of Urban Computing, sensors, vehicles, devices, buildings, and roads are used as a component to probe city dynamics. Their data representation is widely available including GPS traces of vehicles. However, their applications are more towards data processing and storage hungry, which is due to their data increment in large volume starts from few dozen of TB (Tera Bytes to thousands of PT (Peta Bytes (i.e. Big Data. To increase the development and the assessment of the applications such as LBS (Location Based Services, a benchmark of Big Data is urgently needed. This research is a novel research on LBS to characterize and analyze the Big Data benchmarks. We focused on map-matching, which is being used as pre-processing step in many LBS applications. In this preliminary work, this paper also describes current status of Big Data benchmarks and our future direction

  10. Study of LBS for characterization and analysis of big data benchmarks

    International Nuclear Information System (INIS)

    Chandio, A.A.; Zhang, F.; Memon, T.D.

    2014-01-01

    In the past few years, most organizations are gradually diverting their applications and services to Cloud. This is because Cloud paradigm enables (a) on-demand accessed and (b) large data processing for their applications and users on Internet anywhere in the world. The rapid growth of urbanization in developed and developing countries leads a new emerging concept called Urban Computing, one of the application domains that is rapidly deployed to the Cloud. More precisely, in the concept of Urban Computing, sensors, vehicles, devices, buildings, and roads are used as a component to probe city dynamics. Their data representation is widely available including GPS traces of vehicles. However, their applications are more towards data processing and storage hungry, which is due to their data increment in large volume starts from few dozen of TB (Tera Bytes) to thousands of PT (Peta Bytes) (i.e. Big Data). To increase the development and the assessment of the applications such as LBS (Location Based Services), a benchmark of Big Data is urgently needed. This research is a novel research on LBS to characterize and analyze the Big Data benchmarks. We focused on map-matching, which is being used as pre-processing step in many LBS applications. In this preliminary work, this paper also describes current status of Big Data benchmarks and our future direction. (author)

  11. Potential for reducing global carbon emissions from electricity production-A benchmarking analysis

    International Nuclear Information System (INIS)

    Ang, B.W.; Zhou, P.; Tay, L.P.

    2011-01-01

    We present five performance indicators for electricity generation for 129 countries using the 2005 data. These indicators, measured at the national level, are the aggregate CO 2 intensity of electricity production, the efficiencies of coal, oil and gas generation and the share of electricity produced from non-fossil fuels. We conduct a study on the potential for reducing global energy-related CO 2 emissions from electricity production through simple benchmarking. This is performed based on the last four performance indicators and the construction of a cumulative curve for each of these indicators. It is found that global CO 2 emissions from electricity production would be reduced by 19% if all these indicators are benchmarked at the 50th percentile. Not surprisingly, the emission reduction potential measured in absolute terms is the highest for large countries such as China, India, Russia and the United States. When the potential is expressed as a percentage of a country's own emissions, few of these countries appear in the top-five list. - Research highlights: → We study variations in emissions per kWh of electricity generated among countries. → We analyze emissions from electricity production through benchmarking. → Estimates of reduction in emissions are made based on different assumptions.

  12. Update of KASHIL-E6 library for shielding analysis and benchmark calculations

    International Nuclear Information System (INIS)

    Kim, D. H.; Kil, C. S.; Jang, J. H.

    2004-01-01

    For various shielding and reactor pressure vessel dosimetry applications, a pseudo-problem-independent neutron-photon coupled MATXS-format library based on the last release of ENDF/B-VI has been generated as a part of the update program for KASHIL-E6, which was based on ENDF/B-VI.5. It has VITAMIN-B6 neutron and photon energy group structures, i.e., 199 groups for neutron and 42 groups for photon. The neutron and photon weighting functions and the Legendre order of scattering are same as KASHIL-E6. The library has been validated through some benchmarks: the PCA-REPLICA and NESDIP-2 experiments for LWR pressure vessel facility benchmark, the Winfrith Iron88 experiment for validation of iron data, and the Winfrith Graphite experiment for validation of graphite data. These calculations were performed by the TRANSXlDANTSYS code system. In addition, the substitutions of the JENDL-3.3 and JEFF-3.0 data for Fe, Cr, Cu and Ni, which are very important nuclides for shielding analyses, were investigated to estimate the effects on the benchmark calculation results

  13. Benchmarking and gap analysis of faculty mentorship priorities and how well they are met.

    Science.gov (United States)

    Bruner, Deborah Watkins; Dunbar, Sandra; Higgins, Melinda; Martyn, Kristy

    2016-01-01

    There is little consensus among faculty mentoring programs as to best practices. While there are recommendations in the literature to base faculty development programs on gap analyses of faculty ratings of actual and preferred performance in teaching, scholarship and service, no gap analysis was found in the literature. Thus, the purpose of this study was to develop a survey tool to benchmark school of nursing (SON) faculty mentorship priorities and conduct a gap analysis of how well they were being addressed. Senior faculty who lead mentorship as part of their roles in the SON (associate and assistant deans and director of mentorship) developed a survey through (a) asking faculty members for priorities at in-person mentorship seminars, (b) a review of current nursing literature, and (c) input from the SON mentorship advisory board. The final survey included 37 items focused on general job duties, structure of the mentoring program, time management, as well as skills needed for research, teaching, practice, writing and team science. Responses (rated from 0-not important to 5-very high priority) were requested in 4 areas: the first area focused on how high a priority the respondent rated a given item and areas 2 to 4 focused on how well the need was met by one of three resources: their SON primary assigned mentor, other SON resources, or other university resources. There were 63 eligible SON faculty to whom the survey was e-mailed with a 60% (n = 38) response rate. Most of the respondents were clinical track (42.1%) followed by tenure track (39.5%) and research track (15.8%). Half were assistant professors. The percentage of respondents giving a rating of 4 to 5 were calculated and then ranked. Almost all the faculty responding, regardless of track or rank, desired formal mentorship. Among all faculty, the top five priorities were guidance on producing timely publications (70.4%), mentorship on work-life balance (68%), mentorship on putting together a promotion

  14. Thought Experiment to Examine Benchmark Performance for Fusion Nuclear Data

    Science.gov (United States)

    Murata, Isao; Ohta, Masayuki; Kusaka, Sachie; Sato, Fuminobu; Miyamaru, Hiroyuki

    2017-09-01

    There are many benchmark experiments carried out so far with DT neutrons especially aiming at fusion reactor development. These integral experiments seemed vaguely to validate the nuclear data below 14 MeV. However, no precise studies exist now. The author's group thus started to examine how well benchmark experiments with DT neutrons can play a benchmarking role for energies below 14 MeV. Recently, as a next phase, to generalize the above discussion, the energy range was expanded to the entire region. In this study, thought experiments with finer energy bins have thus been conducted to discuss how to generally estimate performance of benchmark experiments. As a result of thought experiments with a point detector, the sensitivity for a discrepancy appearing in the benchmark analysis is "equally" due not only to contribution directly conveyed to the deterctor, but also due to indirect contribution of neutrons (named (A)) making neutrons conveying the contribution, indirect controbution of neutrons (B) making the neutrons (A) and so on. From this concept, it would become clear from a sensitivity analysis in advance how well and which energy nuclear data could be benchmarked with a benchmark experiment.

  15. Thought Experiment to Examine Benchmark Performance for Fusion Nuclear Data

    Directory of Open Access Journals (Sweden)

    Murata Isao

    2017-01-01

    Full Text Available There are many benchmark experiments carried out so far with DT neutrons especially aiming at fusion reactor development. These integral experiments seemed vaguely to validate the nuclear data below 14 MeV. However, no precise studies exist now. The author’s group thus started to examine how well benchmark experiments with DT neutrons can play a benchmarking role for energies below 14 MeV. Recently, as a next phase, to generalize the above discussion, the energy range was expanded to the entire region. In this study, thought experiments with finer energy bins have thus been conducted to discuss how to generally estimate performance of benchmark experiments. As a result of thought experiments with a point detector, the sensitivity for a discrepancy appearing in the benchmark analysis is “equally” due not only to contribution directly conveyed to the deterctor, but also due to indirect contribution of neutrons (named (A making neutrons conveying the contribution, indirect controbution of neutrons (B making the neutrons (A and so on. From this concept, it would become clear from a sensitivity analysis in advance how well and which energy nuclear data could be benchmarked with a benchmark experiment.

  16. Comparative analysis of results between CASMO, MCNP and Serpent for a suite of Benchmark problems on BWR reactors

    International Nuclear Information System (INIS)

    Xolocostli M, J. V.; Vargas E, S.; Gomez T, A. M.; Reyes F, M. del C.; Del Valle G, E.

    2014-10-01

    In this paper a comparison is made in analyzing the suite of Benchmark problems for reactors type BWR between CASMO-4, MCNP6 and Serpent code. The Benchmark problem consists of two different geometries: a fuel cell of a pin and assembly type BWR. To facilitate the study of reactors physics in the fuel pin their nuclear characteristics are provided to detail, such as burnt dependence, the reactivity of selected nuclide, etc. With respect to the fuel assembly, the presented results are regarding to infinite multiplication factor for burning different steps and different vacuum conditions. Making the analysis of this set of Benchmark problems provides comprehensive test problems for the next fuels generation of BWR reactors with high extended burned. It is important to note that when making this comparison the purpose is to validate the methodologies used in modeling for different operating conditions, if the case is of other BWR assembly. The results will be within a range with some uncertainty, considering that does not depend on code that is used. Escuela Superior de Fisica y Matematicas of Instituto Politecnico Nacional (IPN (Mexico) has accumulated some experience in using Serpent, due to the potential of this code over other commercial codes such as CASMO and MCNP. The obtained results for the infinite multiplication factor are encouraging and motivate the studies to continue with the generation of the X S of a core to a next step a respective nuclear data library is constructed and this can be used by codes developed as part of the development project of the Mexican Analysis Platform of Nuclear Reactors AZTLAN. (Author)

  17. RESULTS OF ANALYSIS OF BENCHMARKING METHODS OF INNOVATION SYSTEMS ASSESSMENT IN ACCORDANCE WITH AIMS OF SUSTAINABLE DEVELOPMENT OF SOCIETY

    Directory of Open Access Journals (Sweden)

    A. Vylegzhanina

    2016-01-01

    Full Text Available In this work, we introduce results of comparative analysis of international ratings indexes of innovation systems for their compliance with purposes of sustainable development. Purpose of this research is defining requirements to benchmarking methods of assessing national or regional innovation systems and compare them basing on assumption, that innovation system is aligned with sustainable development concept. Analysis of goal sets and concepts, which underlie observed international composite innovation indexes, comparison of their metrics and calculation techniques, allowed us to reveal opportunities and limitations of using these methods in frames of sustainable development concept. We formulated targets of innovation development on the base of innovation priorities of sustainable socio-economic development. Using comparative analysis of indexes with these targets, we revealed two methods of assessing innovation systems, maximally connected with goals of sustainable development. Nevertheless, today no any benchmarking method, which meets need of innovation systems assessing in compliance with sustainable development concept to a sufficient extent. We suggested practical directions of developing methods, assessing innovation systems in compliance with goals of societal sustainable development.

  18. Verification and benchmarking of PORFLO: an equivalent porous continuum code for repository scale analysis

    International Nuclear Information System (INIS)

    Eyler, L.L.; Budden, M.J.

    1984-11-01

    The objective of this work was to perform an assessment of prediction capabilities and features of the PORFLO code in relation to its intended use in the Basalt Waste Isolation Project. This objective was to be accomplished through a code verification and benchmarking task. Results were to be documented which either support correctness of prediction capabilities or identify areas of intended application in which the code exhibits weaknesses. A test problem set consisting of 10 problems was developed. Results of PORFLO simulations of these problems were provided for use in this work. The 10 problems were designed to test the three basic computational capabilities or categories of the code. Broken down by physical process, these are heat transfer, fluid flow, and radionuclide transport. Two verification problems were included within each of these categories. They were problems designed to test basic features of PORFLO for which analytical solutions are available for use as a known comparison basis. Hence they are referred to as verification problems. Of the remaining four problems, one repository scale problem representative of intended PORFLO use within BWIP was included in each of the three basic capabilities categories. The remaining problem was a case specifically designed to test features of decay and retardation in radionuclide transport. These four problems are referred to as benchmarking problems, because results computed with an additional computer code were used as a basis for comparison. 38 figures

  19. Analysis of the European results on the HTTR's core physics benchmarks

    International Nuclear Information System (INIS)

    Raepsaet, X.; Damian, F.; Ohlig, U.A.; Brockmann, H.J.; Haas, J.B.M. de; Wallerboss, E.M.

    2002-01-01

    Within the frame of the European contract HTR-N1 calculations are performed on the benchmark problems of the HTTR's start-up core physics experiments initially proposed by the IAEA in a Co-ordinated Research Programme. Three European partners, the FZJ in Germany, NRG and IRI in the Netherlands, and CEA in France, have joined this work package with the aim to validate their calculational methods. Pre-test and post-test calculational results, obtained by the partners, are compared with each other and with the experiment. Parts of the discrepancies between experiment and pre-test predictions are analysed and tackled by different treatments. In the case of the Monte Carlo code TRIPOLI4, used by CEA, the discrepancy between measurement and calculation at the first criticality is reduced to Δk/k∼0.85%, when considering the revised data of the HTTR benchmark. In the case of the diffusion codes, this discrepancy is reduced to: Δk/k∼0.8% (FZJ) and 2.7 or 1.8% (CEA). (author)

  20. Structural code benchmarking for the analysis of impact response of nuclear material shipping casks

    International Nuclear Information System (INIS)

    Glass, R.E.

    1984-01-01

    The Transportation Technology Center at Sandia National Laboratories has initiated a program to benchmark thermal and structural codes that are available to the nuclear material transportation community. The program consists of the following five phrases: (1) code inventory and review, (2) development of a cask-like set of problems, (3) multiple independent numerical analyses of the problems, (4) transfer of information, and (5) performance of experiments to obtain data for comparison with the numerical analyses. This paper will summarize the results obtained by the independent numerical analyses. The analyses indicate the variability that can be expected both due to differences in user-controlled parameters and from code-to-code differences. The results show that in purely elastic analyses, differences can be attributed to user controlled parameters. Model problems involving elastic/plastic material behavior and large deformations, however, have greater variability with significant differences reported for implicit and explicit integration schemes in finite element programs. This variability demonstrates the need to obtain experimental data to properly benchmark codes utilizing elastic/plastic material models and large deformation capability

  1. Dissipativity analysis of the base isolated benchmark structure with magnetorheological fluid dampers

    International Nuclear Information System (INIS)

    Erkus, Baris; Johnson, Erik A

    2011-01-01

    This paper investigates the dissipativity and performance characteristics of the semiactive control of the base isolated benchmark structure with magnetorheological (MR) fluid dampers. Previously, the authors introduced the concepts of dissipativity and dissipativity indices in the semiactive control of structures with smart dampers and studied the dissipativity characteristics of simple structures with idealized dampers. To investigate the effects of semiactive controller dissipativity characteristics on the overall performance of the base isolated benchmark building, a clipped optimal control strategy with a linear quadratic Gaussian (LQG) controller and a 20 ton MR fluid damper model is used. A cumulative index is proposed for quantifying the overall dissipativity of a control system with multiple control devices. Two control designs with different dissipativity and performance characteristics are considered as the primary controller in clipped optimal control. Numerical simulations reveal that the dissipativity indices can be classified into two groups that exhibit distinct patterns. It is shown that the dissipativity indices identify primary controllers that are more suitable for application with MR dampers and provide useful information in the semiactive design process that complements other performance indices. The computational efficiency of the proposed dissipativity indices is verified by comparing computation times

  2. Sensitivity analysis and benchmarking of the BLT low-level waste source term code

    International Nuclear Information System (INIS)

    Suen, C.J.; Sullivan, T.M.

    1993-07-01

    To evaluate the source term for low-level waste disposal, a comprehensive model had been developed and incorporated into a computer code, called BLT (Breach-Leach-Transport) Since the release of the original version, many new features and improvements had also been added to the Leach model of the code. This report consists of two different studies based on the new version of the BLT code: (1) a series of verification/sensitivity tests; and (2) benchmarking of the BLT code using field data. Based on the results of the verification/sensitivity tests, the authors concluded that the new version represents a significant improvement and it is capable of providing more realistic simulations of the leaching process. Benchmarking work was carried out to provide a reasonable level of confidence in the model predictions. In this study, the experimentally measured release curves for nitrate, technetium-99 and tritium from the saltstone lysimeters operated by Savannah River Laboratory were used. The model results are observed to be in general agreement with the experimental data, within the acceptable limits of uncertainty

  3. Benchmark of AC and DC Active Power Decoupling Circuits for Second-Order Harmonic Mitigation in Kilowatt-Scale Single-Phase Inverters

    DEFF Research Database (Denmark)

    Qin, Zian; Tang, Yi; Loh, Poh Chiang

    2016-01-01

    efficiency and high power density is identified and comprehensively studied, and the commercially available film capacitors, the circuit topologies, and the control strategies adopted for active power decoupling are all taken into account. Then, an adaptive decoupling voltage control method is proposed...... to further improve the performance of dc decoupling in terms of efficiency and reliability. The feasibility and superiority of the identified solution for active power decoupling together with the proposed adaptive decoupling voltage control method are finally verified by both the simulation and experimental......This paper presents the benchmark study of ac and dc active power decoupling circuits for second order harmonic mitigation in kW scale single-phase inverters. First of all, a brief comparison of recently reported active power decoupling circuits is given, and the best solution that can achieve high...

  4. Comparison of the PHISICS/RELAP5-3D Ring and Block Model Results for Phase I of the OECD MHTGR-350 Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Gerhard Strydom

    2014-04-01

    The INL PHISICS code system consists of three modules providing improved core simulation capability: INSTANT (performing 3D nodal transport core calculations), MRTAU (depletion and decay heat generation) and a perturbation/mixer module. Coupling of the PHISICS code suite to the thermal hydraulics system code RELAP5-3D has recently been finalized, and as part of the code verification and validation program the exercises defined for Phase I of the OECD/NEA MHTGR 350 MW Benchmark were completed. This paper provides an overview of the MHTGR Benchmark, and presents selected results of the three steady state exercises 1-3 defined for Phase I. For Exercise 1, a stand-alone steady-state neutronics solution for an End of Equilibrium Cycle Modular High Temperature Reactor (MHTGR) was calculated with INSTANT, using the provided geometry, material descriptions, and detailed cross-section libraries. Exercise 2 required the modeling of a stand-alone thermal fluids solution. The RELAP5-3D results of four sub-cases are discussed, consisting of various combinations of coolant bypass flows and material thermophysical properties. Exercise 3 combined the first two exercises in a coupled neutronics and thermal fluids solution, and the coupled code suite PHISICS/RELAP5-3D was used to calculate the results of two sub-cases. The main focus of the paper is a comparison of the traditional RELAP5-3D “ring” model approach vs. a much more detailed model that include kinetics feedback on individual block level and thermal feedbacks on a triangular sub-mesh. The higher fidelity of the block model is illustrated with comparison results on the temperature, power density and flux distributions, and the typical under-predictions produced by the ring model approach are highlighted.

  5. Random geometry capability in RMC code for explicit analysis of polytype particle/pebble and applications to HTR-10 benchmark

    International Nuclear Information System (INIS)

    Liu, Shichang; Li, Zeguang; Wang, Kan; Cheng, Quan; She, Ding

    2018-01-01

    Highlights: •A new random geometry was developed in RMC for mixed and polytype particle/pebble. •This capability was applied to the full core calculations of HTR-10 benchmark. •Reactivity, temperature coefficient and control rod worth of HTR-10 were compared. •This method can explicitly model different packing fraction of different pebbles. •Monte Carlo code with this method can simulate polytype particle/pebble type reactor. -- Abstract: With the increasing demands of high fidelity neutronics analysis and the development of computer technology, Monte Carlo method is becoming more and more attractive in accurate simulation of pebble bed High Temperature gas-cooled Reactor (HTR), owing to its advantages of the flexible geometry modeling and the use of continuous-energy nuclear cross sections. For the double-heterogeneous geometry of pebble bed, traditional Monte Carlo codes can treat it by explicit geometry description. However, packing methods such as Random Sequential Addition (RSA) can only produce a sphere packing up to 38% volume packing fraction, while Discrete Element Method (DEM) is troublesome and also time consuming. Moreover, traditional Monte Carlo codes are difficult and inconvenient to simulate the mixed and polytype particles or pebbles. A new random geometry method was developed in Monte Carlo code RMC to simulate the particle transport in polytype particle/pebble in double heterogeneous geometry systems. This method was verified by some test cases, and applied to the full core calculations of HTR-10 benchmark. The reactivity, temperature coefficient and control rod worth of HTR-10 were compared for full core and initial core in helium and air atmosphere respectively, and the results agree well with the benchmark results and experimental results. This work would provide an efficient tool for the innovative design of pebble bed, prism HTRs and molten salt reactors with polytype particles or pebbles using Monte Carlo method.

  6. Application of FORSS sensitivity and uncertainty methodology to fast reactor benchmark analysis

    Energy Technology Data Exchange (ETDEWEB)

    Weisbin, C.R.; Marable, J.H.; Lucius, J.L.; Oblow, E.M.; Mynatt, F.R.; Peelle, R.W.; Perey, F.G.

    1976-12-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions, and associated uncertainties. This paper presents the theory and code description as well as the first results of applying FORSS to fast reactor benchmarks. Specifically, for various assemblies and reactor performance parameters, the nuclear data sensitivities were computed by nuclide, reaction type, and energy. Comprehensive libraries of energy-dependent coefficients have been developed in a computer retrievable format and released for distribution by RSIC and NNCSC. Uncertainties induced by nuclear data were quantified using preliminary, energy-dependent relative covariance matrices evaluated with ENDF/B-IV expectation values and processed for /sup 238/U(n,f), /sup 238/U(n,..gamma..), /sup 239/Pu(n,f), and /sup 239/Pu(..nu..). Nuclear data accuracy requirements to meet specified performance criteria at minimum experimental cost were determined.

  7. Application of FORSS sensitivity and uncertainty methodology to fast reactor benchmark analysis

    International Nuclear Information System (INIS)

    Weisbin, C.R.; Marable, J.H.; Lucius, J.L.; Oblow, E.M.; Mynatt, F.R.; Peelle, R.W.; Perey, F.G.

    1976-12-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions, and associated uncertainties. This paper presents the theory and code description as well as the first results of applying FORSS to fast reactor benchmarks. Specifically, for various assemblies and reactor performance parameters, the nuclear data sensitivities were computed by nuclide, reaction type, and energy. Comprehensive libraries of energy-dependent coefficients have been developed in a computer retrievable format and released for distribution by RSIC and NNCSC. Uncertainties induced by nuclear data were quantified using preliminary, energy-dependent relative covariance matrices evaluated with ENDF/B-IV expectation values and processed for 238 U(n,f), 238 U(n,γ), 239 Pu(n,f), and 239 Pu(ν). Nuclear data accuracy requirements to meet specified performance criteria at minimum experimental cost were determined

  8. Benchmark reference data on post irradiation analysis of light water reactor fuel samples

    International Nuclear Information System (INIS)

    Guardini, S.; Guzzi, G.

    1983-01-01

    The structure of the present report is as follows: in section I the benchmark activity (BM) is described in detail; characteristics of the reactors and fuel assemblies examinated are given, and the technical aspects of the chemical and analytical processes are discussed. In section II all the techniques used to certify the analytical data are presented, together with a discussion of evaluated random and systematic uncertainties. A comparison with the calculated values and the interpretation with ICT (Isotopic Correlation Techniques) is also presented in this section. Section III presents the results. In practice the complete sets of results referring to all JRC measurements are given here for the sake of the completeness and consistency of this final report

  9. Analysis of the NEACRP PWR rod ejection benchmark problems with DIF3D-K

    International Nuclear Information System (INIS)

    Kim, M.H.

    1994-01-01

    Analyses of the NEACRP PWR rod ejection transient benchmark problems with the DIF3D-K nodal kinetics code are presented. The DIF3D-K results are shown to be in generally good agreement with results obtained using other codes, in particular reference results previously generated with the PANTHER code. The sensitivity of the transient results to the DIF3D-K input parameters (such as time step size, radial and axial node sizes, and the mesh structure employed for fuel pin heat conduction calculation) are evaluated and discussed. In addition, the potential in reducing computational effort by application of the improved quasistatic scheme (IQS) to these rod ejection transients, which involve very significant flux shape changes and thermal-hydraulic feedback is evaluated

  10. Challenges and policy implications of gas reform in Italy and Ukraine: Evidence from a benchmarking analysis

    International Nuclear Information System (INIS)

    Goncharuk, Anatoliy G.; Storto, Corrado lo

    2017-01-01

    This paper presents a cross-country benchmarking study of natural gas distribution to final consumers and compares two samples of companies in Italy and Ukraine. A 2-stage DEA procedure calculating efficiency of gas providers and identifying critical context factors and policy issues that affect it is implemented. Both countries are low performing in terms of operators’ technical and scale efficiency and there is room to design more efficient market configurations. Some issues need attention to develop an effective gas market policy: a) search for efficiency requires accurate investigation of its main drivers that depend on context factors; b) while greater efficiency is necessary to reduce cost and increase service quality, at different stages of progress of the reform process other goals may be more important; c) gas industry reform process should be planned adopting a systemic perspective as its development does not remain confined to the sector, but implies changes in the whole country economy, particularly when the gas market is of primary relevance to the economy; d) a more comprehensive package of reforms may be necessary to make gas market reform successful; e) even though the gas market reform is an economic process, it has unavoidably social and political implications. - Highlights: • Benchmarking of natural gas distribution industry in Italy and Ukraine is performed. • Average industry inefficiency is about 27% and decreasing returns to scale are dominant. • Gas industry reform process should be planned adopting a systemic perspective. • Gas reform needs a more comprehensive package of reforms and/or supporting legislation to be successful. • Gas industry reform has social and political implications.

  11. Benchmark for Neutronic Analysis of Sodium-cooled Fast Reactor Cores with Various Fuel Types and Core Sizes

    International Nuclear Information System (INIS)

    Stauff, N.E.; Kim, T.K.; Taiwo, T.A.; Buiron, L.; Rimpault, G.; Brun, E.; Lee, Y.K.; Pataki, I.; Kereszturi, A.; Tota, A.; Parisi, C.; Fridman, E.; Guilliard, N.; Kugo, T.; Sugino, K.; Uematsu, M.M.; Ponomarev, A.; Messaoudi, N.; Lin Tan, R.; Kozlowski, T.; Bernnat, W.; Blanchet, D.; Brun, E.; Buiron, L.; Fridman, E.; Guilliard, N.; Kereszturi, A.; Kim, T.K.; Kozlowski, T.; Kugo, T.; Lee, Y.K.; Lin Tan, R.; Messaoudi, N.; Parisi, C.; Pataki, I.; Ponomarev, A.; Rimpault, G.; Stauff, N.E.; Sugino, K.; Taiwo, T.A.; Tota, A.; Uematsu, M.M.; Monti, S.; Yamaji, A.; Nakahara, Y.; Gulliford, J.

    2016-01-01

    One of the foremost Generation IV International Forum (GIF) objectives is to design nuclear reactor cores that can passively avoid damage of the reactor when control rods fail to scram in response to postulated accident initiators (e.g. inadvertent reactivity insertion or loss of coolant flow). The analysis of such unprotected transients depends primarily on the physical properties of the fuel and the reactivity feedback coefficients of the core. Within the activities of the Working Party on Scientific Issues of Reactor Systems (WPRS), the Sodium Fast Reactor core Feed-back and Transient response (SFR-FT) Task Force was proposed to evaluate core performance characteristics of several Generation IV Sodium-cooled Fast Reactor (SFR) concepts. A set of four numerical benchmark cases was initially developed with different core sizes and fuel types in order to perform neutronic characterisation, evaluation of the feedback coefficients and transient calculations. Two 'large' SFR core designs were proposed by CEA: those generate 3 600 MW(th) and employ oxide and carbide fuel technologies. Two 'medium' SFR core designs proposed by ANL complete the set. These medium SFR cores generate 1 000 MW(th) and employ oxide and metallic fuel technologies. The present report summarises the results obtained by the WPRS for the neutronic characterisation benchmark exercise proposed. The benchmark definition is detailed in Chapter 2. Eleven institutions contributed to this benchmark: Argonne National Laboratory (ANL), Commissariat a l'energie atomique et aux energies alternatives (CEA of Cadarache), Commissariat a l'energie atomique et aux energies alternatives (CEA of Saclay), Centre for Energy Research (CER-EK), Italian National Agency for New Technologies, Energy and Sustainable Economic Development (ENEA), Helmholtz Zentrum Dresden Rossendorf (HZDR), Institute of Nuclear Technology and Energy Systems (IKE), Japan Atomic Energy Agency (JAEA), Karlsruhe Institute of Technology (KIT

  12. Benchmarking the evaluated proton differential cross sections suitable for the EBS analysis of natSi and 16O

    Science.gov (United States)

    Kokkoris, M.; Dede, S.; Kantre, K.; Lagoyannis, A.; Ntemou, E.; Paneta, V.; Preketes-Sigalas, K.; Provatas, G.; Vlastou, R.; Bogdanović-Radović, I.; Siketić, Z.; Obajdin, N.

    2017-08-01

    The evaluated proton differential cross sections suitable for the Elastic Backscattering Spectroscopy (EBS) analysis of natSi and 16O, as obtained from SigmaCalc 2.0, have been benchmarked over a wide energy and angular range at two different accelerator laboratories, namely at N.C.S.R. 'Demokritos', Athens, Greece and at Ruđer Bošković Institute (RBI), Zagreb, Croatia, using a variety of high-purity thick targets of known stoichiometry. The results are presented in graphical and tabular forms, while the observed discrepancies, as well as, the limits in accuracy of the benchmarking procedure, along with target related effects, are thoroughly discussed and analysed. In the case of oxygen the agreement between simulated and experimental spectra was generally good, while for silicon serious discrepancies were observed above Ep,lab = 2.5 MeV, suggesting that a further tuning of the appropriate nuclear model parameters in the evaluated differential cross-section datasets is required.

  13. Benchmarking of the computer code and the thirty foot side drop analysis for the Shippingport (RPV/NST package)

    International Nuclear Information System (INIS)

    Bumpus, S.E.; Gerhard, M.A.; Hovingh, J.; Trummer, D.J.; Witte, M.C.

    1989-01-01

    This paper presents the benchmarking of a finite element computer code and the subsequent results from the code simulating the 30 foot side drop impact of the RPV/NST transport package from the decommissioned Shippingport Nuclear Power Station. The activated reactor pressure vessel (RPV), thermal shield, and other reactor external components were encased in concrete contained by the neutron shield tank (NST) and a lifting skirt. The Shippingport RPV/NST package, a Type B Category II package, weighs approximately 900 tons and has 17.5 ft diameter and 40.7 ft. length. For transport of the activated components from Shippingport to the burial site, the Safety Analysis Report for Packaging (SARP) demonstrated that the package can withstand the hypothetical accidents of DOE Order 5480.3 including 10 CFR 71. Mathematical simulations of these accidents can substitute for actual tests if the simulated results satisfy the acceptance criteria. Any such mathematical simulation, including the modeling of the materials, must be benchmarked to experiments that duplicate the loading conditions of the tests. Additional confidence in the simulations is justified if the test specimens are configured similar to the package

  14. A NRC-BNL benchmark evaluation of seismic analysis methods for non-classically damped coupled systems

    International Nuclear Information System (INIS)

    Xu, J.; DeGrassi, G.; Chokshi, N.

    2004-01-01

    Under the auspices of the U.S. Nuclear Regulatory Commission (NRC), Brookhaven National Laboratory (BNL) developed a comprehensive program to evaluate state-of-the-art methods and computer programs for seismic analysis of typical coupled nuclear power plant (NPP) systems with non-classical damping. In this program, four benchmark models of coupled building-piping/equipment systems with different damping characteristics were developed and analyzed by BNL for a suite of earthquakes. The BNL analysis was carried out by the Wilson-θ time domain integration method with the system-damping matrix computed using a synthesis formulation as presented in a companion paper [Nucl. Eng. Des. (2002)]. These benchmark problems were subsequently distributed to and analyzed by program participants applying their uniquely developed methods and computer programs. This paper is intended to offer a glimpse at the program, and provide a summary of major findings and principle conclusions with some representative results. The participant's analysis results established using complex modal time history methods showed good comparison with the BNL solutions, while the analyses produced with either complex-mode response spectrum methods or classical normal-mode response spectrum method, in general, produced more conservative results, when averaged over a suite of earthquakes. However, when coupling due to damping is significant, complex-mode response spectrum methods performed better than the classical normal-mode response spectrum method. Furthermore, as part of the program objectives, a parametric assessment is also presented in this paper, aimed at evaluation of the applicability of various analysis methods to problems with different dynamic characteristics unique to coupled NPP systems. It is believed that the findings and insights learned from this program will be useful in developing new acceptance criteria and providing guidance for future regulatory activities involving license

  15. Definition and Analysis of Heavy Water Reactor Benchmarks for Testing New Wims-D Libraries; Definicion y Analisis de Benchmarks de Reactores de Agua Pesada para Pruebas de Nuevas Bibliotecas de Datos Wims-D

    Energy Technology Data Exchange (ETDEWEB)

    Leszczynski, Francisco [Comision Nacional de Energia Atomica, Centro Atomico Bariloche (Argentina)

    2000-07-01

    This work is part of the IAEA-WIMS Library Update Project (WLUP). A group of heavy water reactor benchmarks have been selected for testing new WIMS-D libraries, including calculations with WIMSD5B program and the analysis of results.These benchmarks cover a wide variety of reactors and conditions, from fresh fuels to high burnup, and from natural to enriched uranium.Besides, each benchmark includes variations in lattice pitch and in coolants (normally heavy water and void).Multiplication factors with critical experimental bucklings and other parameters are calculated and compared with experimental reference values.The WIMS libraries used for the calculations were generated with basic data from JEF-2.2 Rev.3 (JEF) and ENDF/B-VI iNReleaseln 5 (E6) Results obtained with WIMS-86 (W86) library, included with WIMSD5B package, from Windfrith, UK with adjusted data, are included also, for showing the improvements obtained with the new -not adjusted- libraries.The calculations with WIMSD5B were made with two methods (input program options): PIJ (two-dimension collision probability method) and DSN (one-dimension Sn method, with homogenization of materials by ring).The general conclusions are: the library based on JEF data and the DSN meted give the best results, that in average are acceptable.

  16. Benchmarking of 3D space charge codes using direct phase space measurements from photoemission high voltage dc gun

    Directory of Open Access Journals (Sweden)

    Ivan V. Bazarov

    2008-10-01

    Full Text Available We present a comparison between space charge calculations and direct measurements of the transverse phase space of space charge dominated electron bunches from a high voltage dc photoemission gun followed by an emittance compensation solenoid magnet. The measurements were performed using a double-slit emittance measurement system over a range of bunch charge and solenoid current values. The data are compared with detailed simulations using the 3D space charge codes GPT and Parmela3D. The initial particle distributions were generated from measured transverse and temporal laser beam profiles at the photocathode. The beam brightness as a function of beam fraction is calculated for the measured phase space maps and found to approach within a factor of 2 the theoretical maximum set by the thermal energy and the accelerating field at the photocathode.

  17. Groundwater flow code verification ''benchmarking'' activity (COVE-2A): Analysis of participants' work

    International Nuclear Information System (INIS)

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project

  18. Benchmark Report on Key Outage Attributes: An Analysis of Outage Improvement Opportunities and Priorities

    Energy Technology Data Exchange (ETDEWEB)

    Germain, Shawn St. [Idaho National Laboratory (INL), Idaho Falls, ID (United States); Farris, Ronald [Idaho National Laboratory (INL), Idaho Falls, ID (United States)

    2014-09-01

    Advanced Outage Control Center (AOCC), is a multi-year pilot project targeted at Nuclear Power Plant (NPP) outage improvement. The purpose of this pilot project is to improve management of NPP outages through the development of an AOCC that is specifically designed to maximize the usefulness of communication and collaboration technologies for outage coordination and problem resolution activities. This report documents the results of a benchmarking effort to evaluate the transferability of technologies demonstrated at Idaho National Laboratory and the primary pilot project partner, Palo Verde Nuclear Generating Station. The initial assumption for this pilot project was that NPPs generally do not take advantage of advanced technology to support outage management activities. Several researchers involved in this pilot project have commercial NPP experience and believed that very little technology has been applied towards outage communication and collaboration. To verify that the technology options researched and demonstrated through this pilot project would in fact have broad application for the US commercial nuclear fleet, and to look for additional outage management best practices, LWRS program researchers visited several additional nuclear facilities.

  19. Phase analysis in gated blood pool tomography

    International Nuclear Information System (INIS)

    Nakajima, Kenichi; Bunko, Hisashi; Tada, Akira; Taki, Junichi; Nanbu, Ichiro

    1984-01-01

    Phase analysis of gated blood pool study has been applied to detect the site of accessory conduction pathway (ACP) in the Wolff-Parkinson-White (WPW) syndrome; however, there was a limitation to detect the precise location of ACP by phase analysis alone. In this study, we applied phase analysis to gated blood pool tomography using seven pin hole tomography (7PT) and gated emission computed tomography (GECT) in 21 patients with WPW syndrome and 3 normal subjects. In 17 patients, the sites of ACPs were confirmed by epicardial mapping and the result of the surgical division of ACP. In 7PT, the site of ACP grossly agreed to the abnormal initial phase in phase image in 5 out of 6 patients with left cardiac type. In GECT, phase images were generated in short axial, vertical and horizontal long axial sections. In 8 out of 9 patients, the site of ACP was correctly identified by phase images, and in a patient who had two ACPs, initial phase corresponded to one of the two locations. Phase analysis of gated blood pool tomography has advantages for avoiding overlap of blood pools and for estimating three-dimensional propagation of the contraction, and can be a good adjunctive method in patients with WPW syndrome. (author)

  20. Cross section and method uncertainties: the application of sensitivity analysis to study their relationship in radiation transport benchmark problems

    International Nuclear Information System (INIS)

    Weisbi, C.R.; Oblow, E.M.; Ching, J.; White, J.E.; Wright, R.Q.; Drischler, J.

    1975-08-01

    Sensitivity analysis is applied to the study of an air transport benchmark calculation to quantify and distinguish between cross-section and method uncertainties. The boundary detector response was converged with respect to spatial and angular mesh size, P/sub l/ expansion of the scattering kernel, and the number and location of energy grid boundaries. The uncertainty in the detector response due to uncertainties in nuclear data is 17.0 percent (one standard deviation, not including uncertainties in energy and angular distribution) based upon the ENDF/B-IV ''error files'' including correlations in energy and reaction type. Differences of approximately 6 percent can be attributed exclusively to differences in processing multigroup transfer matrices. Formal documentation of the PUFF computer program for the generation of multigroup covariance matrices is presented. (47 figures, 14 tables) (U.S.)

  1. MC21 Monte Carlo analysis of the Hoogenboom-Martin full-core PWR benchmark problem - 301

    International Nuclear Information System (INIS)

    Kelly, D.J.; Sutton, Th.M.; Trumbull, T.H.; Dobreff, P.S.

    2010-01-01

    At the 2009 American Nuclear Society Mathematics and Computation conference, Hoogenboom and Martin proposed a full-core PWR model to monitor the improvement of Monte Carlo codes to compute detailed power density distributions. This paper describes the application of the MC21 Monte Carlo code to the analysis of this benchmark model. With the MC21 code, we obtained detailed power distributions over the entire core. The model consisted of 214 assemblies, each made up of a 17x17 array of pins. Each pin was subdivided into 100 axial nodes, thus resulting in over seven million tally regions. Various cases were run to assess the statistical convergence of the model. This included runs of 10 billion and 40 billion neutron histories, as well as ten independent runs of 4 billion neutron histories each. The 40 billion neutron-history calculation resulted in 43% of all regions having a 95% confidence level of 2% or less implying a relative standard deviation of 1%. Furthermore, 99.7% of regions having a relative power density of 1.0 or greater have a similar confidence level. We present timing results that assess the MC21 performance relative to the number of tallies requested. Source convergence was monitored by analyzing plots of the Shannon entropy and eigenvalue versus active cycle. We also obtained an estimate of the dominance ratio. Additionally, we performed an analysis of the error in an attempt to ascertain the validity of the confidence intervals predicted by MC21. Finally, we look forward to the prospect of full core 3-D Monte Carlo depletion by scoping out the required problem size. This study provides an initial data point for the Hoogenboom-Martin benchmark model using a state-of-the-art Monte Carlo code. (authors)

  2. Computing sextic centrifugal distortion constants by DFT: A benchmark analysis on halogenated compounds

    Science.gov (United States)

    Pietropolli Charmet, Andrea; Stoppa, Paolo; Tasinato, Nicola; Giorgianni, Santi

    2017-05-01

    This work presents a benchmark study on the calculation of the sextic centrifugal distortion constants employing cubic force fields computed by means of density functional theory (DFT). For a set of semi-rigid halogenated organic compounds several functionals (B2PLYP, B3LYP, B3PW91, M06, M06-2X, O3LYP, X3LYP, ωB97XD, CAM-B3LYP, LC-ωPBE, PBE0, B97-1 and B97-D) were used for computing the sextic centrifugal distortion constants. The effects related to the size of basis sets and the performances of hybrid approaches, where the harmonic data obtained at higher level of electronic correlation are coupled with cubic force constants yielded by DFT functionals, are presented and discussed. The predicted values were compared to both the available data published in the literature and those obtained by calculations carried out at increasing level of electronic correlation: Hartree-Fock Self Consistent Field (HF-SCF), second order Møller-Plesset perturbation theory (MP2), and coupled-cluster single and double (CCSD) level of theory. Different hybrid approaches, having the cubic force field computed at DFT level of theory coupled to harmonic data computed at increasing level of electronic correlation (up to CCSD level of theory augmented by a perturbational estimate of the effects of connected triple excitations, CCSD(T)) were considered. The obtained results demonstrate that they can represent reliable and computationally affordable methods to predict sextic centrifugal terms with an accuracy almost comparable to that yielded by the more expensive anharmonic force fields fully computed at MP2 and CCSD levels of theory. In view of their reduced computational cost, these hybrid approaches pave the route to the study of more complex systems.

  3. Benchmarking the Netherlands. Benchmarking for growth

    International Nuclear Information System (INIS)

    2003-01-01

    This is the fourth edition of the Ministry of Economic Affairs' publication 'Benchmarking the Netherlands', which aims to assess the competitiveness of the Dutch economy. The methodology and objective of the benchmarking remain the same. The basic conditions for economic activity (institutions, regulation, etc.) in a number of benchmark countries are compared in order to learn from the solutions found by other countries for common economic problems. This publication is devoted entirely to the potential output of the Dutch economy. In other words, its ability to achieve sustainable growth and create work over a longer period without capacity becoming an obstacle. This is important because economic growth is needed to increase prosperity in the broad sense and meeting social needs. Prosperity in both a material (per capita GDP) and immaterial (living environment, environment, health, etc) sense, in other words. The economy's potential output is determined by two structural factors: the growth of potential employment and the structural increase in labour productivity. Analysis by the Netherlands Bureau for Economic Policy Analysis (CPB) shows that in recent years the increase in the capacity for economic growth has been realised mainly by increasing the supply of labour and reducing the equilibrium unemployment rate. In view of the ageing of the population in the coming years and decades the supply of labour is unlikely to continue growing at the pace we have become accustomed to in recent years. According to a number of recent studies, to achieve a respectable rate of sustainable economic growth the aim will therefore have to be to increase labour productivity. To realise this we have to focus on for six pillars of economic policy: (1) human capital, (2) functioning of markets, (3) entrepreneurship, (4) spatial planning, (5) innovation, and (6) sustainability. These six pillars determine the course for economic policy aiming at higher productivity growth. Throughout

  4. Benchmarking the Netherlands. Benchmarking for growth

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-01-01

    This is the fourth edition of the Ministry of Economic Affairs' publication 'Benchmarking the Netherlands', which aims to assess the competitiveness of the Dutch economy. The methodology and objective of the benchmarking remain the same. The basic conditions for economic activity (institutions, regulation, etc.) in a number of benchmark countries are compared in order to learn from the solutions found by other countries for common economic problems. This publication is devoted entirely to the potential output of the Dutch economy. In other words, its ability to achieve sustainable growth and create work over a longer period without capacity becoming an obstacle. This is important because economic growth is needed to increase prosperity in the broad sense and meeting social needs. Prosperity in both a material (per capita GDP) and immaterial (living environment, environment, health, etc) sense, in other words. The economy's potential output is determined by two structural factors: the growth of potential employment and the structural increase in labour productivity. Analysis by the Netherlands Bureau for Economic Policy Analysis (CPB) shows that in recent years the increase in the capacity for economic growth has been realised mainly by increasing the supply of labour and reducing the equilibrium unemployment rate. In view of the ageing of the population in the coming years and decades the supply of labour is unlikely to continue growing at the pace we have become accustomed to in recent years. According to a number of recent studies, to achieve a respectable rate of sustainable economic growth the aim will therefore have to be to increase labour productivity. To realise this we have to focus on for six pillars of economic policy: (1) human capital, (2) functioning of markets, (3) entrepreneurship, (4) spatial planning, (5) innovation, and (6) sustainability. These six pillars determine the course for economic policy aiming at higher productivity

  5. Multilaboratory particle image velocimetry analysis of the FDA benchmark nozzle model to support validation of computational fluid dynamics simulations.

    Science.gov (United States)

    Hariharan, Prasanna; Giarra, Matthew; Reddy, Varun; Day, Steven W; Manning, Keefe B; Deutsch, Steven; Stewart, Sandy F C; Myers, Matthew R; Berman, Michael R; Burgreen, Greg W; Paterson, Eric G; Malinauskas, Richard A

    2011-04-01

    This study is part of a FDA-sponsored project to evaluate the use and limitations of computational fluid dynamics (CFD) in assessing blood flow parameters related to medical device safety. In an interlaboratory study, fluid velocities and pressures were measured in a nozzle model to provide experimental validation for a companion round-robin CFD study. The simple benchmark nozzle model, which mimicked the flow fields in several medical devices, consisted of a gradual flow constriction, a narrow throat region, and a sudden expansion region where a fluid jet exited the center of the nozzle with recirculation zones near the model walls. Measurements of mean velocity and turbulent flow quantities were made in the benchmark device at three independent laboratories using particle image velocimetry (PIV). Flow measurements were performed over a range of nozzle throat Reynolds numbers (Re(throat)) from 500 to 6500, covering the laminar, transitional, and turbulent flow regimes. A standard operating procedure was developed for performing experiments under controlled temperature and flow conditions and for minimizing systematic errors during PIV image acquisition and processing. For laminar (Re(throat)=500) and turbulent flow conditions (Re(throat)≥3500), the velocities measured by the three laboratories were similar with an interlaboratory uncertainty of ∼10% at most of the locations. However, for the transitional flow case (Re(throat)=2000), the uncertainty in the size and the velocity of the jet at the nozzle exit increased to ∼60% and was very sensitive to the flow conditions. An error analysis showed that by minimizing the variability in the experimental parameters such as flow rate and fluid viscosity to less than 5% and by matching the inlet turbulence level between the laboratories, the uncertainties in the velocities of the transitional flow case could be reduced to ∼15%. The experimental procedure and flow results from this interlaboratory study (available

  6. Social Analysis Systems (SAS2) - Phase III

    International Development Research Centre (IDRC) Digital Library (Canada)

    Scaling Up the International Impact of Action Research : Social Analysis ... up the international impact of action research : SAS phase 3; final technical report ... 000 Canadians abroad to work at the local level on various development issues.

  7. Topological analysis of nuclear pasta phases

    Science.gov (United States)

    Kycia, Radosław A.; Kubis, Sebastian; Wójcik, Włodzimierz

    2017-08-01

    In this article the analysis of the result of numerical simulations of pasta phases using algebraic topology methods is presented. These considerations suggest that some phases can be further split into subphases and therefore should be more refined in numerical simulations. The results presented in this article can also be used to relate the Euler characteristic from numerical simulations to the geometry of the phases. The Betti numbers are used as they provide finer characterization of the phases. It is also shown that different boundary conditions give different outcomes.

  8. Benchmarking in Foodservice Operations

    National Research Council Canada - National Science Library

    Johnson, Bonnie

    1998-01-01

    The objective of this study was to identify usage of foodservice performance measures, important activities in foodservice benchmarking, and benchmarking attitudes, beliefs, and practices by foodservice directors...

  9. OECD/DOE/CEA VVER-1000 Coolant Transient Benchmark. Summary Record of the First Workshop (V1000-CT1)

    International Nuclear Information System (INIS)

    2003-01-01

    The first workshop for the VVER-1000 Coolant Transient Benchmark TT Benchmark was hosted by the Commissariat a l'Energie Atomique, Centre d'Etudes de Saclay, France. The V1000CT benchmark defines standard problems for validation of coupled three-dimensional (3-D) neutron-kinetics/system thermal-hydraulics codes for application to Soviet-designed VVER-1000 reactors using actual plant data without any scaling. The overall objective is to access computer codes used in the safety analysis of VVER power plants, specifically for their use in reactivity transient simulations in a VVER-1000. The V1000CT benchmark consists of two phases: V1000CT-1 - simulation of the switching on of one main coolant pump (MCP) while the other three MCP are in operation, and V1000CT- 2 - calculation of coolant mixing tests and Main Steam Line Break (MSLB) scenario. Further background information on this benchmark can be found at the OECD/NEA benchmark web site . The purpose of the first workshop was to review the benchmark activities after the Starter Meeting held last year in Dresden, Germany: to discuss the participants' feedback and modifications introduced in the Benchmark Specifications on Phase 1; to present and to discuss modelling issues and preliminary results from the three exercises of Phase 1; to discuss the modelling issues of Exercise 1 of Phase 2; and to define work plan and schedule in order to complete the two phases

  10. Comparative analysis of results between CASMO, MCNP and Serpent for a suite of Benchmark problems on BWR reactors; Analisis comparativo de resultados entre CASMO, MCNP y SERPENT para una suite de problemas Benchmark en reactores BWR

    Energy Technology Data Exchange (ETDEWEB)

    Xolocostli M, J. V.; Vargas E, S.; Gomez T, A. M. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Reyes F, M. del C.; Del Valle G, E., E-mail: vicente.xolocostli@inin.gob.mx [IPN, Escuela Superior de Fisica y Matematicas, UP - Adolfo Lopez Mateos, Edif. 9, 07738 Mexico D. F. (Mexico)

    2014-10-15

    In this paper a comparison is made in analyzing the suite of Benchmark problems for reactors type BWR between CASMO-4, MCNP6 and Serpent code. The Benchmark problem consists of two different geometries: a fuel cell of a pin and assembly type BWR. To facilitate the study of reactors physics in the fuel pin their nuclear characteristics are provided to detail, such as burnt dependence, the reactivity of selected nuclide, etc. With respect to the fuel assembly, the presented results are regarding to infinite multiplication factor for burning different steps and different vacuum conditions. Making the analysis of this set of Benchmark problems provides comprehensive test problems for the next fuels generation of BWR reactors with high extended burned. It is important to note that when making this comparison the purpose is to validate the methodologies used in modeling for different operating conditions, if the case is of other BWR assembly. The results will be within a range with some uncertainty, considering that does not depend on code that is used. Escuela Superior de Fisica y Matematicas of Instituto Politecnico Nacional (IPN (Mexico) has accumulated some experience in using Serpent, due to the potential of this code over other commercial codes such as CASMO and MCNP. The obtained results for the infinite multiplication factor are encouraging and motivate the studies to continue with the generation of the X S of a core to a next step a respective nuclear data library is constructed and this can be used by codes developed as part of the development project of the Mexican Analysis Platform of Nuclear Reactors AZTLAN. (Author)

  11. CEC thermal-hydraulic benchmark exercise on Fiploc verification experiment F2 in Battelle model containment long-term heat-up phase. Results for phase I

    International Nuclear Information System (INIS)

    Fischer, K.; Schall, M.; Wolf, L.

    1991-01-01

    The major objective of the F2 experiment was to investigate the thermal-hydraulic long-term phenomena with special emphasis on natural convection phenomena in a loop-type geometry affected by variations of steam and air injections at different locations as well as dry energy supply into various compartments. The open post-test exercise is being performed in two consecutive phases, with Phase I covering the initial long-term heat-up phase. The exercise received widespread international attention with nine organizations from six European countries participating with seven different computer codes (FUMO, Jericho2, Fiploc, Wavco, Contain, Melcor, Cobra/Fathoms). These codes cover a broad spectrum of presently known European computational tools in severe accident containment analyses. The participants used either the specified mass flow or pressure control boundary conditions. Some exercised their codes for both. In total, 14 different computations were officially provided by the participants indicating strong interests and cooperative efforts by various institutions

  12. 2D Thermal Hydraulic Analysis and Benchmark in Support of HFIR LEU Conversion using COMSOL

    Energy Technology Data Exchange (ETDEWEB)

    Freels, James D [ORNL; Bodey, Isaac T [ORNL; Lowe, Kirk T [ORNL; Arimilli, Rao V [ORNL

    2010-09-01

    The research documented herein was funded by a research contract between the Research Reactors Division (RRD) of Oak Ridge National Laboratory (ORNL) and the University of Tennessee, Knoxville (UTK) Mechanical, Aerospace and Biomedical Engineering Department (MABE). The research was governed by a statement of work (SOW) which clearly defines nine specific tasks. This report is outlined to follow and document the results of each of these nine specific tasks. The primary goal of this phase of the research is to demonstrate, through verification and validation methods, that COMSOL is a viable simulation tool for thermal-hydraulic modeling of the High Flux Isotope Reactor (HFIR) core. A secondary goal of this two-dimensional phase of the research is to establish methodology and data base libraries that are also needed in the full three-dimensional COMSOL simulation to follow. COMSOL version 3.5a was used for all of the models presented throughout this report.

  13. On the importance of adjusting for distorting factors in benchmarking analysis, as illustrated by a cost comparison of the different forms of implementation of the EU Packaging Directive.

    Science.gov (United States)

    Baum, Heinz-Georg; Schuch, Dieter

    2017-12-01

    Benchmarking is a proven and widely used business tool for identifying best practice. To produce robust results, the objects of comparison used in benchmarking analysis need to be structurally comparable and distorting factors need to be eliminated. We focus on a specific example - a benchmark study commissioned by the European Commission's Directorate-General for Environment on the implementation of Extended Producer Responsibility (EPR) for packaging at the national level - to discuss potential distorting factors and take them into account in the calculation. The cost of compliance per inhabitant and year, which is used as the key cost efficiency indicator in the study, is adjusted to take account of seven factors. The results clearly show that differences in performance may play a role, but the (legal) implementation of EPR - which is highly heterogeneous across countries - is the single most important cost determinant and must be taken into account to avoid misinterpretation and false conclusions.

  14. A Global Vision over Benchmarking Process: Benchmarking Based Enterprises

    OpenAIRE

    Sitnikov, Catalina; Giurca Vasilescu, Laura

    2008-01-01

    Benchmarking uses the knowledge and the experience of others to improve the enterprise. Starting from the analysis of the performance and underlying the strengths and weaknesses of the enterprise it should be assessed what must be done in order to improve its activity. Using benchmarking techniques, an enterprise looks at how processes in the value chain are performed. The approach based on the vision “from the whole towards the parts” (a fragmented image of the enterprise’s value chain) redu...

  15. A BENCHMARKING ANALYSIS FOR FIVE RADIONUCLIDE VADOSE ZONE MODELS (CHAIN, MULTIMED_DP, FECTUZ, HYDRUS, AND CHAIN 2D) IN SOIL SCREENING LEVEL CALCULATIONS

    Science.gov (United States)

    Five radionuclide vadose zone models with different degrees of complexity (CHAIN, MULTIMED_DP, FECTUZ, HYDRUS, and CHAIN 2D) were selected for use in soil screening level (SSL) calculations. A benchmarking analysis between the models was conducted for a radionuclide (99Tc) rele...

  16. Benchmarking and application of the state-of-the-art uncertainty analysis methods XSUSA and SHARK-X

    International Nuclear Information System (INIS)

    Aures, A.; Bostelmann, F.; Hursin, M.; Leray, O.

    2017-01-01

    Highlights: • Application of the uncertainty analysis methods XSUSA and SHARK-X. • Propagation of nuclear data uncertainty through PWR pin cell depletion calculation. • Uncertainty quantification of eigenvalue, nuclide densities and Doppler coefficient. • Top contributor to overall output uncertainty by sensitivity analysis. • Comparison with SAMPLER and TSUNAMI of the SCALE code package. - Abstract: This study presents collaborative work performed between GRS and PSI on benchmarking and application of the state-of-the-art uncertainty analysis methods XSUSA and SHARK-X. Applied to a PWR pin cell depletion calculation, both methods propagate input uncertainty from nuclear data to output uncertainty. The uncertainty of the multiplication factors, nuclide densities, and fuel temperature coefficients derived by both methods are compared at various burnup steps. Comparisons of these quantities are furthermore performed with the SAMPLER module of SCALE 6.2. The perturbation theory based TSUNAMI module of both SCALE 6.1 and SCALE 6.2 is additionally applied for comparisons of the reactivity coefficient.

  17. Steam generator automated eddy current data analysis: A benchmarking study. Final report

    International Nuclear Information System (INIS)

    Brown, S.D.

    1998-12-01

    The eddy current examination of steam generator tubes is a very demanding process. Challenges include: complex signal analysis, massive amount of data to be reviewed quickly with extreme precision and accuracy, shortages of data analysts during peak periods, and the desire to reduce examination costs. One method to address these challenges is by incorporating automation into the data analysis process. Specific advantages, which automated data analysis has the potential to provide, include the ability to analyze data more quickly, consistently and accurately than can be performed manually. Also, automated data analysis can potentially perform the data analysis function with significantly smaller levels of analyst staffing. Despite the clear advantages that an automated data analysis system has the potential to provide, no automated system has been produced and qualified that can perform all of the functions that utility engineers demand. This report investigates the current status of automated data analysis, both at the commercial and developmental level. A summary of the various commercial and developmental data analysis systems is provided which includes the signal processing methodologies used and, where available, the performance data obtained for each system. Also, included in this report is input from seventeen research organizations regarding the actions required and obstacles to be overcome in order to bring automatic data analysis from the laboratory into the field environment. In order to provide assistance with ongoing and future research efforts in the automated data analysis arena, the most promising approaches to signal processing are described in this report. These approaches include: wavelet applications, pattern recognition, template matching, expert systems, artificial neural networks, fuzzy logic, case based reasoning and genetic algorithms. Utility engineers and NDE researchers can use this information to assist in developing automated data

  18. RETRAN-3D analysis of the base case and the four extreme cases of the OECD/NRC Peach Bottom 2 Turbine Trip benchmark

    International Nuclear Information System (INIS)

    Barten, Werner; Coddington, Paul; Ferroukhi, Hakim

    2006-01-01

    This paper presents the results of RETRAN-3D calculations of the base case and the four extreme cases of phase 3 of the Peach Bottom 2 OECD/NRC Turbine Trip benchmark for coupled thermal-hydraulic and neutronic codes. The PSI-RETRAN-3D model gives good agreement with the measured data of the base case. In addition to the base case, the analysis of the extreme cases provides a further understanding of the reactor behaviour, which is the result of the dynamic coupling of the whole system, i.e., the interaction between the steam line and vessel flows, the pressure, the Doppler, void and control reactivity and power. For the extreme cases without scram the bank of safety relief valves is able to mitigate the effects of the turbine trip for short times. The 3-D nature of the core power distribution has been investigated by analysing the power density of the different thermal-hydraulic channels. In all cases prior to the reactor scram the course of the power is similar in all the channels with differences of the order of a few percent showing that, by and large, the core acts in a coherent manner. At the time of maximum power, the axial power distribution in the different channels is increased at the core centre with respect to the distribution at time zero, by an amount, which is different for the different channels

  19. Software safety analysis practice in installation phase

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H. W.; Chen, M. H.; Shyu, S. S., E-mail: hwhwang@iner.gov.t [Institute of Nuclear Energy Research, No. 1000 Wenhua Road, Chiaan Village, Longtan Township, 32546 Taoyuan County, Taiwan (China)

    2010-10-15

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  20. Software safety analysis practice in installation phase

    International Nuclear Information System (INIS)

    Huang, H. W.; Chen, M. H.; Shyu, S. S.

    2010-10-01

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  1. Benchmarking and Performance Measurement.

    Science.gov (United States)

    Town, J. Stephen

    This paper defines benchmarking and its relationship to quality management, describes a project which applied the technique in a library context, and explores the relationship between performance measurement and benchmarking. Numerous benchmarking methods contain similar elements: deciding what to benchmark; identifying partners; gathering…

  2. Benchmarking Data Analysis and Machine Learning Applications on the Intel KNL Many-Core Processor

    OpenAIRE

    Byun, Chansup; Kepner, Jeremy; Arcand, William; Bestor, David; Bergeron, Bill; Gadepally, Vijay; Houle, Michael; Hubbell, Matthew; Jones, Michael; Klein, Anna; Michaleas, Peter; Milechin, Lauren; Mullen, Julie; Prout, Andrew; Rosa, Antonio

    2017-01-01

    Knights Landing (KNL) is the code name for the second-generation Intel Xeon Phi product family. KNL has generated significant interest in the data analysis and machine learning communities because its new many-core architecture targets both of these workloads. The KNL many-core vector processor design enables it to exploit much higher levels of parallelism. At the Lincoln Laboratory Supercomputing Center (LLSC), the majority of users are running data analysis applications such as MATLAB and O...

  3. Efficiency Analysis of European Freight Villages-Three Peers for Benchmarking

    OpenAIRE

    Yang, Congcong; Taudes, Alfred; Dong, Guozhi

    2015-01-01

    Measuring the performance of Freight Villages (FVs) has important implications for logistics companies and other related companies as well as governments. In this paper we apply Data Envelopment Analysis (DEA) to measure the performance of European FVs in a purely data-driven way incorporating the nature of FVs as complex operations that use multiple inputs and produce several outputs. We employ several DEA models and perform a complete sensitivity analysis of the appropriateness of the chose...

  4. Benchmark calculation of SCALE-PC 4.3 CSAS6 module and burnup credit criticality analysis

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Hee Sung; Ro, Seong Gy; Shin, Young Joon; Kim, Ik Soo [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-12-01

    Calculation biases of SCALE-PC CSAS6 module for PWR spent fuel, metallized spent fuel and solution of nuclear materials have been determined on the basis of the benchmark to be 0.01100, 0.02650 and 0.00997, respectively. With the aid of the code system, nuclear criticality safety analysis for the spent fuel storage pool has been carried out to determine the minimum burnup of spent fuel required for safe storage. The criticality safety analysis is performed using three types of isotopic composition of spent fuel: ORIGEN2-calculated isotopic compositions; the conservative inventory obtained from the multiplication of ORIGEN2-calculated isotopic compositions by isotopic correction factors; the conservative inventory of only U, Pu and {sup 241}Am. The results show that the minimum burnup for three cases are 990,6190 and 7270 MWd/tU, respectively in the case of 5.0 wt% initial enriched spent fuel. (author). 74 refs., 68 figs., 35 tabs.

  5. Benchmarking the performance of fixed-image receptor digital radiographic systems part 1: a novel method for image quality analysis.

    Science.gov (United States)

    Lee, Kam L; Ireland, Timothy A; Bernardo, Michael

    2016-06-01

    This is the first part of a two-part study in benchmarking the performance of fixed digital radiographic general X-ray systems. This paper concentrates on reporting findings related to quantitative analysis techniques used to establish comparative image quality metrics. A systematic technical comparison of the evaluated systems is presented in part two of this study. A novel quantitative image quality analysis method is presented with technical considerations addressed for peer review. The novel method was applied to seven general radiographic systems with four different makes of radiographic image receptor (12 image receptors in total). For the System Modulation Transfer Function (sMTF), the use of grid was found to reduce veiling glare and decrease roll-off. The major contributor in sMTF degradation was found to be focal spot blurring. For the System Normalised Noise Power Spectrum (sNNPS), it was found that all systems examined had similar sNNPS responses. A mathematical model is presented to explain how the use of stationary grid may cause a difference between horizontal and vertical sNNPS responses.

  6. Phase Image Analysis in Conduction Disturbance Patients

    International Nuclear Information System (INIS)

    Kwark, Byeng Su; Choi, Si Wan; Kang, Seung Sik; Park, Ki Nam; Lee, Kang Wook; Jeon, Eun Seok; Park, Chong Hun

    1994-01-01

    It is known that the normal His-Purkinje system provides for nearly synchronous activation of right (RV) and left (LV) ventricles. When His-Purkinje conduction is abnormal, the resulting sequence of ventricular contraction must be correspondingly abnormal. These abnormal mechanical consequences were difficult to demonstrate because of the complexity and the rapidity of its events. To determine the relationship of the phase changes and the abnormalities of ventricular conduction, we performed phase image analysis of Tc-RBC gated blood pool scintigrams in patients with intraventricular conduction disturbances (24 complete left bundle branch block (C-LBBB), 15 complete right bundle branch block (C-RBBB), 13 Wolff-Parkinson-White syndrome (WPW), 10 controls). The results were as follows; 1) The ejection fraction (EF), peak ejection rate (PER), and peak filling rate (PFR) of LV in gated blood pool scintigraphy (GBPS) were significantly lower in patients with C-LBBB than in controls (44.4 ± 13.9% vs 69.9 ± 4.2%, 2.48 ± 0.98 vs 3.51 ± 0,62, 1.76 ± 0.71 vs 3.38 ± 0.92, respectively, p<0.05). 2) In the phase angle analysis of LV, Standard deviation (SD), width of half maximum of phase angle (FWHM), and range of phase angle were significantly increased in patients with C-LBBB than in controls (20.6 + 18.1 vs S.6 + I.8, 22. 5 + 9.2 vs 16.0 + 3.9, 95.7 + 31.7 vs 51.3 + 5.4, respectively, p<0.05). 3) There was no significant difference in EF, PER, PFR between patients with the WolffParkinson-White syndrome and controls. 4) Standard deviation and range of phase angle were significantly higher in patients with WPW syndrome than in controls (10.6 + 2.6 vs 8.6 + 1.8, p<0.05, 69.8 + 11.7 vs 51.3 + 5 4, p<0.001, respectively), however, there was no difference between the two groups in full width of half maximum. 5) Phase image analysis revealed relatively uniform phase across the both ventriles in patients with normal conduction, but markedly delayed phase in the left ventricle

  7. Phase Image Analysis in Conduction Disturbance Patients

    Energy Technology Data Exchange (ETDEWEB)

    Kwark, Byeng Su; Choi, Si Wan; Kang, Seung Sik; Park, Ki Nam; Lee, Kang Wook; Jeon, Eun Seok; Park, Chong Hun [Chung Nam University Hospital, Daejeon (Korea, Republic of)

    1994-03-15

    It is known that the normal His-Purkinje system provides for nearly synchronous activation of right (RV) and left (LV) ventricles. When His-Purkinje conduction is abnormal, the resulting sequence of ventricular contraction must be correspondingly abnormal. These abnormal mechanical consequences were difficult to demonstrate because of the complexity and the rapidity of its events. To determine the relationship of the phase changes and the abnormalities of ventricular conduction, we performed phase image analysis of Tc-RBC gated blood pool scintigrams in patients with intraventricular conduction disturbances (24 complete left bundle branch block (C-LBBB), 15 complete right bundle branch block (C-RBBB), 13 Wolff-Parkinson-White syndrome (WPW), 10 controls). The results were as follows; 1) The ejection fraction (EF), peak ejection rate (PER), and peak filling rate (PFR) of LV in gated blood pool scintigraphy (GBPS) were significantly lower in patients with C-LBBB than in controls (44.4 +- 13.9% vs 69.9 +- 4.2%, 2.48 +- 0.98 vs 3.51 +- 0,62, 1.76 +- 0.71 vs 3.38 +- 0.92, respectively, p<0.05). 2) In the phase angle analysis of LV, Standard deviation (SD), width of half maximum of phase angle (FWHM), and range of phase angle were significantly increased in patients with C-LBBB than in controls (20.6 + 18.1 vs S.6 + I.8, 22. 5 + 9.2 vs 16.0 + 3.9, 95.7 + 31.7 vs 51.3 + 5.4, respectively, p<0.05). 3) There was no significant difference in EF, PER, PFR between patients with the WolffParkinson-White syndrome and controls. 4) Standard deviation and range of phase angle were significantly higher in patients with WPW syndrome than in controls (10.6 + 2.6 vs 8.6 + 1.8, p<0.05, 69.8 + 11.7 vs 51.3 + 5 4, p<0.001, respectively), however, there was no difference between the two groups in full width of half maximum. 5) Phase image analysis revealed relatively uniform phase across the both ventriles in patients with normal conduction, but markedly delayed phase in the left ventricle

  8. Benchmarking in the Netherlands

    International Nuclear Information System (INIS)

    1999-01-01

    In two articles an overview is given of the activities in the Dutch industry and energy sector with respect to benchmarking. In benchmarking operational processes of different competitive businesses are compared to improve your own performance. Benchmark covenants for energy efficiency between the Dutch government and industrial sectors contribute to a growth of the number of benchmark surveys in the energy intensive industry in the Netherlands. However, some doubt the effectiveness of the benchmark studies

  9. Software safety analysis application in installation phase

    International Nuclear Information System (INIS)

    Huang, H. W.; Yih, S.; Wang, L. H.; Liao, B. C.; Lin, J. M.; Kao, T. M.

    2010-01-01

    This work performed a software safety analysis (SSA) in the installation phase of the Lungmen nuclear power plant (LMNPP) in Taiwan, under the cooperation of INER and TPC. The US Nuclear Regulatory Commission (USNRC) requests licensee to perform software safety analysis (SSA) and software verification and validation (SV and V) in each phase of software development life cycle with Branch Technical Position (BTP) 7-14. In this work, 37 safety grade digital instrumentation and control (I and C) systems were analyzed by Failure Mode and Effects Analysis (FMEA), which is suggested by IEEE Standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The FMEA showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (authors)

  10. Results of the event sequence reliability benchmark exercise

    International Nuclear Information System (INIS)

    Silvestri, E.

    1990-01-01

    The Event Sequence Reliability Benchmark Exercise is the fourth of a series of benchmark exercises on reliability and risk assessment, with specific reference to nuclear power plant applications, and is the logical continuation of the previous benchmark exercises on System Analysis Common Cause Failure and Human Factors. The reference plant is the Nuclear Power Plant at Grohnde Federal Republic of Germany a 1300 MW PWR plant of KWU design. The specific objective of the Exercise is to model, to quantify and to analyze such event sequences initiated by the occurrence of a loss of offsite power that involve the steam generator feed. The general aim is to develop a segment of a risk assessment, which ought to include all the specific aspects and models of quantification, such as common canal failure, Human Factors and System Analysis, developed in the previous reliability benchmark exercises, with the addition of the specific topics of dependences between homologous components belonging to different systems featuring in a given event sequence and of uncertainty quantification, to end up with an overall assessment of: - the state of the art in risk assessment and the relative influences of quantification problems in a general risk assessment framework. The Exercise has been carried out in two phases, both requiring modelling and quantification, with the second phase adopting more restrictive rules and fixing certain common data, as emerged necessary from the first phase. Fourteen teams have participated in the Exercise mostly from EEC countries, with one from Sweden and one from the USA. (author)

  11. Analysis of mineral phases in coal utilizing factor analysis

    International Nuclear Information System (INIS)

    Roscoe, B.A.; Hopke, P.K.

    1982-01-01

    The mineral phase inclusions of coal are discussed. The contribution of these to a coal sample are determined utilizing several techniques. Neutron activation analysis in conjunction with coal washability studies have produced some information on the general trends of elemental variation in the mineral phases. These results have been enhanced by the use of various statistical techniques. The target transformation factor analysis is specifically discussed and shown to be able to produce elemental profiles of the mineral phases in coal. A data set consisting of physically fractionated coal samples was generated. These samples were analyzed by neutron activation analysis and then their elemental concentrations examined using TTFA. Information concerning the mineral phases in coal can thus be acquired from factor analysis even with limited data. Additional data may permit the resolution of additional mineral phases as well as refinement of theose already identified

  12. Meta-analysis for quantitative microbiological risk assessments and benchmarking data

    NARCIS (Netherlands)

    Besten, den H.M.W.; Zwietering, M.H.

    2012-01-01

    Meta-analysis studies are increasingly being conducted in the food microbiology area to quantitatively integrate the findings of many individual studies on specific questions or kinetic parameters of interest. Meta-analyses provide global estimates of parameters and quantify their variabilities, and

  13. Review of staff training plans Licensed Benchmarking on task analysis and selection of learning environments

    International Nuclear Information System (INIS)

    Iglesias Moran, J.

    2013-01-01

    The purpose of this paper is to present the findings and possible improvement actions taken after a work of technical exchange with U.S. Surry nuclear power plant. The visit focused on the study of the methodology for the analysis and design of training programs according to the standards of INPO.

  14. FENDL neutronics benchmark: Specifications for the calculational neutronics and shielding benchmark

    International Nuclear Information System (INIS)

    Sawan, M.E.

    1994-12-01

    During the IAEA Advisory Group Meeting on ''Improved Evaluations and Integral Data Testing for FENDL'' held in Garching near Munich, Germany in the period 12-16 September 1994, the Working Group II on ''Experimental and Calculational Benchmarks on Fusion Neutronics for ITER'' recommended that a calculational benchmark representative of the ITER design should be developed. This report describes the neutronics and shielding calculational benchmark available for scientists interested in performing analysis for this benchmark. (author)

  15. Performance of dental impression materials: Benchmarking of materials and techniques by three-dimensional analysis.

    Science.gov (United States)

    Rudolph, Heike; Graf, Michael R S; Kuhn, Katharina; Rupf-Köhler, Stephanie; Eirich, Alfred; Edelmann, Cornelia; Quaas, Sebastian; Luthardt, Ralph G

    2015-01-01

    Among other factors, the precision of dental impressions is an important and determining factor for the fit of dental restorations. The aim of this study was to examine the three-dimensional (3D) precision of gypsum dies made using a range of impression techniques and materials. Ten impressions of a steel canine were fabricated for each of the 24 material-method-combinations and poured with type 4 die stone. The dies were optically digitized, aligned to the CAD model of the steel canine, and 3D differences were calculated. The results were statistically analyzed using one-way analysis of variance. Depending on material and impression technique, the mean values had a range between +10.9/-10.0 µm (SD 2.8/2.3) and +16.5/-23.5 µm (SD 11.8/18.8). Qualitative analysis using colorcoded graphs showed a characteristic location of deviations for different impression techniques. Three-dimensional analysis provided a comprehensive picture of the achievable precision. Processing aspects and impression technique were of significant influence.

  16. Implementation of Extended Statistical Entropy Analysis to the Effluent Quality Index of the Benchmarking Simulation Model No. 2

    Directory of Open Access Journals (Sweden)

    Alicja P. Sobańtka

    2014-01-01

    Full Text Available Extended statistical entropy analysis (eSEA is used to assess the nitrogen (N removal performance of the wastewater treatment (WWT simulation software, the Benchmarking Simulation Model No. 2 (BSM No. 2 . Six simulations with three different types of wastewater are carried out, which vary in the dissolved oxygen concentration (O2,diss. during the aerobic treatment. N2O emissions generated during denitrification are included in the model. The N-removal performance is expressed as reduction in statistical entropy, ΔH, compared to the hypothetical reference situation of direct discharge of the wastewater into the river. The parameters chemical and biological oxygen demand (COD, BOD and suspended solids (SS are analogously expressed in terms of reduction of COD, BOD, and SS, compared to a direct discharge of the wastewater to the river (ΔEQrest. The cleaning performance is expressed as ΔEQnew, the weighted average of ΔH and ΔEQrest. The results show that ΔEQnew is a more comprehensive indicator of the cleaning performance because, in contrast to the traditional effluent quality index (EQ, it considers the characteristics of the wastewater, includes all N-compounds and their distribution in the effluent, the off-gas, and the sludge. Furthermore, it is demonstrated that realistically expectable N2O emissions have only a moderate impact on ΔEQnew.

  17. Measuring and Benchmarking Technical Efficiency of Public Hospitals in Tianjin, China: A Bootstrap-Data Envelopment Analysis Approach.

    Science.gov (United States)

    Li, Hao; Dong, Siping

    2015-01-01

    China has long been stuck in applying traditional data envelopment analysis (DEA) models to measure technical efficiency of public hospitals without bias correction of efficiency scores. In this article, we have introduced the Bootstrap-DEA approach from the international literature to analyze the technical efficiency of public hospitals in Tianjin (China) and tried to improve the application of this method for benchmarking and inter-organizational learning. It is found that the bias corrected efficiency scores of Bootstrap-DEA differ significantly from those of the traditional Banker, Charnes, and Cooper (BCC) model, which means that Chinese researchers need to update their DEA models for more scientific calculation of hospital efficiency scores. Our research has helped shorten the gap between China and the international world in relative efficiency measurement and improvement of hospitals. It is suggested that Bootstrap-DEA be widely applied into afterward research to measure relative efficiency and productivity of Chinese hospitals so as to better serve for efficiency improvement and related decision making. © The Author(s) 2015.

  18. Extended analysis of benchmark datasets for Agilent two-color microarrays

    Directory of Open Access Journals (Sweden)

    Kerr Kathleen F

    2007-10-01

    Full Text Available Abstract Background As part of its broad and ambitious mission, the MicroArray Quality Control (MAQC project reported the results of experiments using External RNA Controls (ERCs on five microarray platforms. For most platforms, several different methods of data processing were considered. However, there was no similar consideration of different methods for processing the data from the Agilent two-color platform. While this omission is understandable given the scale of the project, it can create the false impression that there is consensus about the best way to process Agilent two-color data. It is also important to consider whether ERCs are representative of all the probes on a microarray. Results A comparison of different methods of processing Agilent two-color data shows substantial differences among methods for low-intensity genes. The sensitivity and specificity for detecting differentially expressed genes varies substantially for different methods. Analysis also reveals that the ERCs in the MAQC data only span the upper half of the intensity range, and therefore cannot be representative of all genes on the microarray. Conclusion Although ERCs demonstrate good agreement between observed and expected log-ratios on the Agilent two-color platform, such an analysis is incomplete. Simple loess normalization outperformed data processing with Agilent's Feature Extraction software for accurate identification of differentially expressed genes. Results from studies using ERCs should not be over-generalized when ERCs are not representative of all probes on a microarray.

  19. Energy Benchmarking in Educational Buildings through Cluster Analysis of Energy Retrofitting

    Directory of Open Access Journals (Sweden)

    Paola Marrone

    2018-03-01

    Full Text Available A large part of the stock of Italian educational buildings have undertaken energy retrofit interventions, thanks to European funds allocated by complex technical-administrative calls. In these projects, the suggested retrofit strategies are often selected based on the common best practices (considering average energy savings but are not supported by proper energy investigations. In this paper, Italian school buildings’ stock was analyzed by cluster analysis with the aim of providing a methodology able to identify the best energy retrofit interventions from the perspective of cost-benefit, and to correlate them with the specific characteristics of the educational buildings. This research is based on the analysis of about 80 school buildings located in central Italy and characterized by different features and construction technologies. The refurbished buildings were classified in homogeneous clusters and, for each of them, the most representative building was identified. Furthermore, for each representative building a validating procedure based on dynamic simulations and a comparison with actual energy use was performed. The two buildings thus singled out provide a model that could be developed into a useful tool for Public Administrations to suggest priorities in the planning of new energy retrofits of existing school building stocks.

  20. SIX SIGMA BENCHMARKING OF PROCESS CAPABILITY ANALYSIS AND MAPPING OF PROCESS PARAMETERS

    Directory of Open Access Journals (Sweden)

    Jagadeesh Rajashekharaiah

    2016-12-01

    Full Text Available Inventory classification aims to ensure that business-driving inventory items are efficiently managed in spite of constrained resources. There are numerous single- and multiple-criteria approaches to it. We compare several approaches using a subset of a large spare parts inventory data. Our objective is to improve resource allocation leading to focus on items that can lead to high equipment availability. This concern is typical of many service industries such as military logistics, airlines, amusement parks and public works. We find that a modified multi-criteria weighted non-linear optimization (WNO technique is a powerful approach for classifying inventory, far outperforming traditional techniques such as ABC analysis and other methods available in the literature.

  1. Benchmark Simulation for the Development of the Regulatory Audit Subchannel Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, G. H.; Song, C.; Woo, S. W. [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2012-05-15

    For the safe and reliable operation of a reactor, it is important to predict accurately the flow and temperature distributions in the thermal-hydraulic design of a reactor core. A subchannel approach can give the reasonable flow and temperature distributions with the short computing time. Korea Institute of Nuclear Safety (KINS) is presently reviewing new subchannel code, THALES, which will substitute for both THINC-IV and TORC code. To assess the prediction performance of THALES, KINS is developing the subchannel analysis code for the independent audit calculation. The code is based on workstation version of COBRA-IV-I. The main objective of the present study is to assess the performance of COBRA-IV-I code by comparing the simulation results with experimental ones for the sample problems

  2. Benchmarking the scientific research on wastewater-energy nexus by using bibliometric analysis.

    Science.gov (United States)

    Zheng, Tianlong; Li, Pengyu; Shi, Zhining; Liu, Jianguo

    2017-12-01

    With an exponential increase in urbanization and industrialization, water pollution is an inevitable consequence of relatively lagging wastewater treatment facilities. The conventional activated sludge process for wastewater treatment primarily emphasizes the removal of harmful substances to maintain increasingly stringent effluent discharged standards, which is considered an energy-intensive technique. Therefore, innovative and sustainable wastewater treatment should pay more attention to energy and resource recovery in dealing with fossil fuel depletion, global-scale energy security, and climate change. A bibliometric analysis was applied to trace wastewater-energy nexus-related research during the period 1991 to 2015, with respect to the Science Citation Index EXPANDED (SCI-EXPANDED) database. Journal of Hazardous Materials, ranking 1st in h-index (79), was the most productive journal (431, 4.5%) during the same time, followed by International Journal of Hydrogen Energy (422, 4.4%) and Water Research (393, 4.1%) journal, the latter owning a topmost journal impact factor. Though, China (2154, 22.5%) was the most productive country, while the USA with highest h-index (88) was the favorest collaborative country. The Chinese Academy of Sciences, China (241, 2.5%) produced the maximum publications. A novel method called "word cluster analysis" showed that the emerging sustainable processes and novel renewable energy application are applied in response to the desire for a net wastewater-energy nexus system. Based on different wastewater types, the emerging energy and sources recovery treatment processes of Anammox, anaerobic digestion, and microbial fuel cells gained extensive innovation. Evaluation indicators including sustainability, life cycle assessment, and environmental impact were appropriately used to dissert feasibility of the novel treatment methods in regard of renewable energy utilization, energy savings, and energy recovery. The transformation of the new

  3. Waveguide Phased Array Antenna Analysis and Synthesis

    NARCIS (Netherlands)

    Visser, H.J.; Keizer, W.P.M.N.

    1996-01-01

    Results of two software packages for analysis and synthesis of waveguide phased array antennas are shown. The antennas consist of arrays of open-ended waveguides where irises can be placed in the waveguide apertures and multiple dielectric sheets in front of the apertures in order to accomplish a

  4. Improving energy productivity in paddy production through benchmarking-An application of data envelopment analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chauhan, Narvendra Singh [Department of Agronomy, Uttar Banga Krishi Viswavidyalaya, P.O. Pundibari, District Cooch Behar (West Bengal) 736 165 (India)]. E-mail: nsc_01@rediffmail.com; Mohapatra, Pratap K.J. [Department of Industrial Engineering and Management, Indian Institute of Technology, Kharagpur (West Bengal) 721 302 (India); Pandey, Keshaw Prasad [Department of Agricultural and Food Engineering, Indian Institute of Technology, Kharagpur (West Bengal) 721 302 (India)

    2006-06-15

    In this study, a data envelopment analysis approach has been used to determine the efficiencies of farmers with regard to energy use in rice production activities in the alluvial zone in the state of West Bengal in India. The study has helped to segregate efficient farmers from inefficient ones, identify wasteful uses of energy from different sources by inefficient farmers and to suggest reasonable savings in energy uses from different sources. The methods of cross efficiency matrix and distribution of virtual inputs are used to get insights into the performance of individual farmers, rank efficient farmers and identify the improved operating practices followed by a group of truly efficient farmers. The results reveal that, on an average, about 11.6% of the total input energy could be saved if the farmers follow the input package recommended by the study. The study also suggests that better use of power tillers and introduction of improved machinery would improve the efficiency of energy use and thereby improve the energy productivity of the rice production system in the zone.

  5. Benchmarking the energy performance of office buildings: A data envelopment analysis approach

    Directory of Open Access Journals (Sweden)

    Molinos-Senante, María

    2016-12-01

    Full Text Available The achievement of energy efficiency in buildings is an important challenge facing both developed and developing countries. Very few papers have assessed the energy efficiency of office buildings using real data. To overcome this limitation, this paper proposes an energy efficiency index for buildings having a large window-to-wall ratio, and uses this index to identify the main architectural factors affecting energy performance. This paper assesses, for the first time, the energy performances of 34 office buildings in Santiago, Chile, by using data envelopment analysis. Overall energy efficiency is decomposed into two indices: the architectural energy efficiency index, and the management energy efficiency index. This decomposition is an essential step in identifying the main drivers of energy inefficiency and designing measures for improvement. Office buildings examined here have significant room for improving their energy efficiencies, saving operational costs and reducing greenhouse gas emissions. The methodology and results of this study will be of great interest to building managers and policymakers seeking to increase the sustainability of cities.

  6. Improving energy productivity in paddy production through benchmarking-An application of data envelopment analysis

    International Nuclear Information System (INIS)

    Chauhan, Narvendra Singh; Mohapatra, Pratap K.J.; Pandey, Keshaw Prasad

    2006-01-01

    In this study, a data envelopment analysis approach has been used to determine the efficiencies of farmers with regard to energy use in rice production activities in the alluvial zone in the state of West Bengal in India. The study has helped to segregate efficient farmers from inefficient ones, identify wasteful uses of energy from different sources by inefficient farmers and to suggest reasonable savings in energy uses from different sources. The methods of cross efficiency matrix and distribution of virtual inputs are used to get insights into the performance of individual farmers, rank efficient farmers and identify the improved operating practices followed by a group of truly efficient farmers. The results reveal that, on an average, about 11.6% of the total input energy could be saved if the farmers follow the input package recommended by the study. The study also suggests that better use of power tillers and introduction of improved machinery would improve the efficiency of energy use and thereby improve the energy productivity of the rice production system in the zone

  7. Hyperspectral image analysis for rapid and accurate discrimination of bacterial infections: A benchmark study.

    Science.gov (United States)

    Arrigoni, Simone; Turra, Giovanni; Signoroni, Alberto

    2017-09-01

    With the rapid diffusion of Full Laboratory Automation systems, Clinical Microbiology is currently experiencing a new digital revolution. The ability to capture and process large amounts of visual data from microbiological specimen processing enables the definition of completely new objectives. These include the direct identification of pathogens growing on culturing plates, with expected improvements in rapid definition of the right treatment for patients affected by bacterial infections. In this framework, the synergies between light spectroscopy and image analysis, offered by hyperspectral imaging, are of prominent interest. This leads us to assess the feasibility of a reliable and rapid discrimination of pathogens through the classification of their spectral signatures extracted from hyperspectral image acquisitions of bacteria colonies growing on blood agar plates. We designed and implemented the whole data acquisition and processing pipeline and performed a comprehensive comparison among 40 combinations of different data preprocessing and classification techniques. High discrimination performance has been achieved also thanks to improved colony segmentation and spectral signature extraction. Experimental results reveal the high accuracy and suitability of the proposed approach, driving the selection of most suitable and scalable classification pipelines and stimulating clinical validations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Cannabis publication analysis using density-equalising mapping and research output benchmarking

    Directory of Open Access Journals (Sweden)

    B H Vogelzang

    2010-12-01

    Full Text Available Background. Cannabis has been a topic of political and medical controversy in many countries over the past century. Although many publications on this topic are available, there is currently no comprehensive evaluation of global research activities in the field. Objective. This study was conducted in order to provide a quantitative and qualitative analysis of the worldwide research output on cannabis. Methods. In a quantitative approach, items concerning cannabis published between 1900 and 2008 were retrieved from the ISI Web of Science databases developed by the Thompson Institute of Scientific Information and analysed using scientometric methods. In a second step, research fields of growing interest were identified. Results. We found that publications on this topic increased during the late 1960s, as well as during the period 1990 - 2008. We noted that South Africa was one of the countries with a high research output, having published numerous articles on cannabis. A comparison of cannabis with other drugs (e.g. alcohol, tobacco, cocaine and heroin showed that in relation to the proportion of respective drug users, cocaine and heroin are overly represented in terms of research output. When analysing the main subjects of the publications, psychiatry was prominent, especially with regard to research on psychosis. Conclusion. There is increasing interest in research on cannabis. The research only partially reflects the drug’s importance with regard to number of users.

  9. Depression and Suicide Publication Analysis, Using Density Equalizing Mapping and Output Benchmarking

    Science.gov (United States)

    Vogelzang, B. H.; Scutaru, C.; Mache, S.; Vitzthum, K.; Quarcoo, David; Groneberg, D. A.

    2011-01-01

    Background: Depression is a major cause of suicide worldwide. This association has been reflected by numerous scientific publications reporting about studies to this theme. There is currently no overall evaluation of the global research activities in this field. Aim: The aim of the current study was to analyze long-term developments and recent research trends in this area. Material and Methods: We searched the Web of Science databases developed by the Thompson Institute of Scientific Information for items concerning depression and suicide published between 1900 and 2007 and analyzed the results using scientometric methods and density-equalizing calculations. Results: We found that publications on this topic increased dramatically in the time period 1990 to 2007. The comparison of the different Journals showed that the Archives of General Psychiatry had the highest average citation rate (more than twice that of any other Journal). When comparing authors, we found that not all the authors who had high h-indexes cooperated much with other authors. The analysis of countries who published papers on this topic showed that they published papers in relation to their Gross Domestic Product and Purchasing Power Parity. Among the G8 countries, Russia had the highest male suicide rate in 1999 (more than twice that of any of the other G8 countries), despite having published least papers and cooperating least with other countries among the G8. Conclusion: We conclude that, although there has been an increase in publications on this topic from 1990 to 2006, this increase is of a lower gradient than that of psoriasis and rheumatoid arthritis. PMID:22021955

  10. Computational advances in transition phase analysis

    International Nuclear Information System (INIS)

    Morita, K.; Kondo, S.; Tobita, Y.; Shirakawa, N.; Brear, D.J.; Fischer, E.A.

    1994-01-01

    In this paper, historical perspective and recent advances are reviewed on computational technologies to evaluate a transition phase of core disruptive accidents in liquid-metal fast reactors. An analysis of the transition phase requires treatment of multi-phase multi-component thermohydraulics coupled with space- and energy-dependent neutron kinetics. Such a comprehensive modeling effort was initiated when the program of SIMMER-series computer code development was initiated in the late 1970s in the USA. Successful application of the latest SIMMER-II in USA, western Europe and Japan have proved its effectiveness, but, at the same time, several areas that require further research have been identified. Based on the experience and lessons learned during the SIMMER-II application through 1980s, a new project of SIMMER-III development is underway at the Power Reactor and Nuclear Fuel Development Corporation (PNC), Japan. The models and methods of SIMMER-III are briefly described with emphasis on recent advances in multi-phase multi-component fluid dynamics technologies and their expected implication on a future reliable transition phase analysis. (author)

  11. Benchmarks for GADRAS performance validation

    International Nuclear Information System (INIS)

    Mattingly, John K.; Mitchell, Dean James; Rhykerd, Charles L. Jr.

    2009-01-01

    The performance of the Gamma Detector Response and Analysis Software (GADRAS) was validated by comparing GADRAS model results to experimental measurements for a series of benchmark sources. Sources for the benchmark include a plutonium metal sphere, bare and shielded in polyethylene, plutonium oxide in cans, a highly enriched uranium sphere, bare and shielded in polyethylene, a depleted uranium shell and spheres, and a natural uranium sphere. The benchmark experimental data were previously acquired and consist of careful collection of background and calibration source spectra along with the source spectra. The calibration data were fit with GADRAS to determine response functions for the detector in each experiment. A one-dimensional model (pie chart) was constructed for each source based on the dimensions of the benchmark source. The GADRAS code made a forward calculation from each model to predict the radiation spectrum for the detector used in the benchmark experiment. The comparisons between the GADRAS calculation and the experimental measurements are excellent, validating that GADRAS can correctly predict the radiation spectra for these well-defined benchmark sources.

  12. Benchmarking in Czech Higher Education

    Directory of Open Access Journals (Sweden)

    Plaček Michal

    2015-12-01

    Full Text Available The first part of this article surveys the current experience with the use of benchmarking at Czech universities specializing in economics and management. The results indicate that collaborative benchmarking is not used on this level today, but most actors show some interest in its introduction. The expression of the need for it and the importance of benchmarking as a very suitable performance-management tool in less developed countries are the impetus for the second part of our article. Based on an analysis of the current situation and existing needs in the Czech Republic, as well as on a comparison with international experience, recommendations for public policy are made, which lie in the design of a model of a collaborative benchmarking for Czech economics and management in higher-education programs. Because the fully complex model cannot be implemented immediately – which is also confirmed by structured interviews with academics who have practical experience with benchmarking –, the final model is designed as a multi-stage model. This approach helps eliminate major barriers to the implementation of benchmarking.

  13. Benchmarking lithium amide versus amine bonding by charge density and energy decomposition analysis arguments.

    Science.gov (United States)

    Engelhardt, Felix; Maaß, Christian; Andrada, Diego M; Herbst-Irmer, Regine; Stalke, Dietmar

    2018-03-28

    Lithium amides are versatile C-H metallation reagents with vast industrial demand because of their high basicity combined with their weak nucleophilicity, and they are applied in kilotons worldwide annually. The nuclearity of lithium amides, however, modifies and steers reactivity, region- and stereo-selectivity and product diversification in organic syntheses. In this regard, it is vital to understand Li-N bonding as it causes the aggregation of lithium amides to form cubes or ladders from the polar Li-N covalent metal amide bond along the ring stacking and laddering principle. Deaggregation, however, is more governed by the Li←N donor bond to form amine adducts. The geometry of the solid state structures already suggests that there is σ- and π-contribution to the covalent bond. To quantify the mutual influence, we investigated [{(Me 2 NCH 2 ) 2 (C 4 H 2 N)}Li] 2 ( 1 ) by means of experimental charge density calculations based on the quantum theory of atoms in molecules (QTAIM) and DFT calculations using energy decomposition analysis (EDA). This new approach allows for the grading of electrostatic Li + N - , covalent Li-N and donating Li←N bonding, and provides a way to modify traditional widely-used heuristic concepts such as the -I and +I inductive effects. The electron density ρ ( r ) and its second derivative, the Laplacian ∇ 2 ρ ( r ), mirror the various types of bonding. Most remarkably, from the topological descriptors, there is no clear separation of the lithium amide bonds from the lithium amine donor bonds. The computed natural partial charges for lithium are only +0.58, indicating an optimal density supply from the four nitrogen atoms, while the Wiberg bond orders of about 0.14 au suggest very weak bonding. The interaction energy between the two pincer molecules, (C 4 H 2 N) 2 2- , with the Li 2 2+ moiety is very strong ( ca. -628 kcal mol -1 ), followed by the bond dissociation energy (-420.9 kcal mol -1 ). Partitioning the interaction energy

  14. Swiss electricity grid - Benchmarking pilot project

    International Nuclear Information System (INIS)

    2001-01-01

    This article is a short version of the ENET number 210369. This report for the Swiss Federal Office of Energy (SFOE) describes a benchmarking pilot project carried out as a second phase in the development of a formula for the regulation of an open electricity market in Switzerland. It follows on from an initial phase involving the definition of a 'blue print' and a basic concept. The aims of the pilot project - to check out the practicability of the concept - are discussed. The collection of anonymised data for the benchmarking model from over 30 electricity utilities operating on all 7 Swiss grid levels and their integration in the three areas 'Technology', 'Grid Costs' and 'Capital Invested' are discussed in detail. In particular, confidentiality and data protection aspects are looked at. The methods used in the analysis of the data are described and the results of an efficiency analysis of various utilities are presented. The report is concluded with a listing of questions concerning data collection and analysis as well as operational and capital costs that are still to be answered

  15. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  16. Criteria of benchmark selection for efficient flexible multibody system formalisms

    Directory of Open Access Journals (Sweden)

    Valášek M.

    2007-10-01

    Full Text Available The paper deals with the selection process of benchmarks for testing and comparing efficient flexible multibody formalisms. The existing benchmarks are briefly summarized. The purposes for benchmark selection are investigated. The result of this analysis is the formulation of the criteria of benchmark selection for flexible multibody formalisms. Based on them the initial set of suitable benchmarks is described. Besides that the evaluation measures are revised and extended.

  17. Nondestructive Damage Assessment of Composite Structures Based on Wavelet Analysis of Modal Curvatures: State-of-the-Art Review and Description of Wavelet-Based Damage Assessment Benchmark

    Directory of Open Access Journals (Sweden)

    Andrzej Katunin

    2015-01-01

    Full Text Available The application of composite structures as elements of machines and vehicles working under various operational conditions causes degradation and occurrence of damage. Considering that composites are often used for responsible elements, for example, parts of aircrafts and other vehicles, it is extremely important to maintain them properly and detect, localize, and identify the damage occurring during their operation in possible early stage of its development. From a great variety of nondestructive testing methods developed to date, the vibration-based methods seem to be ones of the least expensive and simultaneously effective with appropriate processing of measurement data. Over the last decades a great popularity of vibration-based structural testing has been gained by wavelet analysis due to its high sensitivity to a damage. This paper presents an overview of results of numerous researchers working in the area of vibration-based damage assessment supported by the wavelet analysis and the detailed description of the Wavelet-based Structural Damage Assessment (WavStructDamAs Benchmark, which summarizes the author’s 5-year research in this area. The benchmark covers example problems of damage identification in various composite structures with various damage types using numerous wavelet transforms and supporting tools. The benchmark is openly available and allows performing the analysis on the example problems as well as on its own problems using available analysis tools.

  18. Benchmarking of nuclear economics tools

    International Nuclear Information System (INIS)

    Moore, Megan; Korinny, Andriy; Shropshire, David; Sadhankar, Ramesh

    2017-01-01

    Highlights: • INPRO and GIF economic tools exhibited good alignment in total capital cost estimation. • Subtle discrepancies in the cost result from differences in financing and the fuel cycle assumptions. • A common set of assumptions was found to reduce the discrepancies to 1% or less. • Opportunities for harmonisation of economic tools exists. - Abstract: Benchmarking of the economics methodologies developed by the Generation IV International Forum (GIF) and the International Atomic Energy Agency’s International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO), was performed for three Generation IV nuclear energy systems. The Economic Modeling Working Group of GIF developed an Excel based spreadsheet package, G4ECONS (Generation 4 Excel-based Calculation Of Nuclear Systems), to calculate the total capital investment cost (TCIC) and the levelised unit energy cost (LUEC). G4ECONS is sufficiently generic in the sense that it can accept the types of projected input, performance and cost data that are expected to become available for Generation IV systems through various development phases and that it can model both open and closed fuel cycles. The Nuclear Energy System Assessment (NESA) Economic Support Tool (NEST) was developed to enable an economic analysis using the INPRO methodology to easily calculate outputs including the TCIC, LUEC and other financial figures of merit including internal rate of return, return of investment and net present value. NEST is also Excel based and can be used to evaluate nuclear reactor systems using the open fuel cycle, MOX (mixed oxide) fuel recycling and closed cycles. A Super Critical Water-cooled Reactor system with an open fuel cycle and two Fast Reactor systems, one with a break-even fuel cycle and another with a burner fuel cycle, were selected for the benchmarking exercise. Published data on capital and operating costs were used for economics analyses using G4ECONS and NEST tools. Both G4ECONS and

  19. Aquatic Life Benchmarks

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Aquatic Life Benchmarks is an EPA-developed set of criteria for freshwater species. These benchmarks are based on toxicity values reviewed by EPA and used in the...

  20. New Multi-group Transport Neutronics (PHISICS) Capabilities for RELAP5-3D and its Application to Phase I of the OECD/NEA MHTGR-350 MW Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Gerhard Strydom; Cristian Rabiti; Andrea Alfonsi

    2012-10-01

    PHISICS is a neutronics code system currently under development at the Idaho National Laboratory (INL). Its goal is to provide state of the art simulation capability to reactor designers. The different modules for PHISICS currently under development are a nodal and semi-structured transport core solver (INSTANT), a depletion module (MRTAU) and a cross section interpolation (MIXER) module. The INSTANT module is the most developed of the mentioned above. Basic functionalities are ready to use, but the code is still in continuous development to extend its capabilities. This paper reports on the effort of coupling the nodal kinetics code package PHISICS (INSTANT/MRTAU/MIXER) to the thermal hydraulics system code RELAP5-3D, to enable full core and system modeling. This will enable the possibility to model coupled (thermal-hydraulics and neutronics) problems with more options for 3D neutron kinetics, compared to the existing diffusion theory neutron kinetics module in RELAP5-3D (NESTLE). In the second part of the paper, an overview of the OECD/NEA MHTGR-350 MW benchmark is given. This benchmark has been approved by the OECD, and is based on the General Atomics 350 MW Modular High Temperature Gas Reactor (MHTGR) design. The benchmark includes coupled neutronics thermal hydraulics exercises that require more capabilities than RELAP5-3D with NESTLE offers. Therefore, the MHTGR benchmark makes extensive use of the new PHISICS/RELAP5-3D coupling capabilities. The paper presents the preliminary results of the three steady state exercises specified in Phase I of the benchmark using PHISICS/RELAP5-3D.

  1. Analysis of Three-Phase Rectifier Systems with Controlled DC-Link Current Under Unbalanced Grids

    DEFF Research Database (Denmark)

    Kumar, Dinesh; Davari, Pooya; Zare, Firuz

    2017-01-01

    Voltage unbalance is the most common disturbance in distribution networks, which give undesirable effects on many grid connected power electronics systems including Adjustable Speed Drive (ASD). Severe voltage unbalance can force three-phase rectifiers into almost single-phase operation, which...... degrades the grid power quality and also imposes a significant negative impact on the ASD system. This major power quality issue affecting the conventional rectifiers can be attenuated by controlling the DC-link current based on an Electronic Inductor (EI) technique. The purpose of this digest...... is to analyze and compare the performance of an EI with a conventional three-phase rectifier under unbalanced grid conditions. Experimental and simulation results validate the proposed mathematical modelling. Further analysis and benchmarking will be provided in the final paper....

  2. Benchmarking for Higher Education.

    Science.gov (United States)

    Jackson, Norman, Ed.; Lund, Helen, Ed.

    The chapters in this collection explore the concept of benchmarking as it is being used and developed in higher education (HE). Case studies and reviews show how universities in the United Kingdom are using benchmarking to aid in self-regulation and self-improvement. The chapters are: (1) "Introduction to Benchmarking" (Norman Jackson…

  3. Proposal and analysis of the benchmark problem suite for reactor physics study of LWR next generation fuels

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-10-01

    In order to investigate the calculation accuracy of the nuclear characteristics of LWR next generation fuels, the Research Committee on Reactor Physics organized by JAERI has established the Working Party on Reactor Physics for LWR Next Generation Fuels. The next generation fuels mean the ones aiming for further extended burn-up such as 70 GWd/t over the current design. The Working Party has proposed six benchmark problems, which consists of pin-cell, PWR fuel assembly and BWR fuel assembly geometries loaded with uranium and MOX fuels, respectively. The specifications of the benchmark problem neglect some of the current limitations such as 5 wt% {sup 235}U to achieve the above-mentioned target. Eleven organizations in the Working Party have carried out the analyses of the benchmark problems. As a result, status of accuracy with the current data and method and some problems to be solved in the future were clarified. In this report, details of the benchmark problems, result by each organization, and their comparisons are presented. (author)

  4. Analysis of the pool critical assembly benchmark using raptor-M3G, a parallel deterministic radiation transport code - 289

    International Nuclear Information System (INIS)

    Fischer, G.A.

    2010-01-01

    The PCA Benchmark is analyzed using RAPTOR-M3G, a parallel SN radiation transport code. A variety of mesh structures, angular quadrature sets, cross section treatments, and reactor dosimetry cross sections are presented. The results show that RAPTOR-M3G is generally suitable for PWR neutron dosimetry applications. (authors)

  5. Benchmarking semantic web technology

    CERN Document Server

    García-Castro, R

    2009-01-01

    This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:

  6. Benchmarking energy performance of residential buildings using two-stage multifactor data envelopment analysis with degree-day based simple-normalization approach

    International Nuclear Information System (INIS)

    Wang, Endong; Shen, Zhigang; Alp, Neslihan; Barry, Nate

    2015-01-01

    Highlights: • Two-stage DEA model is developed to benchmark building energy efficiency. • Degree-day based simple normalization is used to neutralize the climatic noise. • Results of a real case study validated the benefits of this new model. - Abstract: Being able to identify detailed meta factors of energy performance is essential for creating effective residential energy-retrofitting strategies. Compared to other benchmarking methods, nonparametric multifactor DEA (data envelopment analysis) is capable of discriminating scale factors from management factors to reveal more details to better guide retrofitting practices. A two-stage DEA energy benchmarking method is proposed in this paper. This method includes (1) first-stage meta DEA which integrates the common degree day metrics for neutralizing noise energy effects of exogenous climatic variables; and (2) second-stage Tobit regression for further detailed efficiency analysis. A case study involving 3-year longitudinal panel data of 189 residential buildings indicated the proposed method has advantages over existing methods in terms of its efficiency in data processing and results interpretation. The results of the case study also demonstrated high consistency with existing linear regression based DEA.

  7. The Concepts "Benchmarks and Benchmarking" Used in Education Planning: Teacher Education as Example

    Science.gov (United States)

    Steyn, H. J.

    2015-01-01

    Planning in education is a structured activity that includes several phases and steps that take into account several kinds of information (Steyn, Steyn, De Waal & Wolhuter, 2002: 146). One of the sets of information that are usually considered is the (so-called) "benchmarks" and "benchmarking" regarding the focus of a…

  8. Quantitative phase analysis by neutron diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Song, Su Ho; Lee, Jin Ho; Shim, Hae Seop [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-06-01

    This study is to apply quantitative phase analysis (QPA) by neutron diffraction to the round robin samples provided by the International Union of Crystallography(IUCr). We measured neutron diffraction patterns for mixed samples which have several different weight percentages and their unique characteristic features. Neutron diffraction method has been known to be superior to its complementary methods such as X-ray or Synchrotron, but it is still accepted as highly reliable under limited conditions or samples. Neutron diffraction has strong capability especially on oxides due to its scattering cross-section of the oxygen and it can become a more strong tool for analysis on the industrial materials with this quantitative phase analysis techniques. By doing this study, we hope not only to do one of instrument performance tests on our HRPD but also to improve our ability on the analysis of neutron diffraction data by comparing our QPA results with others from any advanced reactor facilities. 14 refs., 4 figs., 6 tabs. (Author)

  9. Analysis of the ITER computational shielding benchmark with the Monte Carlo TRIPOLI-4{sup ®} neutron gamma coupled calculations

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yi-Kang, E-mail: yi-kang.lee@cea.fr

    2016-11-01

    Highlights: • Verification and validation of TRIPOLI-4 radiation transport calculations for ITER shielding benchmark. • Evaluation of CEA-V5.1.1 and FENDL-3.0 nuclear data libraries on D–T fusion neutron continuous energy transport calculations. • Advances in nuclear analyses for nuclear heating and radiation damage in iron. • This work also demonstrates that the “safety factors” concept is necessary in the nuclear analyses of ITER. - Abstract: With the growing interest in using the continuous-energy TRIPOLI-4{sup ®} Monte Carlo radiation transport code for ITER applications, a key issue that arises is whether or not the released TRIPOLI-4 code and its associated nuclear data libraries are verified and validated for the D–T fusion neutronics calculations. Previous published benchmark results of TRIPOLI-4 code on the ITER related activities have concentrated on the first wall loading, the reactor dosimetry, the nuclear heating, and the tritium breeding ratio. To enhance the TRIPOLI-4 verification and validation on neutron-gamma coupled calculations for fusion device application, the computational ITER shielding benchmark of M. E. Sawan was performed in this work by using the 2013 released TRIPOLI-4.9S code and the associated CEA-V5.1.1 data library. First wall, blanket, vacuum vessel and toroidal field magnet of the inboard and outboard components were fully modelled in this 1-D toroidal cylindrical benchmark. The 14.1 MeV source neutrons were sampled from a uniform isotropic distribution in the plasma zone. Nuclear responses including neutron and gamma fluxes, nuclear heating, and material damage indicator were benchmarked against previous published results. The capabilities of the TRIPOLI-4 code on the evaluation of above physics parameters were presented. The nuclear data library from the new FENDL-3.0 evaluation was also benchmarked against the CEA-V5.1.1 results for the neutron transport calculations. The results show that both data libraries

  10. Analysis of the ITER computational shielding benchmark with the Monte Carlo TRIPOLI-4® neutron gamma coupled calculations

    International Nuclear Information System (INIS)

    Lee, Yi-Kang

    2016-01-01

    Highlights: • Verification and validation of TRIPOLI-4 radiation transport calculations for ITER shielding benchmark. • Evaluation of CEA-V5.1.1 and FENDL-3.0 nuclear data libraries on D–T fusion neutron continuous energy transport calculations. • Advances in nuclear analyses for nuclear heating and radiation damage in iron. • This work also demonstrates that the “safety factors” concept is necessary in the nuclear analyses of ITER. - Abstract: With the growing interest in using the continuous-energy TRIPOLI-4 ® Monte Carlo radiation transport code for ITER applications, a key issue that arises is whether or not the released TRIPOLI-4 code and its associated nuclear data libraries are verified and validated for the D–T fusion neutronics calculations. Previous published benchmark results of TRIPOLI-4 code on the ITER related activities have concentrated on the first wall loading, the reactor dosimetry, the nuclear heating, and the tritium breeding ratio. To enhance the TRIPOLI-4 verification and validation on neutron-gamma coupled calculations for fusion device application, the computational ITER shielding benchmark of M. E. Sawan was performed in this work by using the 2013 released TRIPOLI-4.9S code and the associated CEA-V5.1.1 data library. First wall, blanket, vacuum vessel and toroidal field magnet of the inboard and outboard components were fully modelled in this 1-D toroidal cylindrical benchmark. The 14.1 MeV source neutrons were sampled from a uniform isotropic distribution in the plasma zone. Nuclear responses including neutron and gamma fluxes, nuclear heating, and material damage indicator were benchmarked against previous published results. The capabilities of the TRIPOLI-4 code on the evaluation of above physics parameters were presented. The nuclear data library from the new FENDL-3.0 evaluation was also benchmarked against the CEA-V5.1.1 results for the neutron transport calculations. The results show that both data libraries can be

  11. Strategic behaviour under regulatory benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Jamasb, T. [Cambridge Univ. (United Kingdom). Dept. of Applied Economics; Nillesen, P. [NUON NV (Netherlands); Pollitt, M. [Cambridge Univ. (United Kingdom). Judge Inst. of Management

    2004-09-01

    In order to improve the efficiency of electricity distribution networks, some regulators have adopted incentive regulation schemes that rely on performance benchmarking. Although regulation benchmarking can influence the ''regulation game,'' the subject has received limited attention. This paper discusses how strategic behaviour can result in inefficient behaviour by firms. We then use the Data Envelopment Analysis (DEA) method with US utility data to examine implications of illustrative cases of strategic behaviour reported by regulators. The results show that gaming can have significant effects on the measured performance and profitability of firms. (author)

  12. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 5B. Experience data. Working material

    International Nuclear Information System (INIS)

    1996-01-01

    The Co-ordinated research programme on the benchmark study for the seismic analysis and testing of WWER-type nuclear power plants was initiated subsequent to the request from representatives of Member States. The conclusions adopted at the Technical Committee Meeting on Seismic Issues related to existing nuclear power plants held in Tokyo in 1991 called for the harmonization of methods and criteria used in Member States in issues related to seismic safety. The Consulltants' Meeting which followed resulted in producing a working document for CRP. It was decided that a benchmark study is the most effective way to achieve the principal objective. Two types of WWER reactors (WWER-440/213 and WWER-1000) were selected as prototypes for the benchmark exercise to be tested on a full scale using explosions and/or vibration generators. The two prototypes are Kozloduy Units 5/6 for WWER-1000 and Paks for WWER-440/213 nuclear power plants. This volume of Working material contains reports on the effects of Armenia earthquakes on selected power, industry and commercial facilities and seismic functional qualification of active mechanical and electrical components tested on shaking table

  13. Benchmarking in digital circuit design automation

    NARCIS (Netherlands)

    Jozwiak, L.; Gawlowski, D.M.; Slusarczyk, A.S.

    2008-01-01

    This paper focuses on benchmarking, which is the main experimental approach to the design method and EDA-tool analysis, characterization and evaluation. We discuss the importance and difficulties of benchmarking, as well as the recent research effort related to it. To resolve several serious

  14. Benchmarked Library Websites Comparative Study

    KAUST Repository

    Ramli, Rindra M.; Tyhurst, Janis

    2015-01-01

    This presentation provides an analysis of services provided by the benchmarked library websites. The exploratory study includes comparison of these websites against a list of criterion and presents a list of services that are most commonly deployed by the selected websites. In addition to that, the investigators proposed a list of services that could be provided via the KAUST library website.

  15. Compilation of MCNP data library based on JENDL-3T and test through analysis of benchmark experiment

    International Nuclear Information System (INIS)

    Sakurai, K.; Sasamoto, N.; Kosako, K.; Ishikawa, T.; Sato, O.; Oyama, Y.; Narita, H.; Maekawa, H.; Ueki, K.

    1989-01-01

    Based on an evaluated nuclear data library JENDL-3T, a temporary version of JENDL-3, a pointwise neutron cross section library for MCNP code is compiled which involves 39 nuclides from H-1 to Am-241 which are important for shielding calculations. Compilation is performed with the code system which consists of the nuclear data processing code NJOY-83 and library compilation code MACROS. Validity of the code system and reliability of the library are certified by analysing benchmark experiments. (author)

  16. Neutronics analysis of the International Thermonuclear Experimental Reactor (ITER) MCNP ''Benchmark CAD Model'' with the ATTILA discrete ordinance code

    International Nuclear Information System (INIS)

    Youssef, M.Z.; Feder, R.; Davis, I.

    2007-01-01

    The ITER IT has adopted the newly developed FEM, 3-D, and CAD-based Discrete Ordinates code, ATTILA for the neutronics studies contingent on its success in predicting key neutronics parameters and nuclear field according to the stringent QA requirements set forth by the Management and Quality Program (MQP). ATTILA has the advantage of providing a full flux and response functions mapping everywhere in one run where components subjected to excessive radiation level and strong streaming paths can be identified. The ITER neutronics community had agreed to use a standard CAD model of ITER (40 degree sector, denoted ''Benchmark CAD Model'') to compare results for several responses selected for calculation benchmarking purposes to test the efficiency and accuracy of the CAD-MCNP approach developed by each party. Since ATTILA seems to lend itself as a powerful design tool with minimal turnaround time, it was decided to benchmark this model with ATTILA as well and compare the results to those obtained with the CAD MCNP calculations. In this paper we report such comparison for five responses, namely: (1) Neutron wall load on the surface of the 18 shield blanket module (SBM), (2) Neutron flux and nuclear heating rate in the divertor cassette, (3) nuclear heating rate in the winding pack of the inner leg of the TF coil, (4) Radial flux profile across dummy port plug and shield plug placed in the equatorial port, and (5) Flux at seven point locations situated behind the equatorial port plug. (orig.)

  17. Analysis of NEA-NSC PWR Uncontrolled Control Rod Withdrawal at Zero Power Benchmark Cases with NODAL3 Code

    Directory of Open Access Journals (Sweden)

    Tagor Malem Sembiring

    2017-01-01

    Full Text Available The in-house coupled neutronic and thermal-hydraulic (N/T-H code of BATAN (National Nuclear Energy Agency of Indonesia, NODAL3, based on the few-group neutron diffusion equation in 3-dimensional geometry using the polynomial nodal method, has been verified with static and transient PWR benchmark cases. This paper reports the verification of NODAL3 code in the NEA-NSC PWR uncontrolled control rods withdrawal at zero power benchmark. The objective of this paper is to determine the accuracy of NODAL3 code in solving the continuously slow and fast reactivity insertions due to single and group of control rod bank withdrawn while the power and temperature increment are limited by the Doppler coefficient. The benchmark is chosen since many organizations participated using various methods and approximations, so the calculation results of NODAL3 can be compared to other codes’ results. The calculated parameters are performed for the steady-state, transient core averaged, and transient hot pellet results. The influence of radial and axial nodes number was investigated for all cases. The results of NODAL3 code are in very good agreement with the reference solutions if the radial and axial nodes number is 2 × 2 and 2 × 18 (total axial layers, respectively.

  18. Validation of CENDL and JEFF evaluated nuclear data files for TRIGA calculations through the analysis of integral parameters of TRX and BAPL benchmark lattices of thermal reactors

    International Nuclear Information System (INIS)

    Uddin, M.N.; Sarker, M.M.; Khan, M.J.H.; Islam, S.M.A.

    2009-01-01

    The aim of this paper is to present the validation of evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 through the analysis of the integral parameters of TRX and BAPL benchmark lattices of thermal reactors for neutronics analysis of TRIGA Mark-II Research Reactor at AERE, Bangladesh. In this process, the 69-group cross-section library for lattice code WIMS was generated using the basic evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 with the help of nuclear data processing code NJOY99.0. Integral measurements on the thermal reactor lattices TRX-1, TRX-2, BAPL-UO 2 -1, BAPL-UO 2 -2 and BAPL-UO 2 -3 served as standard benchmarks for testing nuclear data files and have also been selected for this analysis. The integral parameters of the said lattices were calculated using the lattice transport code WIMSD-5B based on the generated 69-group cross-section library. The calculated integral parameters were compared to the measured values as well as the results of Monte Carlo Code MCNP. It was found that in most cases, the values of integral parameters show a good agreement with the experiment and MCNP results. Besides, the group constants in WIMS format for the isotopes U-235 and U-238 between two data files have been compared using WIMS library utility code WILLIE and it was found that the group constants are identical with very insignificant difference. Therefore, this analysis reflects the validation of evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 through benchmarking the integral parameters of TRX and BAPL lattices and can also be essential to implement further neutronic analysis of TRIGA Mark-II research reactor at AERE, Dhaka, Bangladesh.

  19. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 5C. Experience data. Working material

    International Nuclear Information System (INIS)

    1999-01-01

    In August 1991, following the SMiRT-11 Conference in Tokyo, a Technical Committee Meeting was held on the 'Seismic safety issues relating to existing NPPs'. The Proceedings of this TCM was subsequently compiled in an IAEA Working Material. One of the main recommendations of this TCM, called for the harmonization of criteria and methods used in Member States in seismic reassessment and upgrading of existing NPPs. Twenty four institutions from thirteen countries participated in the CRP named 'Benchmark study for the seismic analysis and testing of WWER type NPPs'. Two types of WWER reactors (WWER-1000 and WWER-440/213) selected for benchmarking. Kozloduy NPP Units 5/6 and Paks NPP represented these respectively as prototypes. Consistent with the recommendations of the TCM and the working paper prepared by the subsequent Consultants' Meeting, the focal activity of the CRP was the benchmarking exercises. A similar methodology was followed both for Paks NPP and Kozloduy NPP Unit 5. Firstly, the NPP (mainly the reactor building) was tested using a blast loading generated by a series of explosions from buried TNT charges. Records from this test were obtained at several free field locations (both downhole and surface), foundation mat, various elevations of structures as well as some tanks and the stack. Then the benchmark participants were provided with structural drawings, soil data and the free field record of the blast experiment. Their task was to make a blind prediction of the response at preselected locations. The analytical results from these participants were then compared with the results from the test. Although the benchmarking exercises constituted the focus of the CRP, there were many other interesting problems related to the seismic safety of WWER type NPPs which were addressed by the participants. These involved generic studies, i.e. codes and standards used in original WWER designs and their comparison with current international practice; seismic analysis

  20. Prismatic Core Coupled Transient Benchmark

    International Nuclear Information System (INIS)

    Ortensi, J.; Pope, M.A.; Strydom, G.; Sen, R.S.; DeHart, M.D.; Gougar, H.D.; Ellis, C.; Baxter, A.; Seker, V.; Downar, T.J.; Vierow, K.; Ivanov, K.

    2011-01-01

    The Prismatic Modular Reactor (PMR) is one of the High Temperature Reactor (HTR) design concepts that have existed for some time. Several prismatic units have operated in the world (DRAGON, Fort St. Vrain, Peach Bottom) and one unit is still in operation (HTTR). The deterministic neutronics and thermal-fluids transient analysis tools and methods currently available for the design and analysis of PMRs have lagged behind the state of the art compared to LWR reactor technologies. This has motivated the development of more accurate and efficient tools for the design and safety evaluations of the PMR. In addition to the work invested in new methods, it is essential to develop appropriate benchmarks to verify and validate the new methods in computer codes. The purpose of this benchmark is to establish a well-defined problem, based on a common given set of data, to compare methods and tools in core simulation and thermal hydraulics analysis with a specific focus on transient events. The benchmark-working group is currently seeking OECD/NEA sponsorship. This benchmark is being pursued and is heavily based on the success of the PBMR-400 exercise.

  1. Shielding benchmark test

    International Nuclear Information System (INIS)

    Kawai, Masayoshi

    1984-01-01

    Iron data in JENDL-2 have been tested by analyzing shielding benchmark experiments for neutron transmission through iron block performed at KFK using CF-252 neutron source and at ORNL using collimated neutron beam from reactor. The analyses are made by a shielding analysis code system RADHEAT-V4 developed at JAERI. The calculated results are compared with the measured data. As for the KFK experiments, the C/E values are about 1.1. For the ORNL experiments, the calculated values agree with the measured data within an accuracy of 33% for the off-center geometry. The d-t neutron transmission measurements through carbon sphere made at LLNL are also analyzed preliminarily by using the revised JENDL data for fusion neutronics calculation. (author)

  2. MCNP neutron benchmarks

    International Nuclear Information System (INIS)

    Hendricks, J.S.; Whalen, D.J.; Cardon, D.A.; Uhle, J.L.

    1991-01-01

    Over 50 neutron benchmark calculations have recently been completed as part of an ongoing program to validate the MCNP Monte Carlo radiation transport code. The new and significant aspects of this work are as follows: These calculations are the first attempt at a validation program for MCNP and the first official benchmarking of version 4 of the code. We believe the chosen set of benchmarks is a comprehensive set that may be useful for benchmarking other radiation transport codes and data libraries. These calculations provide insight into how well neutron transport calculations can be expected to model a wide variety of problems

  3. OECD/NEA benchmark for time-dependent neutron transport calculations without spatial homogenization

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Jason, E-mail: jason.hou@ncsu.edu [Department of Nuclear Engineering, North Carolina State University, Raleigh, NC 27695 (United States); Ivanov, Kostadin N. [Department of Nuclear Engineering, North Carolina State University, Raleigh, NC 27695 (United States); Boyarinov, Victor F.; Fomichenko, Peter A. [National Research Centre “Kurchatov Institute”, Kurchatov Sq. 1, Moscow (Russian Federation)

    2017-06-15

    Highlights: • A time-dependent homogenization-free neutron transport benchmark was created. • The first phase, known as the kinetics phase, was described in this work. • Preliminary results for selected 2-D transient exercises were presented. - Abstract: A Nuclear Energy Agency (NEA), Organization for Economic Co-operation and Development (OECD) benchmark for the time-dependent neutron transport calculations without spatial homogenization has been established in order to facilitate the development and assessment of numerical methods for solving the space-time neutron kinetics equations. The benchmark has been named the OECD/NEA C5G7-TD benchmark, and later extended with three consecutive phases each corresponding to one modelling stage of the multi-physics transient analysis of the nuclear reactor core. This paper provides a detailed introduction of the benchmark specification of Phase I, known as the “kinetics phase”, including the geometry description, supporting neutron transport data, transient scenarios in both two-dimensional (2-D) and three-dimensional (3-D) configurations, as well as the expected output parameters from the participants. Also presented are the preliminary results for the initial state 2-D core and selected transient exercises that have been obtained using the Monte Carlo method and the Surface Harmonic Method (SHM), respectively.

  4. Comparative Neutronics Analysis of DIMPLE S06 Criticality Benchmark with Contemporary Reactor Core Analysis Computer Code Systems

    Directory of Open Access Journals (Sweden)

    Wonkyeong Kim

    2015-01-01

    Full Text Available A high-leakage core has been known to be a challenging problem not only for a two-step homogenization approach but also for a direct heterogeneous approach. In this paper the DIMPLE S06 core, which is a small high-leakage core, has been analyzed by a direct heterogeneous modeling approach and by a two-step homogenization modeling approach, using contemporary code systems developed for reactor core analysis. The focus of this work is a comprehensive comparative analysis of the conventional approaches and codes with a small core design, DIMPLE S06 critical experiment. The calculation procedure for the two approaches is explicitly presented in this paper. Comprehensive comparative analysis is performed by neutronics parameters: multiplication factor and assembly power distribution. Comparison of two-group homogenized cross sections from each lattice physics codes shows that the generated transport cross section has significant difference according to the transport approximation to treat anisotropic scattering effect. The necessity of the ADF to correct the discontinuity at the assembly interfaces is clearly presented by the flux distributions and the result of two-step approach. Finally, the two approaches show consistent results for all codes, while the comparison with the reference generated by MCNP shows significant error except for another Monte Carlo code, SERPENT2.

  5. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 3A. Kozloduy NPP units 5/6: Analysis/testing. Working material

    International Nuclear Information System (INIS)

    1995-01-01

    The Co-ordinated research programme on the benchmark study for the seismic analysis and testing of WWER-type nuclear power plants was initiated subsequent to the request from representatives of Member States. The conclusions adopted at the Technical Committee Meeting on Seismic Issues related to existing nuclear power plants held in Tokyo in 1991 called for the harmonization of methods and criteria used in Member States in issues related to seismic safety. The Consulltants' Meeting which followed resulted in producing a working document for CRP. It was decided that a benchmark study is the most effective way to achieve the principal objective. Two types of WWER reactors (WWER-440/213 and WWER-1000) were selected as prototypes for the benchmark exercise to be tested on a full scale using explosions and/or vibration generators. This volume of Working material contains reports related analyses and testing of Kozloduy nuclear power plant, units 5 and 6

  6. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 3B. Kozloduy NPP units 5/6: Analysis/testing. Working material

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    The Co-ordinated research programme on the benchmark study for the seismic analysis and testing of WWER-type nuclear power plants was initiated subsequent to the request from representatives of Member States. The conclusions adopted at the Technical Committee Meeting on Seismic Issues related to existing nuclear power plants held in Tokyo in 1991 called for the harmonization of methods and criteria used in Member States in issues related to seismic safety. The Consulltants' Meeting which followed resulted in producing a working document for CRP. It was decided that a benchmark study is the most effective way to achieve the principal objective. Two types of WWER reactors (WWER-440/213 and WWER-1000) were selected as prototypes for the benchmark exercise to be tested on a full scale using explosions and/or vibration generators. This volume of Working material contains reports related analyses and testing of Kozloduy nuclear power plant, units 5 and 6.

  7. Validation of the Continuous-Energy Monte Carlo Criticality-Safety Analysis System MVP and JENDL-3.2 Using the Internationally Evaluated Criticality Benchmarks

    International Nuclear Information System (INIS)

    Mitake, Susumu

    2003-01-01

    Validation of the continuous-energy Monte Carlo criticality-safety analysis system, comprising the MVP code and neutron cross sections based on JENDL-3.2, was examined using benchmarks evaluated in the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments'. Eight experiments (116 configurations) for the plutonium solution and plutonium-uranium mixture systems performed at Valduc, Battelle Pacific Northwest Laboratories, and other facilities were selected and used in the studies. The averaged multiplication factors calculated with MVP and MCNP-4B using the same neutron cross-section libraries based on JENDL-3.2 were in good agreement. Based on methods provided in the Japanese nuclear criticality-safety handbook, the estimated criticality lower-limit multiplication factors to be used as a subcriticality criterion for the criticality-safety evaluation of nuclear facilities were obtained. The analysis proved the applicability of the MVP code to the criticality-safety analysis of nuclear fuel facilities, particularly to the analysis of systems fueled with plutonium and in homogeneous and thermal-energy conditions

  8. BWR stability analysis: methodology of the stability analysis and results of PSI for the NEA/NCR benchmark task; SWR Stabilitaetsanalyse: Methodik der Stabilitaetsanalyse und PSI-Ergebnisse zur NEA/NCR Benchmarkaufgabe

    Energy Technology Data Exchange (ETDEWEB)

    Hennig, D.; Nechvatal, L. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1996-09-01

    The report describes the PSI stability analysis methodology and the validation of this methodology based on the international OECD/NEA BWR stability benchmark task. In the frame of this work, the stability properties of some operation points of the NPP Ringhals 1 have been analysed and compared with the experimental results. (author) figs., tabs., 45 refs.

  9. 5D Task Analysis Visualization Tool Phase II, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  10. Benchmarking af kommunernes sagsbehandling

    DEFF Research Database (Denmark)

    Amilon, Anna

    Fra 2007 skal Ankestyrelsen gennemføre benchmarking af kommuernes sagsbehandlingskvalitet. Formålet med benchmarkingen er at udvikle praksisundersøgelsernes design med henblik på en bedre opfølgning og at forbedre kommunernes sagsbehandling. Dette arbejdspapir diskuterer metoder for benchmarking...

  11. Internet based benchmarking

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Nielsen, Kurt

    2005-01-01

    We discuss the design of interactive, internet based benchmarking using parametric (statistical) as well as nonparametric (DEA) models. The user receives benchmarks and improvement potentials. The user is also given the possibility to search different efficiency frontiers and hereby to explore...

  12. The Drill Down Benchmark

    NARCIS (Netherlands)

    P.A. Boncz (Peter); T. Rühl (Tim); F. Kwakkel

    1998-01-01

    textabstractData Mining places specific requirements on DBMS query performance that cannot be evaluated satisfactorily using existing OLAP benchmarks. The DD Benchmark - defined here - provides a practical case and yardstick to explore how well a DBMS is able to support Data Mining applications. It

  13. Benchmarking Tool Kit.

    Science.gov (United States)

    Canadian Health Libraries Association.

    Nine Canadian health libraries participated in a pilot test of the Benchmarking Tool Kit between January and April, 1998. Although the Tool Kit was designed specifically for health libraries, the content and approach are useful to other types of libraries as well. Used to its full potential, benchmarking can provide a common measuring stick to…

  14. Hartsville data and analysis book: Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Kerley, C.R.; Siegrist, C.

    1978-09-01

    A preconstruction data base is recorded for the impact area surrounding the Hartsville nuclear construction project. The objective is to document baseline information for socioeconomic characteristics that may be either temporarily or permanently altered by the project. The analysis suggests that the five counties surrounding the site make up a primary impact area, but some impacts may occur outside the area. The work force for the construction phase of the project is segregated into four components: (1) former residents of the site county, (2) former residents of other counties in the impact area, (3) in-movers to the site county, and (4) in-movers to other counties in the impact area. A theoretical model is developed to illustrate the contribution of each component to the spatial pattern of economic benefits and social costs in the impact area. A shift-share analysis of agricultural characteristics in the impact area shows that employment and farm numbers in the area have declined at a slightly faster rate than in the nation but at a slower rate than in the South. A population and construction project threshold analysis suggests that, given the project size and population base at Hartsville, significant social and economic constraints may be encountered in the public and private economic infrastructure. These include amenities such as housing, school space, medical and police protection.

  15. Hartsville data and analysis book: Phase I

    International Nuclear Information System (INIS)

    Kerley, C.R.; Siegrist, C.

    1978-09-01

    A preconstruction data base is recorded for the impact area surrounding the Hartsville nuclear construction project. The objective is to document baseline information for socioeconomic characteristics that may be either temporarily or permanently altered by the project. The analysis suggests that the five counties surrounding the site make up a primary impact area, but some impacts may occur outside the area. The work force for the construction phase of the project is segregated into four components: (1) former residents of the site county, (2) former residents of other counties in the impact area, (3) in-movers to the site county, and (4) in-movers to other counties in the impact area. A theoretical model is developed to illustrate the contribution of each component to the spatial pattern of economic benefits and social costs in the impact area. A shift-share analysis of agricultural characteristics in the impact area shows that employment and farm numbers in the area have declined at a slightly faster rate than in the nation but at a slower rate than in the South. A population and construction project threshold analysis suggests that, given the project size and population base at Hartsville, significant social and economic constraints may be encountered in the public and private economic infrastructure. These include amenities such as housing, school space, medical and police protection

  16. Development of a Model Protein Interaction Pair as a Benchmarking Tool for the Quantitative Analysis of 2-Site Protein-Protein Interactions.

    Science.gov (United States)

    Yamniuk, Aaron P; Newitt, John A; Doyle, Michael L; Arisaka, Fumio; Giannetti, Anthony M; Hensley, Preston; Myszka, David G; Schwarz, Fred P; Thomson, James A; Eisenstein, Edward

    2015-12-01

    A significant challenge in the molecular interaction field is to accurately determine the stoichiometry and stepwise binding affinity constants for macromolecules having >1 binding site. The mission of the Molecular Interactions Research Group (MIRG) of the Association of Biomolecular Resource Facilities (ABRF) is to show how biophysical technologies are used to quantitatively characterize molecular interactions, and to educate the ABRF members and scientific community on the utility and limitations of core technologies [such as biosensor, microcalorimetry, or analytic ultracentrifugation (AUC)]. In the present work, the MIRG has developed a robust model protein interaction pair consisting of a bivalent variant of the Bacillus amyloliquefaciens extracellular RNase barnase and a variant of its natural monovalent intracellular inhibitor protein barstar. It is demonstrated that this system can serve as a benchmarking tool for the quantitative analysis of 2-site protein-protein interactions. The protein interaction pair enables determination of precise binding constants for the barstar protein binding to 2 distinct sites on the bivalent barnase binding partner (termed binase), where the 2 binding sites were engineered to possess affinities that differed by 2 orders of magnitude. Multiple MIRG laboratories characterized the interaction using isothermal titration calorimetry (ITC), AUC, and surface plasmon resonance (SPR) methods to evaluate the feasibility of the system as a benchmarking model. Although general agreement was seen for the binding constants measured using solution-based ITC and AUC approaches, weaker affinity was seen for surface-based method SPR, with protein immobilization likely affecting affinity. An analysis of the results from multiple MIRG laboratories suggests that the bivalent barnase-barstar system is a suitable model for benchmarking new approaches for the quantitative characterization of complex biomolecular interactions.

  17. UNIFORM ATMOSPHERIC RETRIEVAL ANALYSIS OF ULTRACOOL DWARFS. I. CHARACTERIZING BENCHMARKS, Gl 570D AND HD 3651B

    Energy Technology Data Exchange (ETDEWEB)

    Line, Michael R.; Fortney, Jonathan J. [Department of Astronomy and Astrophysics, University of California, Santa Cruz, CA 95064 (United States); Teske, Johanna [Carnegie DTM, 5241 Broad Branch Road, NW, Washington, DC 20015 (United States); Burningham, Ben; Marley, Mark S., E-mail: mrline@ucsc.edu [NASA Ames Research Center, Mail Stop 245-3, Moffett Field, CA 94035 (United States)

    2015-07-10

    Interpreting the spectra of brown dwarfs is key to determining the fundamental physical and chemical processes occurring in their atmospheres. Powerful Bayesian atmospheric retrieval tools have recently been applied to both exoplanet and brown dwarf spectra to tease out the thermal structures and molecular abundances to understand those processes. In this manuscript we develop a significantly upgraded retrieval method and apply it to the SpeX spectral library data of two benchmark late T dwarfs, Gl 570D and HD 3651B, to establish the validity of our upgraded forward model parameterization and Bayesian estimator. Our retrieved metallicities, gravities, and effective temperatures are consistent with the metallicity and presumed ages of the systems. We add the carbon-to-oxygen ratio as a new dimension to benchmark systems and find good agreement between carbon-to-oxygen ratios derived in the brown dwarfs and the host stars. Furthermore, we have for the first time unambiguously determined the presence of ammonia in the low-resolution spectra of these two late T dwarfs. We also show that the retrieved results are not significantly impacted by the possible presence of clouds, though some quantities are significantly impacted by uncertainties in photometry. This investigation represents a watershed study in establishing the utility of atmospheric retrieval approaches on brown dwarf spectra.

  18. How Activists Use Benchmarks

    DEFF Research Database (Denmark)

    Seabrooke, Leonard; Wigan, Duncan

    2015-01-01

    Non-governmental organisations use benchmarks as a form of symbolic violence to place political pressure on firms, states, and international organisations. The development of benchmarks requires three elements: (1) salience, that the community of concern is aware of the issue and views...... are put to the test. The first is a reformist benchmarking cycle where organisations defer to experts to create a benchmark that conforms with the broader system of politico-economic norms. The second is a revolutionary benchmarking cycle driven by expert-activists that seek to contest strong vested...... interests and challenge established politico-economic norms. Differentiating these cycles provides insights into how activists work through organisations and with expert networks, as well as how campaigns on complex economic issues can be mounted and sustained....

  19. EGS4 benchmark program

    International Nuclear Information System (INIS)

    Yasu, Y.; Hirayama, H.; Namito, Y.; Yashiro, S.

    1995-01-01

    This paper proposes EGS4 Benchmark Suite which consists of three programs called UCSAMPL4, UCSAMPL4I and XYZDOS. This paper also evaluates optimization methods of recent RISC/UNIX systems, such as IBM, HP, DEC, Hitachi and Fujitsu, for the benchmark suite. When particular compiler option and math library were included in the evaluation process, system performed significantly better. Observed performance of some of the RISC/UNIX systems were beyond some so-called Mainframes of IBM, Hitachi or Fujitsu. The computer performance of EGS4 Code System on an HP9000/735 (99MHz) was defined to be the unit of EGS4 Unit. The EGS4 Benchmark Suite also run on various PCs such as Pentiums, i486 and DEC alpha and so forth. The performance of recent fast PCs reaches that of recent RISC/UNIX systems. The benchmark programs have been evaluated with correlation of industry benchmark programs, namely, SPECmark. (author)

  20. SP2Bench: A SPARQL Performance Benchmark

    Science.gov (United States)

    Schmidt, Michael; Hornung, Thomas; Meier, Michael; Pinkel, Christoph; Lausen, Georg

    A meaningful analysis and comparison of both existing storage schemes for RDF data and evaluation approaches for SPARQL queries necessitates a comprehensive and universal benchmark platform. We present SP2Bench, a publicly available, language-specific performance benchmark for the SPARQL query language. SP2Bench is settled in the DBLP scenario and comprises a data generator for creating arbitrarily large DBLP-like documents and a set of carefully designed benchmark queries. The generated documents mirror vital key characteristics and social-world distributions encountered in the original DBLP data set, while the queries implement meaningful requests on top of this data, covering a variety of SPARQL operator constellations and RDF access patterns. In this chapter, we discuss requirements and desiderata for SPARQL benchmarks and present the SP2Bench framework, including its data generator, benchmark queries and performance metrics.

  1. Benchmarking of refinery emissions performance : Executive summary

    International Nuclear Information System (INIS)

    2003-07-01

    This study was undertaken to collect emissions performance data for Canadian and comparable American refineries. The objective was to examine parameters that affect refinery air emissions performance and develop methods or correlations to normalize emissions performance. Another objective was to correlate and compare the performance of Canadian refineries to comparable American refineries. For the purpose of this study, benchmarking involved the determination of levels of emission performance that are being achieved for generic groups of facilities. A total of 20 facilities were included in the benchmarking analysis, and 74 American refinery emission correlations were developed. The recommended benchmarks, and the application of those correlations for comparison between Canadian and American refinery performance, were discussed. The benchmarks were: sulfur oxides, nitrogen oxides, carbon monoxide, particulate, volatile organic compounds, ammonia and benzene. For each refinery in Canada, benchmark emissions were developed. Several factors can explain differences in Canadian and American refinery emission performance. 4 tabs., 7 figs

  2. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  3. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  4. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  5. Benchmarking the energy efficiency of commercial buildings

    International Nuclear Information System (INIS)

    Chung, William; Hui, Y.V.; Lam, Y. Miu

    2006-01-01

    Benchmarking energy-efficiency is an important tool to promote the efficient use of energy in commercial buildings. Benchmarking models are mostly constructed in a simple benchmark table (percentile table) of energy use, which is normalized with floor area and temperature. This paper describes a benchmarking process for energy efficiency by means of multiple regression analysis, where the relationship between energy-use intensities (EUIs) and the explanatory factors (e.g., operating hours) is developed. Using the resulting regression model, these EUIs are then normalized by removing the effect of deviance in the significant explanatory factors. The empirical cumulative distribution of the normalized EUI gives a benchmark table (or percentile table of EUI) for benchmarking an observed EUI. The advantage of this approach is that the benchmark table represents a normalized distribution of EUI, taking into account all the significant explanatory factors that affect energy consumption. An application to supermarkets is presented to illustrate the development and the use of the benchmarking method

  6. Tourism Destination Benchmarking: Evaluation and Selection of the Benchmarking Partners

    Directory of Open Access Journals (Sweden)

    Luštický Martin

    2012-03-01

    Full Text Available Tourism development has an irreplaceable role in regional policy of almost all countries. This is due to its undeniable benefits for the local population with regards to the economic, social and environmental sphere. Tourist destinations compete for visitors at tourism market and subsequently get into a relatively sharp competitive struggle. The main goal of regional governments and destination management institutions is to succeed in this struggle by increasing the competitiveness of their destination. The quality of strategic planning and final strategies is a key factor of competitiveness. Even though the tourism sector is not the typical field where the benchmarking methods are widely used, such approaches could be successfully applied. The paper focuses on key phases of the benchmarking process which lies in the search for suitable referencing partners. The partners are consequently selected to meet general requirements to ensure the quality if strategies. Following from this, some specific characteristics are developed according to the SMART approach. The paper tests this procedure with an expert evaluation of eight selected regional tourism strategies of regions in the Czech Republic, Slovakia and Great Britain. In this way it validates the selected criteria in the frame of the international environment. Hence, it makes it possible to find strengths and weaknesses of selected strategies and at the same time facilitates the discovery of suitable benchmarking partners.

  7. THE APPLICATION OF DATA ENVELOPMENT ANALYSIS METHODOLOGY TO IMPROVE THE BENCHMARKING PROCESS IN THE EFQM BUSINESS MODEL (CASE STUDY: AUTOMOTIVE INDUSTRY OF IRAN

    Directory of Open Access Journals (Sweden)

    K. Shahroudi

    2009-10-01

    Full Text Available This paper reports a survey and case study research outcomes on the application of Data Envelopment Analysis (DEA to the ranking method of European Foundation for Quality Management (EFQM Business Excellence Model in Iran’s Automotive Industry and improving benchmarking process after assessment. Following the global trend, the Iranian industry leaders have introduced the EFQM practice to their supply chain in order to improve the supply base competitiveness during the last four years. A question which is raises is whether the EFQM model can be combined with a mathematical model such as DEA in order to generate a new ranking method and develop or facilitate the benchmarking process. The developed model of this paper is simple. However, it provides some new and interesting insights. The paper assesses the usefulness and capability of the DEA technique to recognize a new scoring system in order to compare the classical ranking method and the EFQM business model. We used this method to identify meaningful exemplar companies for each criterion of the EFQM model then we designed a road map based on realistic targets in the criterion which have currently been achieved by exemplar companies. The research indicates that the DEA approach is a reliable tool to analyze the latent knowledge of scores generated by conducting self- assessments. The Wilcoxon Rank Sum Test is used to compare two scores and the Test of Hypothesis reveals the meaningful relation between the EFQM and DEA new ranking methods. Finally, we drew a road map based on the benchmarking concept using the research results.

  8. Dynamic Monte Carlo transient analysis for the Organization for Economic Co-operation and Development Nuclear Energy Agency (OECD/NEA) C5G7-TD benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Shaukat, Nadeem; Ryu, Min; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of)

    2017-08-15

    With ever-advancing computer technology, the Monte Carlo (MC) neutron transport calculation is expanding its application area to nuclear reactor transient analysis. Dynamic MC (DMC) neutron tracking for transient analysis requires efficient algorithms for delayed neutron generation, neutron population control, and initial condition modeling. In this paper, a new MC steady-state simulation method based on time-dependent MC neutron tracking is proposed for steady-state initial condition modeling; during this process, prompt neutron sources and delayed neutron precursors for the DMC transient simulation can easily be sampled. The DMC method, including the proposed time-dependent DMC steady-state simulation method, has been implemented in McCARD and applied for two-dimensional core kinetics problems in the time-dependent neutron transport benchmark C5G7-TD. The McCARD DMC calculation results show good agreement with results of a deterministic transport analysis code, nTRACER.

  9. Analysis of OKTAVIAN Shielding Benchmark Experiments by ENDF/B-VII, JEFF-3.1, and JENDL-3.3

    International Nuclear Information System (INIS)

    Kim, Do-Heon; Gil, Choong-Sup; Lee, Young-Ouk

    2007-01-01

    International collaborations for the ITER Project and other fusion-related development projects have been conducted to create a reference fusion nuclear data library such as FENDL, which was a collection of the best cross section data from national nuclear data libraries. Recent release of newly evaluated nuclear data libraries requires an extensive and intensive benchmarking of the updated transport libraries to become a candidate for the future collection. In this study, the pulsed sphere experiments for leakage neutron and gamma-ray spectra at the D-T neutron source facility of Osaka University, OKTAVIAN were employed to test the ENDF/B-VII beta 1, JEFF-3.1, and JENDL-3.3 libraries. The continuous energy Monte Carlo transport code MCNPX-2.5 was used along with the ACE format libraries processed by a modified version of the NJOY99.90 code

  10. Shielding benchmark problems, (2)

    International Nuclear Information System (INIS)

    Tanaka, Shun-ichi; Sasamoto, Nobuo; Oka, Yoshiaki; Shin, Kazuo; Tada, Keiko.

    1980-02-01

    Shielding benchmark problems prepared by Working Group of Assessment of Shielding Experiments in the Research Committee on Shielding Design in the Atomic Energy Society of Japan were compiled by Shielding Laboratory in Japan Atomic Energy Research Institute. Fourteen shielding benchmark problems are presented newly in addition to twenty-one problems proposed already, for evaluating the calculational algorithm and accuracy of computer codes based on discrete ordinates method and Monte Carlo method and for evaluating the nuclear data used in codes. The present benchmark problems are principally for investigating the backscattering and the streaming of neutrons and gamma rays in two- and three-dimensional configurations. (author)

  11. Toxicological Benchmarks for Wildlife

    Energy Technology Data Exchange (ETDEWEB)

    Sample, B.E. Opresko, D.M. Suter, G.W.

    1993-01-01

    Ecological risks of environmental contaminants are evaluated by using a two-tiered process. In the first tier, a screening assessment is performed where concentrations of contaminants in the environment are compared to no observed adverse effects level (NOAEL)-based toxicological benchmarks. These benchmarks represent concentrations of chemicals (i.e., concentrations presumed to be nonhazardous to the biota) in environmental media (water, sediment, soil, food, etc.). While exceedance of these benchmarks does not indicate any particular level or type of risk, concentrations below the benchmarks should not result in significant effects. In practice, when contaminant concentrations in food or water resources are less than these toxicological benchmarks, the contaminants may be excluded from further consideration. However, if the concentration of a contaminant exceeds a benchmark, that contaminant should be retained as a contaminant of potential concern (COPC) and investigated further. The second tier in ecological risk assessment, the baseline ecological risk assessment, may use toxicological benchmarks as part of a weight-of-evidence approach (Suter 1993). Under this approach, based toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. Other sources of evidence include media toxicity tests, surveys of biota (abundance and diversity), measures of contaminant body burdens, and biomarkers. This report presents NOAEL- and lowest observed adverse effects level (LOAEL)-based toxicological benchmarks for assessment of effects of 85 chemicals on 9 representative mammalian wildlife species (short-tailed shrew, little brown bat, meadow vole, white-footed mouse, cottontail rabbit, mink, red fox, and whitetail deer) or 11 avian wildlife species (American robin, rough-winged swallow, American woodcock, wild turkey, belted kingfisher, great blue heron, barred owl, barn owl, Cooper's hawk, and red

  12. Derivation of the critical effect size/benchmark response for the dose-response analysis of the uptake of radioactive iodine in the human thyroid.

    Science.gov (United States)

    Weterings, Peter J J M; Loftus, Christine; Lewandowski, Thomas A

    2016-08-22

    Potential adverse effects of chemical substances on thyroid function are usually examined by measuring serum levels of thyroid-related hormones. Instead, recent risk assessments for thyroid-active chemicals have focussed on iodine uptake inhibition, an upstream event that by itself is not necessarily adverse. Establishing the extent of uptake inhibition that can be considered de minimis, the chosen benchmark response (BMR), is therefore critical. The BMR values selected by two international advisory bodies were 5% and 50%, a difference that had correspondingly large impacts on the estimated risks and health-based guidance values that were established. Potential treatment-related inhibition of thyroidal iodine uptake is usually determined by comparing thyroidal uptake of radioactive iodine (RAIU) during treatment with a single pre-treatment RAIU value. In the present study it is demonstrated that the physiological intra-individual variation in iodine uptake is much larger than 5%. Consequently, in-treatment RAIU values, expressed as a percentage of the pre-treatment value, have an inherent variation, that needs to be considered when conducting dose-response analyses. Based on statistical and biological considerations, a BMR of 20% is proposed for benchmark dose analysis of human thyroidal iodine uptake data, to take the inherent variation in relative RAIU data into account. Implications for the tolerated daily intakes for perchlorate and chlorate, recently established by the European Food Safety Authority (EFSA), are discussed. Copyright © 2016 The Author(s). Published by Elsevier Ireland Ltd.. All rights reserved.

  13. Benchmarking local healthcare-associated infections: Available benchmarks and interpretation challenges

    Directory of Open Access Journals (Sweden)

    Aiman El-Saed

    2013-10-01

    Full Text Available Summary: Growing numbers of healthcare facilities are routinely collecting standardized data on healthcare-associated infection (HAI, which can be used not only to track internal performance but also to compare local data to national and international benchmarks. Benchmarking overall (crude HAI surveillance metrics without accounting or adjusting for potential confounders can result in misleading conclusions. Methods commonly used to provide risk-adjusted metrics include multivariate logistic regression analysis, stratification, indirect standardization, and restrictions. The characteristics of recognized benchmarks worldwide, including the advantages and limitations are described. The choice of the right benchmark for the data from the Gulf Cooperation Council (GCC states is challenging. The chosen benchmark should have similar data collection and presentation methods. Additionally, differences in surveillance environments including regulations should be taken into consideration when considering such a benchmark. The GCC center for infection control took some steps to unify HAI surveillance systems in the region. GCC hospitals still need to overcome legislative and logistic difficulties in sharing data to create their own benchmark. The availability of a regional GCC benchmark may better enable health care workers and researchers to obtain more accurate and realistic comparisons. Keywords: Benchmarking, Comparison, Surveillance, Healthcare-associated infections

  14. Diagnostic Algorithm Benchmarking

    Science.gov (United States)

    Poll, Scott

    2011-01-01

    A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.

  15. Benchmarking Swiss electricity grids

    International Nuclear Information System (INIS)

    Walti, N.O.; Weber, Ch.

    2001-01-01

    This extensive article describes a pilot benchmarking project initiated by the Swiss Association of Electricity Enterprises that assessed 37 Swiss utilities. The data collected from these utilities on a voluntary basis included data on technical infrastructure, investments and operating costs. These various factors are listed and discussed in detail. The assessment methods and rating mechanisms that provided the benchmarks are discussed and the results of the pilot study are presented that are to form the basis of benchmarking procedures for the grid regulation authorities under the planned Switzerland's electricity market law. Examples of the practical use of the benchmarking methods are given and cost-efficiency questions still open in the area of investment and operating costs are listed. Prefaces by the Swiss Association of Electricity Enterprises and the Swiss Federal Office of Energy complete the article

  16. Benchmarking and Regulation

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    . The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...

  17. Financial Integrity Benchmarks

    Data.gov (United States)

    City of Jackson, Mississippi — This data compiles standard financial integrity benchmarks that allow the City to measure its financial standing. It measure the City's debt ratio and bond ratings....

  18. Benchmarking in Foodservice Operations

    National Research Council Canada - National Science Library

    Johnson, Bonnie

    1998-01-01

    .... The design of this study included two parts: (1) eleven expert panelists involved in a Delphi technique to identify and rate importance of foodservice performance measures and rate the importance of benchmarking activities, and (2...

  19. MFTF TOTAL benchmark

    International Nuclear Information System (INIS)

    Choy, J.H.

    1979-06-01

    A benchmark of the TOTAL data base management system as applied to the Mirror Fusion Test Facility (MFTF) data base was implemented and run in February and March of 1979. The benchmark was run on an Interdata 8/32 and involved the following tasks: (1) data base design, (2) data base generation, (3) data base load, and (4) develop and implement programs to simulate MFTF usage of the data base

  20. Accelerator shielding benchmark problems

    International Nuclear Information System (INIS)

    Hirayama, H.; Ban, S.; Nakamura, T.

    1993-01-01

    Accelerator shielding benchmark problems prepared by Working Group of Accelerator Shielding in the Research Committee on Radiation Behavior in the Atomic Energy Society of Japan were compiled by Radiation Safety Control Center of National Laboratory for High Energy Physics. Twenty-five accelerator shielding benchmark problems are presented for evaluating the calculational algorithm, the accuracy of computer codes and the nuclear data used in codes. (author)

  1. Shielding benchmark problems

    International Nuclear Information System (INIS)

    Tanaka, Shun-ichi; Sasamoto, Nobuo; Oka, Yoshiaki; Kawai, Masayoshi; Nakazawa, Masaharu.

    1978-09-01

    Shielding benchmark problems were prepared by the Working Group of Assessment of Shielding Experiments in the Research Comittee on Shielding Design of the Atomic Energy Society of Japan, and compiled by the Shielding Laboratory of Japan Atomic Energy Research Institute. Twenty-one kinds of shielding benchmark problems are presented for evaluating the calculational algorithm and the accuracy of computer codes based on the discrete ordinates method and the Monte Carlo method and for evaluating the nuclear data used in the codes. (author)

  2. Visual Analysis of Inclusion Dynamics in Two-Phase Flow.

    Science.gov (United States)

    Karch, Grzegorz Karol; Beck, Fabian; Ertl, Moritz; Meister, Christian; Schulte, Kathrin; Weigand, Bernhard; Ertl, Thomas; Sadlo, Filip

    2018-05-01

    In single-phase flow visualization, research focuses on the analysis of vector field properties. In two-phase flow, in contrast, analysis of the phase components is typically of major interest. So far, visualization research of two-phase flow concentrated on proper interface reconstruction and the analysis thereof. In this paper, we present a novel visualization technique that enables the investigation of complex two-phase flow phenomena with respect to the physics of breakup and coalescence of inclusions. On the one hand, we adapt dimensionless quantities for a localized analysis of phase instability and breakup, and provide detailed inspection of breakup dynamics with emphasis on oscillation and its interplay with rotational motion. On the other hand, we present a parametric tightly linked space-time visualization approach for an effective interactive representation of the overall dynamics. We demonstrate the utility of our approach using several two-phase CFD datasets.

  3. Benchmarking electricity distribution

    Energy Technology Data Exchange (ETDEWEB)

    Watts, K. [Department of Justice and Attorney-General, QLD (Australia)

    1995-12-31

    Benchmarking has been described as a method of continuous improvement that involves an ongoing and systematic evaluation and incorporation of external products, services and processes recognised as representing best practice. It is a management tool similar to total quality management (TQM) and business process re-engineering (BPR), and is best used as part of a total package. This paper discusses benchmarking models and approaches and suggests a few key performance indicators that could be applied to benchmarking electricity distribution utilities. Some recent benchmarking studies are used as examples and briefly discussed. It is concluded that benchmarking is a strong tool to be added to the range of techniques that can be used by electricity distribution utilities and other organizations in search of continuous improvement, and that there is now a high level of interest in Australia. Benchmarking represents an opportunity for organizations to approach learning from others in a disciplined and highly productive way, which will complement the other micro-economic reforms being implemented in Australia. (author). 26 refs.

  4. Processing and benchmarking of evaluated nuclear data file/b-viii.0β4 cross-section library by analysis of a series of critical experimental benchmark using the monte carlo code MCNP(X and NJOY2016

    Directory of Open Access Journals (Sweden)

    Kabach Ouadie

    2017-12-01

    Full Text Available To validate the new Evaluated Nuclear Data File (ENDF/B-VIII.0β4 library, 31 different critical cores were selected and used for a benchmark test of the important parameter keff. The four utilized libraries are processed using Nuclear Data Processing Code (NJOY2016. The results obtained with the ENDF/B-VIII.0β4 library were compared against those calculated with ENDF/B-VI.8, ENDF/B-VII.0, and ENDF/B-VII.1 libraries using the Monte Carlo N-Particle (MCNP(X code. All the MCNP(X calculations of keff values with these four libraries were compared with the experimentally measured results, which are available in the International Critically Safety Benchmark Evaluation Project. The obtained results are discussed and analyzed in this paper.

  5. Analysis of the TRIGA Mark-II benchmark IEU-COMP-THERM-003 with Monte Carlo code MVP

    International Nuclear Information System (INIS)

    Mahmood, Mohammad Sayem; Nagaya, Yasunobu; Mori, Takamasa

    2004-03-01

    The benchmark experiments of the TRIGA Mark-II reactor in the ICSBEP handbook have been analyzed with the Monte Carlo code MVP using the cross section libraries based on JENDL-3.3, JENDL-3.2 and ENDF/B-VI.8. The MCNP calculations have been also performed with the ENDF/B-VI.6 library for comparison between the MVP and MCNP results. For both cores labeled 132 and 133, which have different core configurations, the ratio of the calculated to the experimental results (C/E) for k eff obtained by the MVP code is 0.999 for JENDL-3.3, 1.003 for JENDL-3.2, and 0.998 for ENDF/B-VI.8. For the MCNP code, the C/E values are 0.998 for both Core 132 and 133. All the calculated results agree with the reference values within the experimental uncertainties. The results obtained by MVP with ENDF/B-VI.8 and MCNP with ENDF/B-VI.6 differ only by 0.02% for Core 132, and by 0.01% for Core 133. (author)

  6. Benchmarking for Cost Improvement. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The US Department of Energy`s (DOE) Office of Environmental Restoration and Waste Management (EM) conducted the Benchmarking for Cost Improvement initiative with three objectives: Pilot test benchmarking as an EM cost improvement tool; identify areas for cost improvement and recommend actions to address these areas; provide a framework for future cost improvement. The benchmarking initiative featured the use of four principal methods (program classification, nationwide cost improvement survey, paired cost comparison and component benchmarking). Interested parties contributed during both the design and execution phases. The benchmarking initiative was conducted on an accelerated basis. Of necessity, it considered only a limited set of data that may not be fully representative of the diverse and complex conditions found at the many DOE installations. The initiative generated preliminary data about cost differences and it found a high degree of convergence on several issues. Based on this convergence, the report recommends cost improvement strategies and actions. This report describes the steps taken as part of the benchmarking initiative and discusses the findings and recommended actions for achieving cost improvement. The results and summary recommendations, reported below, are organized by the study objectives.

  7. The KMAT: Benchmarking Knowledge Management.

    Science.gov (United States)

    de Jager, Martha

    Provides an overview of knowledge management and benchmarking, including the benefits and methods of benchmarking (e.g., competitive, cooperative, collaborative, and internal benchmarking). Arthur Andersen's KMAT (Knowledge Management Assessment Tool) is described. The KMAT is a collaborative benchmarking tool, designed to help organizations make…

  8. Phase transition phenomenon: A compound measure analysis

    Science.gov (United States)

    Kang, Bo Soo; Park, Chanhi; Ryu, Doojin; Song, Wonho

    2015-06-01

    This study investigates the well-documented phenomenon of phase transition in financial markets using combined information from both return and volume changes within short time intervals. We suggest a new measure for the phase transition behaviour of markets, calculated as a return distribution conditional on local variance in volume imbalance, and show that this measure successfully captures phase transition behaviour under various conditions. We analyse the intraday trade and quote dataset from the KOSPI 200 index futures, which includes detailed information on the original order size and the type of each initiating investor. We find that among these two competing factors, the submitted order size yields more explanatory power on the phenomenon of market phase transition than the investor type.

  9. Silver Biocide Analysis & Control Device, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Rapid, accurate measurement and process control of silver ion biocide concentrations in future space missions is needed. The purpose of the Phase II program is to...

  10. Gamma model and its analysis for phase measuring profilometry.

    Science.gov (United States)

    Liu, Kai; Wang, Yongchang; Lau, Daniel L; Hao, Qi; Hassebrook, Laurence G

    2010-03-01

    Phase measuring profilometry is a method of structured light illumination whose three-dimensional reconstructions are susceptible to error from nonunitary gamma in the associated optical devices. While the effects of this distortion diminish with an increasing number of employed phase-shifted patterns, gamma distortion may be unavoidable in real-time systems where the number of projected patterns is limited by the presence of target motion. A mathematical model is developed for predicting the effects of nonunitary gamma on phase measuring profilometry, while also introducing an accurate gamma calibration method and two strategies for minimizing gamma's effect on phase determination. These phase correction strategies include phase corrections with and without gamma calibration. With the reduction in noise, for three-step phase measuring profilometry, analysis of the root mean squared error of the corrected phase will show a 60x reduction in phase error when the proposed gamma calibration is performed versus 33x reduction without calibration.

  11. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 3I. Kozloduy NPP units 5/6: Analysis and testing. Working material

    International Nuclear Information System (INIS)

    1999-01-01

    In August 1991, following the SMiRT-11 Conference in Tokyo, a Technical Committee Meeting was held on the 'Seismic safety issues relating to existing NPPs'. The Proceedings of this TCM was subsequently compiled in an IAEA Working Material. One of the main recommendations of this TCM, called for the harmonization of criteria and methods used in Member States in seismic reassessment and upgrading of existing NPPs. Twenty four institutions from thirteen countries participated in the CRP named 'Benchmark study for the seismic analysis and testing of WWER type NPPs'. Two types of WWER reactors (WWER-1000 and WWER-440/213) selected for benchmarking. Kozloduy NPP Units 5/6 and Paks NPP represented these respectively as prototypes. Consistent with the recommendations of the TCM and the working paper prepared by the subsequent Consultants' Meeting, the focal activity of the CRP was the benchmarking exercises. A similar methodology was followed both for Paks NPP and Kozloduy NPP Unit 5. Firstly, the NPP (mainly the reactor building) was tested using a blast loading generated by a series of explosions from buried TNT charges. Records from this test were obtained at several free field locations (both downhole and surface), foundation mat, various elevations of structures as well as some tanks and the stack. Then the benchmark participants were provided with structural drawings, soil data and the free field record of the blast experiment. Their task was to make a blind prediction of the response at preselected locations. The analytical results from these participants were then compared with the results from the test. Although the benchmarking exercises constituted the focus of the CRP, there were many other interesting problems related to the seismic safety of WWER type NPPs which were addressed by the participants. These involved generic studies, i.e. codes and standards used in original WWER designs and their comparison with current international practice; seismic analysis

  12. Phase analysis and focusing of synchrotron radiation

    CERN Document Server

    Chubar, O; Snigirev, A

    1999-01-01

    High accuracy calculations of synchrotron radiation (SR) emitted by a relativistic electron show that the phase of the frequency domain electric field of SR differs from the phase of radiation of a virtual point source. These differences may result in the reduction of focusing efficiency of diffraction-limited SR, if the focusing is performed by conventional optical components optimised for point sources. We show that by applying a phase correction locally, one may transform the phase of SR electric field at a desired polarisation to that of a point source. Such corrections are computed for undulator radiation (planar and helical) and bending magnet radiation (central part and edges). The focusing of the corrected SR wavefront can result in the increase of peak intensity in the focused spot up to several times compared to the focusing without correction. For non-diffraction-limited radiation, the effect of the phase corrections is reduced. Due to this reason, the use of the proposed phase corrections in exist...

  13. Evaluation of High Temperature Gas Cooled Reactor Performance: Benchmark Analysis Related to the PBMR-400, PBMM, GT-MHR, HTR-10 and the ASTRA Critical Facility

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-04-15

    The IAEA has facilitated an extensive programme that addresses the technical development of advanced gas cooled reactor technology. Included in this programme is the coordinated research project (CRP) on Evaluation of High Temperature Gas Cooled Reactor (HTGR) Performance, which is the focus of this TECDOC. This CRP was established to foster the sharing of research and associated technical information among participating Member States in the ongoing development of the HTGR as a future source of nuclear energy. Within it, computer codes and models were verified through actual test results from operating reactor facilities. The work carried out in the CRP involved both computational and experimental analysis at various facilities in IAEA Member States with a view to verifying computer codes and methods in particular, and to evaluating the performance of HTGRs in general. The IAEA is grateful to China, the Russian Federation and South Africa for providing their facilities and benchmark programmes in support of this CRP.

  14. Evaluation of High Temperature Gas Cooled Reactor Performance: Benchmark Analysis Related to the PBMR-400, PBMM, GT-MHR, HTR-10 and the ASTRA Critical Facility

    International Nuclear Information System (INIS)

    2013-04-01

    The IAEA has facilitated an extensive programme that addresses the technical development of advanced gas cooled reactor technology. Included in this programme is the coordinated research project (CRP) on Evaluation of High Temperature Gas Cooled Reactor (HTGR) Performance, which is the focus of this TECDOC. This CRP was established to foster the sharing of research and associated technical information among participating Member States in the ongoing development of the HTGR as a future source of nuclear energy. Within it, computer codes and models were verified through actual test results from operating reactor facilities. The work carried out in the CRP involved both computational and experimental analysis at various facilities in IAEA Member States with a view to verifying computer codes and methods in particular, and to evaluating the performance of HTGRs in general. The IAEA is grateful to China, the Russian Federation and South Africa for providing their facilities and benchmark programmes in support of this CRP.

  15. Pool critical assembly pressure vessel facility benchmark

    International Nuclear Information System (INIS)

    Remec, I.; Kam, F.B.K.

    1997-07-01

    This pool critical assembly (PCA) pressure vessel wall facility benchmark (PCA benchmark) is described and analyzed in this report. Analysis of the PCA benchmark can be used for partial fulfillment of the requirements for the qualification of the methodology for pressure vessel neutron fluence calculations, as required by the US Nuclear Regulatory Commission regulatory guide DG-1053. Section 1 of this report describes the PCA benchmark and provides all data necessary for the benchmark analysis. The measured quantities, to be compared with the calculated values, are the equivalent fission fluxes. In Section 2 the analysis of the PCA benchmark is described. Calculations with the computer code DORT, based on the discrete-ordinates method, were performed for three ENDF/B-VI-based multigroup libraries: BUGLE-93, SAILOR-95, and BUGLE-96. An excellent agreement of the calculated (C) and measures (M) equivalent fission fluxes was obtained. The arithmetic average C/M for all the dosimeters (total of 31) was 0.93 ± 0.03 and 0.92 ± 0.03 for the SAILOR-95 and BUGLE-96 libraries, respectively. The average C/M ratio, obtained with the BUGLE-93 library, for the 28 measurements was 0.93 ± 0.03 (the neptunium measurements in the water and air regions were overpredicted and excluded from the average). No systematic decrease in the C/M ratios with increasing distance from the core was observed for any of the libraries used

  16. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 3E. Kozloduy NPP units 5/6: Analysis/testing. Working material

    International Nuclear Information System (INIS)

    1996-01-01

    The Co-ordinated research programme on the benchmark study for the seismic analysis and testing of WWER-type nuclear power plants was initiated subsequent to the request from representatives of Member States. The conclusions adopted at the Technical Committee Meeting on Seismic Issues related to existing nuclear power plants held in Tokyo in 1991 called for the harmonization of methods and criteria used in Member States in issues related to seismic safety. The Consulltants' Meeting which followed resulted in producing a working document for CRP. It was decided that a benchmark study is the most effective way to achieve the principal objective. Two types of WWER reactors (WWER-440/213 and WWER-1000) were selected as prototypes for the benchmark exercise to be tested on a full scale using explosions and/or vibration generators. The two prototypes are Kozloduy Units 5/6 for WWER-1000 and Paks for WWER-440/213 nuclear power plants. This volume of Working material contains reports on data related to floor response spectra of Kozloduy NPP; calculational-experimental examination and ensuring of equipment and pipelines seismic resistance at starting and operating WWER-type NPPs; analysis of design floor response spectra and testing of the electrical systems; experimental investigations and seismic analysis Kozloduy NPP; testing of components on the shaking table facilities and contribution to full scale dynamic testing of Kozloduy NPP; seismic evaluation of the main steam line, piping systems, containment pre-stressing and steel ventilation chimney of Kozloduy NPP

  17. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 4C. Paks NPP: Analysis and testing. Working material

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-07-01

    The Co-ordinated research programme on the benchmark study for the seismic analysis and testing of WWER-type nuclear power plants was initiated subsequent to the request from representatives of Member States. The conclusions adopted at the Technical Committee Meeting on Seismic Issues related to existing nuclear power plants held in Tokyo in 1991 called for the harmonization of methods and criteria used in Member States in issues related to seismic safety. The Consulltants' Meeting which followed resulted in producing a working document for CRP. It was decided that a benchmark study is the most effective way to achieve the principal objective. Two types of WWER reactors (WWER-440/213 and WWER-1000) were selected as prototypes for the benchmark exercise to be tested on a full scale using explosions and/or vibration generators. The two prototypes are Kozloduy Units 5/6 for WWER-1000 and Paks for WWER-440/213 nuclear power plants. This volume of Working material involves comparative analysis of the seismic analysis results of the reactor building for soft soil conditions, derivation of design response spectra for components and systems; and upper range design response spectra for soft soil site conditions at Paks NPP.

  18. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 4C. Paks NPP: Analysis and testing. Working material

    International Nuclear Information System (INIS)

    1996-01-01

    The Co-ordinated research programme on the benchmark study for the seismic analysis and testing of WWER-type nuclear power plants was initiated subsequent to the request from representatives of Member States. The conclusions adopted at the Technical Committee Meeting on Seismic Issues related to existing nuclear power plants held in Tokyo in 1991 called for the harmonization of methods and criteria used in Member States in issues related to seismic safety. The Consulltants' Meeting which followed resulted in producing a working document for CRP. It was decided that a benchmark study is the most effective way to achieve the principal objective. Two types of WWER reactors (WWER-440/213 and WWER-1000) were selected as prototypes for the benchmark exercise to be tested on a full scale using explosions and/or vibration generators. The two prototypes are Kozloduy Units 5/6 for WWER-1000 and Paks for WWER-440/213 nuclear power plants. This volume of Working material involves comparative analysis of the seismic analysis results of the reactor building for soft soil conditions, derivation of design response spectra for components and systems; and upper range design response spectra for soft soil site conditions at Paks NPP

  19. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 4D. Paks NPP: Analysis and testing. Working material

    International Nuclear Information System (INIS)

    1996-01-01

    The Co-ordinated research programme on the benchmark study for the seismic analysis and testing of WWER-type nuclear power plants was initiated subsequent to the request from representatives of Member States. The conclusions adopted at the Technical Committee Meeting on Seismic Issues related to existing nuclear power plants held in Tokyo in 1991 called for the harmonization of methods and criteria used in Member States in issues related to seismic safety. The Consulltants' Meeting which followed resulted in producing a working document for CRP. It was decided that a benchmark study is the most effective way to achieve the principal objective. Two types of WWER reactors (WWER-440/213 and WWER-1000) were selected as prototypes for the benchmark exercise to be tested on a full scale using explosions and/or vibration generators. The two prototypes are Kozloduy Units 5/6 for WWER-1000 and Paks for WWER-440/213 nuclear power plants. This volume of Working material contains reports on seismic margin assessment and earthquake experience based methods for WWER-440/213 type NPPs; structural analysis and site inspection for site requalification; structural response of Paks NPP reactor building; analysis and testing of model worm type tanks on shaking table; vibration test of a worm tank model; evaluation of potential hazard for operating WWER control rods under seismic excitation

  20. Benchmarking in Mobarakeh Steel Company

    Directory of Open Access Journals (Sweden)

    Sasan Ghasemi

    2008-05-01

    Full Text Available Benchmarking is considered as one of the most effective ways of improving performance incompanies. Although benchmarking in business organizations is a relatively new concept and practice, ithas rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan’s Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aimsto share the process deployed for the benchmarking project in this company and illustrate how the projectsystematic implementation led to succes.

  1. Benchmarking in Mobarakeh Steel Company

    OpenAIRE

    Sasan Ghasemi; Mohammad Nazemi; Mehran Nejati

    2008-01-01

    Benchmarking is considered as one of the most effective ways of improving performance in companies. Although benchmarking in business organizations is a relatively new concept and practice, it has rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan's Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aims to share the process deployed for the benchmarking project in this company and illustrate how th...

  2. X447 EBR-II Experiment Benchmark for Verification of Audit Code of SFR Metal Fuel

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yong Won; Bae, Moo-Hoon; Shin, Andong; Suh, Namduk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    In KINS (Korea Institute of Nuclear Safety), to prepare audit calculation of PGSFR licensing review, the project has been started to develop the regulatory technology for SFR system including a fuel area. To evaluate the fuel integrity and safety during an irradiation, the fuel performance code must be used for audit calculation. In this study, to verify the new code system, the benchmark analysis is performed. In the benchmark, X447 EBR-II experiment data are used. Additionally, the sensitivity analysis according to mass flux change of coolant is performed. In case of LWR fuel performance modeling, various and advanced models have been proposed and validated based on sufficient in-reactor test results. However, due to the lack of experience of SFR operation, the current understanding of SFR fuel behavior is limited. In this study, X447 EBR-II Experiment data are used for benchmark. The fuel composition of X447 assembly is U-10Zr and PGSFR also uses this composition in initial phase. So we select X447 EBR-II experiment for benchmark analysis. Due to the lack of experience of SFR operation and data, the current understanding of SFR fuel behavior is limited. However, in order to prepare the licensing of PGSFR, regulatory audit technologies of SFR must be secured. So, in this study, to verify the new audit fuel performance analysis code, the benchmark analysis is performed using X447 EBR-II experiment data. Also, the sensitivity analysis with mass flux change of coolant is performed. In terms of verification, it is considered that the results of benchmark and sensitivity analysis are reasonable.

  3. X447 EBR-II Experiment Benchmark for Verification of Audit Code of SFR Metal Fuel

    International Nuclear Information System (INIS)

    Choi, Yong Won; Bae, Moo-Hoon; Shin, Andong; Suh, Namduk

    2016-01-01

    In KINS (Korea Institute of Nuclear Safety), to prepare audit calculation of PGSFR licensing review, the project has been started to develop the regulatory technology for SFR system including a fuel area. To evaluate the fuel integrity and safety during an irradiation, the fuel performance code must be used for audit calculation. In this study, to verify the new code system, the benchmark analysis is performed. In the benchmark, X447 EBR-II experiment data are used. Additionally, the sensitivity analysis according to mass flux change of coolant is performed. In case of LWR fuel performance modeling, various and advanced models have been proposed and validated based on sufficient in-reactor test results. However, due to the lack of experience of SFR operation, the current understanding of SFR fuel behavior is limited. In this study, X447 EBR-II Experiment data are used for benchmark. The fuel composition of X447 assembly is U-10Zr and PGSFR also uses this composition in initial phase. So we select X447 EBR-II experiment for benchmark analysis. Due to the lack of experience of SFR operation and data, the current understanding of SFR fuel behavior is limited. However, in order to prepare the licensing of PGSFR, regulatory audit technologies of SFR must be secured. So, in this study, to verify the new audit fuel performance analysis code, the benchmark analysis is performed using X447 EBR-II experiment data. Also, the sensitivity analysis with mass flux change of coolant is performed. In terms of verification, it is considered that the results of benchmark and sensitivity analysis are reasonable

  4. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 4F. Paks NPP: Analysis and testing. Working material

    International Nuclear Information System (INIS)

    1999-01-01

    In August 1991, following the SMiRT-11 Conference in Tokyo, a Technical Committee Meeting was held on the 'Seismic safety issues relating to existing NPPs'. The Proceedings of this TCM was subsequently compiled in an IAEA Working Material. One of the main recommendations of this TCM, called for the harmonization of criteria and methods used in Member States in seismic reassessment and upgrading of existing NPPs. Twenty four institutions from thirteen countries participated in the CRP named 'Benchmark study for the seismic analysis and testing of WWER type NPPs'. Two types of WWER reactors (WWER-1000 and WWER-440/213) selected for benchmarking. Kozloduy NPP Units 5/6 and Paks NPP represented these respectively as prototypes. Consistent with the recommendations of the TCM and the working paper prepared by the subsequent Consultants' Meeting, the focal activity of the CRP was the benchmarking exercises. A similar methodology was followed both for Paks NPP and Kozloduy NPP Unit 5. Firstly, the NPP (mainly the reactor building) was tested using a blast loading generated by a series of explosions from buried TNT charges. Records from this test were obtained at several free field locations (both downhole and surface), foundation mat, various elevations of structures as well as some tanks and the stack. Then the benchmark participants were provided with structural drawings, soil data and the free field record of the blast experiment. Their task was to make a blind prediction of the response at preselected locations. The analytical results from these participants were then compared with the results from the test. Although the benchmarking exercises constituted the focus of the CRP, there were many other interesting problems related to the seismic safety of WWER type NPPs which were addressed by the participants. These involved generic studies, i.e. codes and standards used in original WWER designs and their comparison with current international practice; seismic analysis

  5. Automatic generation of 3D fine mesh geometries for the analysis of the venus-3 shielding benchmark experiment with the Tort code

    International Nuclear Information System (INIS)

    Pescarini, M.; Orsi, R.; Martinelli, T.

    2003-01-01

    In many practical radiation transport applications today the cost for solving refined, large size and complex multi-dimensional problems is not so much computing but is linked to the cumbersome effort required by an expert to prepare a detailed geometrical model, verify and validate that it is correct and represents, to a specified tolerance, the real design or facility. This situation is, in particular, relevant and frequent in reactor core criticality and shielding calculations, with three-dimensional (3D) general purpose radiation transport codes, requiring a very large number of meshes and high performance computers. The need for developing tools that make easier the task to the physicist or engineer, by reducing the time required, by facilitating through effective graphical display the verification of correctness and, finally, that help the interpretation of the results obtained, has clearly emerged. The paper shows the results of efforts in this field through detailed simulations of a complex shielding benchmark experiment. In the context of the activities proposed by the OECD/NEA Nuclear Science Committee (NSC) Task Force on Computing Radiation Dose and Modelling of Radiation-Induced Degradation of Reactor Components (TFRDD), the ENEA-Bologna Nuclear Data Centre contributed with an analysis of the VENUS-3 low-flux neutron shielding benchmark experiment (SCK/CEN-Mol, Belgium). One of the targets of the work was to test the BOT3P system, originally developed at the Nuclear Data Centre in ENEA-Bologna and actually released to OECD/NEA Data Bank for free distribution. BOT3P, ancillary system of the DORT (2D) and TORT (3D) SN codes, permits a flexible automatic generation of spatial mesh grids in Cartesian or cylindrical geometry, through combinatorial geometry algorithms, following a simplified user-friendly approach. This system demonstrated its validity also in core criticality analyses, as for example the Lewis MOX fuel benchmark, permitting to easily

  6. Deviating From the Benchmarks

    DEFF Research Database (Denmark)

    Rocha, Vera; Van Praag, Mirjam; Carneiro, Anabela

    This paper studies three related questions: To what extent otherwise similar startups employ different quantities and qualities of human capital at the moment of entry? How persistent are initial human capital choices over time? And how does deviating from human capital benchmarks influence firm......, founders human capital, and the ownership structure of startups (solo entrepreneurs versus entrepreneurial teams). We then study the survival implications of exogenous deviations from these benchmarks, based on spline models for survival data. Our results indicate that (especially negative) deviations from...... the benchmark can be substantial, are persistent over time, and hinder the survival of firms. The implications may, however, vary according to the sector and the ownership structure at entry. Given the stickiness of initial choices, wrong human capital decisions at entry turn out to be a close to irreversible...

  7. Analysis of benchmark lattices with endf/b-vi, jef-2.2 and jendl-3 data

    International Nuclear Information System (INIS)

    Saglam, M.

    1995-01-01

    The NJOY Nuclear Data Processing System has been used to process the ENDF/B-VI , JEF-2.2 and JENDL-3 Nuclear Cross Section Data Bases into multigroup form. A brief description of the data bases is given and the assumptions made in processing the data from evaluated nuclear data file format to multigroup format are presented. The differences and similarities of the Evaluated Nuclear Data Files have been investigated by producing four group cross sections by using the GROUPIE code and calculating thermal, fission spectrum averaged and 2200 m/s cross sections and resonance integrals using the INTER cale. It has been shown that the evaluated data for U238 in JEF and ENDF/B-VI are principally the same while in case of U235 the same is true for JENDL and ENDF/B-VI. The evaluations for U233 and Th232 are different for all three ENDF files. Several utility codes have been written to convert the multigroup library into a WIMS-D4 compatible binary library. The performance and suitability of the generated libraries have been tested with the use of metal tueled TRX lattices, uranium oxide fueled BAPL lattices and Th232-U233 fueled BNL lattices. The use ot a new thermal scattering matrix for Hydrogen from ENDF/B-VI increased keff for 0.5 o/ while the use of ENDF/B-VI U238 decreased it for 2.5 %. Although the original WIMS library performed well for Ihe effective multiplication factor of the lattices there is an improvement for the epithermal to thermal capture rate of U238 while using new data in the TRX and BAPL lattices. The effect of the fission spectrum is investigated for the BNL lattices and it is shown that using U233 fission spectrum instead of the original U235 spectrum gives a keff which agrees better with the experimental value. The results obtained by using new multigroup data are generally acceptable and in the experimental error range. They especially improve the prediction of the reaction rate dependent benchmark parameters

  8. Critical point analysis of phase envelope diagram

    Energy Technology Data Exchange (ETDEWEB)

    Soetikno, Darmadi; Siagian, Ucok W. R. [Department of Petroleum Engineering, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia); Kusdiantara, Rudy, E-mail: rkusdiantara@s.itb.ac.id; Puspita, Dila, E-mail: rkusdiantara@s.itb.ac.id; Sidarto, Kuntjoro A., E-mail: rkusdiantara@s.itb.ac.id; Soewono, Edy; Gunawan, Agus Y. [Department of Mathematics, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia)

    2014-03-24

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab.

  9. Critical point analysis of phase envelope diagram

    International Nuclear Information System (INIS)

    Soetikno, Darmadi; Siagian, Ucok W. R.; Kusdiantara, Rudy; Puspita, Dila; Sidarto, Kuntjoro A.; Soewono, Edy; Gunawan, Agus Y.

    2014-01-01

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab

  10. HPCG Benchmark Technical Specification

    Energy Technology Data Exchange (ETDEWEB)

    Heroux, Michael Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dongarra, Jack [Univ. of Tennessee, Knoxville, TN (United States); Luszczek, Piotr [Univ. of Tennessee, Knoxville, TN (United States)

    2013-10-01

    The High Performance Conjugate Gradient (HPCG) benchmark [cite SNL, UTK reports] is a tool for ranking computer systems based on a simple additive Schwarz, symmetric Gauss-Seidel preconditioned conjugate gradient solver. HPCG is similar to the High Performance Linpack (HPL), or Top 500, benchmark [1] in its purpose, but HPCG is intended to better represent how today’s applications perform. In this paper we describe the technical details of HPCG: how it is designed and implemented, what code transformations are permitted and how to interpret and report results.

  11. Benchmarking for Best Practice

    CERN Document Server

    Zairi, Mohamed

    1998-01-01

    Benchmarking for Best Practice uses up-to-the-minute case-studies of individual companies and industry-wide quality schemes to show how and why implementation has succeeded. For any practitioner wanting to establish best practice in a wide variety of business areas, this book makes essential reading. .It is also an ideal textbook on the applications of TQM since it describes concepts, covers definitions and illustrates the applications with first-hand examples. Professor Mohamed Zairi is an international expert and leading figure in the field of benchmarking. His pioneering work in this area l

  12. Benchmarking Danish Industries

    DEFF Research Database (Denmark)

    Gammelgaard, Britta; Bentzen, Eric; Aagaard Andreassen, Mette

    2003-01-01

    compatible survey. The International Manufacturing Strategy Survey (IMSS) doesbring up the question of supply chain management, but unfortunately, we did not have access to thedatabase. Data from the members of the SCOR-model, in the form of benchmarked performance data,may exist, but are nonetheless...... not public. The survey is a cooperative project "Benchmarking DanishIndustries" with CIP/Aalborg University, the Danish Technological University, the DanishTechnological Institute and Copenhagen Business School as consortia partners. The project has beenfunded by the Danish Agency for Trade and Industry...

  13. [Do you mean benchmarking?].

    Science.gov (United States)

    Bonnet, F; Solignac, S; Marty, J

    2008-03-01

    The purpose of benchmarking is to settle improvement processes by comparing the activities to quality standards. The proposed methodology is illustrated by benchmark business cases performed inside medical plants on some items like nosocomial diseases or organization of surgery facilities. Moreover, the authors have built a specific graphic tool, enhanced with balance score numbers and mappings, so that the comparison between different anesthesia-reanimation services, which are willing to start an improvement program, is easy and relevant. This ready-made application is even more accurate as far as detailed tariffs of activities are implemented.

  14. RB reactor benchmark cores

    International Nuclear Information System (INIS)

    Pesic, M.

    1998-01-01

    A selected set of the RB reactor benchmark cores is presented in this paper. The first results of validation of the well-known Monte Carlo MCNP TM code and adjoining neutron cross section libraries are given. They confirm the idea for the proposal of the new U-D 2 O criticality benchmark system and support the intention to include this system in the next edition of the recent OECD/NEA Project: International Handbook of Evaluated Criticality Safety Experiment, in near future. (author)

  15. Comparison between RELAP5 versions for a two-phase natural circulation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Braz Filho, Francisco A.; Ribeiro, Guilherme B.; Sabundjian, Gaianê; Caldeira, Alexandre D., E-mail: fbraz@ieav.cta.br, E-mail: gbribeiro@ieav.cta.br, E-mail: alexdc@ieav.cta.br, E-mail: gdjian@ipen.br [Instituto de Estudos Avançados (IEAv), São José dos Campos, SP (Brazil). Div. de Energia Nuclear; Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2017-11-01

    RELAP5 is one of the most used numerical tools to predict thermal-hydraulic and neutronic phenomena in nuclear reactors. RELAP5-3D is the latest version of this software family, but RELAP5-mod3 is still widely used in Brazilian research institutes and it is also used as benchmark for several nuclear applications. Among these applications, the use of passive heat transfer mechanisms, such as natural circulation, has drawn attention of several studies, especially after the Fukushima-Daiichi accident. Considering this aforementioned aspect, this study proposes a comparison of RELAP5-3D and RELAP5-mod3 versions, focusing on a two-phase natural circulation loop. For comparison purposes, an experimental data set is part of the analysis. Results showed that during the single-phase regime, the temperature difference between versions is negligible. However, when the two-phase flow regime takes place, different wavelengths and amplitudes of flow instabilities were obtained for each version. When compared to the experimental data set, the RELAP5-3D version provided the best prediction results. (author)

  16. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 4E. Paks NPP: Analysis and testing. Working material

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    In August 1991, following the SMiRT-11 Conference in Tokyo, a Technical Committee Meeting was held on the 'Seismic safety issues relating to existing NPPs'. The Proceedings of this TCM was subsequently compiled in an IAEA Working Material. One of the main recommendations of this TCM, called for the harmonization of criteria and methods used in Member States in seismic reassessment and upgrading of existing NPPs. Twenty four institutions from thirteen countries participated in the CRP named 'Benchmark study for the seismic analysis and testing of WWER type NPPs'. Two types of WWER reactors (WWER-1000 and WWER-440/213) selected for benchmarking. Kozloduy NPP Units 5/6 and Paks NPP represented these respectively as prototypes. Consistent with the recommendations of the TCM and the working paper prepared by the subsequent Consultants' Meeting, the focal activity of the CRP was the benchmarking exercises. A similar methodology was followed both for Paks NPP and Kozloduy NPP Unit 5. Firstly, the NPP (mainly the reactor building) was tested using a blast loading generated by a series of explosions from buried TNT charges. Records from this test were obtained at several free field locations (both downhole and surface), foundation mat, various elevations of structures as well as some tanks and the stack. Then the benchmark participants were provided with structural drawings, soil data and the free field record of the blast experiment. Their task was to make a blind prediction of the response at preselected locations. The analytical results from these participants were then compared with the results from the test. Although the benchmarking exercises constituted the focus of the CRP, there were many other interesting problems related to the seismic safety of WWER type NPPs which were addressed by the participants. These involved generic studies, i.e. codes and standards used in original WWER designs and their comparison with current international practice; seismic analysis

  17. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 3H. Kozloduy NPP units 5/6: Analysis and testing. Working material

    International Nuclear Information System (INIS)

    1999-01-01

    In August 1991, following the SMiRT-11 Conference in Tokyo, a Technical Committee Meeting was held on the 'Seismic safety issues relating to existing NPPs'. The Proceedings of this TCM was subsequently compiled in an IAEA Working Material. One of the main recommendations of this TCM, called for the harmonization of criteria and methods used in Member States in seismic reassessment and upgrading of existing NPPs. Twenty four institutions from thirteen countries participated in the CRP named 'Benchmark study for the seismic analysis and testing of WWER type NPPs'. Two types of WWER reactors (WWER-1000 and WWER-440/213) selected for benchmarking. Kozloduy NPP Units 5/6 and Paks NPP represented these respectively as prototypes. Consistent with the recommendations of the TCM and the working paper prepared by the subsequent Consultants' Meeting, the focal activity of the CRP was the benchmarking exercises. A similar methodology was followed both for Paks NPP and Kozloduy NPP Unit 5. Firstly, the NPP (mainly the reactor building) was tested using a blast loading generated by a series of explosions from buried TNT charges. Records from this test were obtained at several free field locations (both downhole and surface), foundation mat, various elevations of structures as well as some tanks and the stack. Then the benchmark participants were provided with structural drawings, soil data and the free field record of the blast experiment. Their task was to make a blind prediction of the response at preselected locations. The analytical results from these participants were then compared with the results from the test. Although the benchmarking exercises constituted the focus of the CRP, there were many other interesting problems related to the seismic safety of WWER type NPPs which were addressed by the participants. These involved generic studies, i.e. codes and standards used in original WWER designs and their comparison with current international practice; seismic analysis

  18. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 4E. Paks NPP: Analysis and testing. Working material

    International Nuclear Information System (INIS)

    1995-01-01

    In August 1991, following the SMiRT-11 Conference in Tokyo, a Technical Committee Meeting was held on the 'Seismic safety issues relating to existing NPPs'. The Proceedings of this TCM was subsequently compiled in an IAEA Working Material. One of the main recommendations of this TCM, called for the harmonization of criteria and methods used in Member States in seismic reassessment and upgrading of existing NPPs. Twenty four institutions from thirteen countries participated in the CRP named 'Benchmark study for the seismic analysis and testing of WWER type NPPs'. Two types of WWER reactors (WWER-1000 and WWER-440/213) selected for benchmarking. Kozloduy NPP Units 5/6 and Paks NPP represented these respectively as prototypes. Consistent with the recommendations of the TCM and the working paper prepared by the subsequent Consultants' Meeting, the focal activity of the CRP was the benchmarking exercises. A similar methodology was followed both for Paks NPP and Kozloduy NPP Unit 5. Firstly, the NPP (mainly the reactor building) was tested using a blast loading generated by a series of explosions from buried TNT charges. Records from this test were obtained at several free field locations (both downhole and surface), foundation mat, various elevations of structures as well as some tanks and the stack. Then the benchmark participants were provided with structural drawings, soil data and the free field record of the blast experiment. Their task was to make a blind prediction of the response at preselected locations. The analytical results from these participants were then compared with the results from the test. Although the benchmarking exercises constituted the focus of the CRP, there were many other interesting problems related to the seismic safety of WWER type NPPs which were addressed by the participants. These involved generic studies, i.e. codes and standards used in original WWER designs and their comparison with current international practice; seismic analysis

  19. Benchmarking and Performance Management

    Directory of Open Access Journals (Sweden)

    Adrian TANTAU

    2010-12-01

    Full Text Available The relevance of the chosen topic is explained by the meaning of the firm efficiency concept - the firm efficiency means the revealed performance (how well the firm performs in the actual market environment given the basic characteristics of the firms and their markets that are expected to drive their profitability (firm size, market power etc.. This complex and relative performance could be due to such things as product innovation, management quality, work organization, some other factors can be a cause even if they are not directly observed by the researcher. The critical need for the management individuals/group to continuously improve their firm/company’s efficiency and effectiveness, the need for the managers to know which are the success factors and the competitiveness determinants determine consequently, what performance measures are most critical in determining their firm’s overall success. Benchmarking, when done properly, can accurately identify both successful companies and the underlying reasons for their success. Innovation and benchmarking firm level performance are critical interdependent activities. Firm level variables, used to infer performance, are often interdependent due to operational reasons. Hence, the managers need to take the dependencies among these variables into account when forecasting and benchmarking performance. This paper studies firm level performance using financial ratio and other type of profitability measures. It uses econometric models to describe and then propose a method to forecast and benchmark performance.

  20. Surveys and Benchmarks

    Science.gov (United States)

    Bers, Trudy

    2012-01-01

    Surveys and benchmarks continue to grow in importance for community colleges in response to several factors. One is the press for accountability, that is, for colleges to report the outcomes of their programs and services to demonstrate their quality and prudent use of resources, primarily to external constituents and governing boards at the state…

  1. A framework for benchmarking land models

    Directory of Open Access Journals (Sweden)

    Y. Q. Luo

    2012-10-01

    Full Text Available Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1 targeted aspects of model performance to be evaluated, (2 a set of benchmarks as defined references to test model performance, (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4 model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data–model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties

  2. Application of benchmark dose modeling to protein expression data in the development and analysis of mode of action/adverse outcome pathways for testicular toxicity.

    Science.gov (United States)

    Chepelev, Nikolai L; Meek, M E Bette; Yauk, Carole Lyn

    2014-11-01

    Reliable quantification of gene and protein expression has potential to contribute significantly to the characterization of hypothesized modes of action (MOA) or adverse outcome pathways for critical effects of toxicants. Quantitative analysis of gene expression by benchmark dose (BMD) modeling has been facilitated by the development of effective software tools. In contrast, protein expression is still generally quantified by a less robust effect level (no or lowest [adverse] effect levels) approach, which minimizes its potential utility in the consideration of dose-response and temporal concordance for key events in hypothesized MOAs. BMD modeling is applied here to toxicological data on testicular toxicity to investigate its potential utility in analyzing protein expression relevant to the proposed MOA to inform human health risk assessment. The results illustrate how the BMD analysis of protein expression in animal tissues in response to toxicant exposure: (1) complements other toxicity data, and (2) contributes to consideration of the empirical concordance of dose-response relationships, as part of the weight of evidence for hypothesized MOAs to facilitate consideration and application in regulatory risk assessment. Lack of BMD analysis in proteomics has likely limited its use for these purposes. This paper illustrates the added value of BMD modeling to support and strengthen hypothetical MOAs as a basis to facilitate the translation and uptake of the results of proteomic research into risk assessment. Copyright © 2014 Her Majesty the Queen in Right of Canada. Journal of Applied Toxicology © 2014 John Wiley & Sons, Ltd.

  3. Inclusion and Human Rights in Health Policies: Comparative and Benchmarking Analysis of 51 Policies from Malawi, Sudan, South Africa and Namibia

    Science.gov (United States)

    MacLachlan, Malcolm; Amin, Mutamad; Mannan, Hasheem; El Tayeb, Shahla; Bedri, Nafisa; Swartz, Leslie; Munthali, Alister; Van Rooy, Gert; McVeigh, Joanne

    2012-01-01

    While many health services strive to be equitable, accessible and inclusive, peoples’ right to health often goes unrealized, particularly among vulnerable groups. The extent to which health policies explicitly seek to achieve such goals sets the policy context in which services are delivered and evaluated. An analytical framework was developed – EquiFrame – to evaluate 1) the extent to which 21 Core Concepts of human rights were addressed in policy documents, and 2) coverage of 12 Vulnerable Groups who might benefit from such policies. Using this framework, analysis of 51 policies across Malawi, Namibia, South Africa and Sudan, confirmed the relevance of all Core Concepts and Vulnerable Groups. Further, our analysis highlighted some very strong policies, serious shortcomings in others as well as country-specific patterns. If social inclusion and human rights do not underpin policy formation, it is unlikely they will be inculcated in service delivery. EquiFrame facilitates policy analysis and benchmarking, and provides a means for evaluating policy revision and development. PMID:22649488

  4. Inclusion and human rights in health policies: comparative and benchmarking analysis of 51 policies from Malawi, Sudan, South Africa and Namibia.

    Directory of Open Access Journals (Sweden)

    Malcolm MacLachlan

    Full Text Available While many health services strive to be equitable, accessible and inclusive, peoples' right to health often goes unrealized, particularly among vulnerable groups. The extent to which health policies explicitly seek to achieve such goals sets the policy context in which services are delivered and evaluated. An analytical framework was developed--EquiFrame--to evaluate 1 the extent to which 21 Core Concepts of human rights were addressed in policy documents, and 2 coverage of 12 Vulnerable Groups who might benefit from such policies. Using this framework, analysis of 51 policies across Malawi, Namibia, South Africa and Sudan, confirmed the relevance of all Core Concepts and Vulnerable Groups. Further, our analysis highlighted some very strong policies, serious shortcomings in others as well as country-specific patterns. If social inclusion and human rights do not underpin policy formation, it is unlikely they will be inculcated in service delivery. EquiFrame facilitates policy analysis and benchmarking, and provides a means for evaluating policy revision and development.

  5. IWGFR benchmark test on signal processing for boiling noise detection, stage 2: Analysis of data from BOR-60

    International Nuclear Information System (INIS)

    Rowley, R.; Waites, C.; Macleod, I.D.

    1989-01-01

    Data from boiling experiments in the BOR 60 reactor in USSR has been supplied by IAEA to enable analysis techniques to be compared. The signals have been analysed at RNL using two basic techniques, High Frequency RMS analysis and Pulse Counting analysis and two more sophisticated methods, Pattern Recognition and Pulse Timing Analysis. All methods indicated boiling successfully, pulse counting proved more sensitive than RMS for the detection of the onset of boiling. Pattern Recognition shows promise of a very reliable detector provided the background can be defined. Data from an Ionisation chamber was also supplied and there was good correlation between the neutronic and acoustic signals. (author). 25 figs, 4 tabs

  6. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 3F. Kozloduy NPP units 5/6: Analysis and testing. Working material

    International Nuclear Information System (INIS)

    1999-01-01

    In August 1991, following the SMiRT-11 Conference in Tokyo, a Technical Committee Meeting was held on the 'Seismic safety issues relating to existing NPPs'. The Proceedings of this TCM was subsequently compiled in an IAEA Working Material. One of the main recommendations of this TCM, called for the harmonization of criteria and methods used in Member States in seismic reassessment and upgrading of existing NPPs. Twenty four institutions from thirteen countries participated in the CRP named 'Benchmark study for the seismic analysis and testing of WWER type NPPs'. Two types of WWER reactors (WWER-1000 and WWER-440/213) selected for benchmarking. Kozloduy NPP Units 5/6 and Paks NPP represented these respectively as prototypes. Consistent with the recommendations of the TCM and the working paper prepared by the subsequent Consultants' Meeting, the focal activity of the CRP was the benchmarking exercises. A similar methodology was followed both for Paks NPP and Kozloduy NPP Unit 5. Firstly, the NPP (mainly the reactor building) was tested using a blast loading generated by a series of explosions from buried TNT charges. Records from this test were obtained at several free field locations (both downhole and surface), foundation mat, various elevations of structures as well as some tanks and the stack. Then the benchmark participants were provided with structural drawings, soil data and the free field record of the blast experiment. Their task was to make a blind prediction of the response at preselected locations. The analytical results from these participants were then compared with the results from the test. Although the benchmarking exercises constituted the focus of the CRP, there were many other interesting problems related to the seismic safety of WWER type NPPs which were addressed by the participants. These involved generic studies, i.e. codes and standards used in original WWER designs and their comparison with current international practice; seismic analysis

  7. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 3G. Kozloduy NPP units 5/6: Analysis and testing. Working material

    International Nuclear Information System (INIS)

    1999-01-01

    In August 1991, following the SMiRT-11 Conference in Tokyo, a Technical Committee Meeting was held on the 'Seismic safety issues relating to existing NPPs'. The Proceedings of this TCM was subsequently compiled in an IAEA Working Material. One of the main recommendations of this TCM, called for the harmonization of criteria and methods used in Member States in seismic reassessment and upgrading of existing NPPs. Twenty four institutions from thirteen countries participated in the CRP named 'Benchmark study for the seismic analysis and testing of WWER type NPPs'. Two types of WWER reactors (WWER-1000 and WWER-440/213) selected for benchmarking. Kozloduy NPP Units 5/6 and Paks NPP represented these respectively as prototypes. Consistent with the recommendations of the TCM and the working paper prepared by the subsequent Consultants' Meeting, the focal activity of the CRP was the benchmarking exercises. A similar methodology was followed both for Paks NPP and Kozloduy NPP Unit 5. Firstly, the NPP (mainly the reactor building) was tested using a blast loading generated by a series of explosions from buried TNT charges. Records from this test were obtained at several free field locations (both downhole and surface), foundation mat, various elevations of structures as well as some tanks and the stack. Then the benchmark participants were provided with structural drawings, soil data and the free field record of the blast experiment. Their task was to make a blind prediction of the response at preselected locations. The analytical results from these participants were then compared with the results from the test. Although the benchmarking exercises constituted the focus of the CRP, there were many other interesting problems related to the seismic safety of WWER type NPPs which were addressed by the participants. These involved generic studies, i.e. codes and standards used in original WWER designs and their comparison with current international practice; seismic analysis

  8. Benchmarking, benchmarks, or best practices? Applying quality improvement principles to decrease surgical turnaround time.

    Science.gov (United States)

    Mitchell, L

    1996-01-01

    The processes of benchmarking, benchmark data comparative analysis, and study of best practices are distinctly different. The study of best practices is explained with an example based on the Arthur Andersen & Co. 1992 "Study of Best Practices in Ambulatory Surgery". The results of a national best practices study in ambulatory surgery were used to provide our quality improvement team with the goal of improving the turnaround time between surgical cases. The team used a seven-step quality improvement problem-solving process to improve the surgical turnaround time. The national benchmark for turnaround times between surgical cases in 1992 was 13.5 minutes. The initial turnaround time at St. Joseph's Medical Center was 19.9 minutes. After the team implemented solutions, the time was reduced to an average of 16.3 minutes, an 18% improvement. Cost-benefit analysis showed a potential enhanced revenue of approximately $300,000, or a potential savings of $10,119. Applying quality improvement principles to benchmarking, benchmarks, or best practices can improve process performance. Understanding which form of benchmarking the institution wishes to embark on will help focus a team and use appropriate resources. Communicating with professional organizations that have experience in benchmarking will save time and money and help achieve the desired results.

  9. Confidential benchmarking based on multiparty computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Damgård, Kasper Lyneborg; Nielsen, Kurt

    We report on the design and implementation of a system that uses multiparty computation to enable banks to benchmark their customers' confidential performance data against a large representative set of confidential performance data from a consultancy house. The system ensures that both the banks......' and the consultancy house's data stays confidential, the banks as clients learn nothing but the computed benchmarking score. In the concrete business application, the developed prototype help Danish banks to find the most efficient customers among a large and challenging group of agricultural customers with too much...... debt. We propose a model based on linear programming for doing the benchmarking and implement it using the SPDZ protocol by Damgård et al., which we modify using a new idea that allows clients to supply data and get output without having to participate in the preprocessing phase and without keeping...

  10. Finite element analysis of a 1:4 scale PCCV model - Korea Atomic Energy Research Institute, Phase 2

    International Nuclear Information System (INIS)

    Lee, Hong-pyo; Choun, Young-sun

    2005-01-01

    This report covers phase 2 of the International Standard Problem 48 (ISP48) benchmark on containment integrity. It describes the finite element (FE) analysis results of a 1:4 scale model of a pre-stressed concrete containment vessel (PCCV) model. The objective of the present FE analysis is to evaluate the ultimate internal pressure capacity of the PCCV as well as its failure mechanism when the PCCV model is subjected to a monotonous internal pressure beyond its design pressure. The FE analysis used two concrete failure criteria with the commercial code ABAQUS. One is axisymmetric model with modified Drucker-Prager failure criteria and the other is 3-dimensional model with damaged plasticity model. Finally, the FE analysis results on the ultimate pressure and failure modes have a good agreement with experimental data

  11. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 4A. Paks NPP: Analysis/testing. Working material

    International Nuclear Information System (INIS)

    1995-01-01

    The Co-ordinated research programme on the benchmark study for the seismic analysis and testing of WWER-type nuclear power plants was initiated subsequent to the request from representatives of Member States. The conclusions adopted at the Technical Committee Meeting on Seismic Issues related to existing nuclear power plants held in Tokyo in 1991 called for the harmonization of methods and criteria used in Member States in issues related to seismic safety. The Consulltants' Meeting which followed resulted in producing a working document for CRP. It was decided that a benchmark study is the most effective way to achieve the principal objective. Two types of WWER reactors (WWER-440/213 and WWER-1000) were selected as prototypes for the benchmark exercise to be tested on a full scale using explosions and/or vibration generators. The two prototypes are Kozloduy Units 5/6 for WWER-1000 and Paks for WWER-440/213 nuclear power plants. This volume of Working material contains reports on data related to seismic analyses of structures of Paks and Kozloduy reactor buildings and WWER-440/213 primary coolant loops with different antiseismic devices

  12. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 4B. Paks NPP: Analysis/testing. Working material

    International Nuclear Information System (INIS)

    1995-01-01

    The Co-ordinated research programme on the benchmark study for the seismic analysis and testing of WWER-type nuclear power plants was initiated subsequent to the request from representatives of Member States. The conclusions adopted at the Technical Committee Meeting on Seismic Issues related to existing nuclear power plants held in Tokyo in 1991 called for the harmonization of methods and criteria used in Member States in issues related to seismic safety. The Consulltants' Meeting which followed resulted in producing a working document for CRP. It was decided that a benchmark study is the most effective way to achieve the principal objective. Two types of WWER reactors (WWER-440/213 and WWER-1000) were selected as prototypes for the benchmark exercise to be tested on a full scale using explosions and/or vibration generators. The two prototypes are Kozloduy Units 5/6 for WWER-1000 and Paks for WWER-440/213 nuclear power plants. This volume of Working material contains reports on dynamic study of the main building of the Paks NPP; shake table investigation at Paks NPP and the Final report of the Co-ordinated Research Programme

  13. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 4A. Paks NPP: Analysis/testing. Working material

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    The Co-ordinated research programme on the benchmark study for the seismic analysis and testing of WWER-type nuclear power plants was initiated subsequent to the request from representatives of Member States. The conclusions adopted at the Technical Committee Meeting on Seismic Issues related to existing nuclear power plants held in Tokyo in 1991 called for the harmonization of methods and criteria used in Member States in issues related to seismic safety. The Consulltants' Meeting which followed resulted in producing a working document for CRP. It was decided that a benchmark study is the most effective way to achieve the principal objective. Two types of WWER reactors (WWER-440/213 and WWER-1000) were selected as prototypes for the benchmark exercise to be tested on a full scale using explosions and/or vibration generators. The two prototypes are Kozloduy Units 5/6 for WWER-1000 and Paks for WWER-440/213 nuclear power plants. This volume of Working material contains reports on data related to seismic analyses of structures of Paks and Kozloduy reactor buildings and WWER-440/213 primary coolant loops with different antiseismic devices.

  14. Phase Plane Analysis Method of Nonlinear Traffic Phenomena

    Directory of Open Access Journals (Sweden)

    Wenhuan Ai

    2015-01-01

    Full Text Available A new phase plane analysis method for analyzing the complex nonlinear traffic phenomena is presented in this paper. This method makes use of variable substitution to transform a traditional traffic flow model into a new model which is suitable for the analysis in phase plane. According to the new model, various traffic phenomena, such as the well-known shock waves, rarefaction waves, and stop-and-go waves, are analyzed in the phase plane. From the phase plane diagrams, we can see the relationship between traffic jams and system instability. So the problem of traffic flow could be converted into that of system stability. The results show that the traffic phenomena described by the new method is consistent with that described by traditional methods. Moreover, the phase plane analysis highlights the unstable traffic phenomena we are chiefly concerned about and describes the variation of density or velocity with time or sections more clearly.

  15. Wavelet analysis of the nuclear phase space

    Energy Technology Data Exchange (ETDEWEB)

    Jouault, B.; Sebille, F.; Mota, V. de la

    1997-12-31

    The description of transport phenomena in nuclear matter is addressed in a new approach based on the mathematical theory of wavelets and the projection methods of statistical physics. The advantage of this framework is to offer the opportunity to use information concepts common to both the formulation of physical properties and the mathematical description. This paper focuses on two features, the extraction of relevant informations using the geometrical properties of the underlying phase space and the optimization of the theoretical and numerical treatments based on convenient choices of the representation spaces. (author). 34 refs.

  16. Wavelet analysis of the nuclear phase space

    International Nuclear Information System (INIS)

    Jouault, B.; Sebille, F.; Mota, V. de la.

    1997-01-01

    The description of transport phenomena in nuclear matter is addressed in a new approach based on the mathematical theory of wavelets and the projection methods of statistical physics. The advantage of this framework is to offer the opportunity to use information concepts common to both the formulation of physical properties and the mathematical description. This paper focuses on two features, the extraction of relevant informations using the geometrical properties of the underlying phase space and the optimization of the theoretical and numerical treatments based on convenient choices of the representation spaces. (author)

  17. An investigation of subchannel analysis models for single-phase and two-phase flow

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Dae Hyun

    1996-01-01

    The governing equations and lateral transport modelings of subchannel analysis code, which is the most widely used tool for the analysis of thermal hydraulics fields in reactor cores, have been thoroughly investigated in this study. The procedure for the derivation of subchannel integral balance equations from the local instantaneous phase equations was investigated by stages. The characteristics of governing equations according to the treatment of phase velocity were studies, and the equations based on the drift-flux equilibrium formulation have been derived. Turbulent mixing and void drift modeling, which affect considerably to the accuracy of subchannel analysis code, have been reviewed. In addition, some representative modelings of single-phase and two-phase turbulent mixing models have been introduced. (author). 5 tabs., 4 figs., 16 refs.

  18. Quick Spacecraft Thermal Analysis Tool, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — For spacecraft design and development teams concerned with cost and schedule, the Quick Spacecraft Thermal Analysis Tool (QuickSTAT) is an innovative software suite...

  19. Lesson learned from the SARNET wall condensation benchmarks

    International Nuclear Information System (INIS)

    Ambrosini, W.; Forgione, N.; Merli, F.; Oriolo, F.; Paci, S.; Kljenak, I.; Kostka, P.; Vyskocil, L.; Travis, J.R.; Lehmkuhl, J.; Kelm, S.; Chin, Y.-S.; Bucci, M.

    2014-01-01

    Highlights: • The results of the benchmarking activity on wall condensation are reported. • The work was performed in the frame of SARNET. • General modelling techniques for condensation are discussed. • Results of University of Pisa and of other benchmark participants are discussed. • The lesson learned is drawn. - Abstract: The prediction of condensation in the presence of noncondensable gases has received continuing attention in the frame of the Severe Accident Research Network of Excellence, both in the first (2004–2008) and in the second (2009–2013) EC integrated projects. Among the different reasons for considering so relevant this basic phenomenon, coped with by classical treatments dated in the first decades of the last century, there is the interest for developing updated CFD models for reactor containment analysis, requiring validating at a different level the available modelling techniques. In the frame of SARNET, benchmarking activities were undertaken taking advantage of the work performed at different institutions in setting up and developing models for steam condensation in conditions of interest for nuclear reactor containment. Four steps were performed in the activity, involving: (1) an idealized problem freely inspired at the actual conditions occurring in an experimental facility, CONAN, installed at the University of Pisa; (2) a first comparison with experimental data purposely collected by the CONAN facility; (3) a second comparison with data available from experimental campaigns performed in the same apparatus before the inclusion of the activities in SARNET; (4) a third exercise involving data obtained at lower mixture velocity than in previous campaigns, aimed at providing conditions closer to those addressed in reactor containment analyses. The last step of the benchmarking activity required to change the configuration of the experimental apparatus to achieve the lower flow rates involved in the new test specifications. The

  20. Technical Overview of Ecological Risk Assessment - Analysis Phase: Exposure Characterization

    Science.gov (United States)

    Exposure Characterization is the second major component of the analysis phase of a risk assessment. For a pesticide risk assessment, the exposure characterization describes the potential or actual contact of a pesticide with a plant, animal, or media.

  1. Evaluation of PWR and BWR pin cell benchmark results

    International Nuclear Information System (INIS)

    Pijlgroms, B.J.; Gruppelaar, H.; Janssen, A.J.; Hoogenboom, J.E.; Leege, P.F.A. de; Voet, J. van der; Verhagen, F.C.M.

    1991-12-01

    Benchmark results of the Dutch PINK working group on PWR and BWR pin cell calculational benchmark as defined by EPRI are presented and evaluated. The observed discrepancies are problem dependent: a part of the results is satisfactory, some other results require further analysis. A brief overview is given of the different code packages used in this analysis. (author). 14 refs., 9 figs., 30 tabs

  2. Evaluation of PWR and BWR pin cell benchmark results

    Energy Technology Data Exchange (ETDEWEB)

    Pijlgroms, B.J.; Gruppelaar, H.; Janssen, A.J. (Netherlands Energy Research Foundation (ECN), Petten (Netherlands)); Hoogenboom, J.E.; Leege, P.F.A. de (Interuniversitair Reactor Inst., Delft (Netherlands)); Voet, J. van der (Gemeenschappelijke Kernenergiecentrale Nederland NV, Dodewaard (Netherlands)); Verhagen, F.C.M. (Keuring van Electrotechnische Materialen NV, Arnhem (Netherlands))

    1991-12-01

    Benchmark results of the Dutch PINK working group on PWR and BWR pin cell calculational benchmark as defined by EPRI are presented and evaluated. The observed discrepancies are problem dependent: a part of the results is satisfactory, some other results require further analysis. A brief overview is given of the different code packages used in this analysis. (author). 14 refs., 9 figs., 30 tabs.

  3. Benchmarking i den offentlige sektor

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj; Dietrichson, Lars; Sandalgaard, Niels

    2008-01-01

    I artiklen vil vi kort diskutere behovet for benchmarking i fraværet af traditionelle markedsmekanismer. Herefter vil vi nærmere redegøre for, hvad benchmarking er med udgangspunkt i fire forskellige anvendelser af benchmarking. Regulering af forsyningsvirksomheder vil blive behandlet, hvorefter...

  4. Integral benchmarks with reference to thorium fuel cycle

    International Nuclear Information System (INIS)

    Ganesan, S.

    2003-01-01

    This is a power point presentation about the Indian participation in the CRP 'Evaluated Data for the Thorium-Uranium fuel cycle'. The plans and scope of the Indian participation are to provide selected integral experimental benchmarks for nuclear data validation, including Indian Thorium burn up benchmarks, post-irradiation examination studies, comparison of basic evaluated data files and analysis of selected benchmarks for Th-U fuel cycle

  5. Establishing benchmarks for the management of elevated liver enzymes and/or dilated biliary trees in an urban safety net hospital: analysis of 915 subjects.

    Science.gov (United States)

    Liu, Laindy; Cripps, Michael W; Riggle, Andrew J; Wolf, Steven E; Nakonezny, Paul A; Phelan, Herb A

    2015-12-01

    The push for public reporting of outcomes necessitates relevant benchmarks for disease states across different settings. This study establishes benchmarks for choledocholithiasis management in a safety net hospital setting. We reviewed all patients admitted to our acute care surgery service with biochemical evidence of choledocholithiasis who underwent same-admission cholecystectomy (CCY) between July 2012 and December 2013. During this 18-month period, 915 patients were admitted with biochemical evidence of choledocholithiasis. Descriptive statistics for the cohort are provided, which include a 51% rate of obesity and 95% rate of pathologic cholecystitis. Conversion rates of 4% and complication rates of 6% were found. The majority had a CCY without biliary imaging (n = 630, 68.9%). Relevant benchmarks are characterized, and results of a practice pattern of omitting pre- or intraoperative biliary tree imaging are described. These findings serve as a first benchmark of choledocholithiasis management for urban safety net hospitals. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. The application of isogeometric analysis to the neutron diffusion equation for a pincell problem with an analytic benchmark

    International Nuclear Information System (INIS)

    Hall, S.K.; Eaton, M.D.; Williams, M.M.R.

    2012-01-01

    Highlights: ► Isogeometric analysis used to obtain solutions to the neutron diffusion equation. ► Exact geometry captured for a circular fuel pin within a square moderator. ► Comparisons are made between the finite element method and isogeometric analysis. ► Error and observed order of convergence found using an analytic solution. -- Abstract: In this paper the neutron diffusion equation is solved using Isogeometric Analysis (IGA), which is an attempt to generalise Finite Element Analysis (FEA) to include exact geometries. In contrast to FEA, the basis functions are rational functions instead of polynomials. These rational functions, called non-uniform rational B-splines, are used to capture both the geometry and approximate the solution. The method of manufactured solutions is used to verify a MatLab implementation of IGA, which is then applied to a pincell problem. This is a circular uranium fuel pin within a square block of graphite moderator. A new method is used to compute an analytic solution to a simplified version of this problem, and is then used to observe the order of convergence of the numerical scheme. Comparisons are made against quadratic finite elements for the pincell problem, and it is found that the disadvantage factor computed using IGA is less accurate. This is due to a cancellation of errors in the FEA solution. A modified pincell problem with vacuum boundary conditions is then considered. IGA is shown to outperform FEA in this situation.

  7. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 2. Generic material: Codes, standards, criteria. Working material

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    The Co-ordinated research programme on the benchmark study for the seismic analysis and testing of WWER-type nuclear power plants was initiated subsequent to the request from representatives of Member States. The conclusions adopted at the Technical Committee Meeting on Seismic Issues related to existing nuclear power plants held in Tokyo in 1991 called for the harmonization of methods and criteria used in Member States in issues related to seismic safety. The Consulltants' Meeting which followed resulted in producing a working document for CRP. It was decided that a benchmark study is the most effective way to achieve the principal objective. Two types of WWER reactors (WWER-440/213 and WWER-1000) were selected as prototypes for the benchmark exercise to be tested on a full scale using explosions and/or vibration generators. The two prototypes are Kozloduy Units 5/6 for WWER-1000 and Paks for WWER-440/213 nuclear power plants. This volume of Working material contains reports related to generic material, namely codes, standards and criteria for benchmark analysis.

  8. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 2. Generic material: Codes, standards, criteria. Working material

    International Nuclear Information System (INIS)

    1995-01-01

    The Co-ordinated research programme on the benchmark study for the seismic analysis and testing of WWER-type nuclear power plants was initiated subsequent to the request from representatives of Member States. The conclusions adopted at the Technical Committee Meeting on Seismic Issues related to existing nuclear power plants held in Tokyo in 1991 called for the harmonization of methods and criteria used in Member States in issues related to seismic safety. The Consulltants' Meeting which followed resulted in producing a working document for CRP. It was decided that a benchmark study is the most effective way to achieve the principal objective. Two types of WWER reactors (WWER-440/213 and WWER-1000) were selected as prototypes for the benchmark exercise to be tested on a full scale using explosions and/or vibration generators. The two prototypes are Kozloduy Units 5/6 for WWER-1000 and Paks for WWER-440/213 nuclear power plants. This volume of Working material contains reports related to generic material, namely codes, standards and criteria for benchmark analysis

  9. Cloud benchmarking for performance

    OpenAIRE

    Varghese, Blesson; Akgun, Ozgur; Miguel, Ian; Thai, Long; Barker, Adam

    2014-01-01

    Date of Acceptance: 20/09/2014 How can applications be deployed on the cloud to achieve maximum performance? This question has become significant and challenging with the availability of a wide variety of Virtual Machines (VMs) with different performance capabilities in the cloud. The above question is addressed by proposing a six step benchmarking methodology in which a user provides a set of four weights that indicate how important each of the following groups: memory, processor, computa...

  10. Benchmark Analysis for Condition Monitoring Test Techniques of Aged Low Voltage Cables in Nuclear Power Plants. Final Results of a Coordinated Research Project

    International Nuclear Information System (INIS)

    2017-10-01

    used for in situ testing of installed cables while a nuclear power plant is operating.The results of these benchmark tests were then compared to identify the best condition monitoring methods and establish recommendations for improvements. The conclusions of the data analysis provided insight into condition monitoring techniques which yield usable or traceable results

  11. Extensive regularization of the coupled cluster methods based on the generating functional formalism: Application to gas-phase benchmarks and to the SN2 reaction of CHCl3 and OH- in water

    International Nuclear Information System (INIS)

    Kowalski, Karol; Valiev, Marat

    2009-01-01

    The recently introduced energy expansion based on the use of generating functional (GF) [K. Kowalski and P. D. Fan, J. Chem. Phys. 130, 084112 (2009)] provides a way of constructing size-consistent noniterative coupled cluster (CC) corrections in terms of moments of the CC equations. To take advantage of this expansion in a strongly interacting regime, the regularization of the cluster amplitudes is required in order to counteract the effect of excessive growth of the norm of the CC wave function. Although proven to be efficient, the previously discussed form of the regularization does not lead to rigorously size-consistent corrections. In this paper we address the issue of size-consistent regularization of the GF expansion by redefining the equations for the cluster amplitudes. The performance and basic features of proposed methodology are illustrated on several gas-phase benchmark systems. Moreover, the regularized GF approaches are combined with quantum mechanical molecular mechanics module and applied to describe the S N 2 reaction of CHCl 3 and OH - in aqueous solution.

  12. Wavelet analysis of the nuclear phase space

    International Nuclear Information System (INIS)

    Jouault, B.; Sebille, F.; De La Mota, V.

    1997-01-01

    The description of complex systems requires to select and to compact the relevant information. The wavelet theory constitutes an appropriate framework for defining adapted representation bases obtained from a controlled hierarchy of approximations. The optimization of the wavelet analysis depend mainly on the chosen analysis method and wavelet family. Here the analysis of the harmonic oscillator wave function was carried out by considering a Spline bi-orthogonal wavelet base which satisfy the symmetry requirements and can be approximated by simple analytical functions. The goal of this study was to determine a selection criterion allowing to minimize the number of elements considered for an optimal description of the analysed functions. An essential point consists in utilization of the wavelet complementarity and of the scale functions in order to reproduce the oscillating and peripheral parts of the wave functions. The wavelet base representation allows defining a sequence of approximations of the density matrix. Thus, this wavelet representation of the density matrix offers an optimal base for describing both the static nuclear configurations and their time evolution. This information compacting procedure is performed in a controlled manner and preserves the structure of the system wave functions and consequently some of its quantum properties

  13. Benchmark study for the seismic analysis and testing of WWER type NPPs. Final report of a co-ordinated research project

    International Nuclear Information System (INIS)

    2000-10-01

    This report is the final results of a five-year IAEA Coordinated Research Project (CRP) launched in 1992. The main goal was the harmonization of criteria and methods used in Member States in seismic reassessment and upgrading of existing nuclear power plants. To this aim, most of the activities have been focused on a benchmarking exercise related to a mixed numerical and experimental dynamic analysis carried out on two reference units of WWER reactors (WWER-1000 and WWER-440/213): Kozloduy NPP Units 5/6 and PAKS NPP. Twenty-four institutions from 13 countries participated in the CRP, and two other institutions from Japan contributed to the CRP informally. The objective of this TECDOC is to provide a consistent and comprehensive summary of the results of the work performed in the CRP through the preparation of a ''self-standing'' report with the general conclusion of the programme: a great deal of information from the Background Documents has been included in this report with a set of recommendations for future work in this field

  14. Analysis of radially heterogeneous ZPPR-13A benchmark for investigating the spatial dependence of the calculated-to-experiment ratio for control rod worths

    International Nuclear Information System (INIS)

    Mahalakshmi, B.; Mohanakrishnan, P.

    1993-01-01

    Investigation were performed on the ZPPR-13A critical assembly to determine the cause of the radial variation of the calculated-to-experimental (C/E) ratio for control rod worth in large heterogeneous cores. The effects of errors in cross section, mesh size, group condensation, transport, and modeling were studied by studied by using two- and three-dimensional diffusion calculations and three-dimensional transport calculations. In that process, the cross-section set and the calculation scheme that are being used for fast reactor design in India have been revalidated. The cross-section set was found to yield satisfactory results. Three-dimensional calculations with adjusted and unadjusted cross sections confirmed that the error in cross sections was largely responsible for the radial dependence of the C/E ratios. The contributions from group condensation and mesh size errors were < 2%, and from modeling errors and transport correction, < 1%. The effect of these errors is insignificant when compared with the effect of the cross-section error. The analysis also showed that even without the adjustment in diffusion coefficient suggested in earlier studies, a satisfactory prediction is found, at least for this benchmark. The diffusion-to-transport correction for control rod worth was found to be -7%

  15. Benchmark data for a large reprocessing plant for evaluation of advanced data analysis algorithms and safeguards system design

    International Nuclear Information System (INIS)

    Burr, T.L.; Coulter, C.A.; Wangen, L.E.

    1998-02-01

    This report describes the simulation and analysis of solution level and density (L,D) in all key main process tanks in a large reprocessing plant. In addition, initial provisions were made to include temperature (T) data in the analysis at a later time. FacSim, a simulation program developed at Los Alamos, was used to generate simulated process operating data for the Rokkasho Reprocessing Plant (RRP) that is now under construction in Japan. Both normal facility operation and more than thirty abrupt diversion scenarios were modeled over 25-day periods of simulated operation beginning with clean startup of the facility. The simulation tracked uranium, plutonium (both +3 and +4 oxidation states), HNO 3 diluent, and tributyl phosphate from the input accountability vessel to the plutonium output accountability vessel, with the status of each process vessel and many pipes recorded at intervals of approximately four minutes. These data were used to determine solution volume and density values in each process vessel as a function of time

  16. Microfabricated Gas Phase Chemical Analysis Systems

    International Nuclear Information System (INIS)

    FRYE-MASON, GREGORY CHARLES; HELLER, EDWIN J.; HIETALA, VINCENT M.; KOTTENSTETTE, RICHARD; LEWIS, PATRICK R.; MANGINELL, RONALD P.; MATZKE, CAROLYN M.; WONG, CHUNGNIN C.

    1999-01-01

    A portable, autonomous, hand-held chemical laboratory ((micro)ChemLab(trademark)) is being developed for trace detection (ppb) of chemical warfare (CW) agents and explosives in real-world environments containing high concentrations of interfering compounds. Microfabrication is utilized to provide miniature, low-power components that are characterized by rapid, sensitive and selective response. Sensitivity and selectivity are enhanced using two parallel analysis channels, each containing the sequential connection of a front-end sample collector/concentrator, a gas chromatographic (GC) separator, and a surface acoustic wave (SAW) detector. Component design and fabrication and system performance are described

  17. Benchmarking reference services: an introduction.

    Science.gov (United States)

    Marshall, J G; Buchanan, H S

    1995-01-01

    Benchmarking is based on the common sense idea that someone else, either inside or outside of libraries, has found a better way of doing certain things and that your own library's performance can be improved by finding out how others do things and adopting the best practices you find. Benchmarking is one of the tools used for achieving continuous improvement in Total Quality Management (TQM) programs. Although benchmarking can be done on an informal basis, TQM puts considerable emphasis on formal data collection and performance measurement. Used to its full potential, benchmarking can provide a common measuring stick to evaluate process performance. This article introduces the general concept of benchmarking, linking it whenever possible to reference services in health sciences libraries. Data collection instruments that have potential application in benchmarking studies are discussed and the need to develop common measurement tools to facilitate benchmarking is emphasized.

  18. International benchmark for coupled codes and uncertainty analysis in modelling: switching-Off of one of the four operating main circulation pumps at nominal reactor power at NPP Kalinin unit 3

    International Nuclear Information System (INIS)

    Tereshonok, V. A.; Nikonov, S. P.; Lizorkin, M. P.; Velkov, K; Pautz, A.; Ivanov, V.

    2008-01-01

    The paper briefly describes the Specification of an international NEA/OECD benchmark based on measured plant data. During the commissioning tests for nominal power at NPP Kalinin Unit 3 a lot of measurements of neutron and thermo-hydraulic parameters have been carried out in the reactor pressure vessel, primary and the secondary circuits. One of the measured data sets for the transient 'Switching-off of one Main Circulation Pump (MCP) at nominal power' has been chosen to be applied for validation of coupled thermal-hydraulic and neutron-kinetic system codes and additionally for performing of uncertainty analyses as a part of the NEA/OECD Uncertainty Analysis in Modeling Benchmark. The benchmark is opened for all countries and institutions. The experimental data and the final specification with the cross section libraries will be provided to the participants from NEA/OECD only after official declaration of real participation in the benchmark and delivery of the simulated results of the transient for comparison. (Author)

  19. An algorithm for reliability analysis of phased-mission systems

    International Nuclear Information System (INIS)

    Ma, Y.; Trivedi, K.S.

    1999-01-01

    The purpose of this paper is to describe an efficient Boolean algebraic algorithm that provides exact solution to the unreliability of a multi-phase mission system where the configurations are described through fault trees. The algorithm extends and improves the Boolean method originally proposed by Somani and Trivedi. By using the Boolean algebraic method, we provide an efficient modeling approach which avoids the state space explosion and the mapping problems that are encountered by the Markov chain approach. To calculate the exact solution of the phased-mission system with deterministic phase durations, we introduce the sum of disjoint phase products (SDPP) formula, which is a phased-extension of the sum of disjoint products (SDP) formula. Computationally, the algorithm is quite efficient because it calls an SDP generation algorithm in the early stage of the SDPP computation. In this way, the phase products generated in the early stage of the SDPP formula are guaranteed to be disjoint. Consequently, the number of the intermediate phase products is greatly reduced. In this paper, we also consider the transient analysis of the phased-mission system. Special care is needed to account for the possible latent failures at the mission phase change times. If there are more stringent success criteria just after a mission phase change time, an unreliability jump would occur at that time. Finally, the algorithm has been implemented in the software package SHARPE. With SHARPE, the complexities of the phased-mission system is made transparent to the potential users. The user can conveniently specify a phased-mission model at a high level (through fault trees) and analyze the system quantitatively

  20. Thermalhydraulic instability analysis of a two phase natural circulation loop

    International Nuclear Information System (INIS)

    Sesini, Paula Aida

    1998-01-01

    This work presents an analysis of a loop operating in natural circulation regime. Experiments were done in a rectangular closed circuit in one and two-phase flows. Numerical analysis were performed initially with the CIRNAT code and afterwards with RELAP5/MOD2. The limitations of CIRNAT were studied and new developments for this code are proposed. (author)

  1. Benchmarking LWR codes capability to model radionuclide deposition within SFR containments: An analysis of the Na ABCOVE tests

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, Luis E., E-mail: luisen.herranz@ciemat.es [CIEMAT, Unit of Nuclear Safety Research, Av. Complutense, 40, 28040 Madrid (Spain); Garcia, Monica, E-mail: monica.gmartin@ciemat.es [CIEMAT, Unit of Nuclear Safety Research, Av. Complutense, 40, 28040 Madrid (Spain); Morandi, Sonia, E-mail: sonia.morandi@rse-web.it [Nuclear and Industrial Plant Safety Team, Power Generation System Department, RSE, via Rubattino 54, 20134 Milano (Italy)

    2013-12-15

    Highlights: • Assessment of LWR codes capability to model aerosol deposition within SFR containments. • Original hypotheses proposed to partially accommodate drawbacks from Na oxidation reactions. • A defined methodology to derive a more accurate characterization of Na-based particles. • Key missing models in LWR codes for SFR applications are identified. - Abstract: Postulated BDBAs in SFRs might result in contaminated-coolant discharge at high temperature into the containment. A full scope safety analysis of this reactor type requires computation tools properly validated in all the related fields. Radionuclide transport, particularly within the containment, is one of those fields. This sets two major challenges: to have reliable codes available and to build up a sound data base. Development of SFR source term codes was abandoned in the 80's and few data are available at present. The ABCOVE experimental programme conducted in the 80's is still a reference in the field. Postulated BDBAs in SFRs might result in contaminated-coolant discharge at high temperature into the containment. A full scope safety analysis of this reactor type requires computation tools properly validated in all the related fields. Radionuclide deposition, particularly within the containment, is one of those fields. This sets two major challenges: to have reliable codes available and to build up a sound data base. Development of SFR source term codes was abandoned in the 80's and few data are available at present. The ABCOVE experimental programme conducted in the 80's is still a reference in the field. The present paper is aimed at assessing the current capability of LWR codes to model aerosol deposition within a SFR containment under BDBA conditions. Through a systematic application of the ASTEC, ECART and MELCOR codes to relevant ABCOVE tests, insights have been gained into drawbacks and capabilities of these computation tools. Hypotheses and approximations have

  2. Benchmarking LWR codes capability to model radionuclide deposition within SFR containments: An analysis of the Na ABCOVE tests

    International Nuclear Information System (INIS)

    Herranz, Luis E.; Garcia, Monica; Morandi, Sonia

    2013-01-01

    Highlights: • Assessment of LWR codes capability to model aerosol deposition within SFR containments. • Original hypotheses proposed to partially accommodate drawbacks from Na oxidation reactions. • A defined methodology to derive a more accurate characterization of Na-based particles. • Key missing models in LWR codes for SFR applications are identified. - Abstract: Postulated BDBAs in SFRs might result in contaminated-coolant discharge at high temperature into the containment. A full scope safety analysis of this reactor type requires computation tools properly validated in all the related fields. Radionuclide transport, particularly within the containment, is one of those fields. This sets two major challenges: to have reliable codes available and to build up a sound data base. Development of SFR source term codes was abandoned in the 80's and few data are available at present. The ABCOVE experimental programme conducted in the 80's is still a reference in the field. Postulated BDBAs in SFRs might result in contaminated-coolant discharge at high temperature into the containment. A full scope safety analysis of this reactor type requires computation tools properly validated in all the related fields. Radionuclide deposition, particularly within the containment, is one of those fields. This sets two major challenges: to have reliable codes available and to build up a sound data base. Development of SFR source term codes was abandoned in the 80's and few data are available at present. The ABCOVE experimental programme conducted in the 80's is still a reference in the field. The present paper is aimed at assessing the current capability of LWR codes to model aerosol deposition within a SFR containment under BDBA conditions. Through a systematic application of the ASTEC, ECART and MELCOR codes to relevant ABCOVE tests, insights have been gained into drawbacks and capabilities of these computation tools. Hypotheses and approximations have been adopted so that

  3. Analysis of the Numerical Diffusion in Anisotropic Mediums: Benchmarks for Magnetic Field Aligned Meshes in Space Propulsion Simulations

    Directory of Open Access Journals (Sweden)

    Daniel Pérez-Grande

    2016-11-01

    Full Text Available This manuscript explores numerical errors in highly anisotropic diffusion problems. First, the paper addresses the use of regular structured meshes in numerical solutions versus meshes aligned with the preferential directions of the problem. Numerical diffusion in structured meshes is quantified by solving the classical anisotropic diffusion problem; the analysis is exemplified with the application to a numerical model of conducting fluids under magnetic confinement, where rates of transport in directions parallel and perpendicular to a magnetic field are quite different. Numerical diffusion errors in this problem promote the use of magnetic field aligned meshes (MFAM. The generation of this type of meshes presents some challenges; several meshing strategies are implemented and analyzed in order to provide insight into achieving acceptable mesh regularity. Second, Gradient Reconstruction methods for magnetically aligned meshes are addressed and numerical errors are compared for the structured and magnetically aligned meshes. It is concluded that using the latter provides a more correct and straightforward approach to solving problems where anisotropicity is present, especially, if the anisotropicity level is high or difficult to quantify. The conclusions of the study may be extrapolated to the study of anisotropic flows different from conducting fluids.

  4. Phased mission analysis of maintained systems

    International Nuclear Information System (INIS)

    Terpstra, K.

    1984-09-01

    The present study is devoted to system reliability and is mainly directed to the quantitative evaluation of accident sequences. Event tree methodology and fault tree analysis are applied as basic techniques. It introduces a new methodology for the calculation of the probability of occurrence of an accident sequence. This new methodology takes correctly into account shared equipment dependencies between the different systems present in an accident sequence. Since large and/or complex systems may contain a large number of minimal cut sets (sometimes millions of it), it is not possible as a rule to obtain the exact analytical solution. Therefore, upper and lower bounds for the probability of occurrence of an accident sequence are presented. Calculation results show that this probability is under-estimated if system dependencies are not fully taken into account. The new methodology also offers the possibility to get insight into the degree of dependency between systems based on quantitative calculations. To make the methodology manageable for complex systems, it is implemented in the reliability computer program PHAMISS. This program is written in FORTRAN-IV for the CDC-Cyber 175. PHAMISS is users friendly and has proven to be a fast and efficient program. (Auth.)

  5. Human factors reliability Benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1989-06-01

    The Joint Research Centre of the European Commission has organized a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim of assessing the state of the art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participated in the HF-RBE. The HF-RBE was organized around two study cases: (1) analysis of routine functional Test and Maintenance (T and M) procedures: with the aim of assessing the probability of test induced failures, the probability of failures to remain unrevealed and the potential to initiate transients because of errors performed in the test; (2) analysis of human actions during an operational transient: with the aim of assessing the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. This report contains the final summary reports produced by the participants in the exercise

  6. Code assessment and modelling for Design Basis Accident Analysis of the European sodium fast reactor design. Part I: System description, modelling and benchmarking

    International Nuclear Information System (INIS)

    Lázaro, A.; Ammirabile, L.; Bandini, G.; Darmet, G.; Massara, S.; Dufour, Ph.; Tosello, A.; Gallego, E.; Jimenez, G.; Mikityuk, K.; Schikorr, M.; Bubelis, E.; Ponomarev, A.; Kruessmann, R.; Stempniewicz, M.

    2014-01-01

    Highlights: • Ten system-code models of the ESFR were developed in the frame of the CP-ESFR project. • Eight different thermohydraulic system codes adapted to sodium fast reactor's technology. • Benchmarking exercise settled to check the consistency of the calculations. • Upgraded system codes able to simulate the reactivity feedback and key safety parameters. -- Abstract: The new reactor concepts proposed in the Generation IV International Forum (GIF) are conceived to improve the use of natural resources, reduce the amount of high-level radioactive waste and excel in their reliability and safe operation. Among these novel designs sodium fast reactors (SFRs) stand out due to their technological feasibility as demonstrated in several countries during the last decades. As part of the contribution of EURATOM to GIF the CP-ESFR is a collaborative project with the objective, among others, to perform extensive analysis on safety issues involving renewed SFR demonstrator designs. The verification of computational tools able to simulate the plant behaviour under postulated accidental conditions by code-to-code comparison was identified as a key point to ensure reactor safety. In this line, several organizations employed coupled neutronic and thermal-hydraulic system codes able to simulate complex and specific phenomena involving multi-physics studies adapted to this particular fast reactor technology. In the “Introduction” of this paper the framework of this study is discussed, the second section describes the envisaged plant design and the commonly agreed upon modelling guidelines. The third section presents a comparative analysis of the calculations performed by each organisation applying their models and codes to a common agreed transient with the objective to harmonize the models as well as validating the implementation of all relevant physical phenomena in the different system codes

  7. Code assessment and modelling for Design Basis Accident Analysis of the European sodium fast reactor design. Part I: System description, modelling and benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Lázaro, A., E-mail: aurelio.lazaro-chueca@ec.europa.eu [JRC-IET European Commission—Westerduinweg 3, PO Box-2, 1755 ZG Petten (Netherlands); UPV—Universidad Politecnica de Valencia, Cami de vera s/n-46002, Valencia (Spain); Ammirabile, L. [JRC-IET European Commission—Westerduinweg 3, PO Box-2, 1755 ZG Petten (Netherlands); Bandini, G. [ENEA, Via Martiri di Monte Sole 4, 40129 Bologna (Italy); Darmet, G.; Massara, S. [EDF, 1 avenue du Général de Gaulle, 92141 Clamart (France); Dufour, Ph.; Tosello, A. [CEA, St Paul lez Durance, 13108 Cadarache (France); Gallego, E.; Jimenez, G. [UPM, José Gutiérrez Abascal, 2-28006 Madrid (Spain); Mikityuk, K. [PSI—Paul Scherrer Institut, 5232 Villigen Switzerland (Switzerland); Schikorr, M.; Bubelis, E.; Ponomarev, A.; Kruessmann, R. [KIT—Institute for Neutron Physics and Reactor Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen Germany (Germany); Stempniewicz, M. [NRG, Utrechtseweg 310, PO Box 9034 6800 ES, Arnhem (Netherlands)

    2014-01-15

    Highlights: • Ten system-code models of the ESFR were developed in the frame of the CP-ESFR project. • Eight different thermohydraulic system codes adapted to sodium fast reactor's technology. • Benchmarking exercise settled to check the consistency of the calculations. • Upgraded system codes able to simulate the reactivity feedback and key safety parameters. -- Abstract: The new reactor concepts proposed in the Generation IV International Forum (GIF) are conceived to improve the use of natural resources, reduce the amount of high-level radioactive waste and excel in their reliability and safe operation. Among these novel designs sodium fast reactors (SFRs) stand out due to their technological feasibility as demonstrated in several countries during the last decades. As part of the contribution of EURATOM to GIF the CP-ESFR is a collaborative project with the objective, among others, to perform extensive analysis on safety issues involving renewed SFR demonstrator designs. The verification of computational tools able to simulate the plant behaviour under postulated accidental conditions by code-to-code comparison was identified as a key point to ensure reactor safety. In this line, several organizations employed coupled neutronic and thermal-hydraulic system codes able to simulate complex and specific phenomena involving multi-physics studies adapted to this particular fast reactor technology. In the “Introduction” of this paper the framework of this study is discussed, the second section describes the envisaged plant design and the commonly agreed upon modelling guidelines. The third section presents a comparative analysis of the calculations performed by each organisation applying their models and codes to a common agreed transient with the objective to harmonize the models as well as validating the implementation of all relevant physical phenomena in the different system codes.

  8. Three-dimensional space-time kinetic analysis with CORETRAN and RETRAN-3D of the NEACRP PWR rod ejection benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Ferroukhi, H.; Coddington, P

    2001-03-01

    One of the activities within the STARS project, in the Laboratory for Reactor Physics and System Behaviour; is the development of a coupling methodology between the three-dimensional, space-time kinetics codes CORETRAN and RETRAN-3D in order to perform core and plant transient analyses of the Swiss LWRs. The CORETRAN code is a 3-D full-core simulator, intended to be used for core-related analyses, while RETRAN-3D is the three-dimensional kinetics version of the plant system code RETRAN, and can therefore be used for best-estimate analyses of a wide range of transients in both PWRs and BWRs. Because the neutronics solver in both codes is based on the same kinetics model, one important advantage is that the codes can be coupled so that the initial conditions for a RETRAN-3D plant analysis are generated by a detailed-core, steady-state calculation using CORETRAN. As a first step towards using CORETRAN and RETRAN-3D for kinetic applications, the NEACRP PWR rod ejection benchmark has been analyzed with both codes, and is presented in this paper. The first objective is to verify the consistency between the static and kinetic solutions of the two codes, and so gain confidence in the coupling methodology. The second objective is to assess the CORETRAN and RETRAN-3D solutions for a well-defined RIA transient, comparing with previously published results. In parallel, several sensitivity studies have been performed in an attempt to identify models and calculational options important for a correct analysis of an RIA event in a LWR using these two codes. (author)

  9. Benchmark calculations of power distribution within assemblies

    International Nuclear Information System (INIS)

    Cavarec, C.; Perron, J.F.; Verwaerde, D.; West, J.P.

    1994-09-01

    The main objective of this Benchmark is to compare different techniques for fine flux prediction based upon coarse mesh diffusion or transport calculations. We proposed 5 ''core'' configurations including different assembly types (17 x 17 pins, ''uranium'', ''absorber'' or ''MOX'' assemblies), with different boundary conditions. The specification required results in terms of reactivity, pin by pin fluxes and production rate distributions. The proposal for these Benchmark calculations was made by J.C. LEFEBVRE, J. MONDOT, J.P. WEST and the specification (with nuclear data, assembly types, core configurations for 2D geometry and results presentation) was distributed to correspondents of the OECD Nuclear Energy Agency. 11 countries and 19 companies answered the exercise proposed by this Benchmark. Heterogeneous calculations and homogeneous calculations were made. Various methods were used to produce the results: diffusion (finite differences, nodal...), transport (P ij , S n , Monte Carlo). This report presents an analysis and intercomparisons of all the results received

  10. Interior beam searchlight semi-analytical benchmark

    International Nuclear Information System (INIS)

    Ganapol, Barry D.; Kornreich, Drew E.

    2008-01-01

    Multidimensional semi-analytical benchmarks to provide highly accurate standards to assess routine numerical particle transport algorithms are few and far between. Because of the well-established 1D theory for the analytical solution of the transport equation, it is sometimes possible to 'bootstrap' a 1D solution to generate a more comprehensive solution representation. Here, we consider the searchlight problem (SLP) as a multidimensional benchmark. A variation of the usual SLP is the interior beam SLP (IBSLP) where a beam source lies beneath the surface of a half space and emits directly towards the free surface. We consider the establishment of a new semi-analytical benchmark based on a new FN formulation. This problem is important in radiative transfer experimental analysis to determine cloud absorption and scattering properties. (authors)

  11. Benchmarking HIV health care

    DEFF Research Database (Denmark)

    Podlekareva, Daria; Reekie, Joanne; Mocroft, Amanda

    2012-01-01

    ABSTRACT: BACKGROUND: State-of-the-art care involving the utilisation of multiple health care interventions is the basis for an optimal long-term clinical prognosis for HIV-patients. We evaluated health care for HIV-patients based on four key indicators. METHODS: Four indicators of health care we...... document pronounced regional differences in adherence to guidelines and can help to identify gaps and direct target interventions. It may serve as a tool for assessment and benchmarking the clinical management of HIV-patients in any setting worldwide....

  12. Benchmarking Cloud Storage Systems

    OpenAIRE

    Wang, Xing

    2014-01-01

    With the rise of cloud computing, many cloud storage systems like Dropbox, Google Drive and Mega have been built to provide decentralized and reliable file storage. It is thus of prime importance to know their features, performance, and the best way to make use of them. In this context, we introduce BenchCloud, a tool designed as part of this thesis to conveniently and efficiently benchmark any cloud storage system. First, we provide a study of six commonly-used cloud storage systems to ident...

  13. The COST Benchmark

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Tiesyte, Dalia; Tradisauskas, Nerius

    2006-01-01

    An infrastructure is emerging that enables the positioning of populations of on-line, mobile service users. In step with this, research in the management of moving objects has attracted substantial attention. In particular, quite a few proposals now exist for the indexing of moving objects...... takes into account that the available positions of the moving objects are inaccurate, an aspect largely ignored in previous indexing research. The concepts of data and query enlargement are introduced for addressing inaccuracy. As proof of concepts of the benchmark, the paper covers the application...

  14. The benchmark testing of 9Be of CENDL-3

    International Nuclear Information System (INIS)

    Liu Ping

    2002-01-01

    CENDL-3, the latest version of China Evaluated Nuclear Data Library was finished. The data of 9 Be were updated, and distributed for benchmark analysis recently. The calculated results were presented, and compared with the experimental data and the results based on other evaluated nuclear data libraries. The results show that CENDL-3 is better than others for most benchmarks

  15. Benchmarking optimization solvers for structural topology optimization

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana; Stolpe, Mathias

    2015-01-01

    solvers in IPOPT and FMINCON, and the sequential quadratic programming method in SNOPT, are benchmarked on the library using performance profiles. Whenever possible the methods are applied to both the nested and the Simultaneous Analysis and Design (SAND) formulations of the problem. The performance...

  16. Benchmarking 2011: Trends in Education Philanthropy

    Science.gov (United States)

    Grantmakers for Education, 2011

    2011-01-01

    The analysis in "Benchmarking 2011" is based on data from an unduplicated sample of 184 education grantmaking organizations--approximately two-thirds of Grantmakers for Education's (GFE's) network of grantmakers--who responded to an online survey consisting of fixed-choice and open-ended questions. Because a different subset of funders elects to…

  17. Spectrum integrated (n,He) cross section comparison and least squares analysis for /sup 6/Li and /sup 10/B in benchmark fields

    International Nuclear Information System (INIS)

    Schenter, R.E.; Oliver, B.M.; Farrar, H. IV

    1987-01-01

    Spectrum integrated cross sections for /sup 6/Li and /sup 10/B from five benchmark fast reactor neutron fields are compared with calculated values obtained using the ENDF/B-V Cross Section Files. The benchmark fields include the Coupled Fast Reactivity Measurements Facility (CFRMF) at the Idaho National Engineering Laboratory, the 10% Enriched U-235 Critical Assembly (BIG-10) at Los Alamos National Laboratory, the Sigma Sigma and Fission Cavity fields of the BR-1 reactor at CEN/SCK, and the Intermediate-Energy Standard Neutron Field (ISNF) at the National Bureau of Standards. Results from least square analyses using the FERRET computer code to obtain adjusted cross section values and their uncertainties are presented. Input to these calculations include the above five benchmark data sets. These analyses indicate a need for revision in the ENDF/B-V files for the /sup 10/B cross section for energies above 50 keV

  18. X-ray qualitative analysis of low concentration phases

    International Nuclear Information System (INIS)

    Brusilovskii, B.A.; Khaet, L.G.

    1993-01-01

    The identification of low concentration phases (LCP) situated at the detection limit of x-ray analysis have not been examined sufficiently. The authors have developed a method of qualitative x-ray analysis of LCPs of the first order (carbides and certain intermetallic compounds) in a multiphase system. X-ray examination of the LCPs consists of analysis of a priori information and formation of the information data on the basis of the profile of the diffraction line with subsequent processing using the F-criterion (Fischer's criterion). If the phase is not detected, consecutive analysis with build up of information is pursued. Results are presented of a successive qualitative phase analysis of a region of quenched layer of a roll situated 10 mm from the surface. Carbide phases were identified in the quenched layer of the roll up to 15 mm deep and in the transition zone 15-35 mm. The proposed method was justified physically and statistically, recommendations for practice were given, and the method was tested and is regarded as promising for detecting weak lines in qualitative diffractometric analysis. 12 refs., 3 tabs

  19. Benchmarking multimedia performance

    Science.gov (United States)

    Zandi, Ahmad; Sudharsanan, Subramania I.

    1998-03-01

    With the introduction of faster processors and special instruction sets tailored to multimedia, a number of exciting applications are now feasible on the desktops. Among these is the DVD playback consisting, among other things, of MPEG-2 video and Dolby digital audio or MPEG-2 audio. Other multimedia applications such as video conferencing and speech recognition are also becoming popular on computer systems. In view of this tremendous interest in multimedia, a group of major computer companies have formed, Multimedia Benchmarks Committee as part of Standard Performance Evaluation Corp. to address the performance issues of multimedia applications. The approach is multi-tiered with three tiers of fidelity from minimal to full compliant. In each case the fidelity of the bitstream reconstruction as well as quality of the video or audio output are measured and the system is classified accordingly. At the next step the performance of the system is measured. In many multimedia applications such as the DVD playback the application needs to be run at a specific rate. In this case the measurement of the excess processing power, makes all the difference. All these make a system level, application based, multimedia benchmark very challenging. Several ideas and methodologies for each aspect of the problems will be presented and analyzed.

  20. Core Benchmarks Descriptions

    International Nuclear Information System (INIS)

    Pavlovichev, A.M.

    2001-01-01

    Actual regulations while designing of new fuel cycles for nuclear power installations comprise a calculational justification to be performed by certified computer codes. It guarantees that obtained calculational results will be within the limits of declared uncertainties that are indicated in a certificate issued by Gosatomnadzor of Russian Federation (GAN) and concerning a corresponding computer code. A formal justification of declared uncertainties is the comparison of calculational results obtained by a commercial code with the results of experiments or of calculational tests that are calculated with an uncertainty defined by certified precision codes of MCU type or of other one. The actual level of international cooperation provides an enlarging of the bank of experimental and calculational benchmarks acceptable for a certification of commercial codes that are being used for a design of fuel loadings with MOX fuel. In particular, the work is practically finished on the forming of calculational benchmarks list for a certification of code TVS-M as applied to MOX fuel assembly calculations. The results on these activities are presented

  1. A benchmarking study

    Directory of Open Access Journals (Sweden)

    H. Groessing

    2015-02-01

    Full Text Available A benchmark study for permeability measurement is presented. In the past studies of other research groups which focused on the reproducibility of 1D-permeability measurements showed high standard deviations of the gained permeability values (25%, even though a defined test rig with required specifications was used. Within this study, the reproducibility of capacitive in-plane permeability testing system measurements was benchmarked by comparing results of two research sites using this technology. The reproducibility was compared by using a glass fibre woven textile and carbon fibre non crimped fabric (NCF. These two material types were taken into consideration due to the different electrical properties of glass and carbon with respect to dielectric capacitive sensors of the permeability measurement systems. In order to determine the unsaturated permeability characteristics as function of fibre volume content the measurements were executed at three different fibre volume contents including five repetitions. It was found that the stability and reproducibility of the presentedin-plane permeability measurement system is very good in the case of the glass fibre woven textiles. This is true for the comparison of the repetition measurements as well as for the comparison between the two different permeameters. These positive results were confirmed by a comparison to permeability values of the same textile gained with an older generation permeameter applying the same measurement technology. Also it was shown, that a correct determination of the grammage and the material density are crucial for correct correlation of measured permeability values and fibre volume contents.

  2. Benchmarking Using Basic DBMS Operations

    Science.gov (United States)

    Crolotte, Alain; Ghazal, Ahmad

    The TPC-H benchmark proved to be successful in the decision support area. Many commercial database vendors and their related hardware vendors used these benchmarks to show the superiority and competitive edge of their products. However, over time, the TPC-H became less representative of industry trends as vendors keep tuning their database to this benchmark-specific workload. In this paper, we present XMarq, a simple benchmark framework that can be used to compare various software/hardware combinations. Our benchmark model is currently composed of 25 queries that measure the performance of basic operations such as scans, aggregations, joins and index access. This benchmark model is based on the TPC-H data model due to its maturity and well-understood data generation capability. We also propose metrics to evaluate single-system performance and compare two systems. Finally we illustrate the effectiveness of this model by showing experimental results comparing two systems under different conditions.

  3. Co-ordinated research programme on benchmark study for the seismic analysis and testing of WWER-type nuclear power plants. V. 4G. Paks NPP: Analysis and testing. Working material

    International Nuclear Information System (INIS)

    1999-01-01

    In August 1991, following the SMiRT-11 Conference in Tokyo, a Technical Committee Meeting was held on the 'Seismic safety issues relating to existing NPPs'. The Proceedings of this TCM was subsequently compiled in an IAEA Working Material. One of the main recommendations of this TCM, called for the harmonization of criteria and methods used in Member States in seismic reassessment and upgrading of existing NPPs. Twenty four institutions from thirteen countries participated in the CRP named 'Benchmark study for the seismic analysis and testing of WWER type NPPs'. Two types of WWER reactors (WWER-1000 and WWER-440/213) selected for benchmarking. Kozloduy NPP Units 5/6 and Paks NPP represented these respectively as prototypes. Consistent with the recommendations of the TCM and the working paper prepared by the subsequent Consultants' Meeting, the focal activity of the CRP was the benchmarking exercises. A similar methodology was followed both for Paks NPP and Kozloduy NPP Unit 5. Firstly, the NPP (mainly the reactor building) was tested using a blast loading generated by a series of explosions from buried TNT charges. Records from this test were obtained at several free field locations (both downhole and surface), foundation mat, various elevations of structures as well as some tanks and the stack. Then the benchmark participants were provided with structural drawings, soil data and the free field record of the blast experiment. Their task was to make a blind prediction of the response at preselected locations. The analytical results from these participants were then compared with the results from the test. Although the benchmarking exercises constituted the focus of the CRP, there were many other interesting problems related to the seismic safety of WWER type NPPs which were addressed by the participants. These involved generic studies, i.e. codes and standards used in original WWER designs and their comparison with current international practice; seismic analysis

  4. BN-600 Phase III benchmark calculations

    International Nuclear Information System (INIS)

    Hill, R.N.; Grimm, K.N.

    2002-01-01

    Calculations for a Hexagonal-Z model of the BN-600 reactor with a partial mixed oxide loading, based on a joint IPPE/OBMK loading configuration that contained three uranium enrichment zones and one plutonium enrichment zone in the core, have been performed at ANL. Control-rod worths and reactivity feedback coefficients were calculated using both homogeneous and heterogeneous models. These values were calculated with either first-order perturbation theory methods (Triangle-Z geometry), nodal eigenvalue differences (Hexagonal-Z geometry), or Monte Carlo eigenvalue differences. Both spatially-dependent and region integrated values are shown

  5. Benchmarking & European Sustainable Transport Policies

    DEFF Research Database (Denmark)

    Gudmundsson, H.

    2003-01-01

    , Benchmarking is one of the management tools that have recently been introduced in the transport sector. It is rapidly being applied to a wide range of transport operations, services and policies. This paper is a contribution to the discussion of the role of benchmarking in the future efforts to...... contribution to the discussions within the Eusponsored BEST Thematic Network (Benchmarking European Sustainable Transport) which ran from 2000 to 2003....

  6. Benchmarking in Czech Higher Education

    OpenAIRE

    Plaček Michal; Ochrana František; Půček Milan

    2015-01-01

    The first part of this article surveys the current experience with the use of benchmarking at Czech universities specializing in economics and management. The results indicate that collaborative benchmarking is not used on this level today, but most actors show some interest in its introduction. The expression of the need for it and the importance of benchmarking as a very suitable performance-management tool in less developed countries are the impetus for the second part of our article. Base...

  7. Power reactor pressure vessel benchmarks

    International Nuclear Information System (INIS)

    Rahn, F.J.

    1978-01-01

    A review is given of the current status of experimental and calculational benchmarks for use in understanding the radiation embrittlement effects in the pressure vessels of operating light water power reactors. The requirements of such benchmarks for application to pressure vessel dosimetry are stated. Recent developments in active and passive neutron detectors sensitive in the ranges of importance to embrittlement studies are summarized and recommendations for improvements in the benchmark are made. (author)

  8. HANFORD DOUBLE-SHELL TANK THERMAL and SEISMIC PROJECT-ANSYS BENCHMARK ANALYSIS OF SEISMICALLY INDUCED FLUID-STRUCTURE INTERACTION IN A HANFORD DOUBLE-SHELL PRIMARY TANK

    International Nuclear Information System (INIS)

    MACKEY, T.C.

    2006-01-01

    M and D Professional Services, Inc. (M and D) is under subcontract to Pacific Northwest National Laboratories (PNNL) to perform seismic analysis of the Hanford Site Double-Shell Tanks (DSTs) in support of a project entitled ''Double-Shell Tank (DSV Integrity Project-DST Thermal and Seismic Analyses)''. The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the DST System at Hanford in support of Tri-Party Agreement Milestone M-48-14. The work described herein was performed in support of the seismic analysis of the DSTs. The thermal and operating loads analysis of the DSTs is documented in Rinker et al. (2004). The overall seismic analysis of the DSTs is being performed with the general-purpose finite element code ANSYS. The overall model used for the seismic analysis of the DSTs includes the DST structure, the contained waste, and the surrounding soil. The seismic analysis of the DSTs must address the fluid-structure interaction behavior and sloshing response of the primary tank and contained liquid. ANSYS has demonstrated capabilities for structural analysis, but the capabilities and limitations of ANSYS to perform fluid-structure interaction are less well understood. The purpose of this study is to demonstrate the capabilities and investigate the limitations of ANSYS for performing a fluid-structure interaction analysis of the primary tank and contained waste. To this end, the ANSYS solutions are benchmarked against theoretical solutions appearing in BNL 1995, when such theoretical solutions exist. When theoretical solutions were not available, comparisons were made to theoretical solutions of similar problems and to the results from Dytran simulations. The capabilities and limitations of the finite element code Dytran for performing a fluid-structure interaction analysis of the primary tank and contained waste were explored in a parallel investigation (Abatt 2006). In conjunction with the results of the global ANSYS

  9. Cryptographic analysis on the key space of optical phase encryption algorithm based on the design of discrete random phase mask

    Science.gov (United States)

    Lin, Chao; Shen, Xueju; Li, Zengyan

    2013-07-01

    The key space of phase encryption algorithm using discrete random phase mask is investigated by numerical simulation in this paper. Random phase mask with finite and discrete phase levels is considered as the core component in most practical optical encryption architectures. The key space analysis is based on the design criteria of discrete random phase mask. The role of random amplitude mask and random phase mask in optical encryption system is identified from the perspective of confusion and diffusion. The properties of discrete random phase mask in a practical double random phase encoding scheme working in both amplitude encoding (AE) and phase encoding (PE) modes are comparably analyzed. The key space of random phase encryption algorithm is evaluated considering both the encryption quality and the brute-force attack resistibility. A method for enlarging the key space of phase encryption algorithm is also proposed to enhance the security of optical phase encryption techniques.

  10. OECD/NRC Benchmark Based on NUPEC PWR Sub-channel and Bundle Test (PSBT). Volume I: Experimental Database and Final Problem Specifications

    International Nuclear Information System (INIS)

    Rubin, A.; Schoedel, A.; Avramova, M.; Utsuno, H.; Bajorek, S.; Velazquez-Lozada, A.

    2012-01-01

    The need to refine models for best-estimate calculations, based on good-quality experimental data, has been expressed in many recent meetings in the field of nuclear applications. The needs arising in this respect should not be limited to the currently available macroscopic methods but should be extended to next-generation analysis techniques that focus on more microscopic processes. One of the most valuable databases identified for the thermal-hydraulics modelling was developed by the Nuclear Power Engineering Corporation (NUPEC), Japan, which includes sub-channel void fraction and departure from nucleate boiling (DNB) measurements in a representative Pressurised Water Reactor (PWR) fuel assembly. Part of this database has been made available for this international benchmark activity entitled 'NUPEC PWR Sub-channel and Bundle Tests (PSBT) benchmark'. This international project has been officially approved by the Japanese Ministry of Economy, Trade, and Industry (METI), the US Nuclear Regulatory Commission (NRC) and endorsed by the OECD/NEA. The benchmark team has been organised based on the collaboration between Japan and the USA. A large number of international experts have agreed to participate in this programme. The fine-mesh high-quality sub-channel void fraction and departure from nucleate boiling data encourages advancement in understanding and modelling complex flow behaviour in real bundles. Considering that the present theoretical approach is relatively immature, the benchmark specification is designed so that it will systematically assess and compare the participants' analytical models on the prediction of detailed void distributions and DNB. The development of truly mechanistic models for DNB prediction is currently underway. The benchmark problem includes both macroscopic and microscopic measurement data. In this context, the sub-channel grade void fraction data are regarded as the macroscopic data and the digitised computer graphic images are the

  11. ZZ ECN-BUBEBO, ECN-Petten Burnup Benchmark Book, Inventories, Afterheat

    International Nuclear Information System (INIS)

    Kloosterman, Jan Leen

    1999-01-01

    Description of program or function: Contains experimental benchmarks which can be used for the validation of burnup code systems and accompanied data libraries. Although the benchmarks presented here are thoroughly described in literature, it is in many cases not straightforward to retrieve unambiguously the correct input data and corresponding results from the benchmark Descriptions. Furthermore, results which can easily be measured, are sometimes difficult to calculate because of conversions to be made. Therefore, emphasis has been put to clarify the input of the benchmarks and to present the benchmark results in such a way that they can easily be calculated and compared. For more thorough Descriptions of the benchmarks themselves, the literature referred to here should be consulted. This benchmark book is divided in 11 chapters/files containing the following in text and tabular form: chapter 1: Introduction; chapter 2: Burnup Credit Criticality Benchmark Phase 1-B; chapter 3: Yankee-Rowe Core V Fuel Inventory Study; chapter 4: H.B. Robinson Unit 2 Fuel Inventory Study; chapter 5: Turkey Point Unit 3 Fuel Inventory Study; chapter 6: Turkey Point Unit 3 Afterheat Power Study; chapter 7: Dickens Benchmark on Fission Product Energy Release of U-235; chapter 8: Dickens Benchmark on Fission Product Energy Release of Pu-239; chapter 9: Yarnell Benchmark on Decay Heat Measurements of U-233; chapter 10: Yarnell Benchmark on Decay Heat Measurements of U-235; chapter 11: Yarnell Benchmark on Decay Heat Measurements of Pu-239

  12. A Benchmarking System for Domestic Water Use

    Directory of Open Access Journals (Sweden)

    Dexter V. L. Hunt

    2014-05-01

    Full Text Available The national demand for water in the UK is predicted to increase, exacerbated by a growing UK population, and home-grown demands for energy and food. When set against the context of overstretched existing supply sources vulnerable to droughts, particularly in increasingly dense city centres, the delicate balance of matching minimal demands with resource secure supplies becomes critical. When making changes to "internal" demands the role of technological efficiency and user behaviour cannot be ignored, yet existing benchmarking systems traditionally do not consider the latter. This paper investigates the practicalities of adopting a domestic benchmarking system (using a band rating that allows individual users to assess their current water use performance against what is possible. The benchmarking system allows users to achieve higher benchmarks through any approach that reduces water consumption. The sensitivity of water use benchmarks are investigated by making changes to user behaviour and technology. The impact of adopting localised supplies (i.e., Rainwater harvesting—RWH and Grey water—GW and including "external" gardening demands are investigated. This includes the impacts (in isolation and combination of the following: occupancy rates (1 to 4; roof size (12.5 m2 to 100 m2; garden size (25 m2 to 100 m2 and geographical location (North West, Midlands and South East, UK with yearly temporal effects (i.e., rainfall and temperature. Lessons learnt from analysis of the proposed benchmarking system are made throughout this paper, in particular its compatibility with the existing Code for Sustainable Homes (CSH accreditation system. Conclusions are subsequently drawn for the robustness of the proposed system.

  13. Results of a benchmark study for the seismic analysis and testing of WWER type NPPs: Overview and general comparison for Paks NPP

    International Nuclear Information System (INIS)

    Guerpinar, A.; Zola, M.

    2001-01-01

    Within the framework of the IAEA coordinated 'Benchmark Study for the seismic analysis and testing of WWER-type NPPs', in-situ dynamic structural testing activities have been performed at the Paks Nuclear Power Plant in Hungary. The specific objective of the investigation was to obtain experimental data on the actual dynamic structural behaviour of the plant's major constructions and equipment under normal operating conditions, for enabling a valid seismic safety review to be made. This paper refers on the comparison of the results obtained from the experimental activities performed by ISMES with those coming from analytical studies performed for the Coordinated Research Programme (CRP) by Siemens (Germany), EQE (Bulgaria), Central Laboratory (Bulgaria), M. David Consulting (Czech Republic), IVO (Finland). This paper gives a synthetic description of the conducted experiments and presents some results, regarding in particular the free-field excitations produced during the earthquake-simulation experiments and an experiment of the dynamic soil-structure interaction global effects at the base of the reactor containment structure. The specific objective of the experimental investigation was to obtain valid data on the dynamic behaviour of the plant's major constructions, under normal operating conditions, to support the analytical assessment of their actual seismic safety. The full-scale dynamic structural testing activities have been performed in December 1994 at the Paks (H) Nuclear Power Plant. The Paks NPP site has been subjected to low level earthquake-like ground shaking, through appropriately devised underground explosions, and the dynamic response of the plant's 1st reactor unit important structures was appropriately measured and digitally recorded, with the whole nuclear power plant under normal operating conditions. In-situ free field response was measured concurrently and, moreover, site-specific geophysical and seismological data were simultaneously

  14. Artefacts in geometric phase analysis of compound materials.

    Science.gov (United States)

    Peters, Jonathan J P; Beanland, Richard; Alexe, Marin; Cockburn, John W; Revin, Dmitry G; Zhang, Shiyong Y; Sanchez, Ana M

    2015-10-01

    The geometric phase analysis (GPA) algorithm is known as a robust and straightforward technique that can be used to measure lattice strains in high resolution transmission electron microscope (TEM) images. It is also attractive for analysis of aberration-corrected scanning TEM (ac-STEM) images that resolve every atom column, since it uses Fourier transforms and does not require real-space peak detection and assignment to appropriate sublattices. Here it is demonstrated that, in ac-STEM images of compound materials with compositionally distinct atom columns, an additional geometric phase is present in the Fourier transform. If the structure changes from one area to another in the image (e.g. across an interface), the change in this additional phase will appear as a strain in conventional GPA, even if there is no lattice strain. Strategies to avoid this pitfall are outlined. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. MOx Depletion Calculation Benchmark

    International Nuclear Information System (INIS)

    San Felice, Laurence; Eschbach, Romain; Dewi Syarifah, Ratna; Maryam, Seif-Eddine; Hesketh, Kevin

    2016-01-01

    Under the auspices of the NEA Nuclear Science Committee (NSC), the Working Party on Scientific Issues of Reactor Systems (WPRS) has been established to study the reactor physics, fuel performance, radiation transport and shielding, and the uncertainties associated with modelling of these phenomena in present and future nuclear power systems. The WPRS has different expert groups to cover a wide range of scientific issues in these fields. The Expert Group on Reactor Physics and Advanced Nuclear Systems (EGRPANS) was created in 2011 to perform specific tasks associated with reactor physics aspects of present and future nuclear power systems. EGRPANS provides expert advice to the WPRS and the nuclear community on the development needs (data and methods, validation experiments, scenario studies) for different reactor systems and also provides specific technical information regarding: core reactivity characteristics, including fuel depletion effects; core power/flux distributions; Core dynamics and reactivity control. In 2013 EGRPANS published a report that investigated fuel depletion effects in a Pressurised Water Reactor (PWR). This was entitled 'International Comparison of a Depletion Calculation Benchmark on Fuel Cycle Issues' NEA/NSC/DOC(2013) that documented a benchmark exercise for UO 2 fuel rods. This report documents a complementary benchmark exercise that focused on PuO 2 /UO 2 Mixed Oxide (MOX) fuel rods. The results are especially relevant to the back-end of the fuel cycle, including irradiated fuel transport, reprocessing, interim storage and waste repository. Saint-Laurent B1 (SLB1) was the first French reactor to use MOx assemblies. SLB1 is a 900 MWe PWR, with 30% MOx fuel loading. The standard MOx assemblies, used in Saint-Laurent B1 reactor, include three zones with different plutonium enrichments, high Pu content (5.64%) in the center zone, medium Pu content (4.42%) in the intermediate zone and low Pu content (2.91%) in the peripheral zone

  16. Benchmarking Academic Anatomic Pathologists

    Directory of Open Access Journals (Sweden)

    Barbara S. Ducatman MD

    2016-10-01

    Full Text Available The most common benchmarks for faculty productivity are derived from Medical Group Management Association (MGMA or Vizient-AAMC Faculty Practice Solutions Center ® (FPSC databases. The Association of Pathology Chairs has also collected similar survey data for several years. We examined the Association of Pathology Chairs annual faculty productivity data and compared it with MGMA and FPSC data to understand the value, inherent flaws, and limitations of benchmarking data. We hypothesized that the variability in calculated faculty productivity is due to the type of practice model and clinical effort allocation. Data from the Association of Pathology Chairs survey on 629 surgical pathologists and/or anatomic pathologists from 51 programs were analyzed. From review of service assignments, we were able to assign each pathologist to a specific practice model: general anatomic pathologists/surgical pathologists, 1 or more subspecialties, or a hybrid of the 2 models. There were statistically significant differences among academic ranks and practice types. When we analyzed our data using each organization’s methods, the median results for the anatomic pathologists/surgical pathologists general practice model compared to MGMA and FPSC results for anatomic and/or surgical pathology were quite close. Both MGMA and FPSC data exclude a significant proportion of academic pathologists with clinical duties. We used the more inclusive FPSC definition of clinical “full-time faculty” (0.60 clinical full-time equivalent and above. The correlation between clinical full-time equivalent effort allocation, annual days on service, and annual work relative value unit productivity was poor. This study demonstrates that effort allocations are variable across academic departments of pathology and do not correlate well with either work relative value unit effort or reported days on service. Although the Association of Pathology Chairs–reported median work relative

  17. Phase Transitions in Planning Problems: Design and Analysis of Parameterized Families of Hard Planning Problems

    Science.gov (United States)

    Hen, Itay; Rieffel, Eleanor G.; Do, Minh; Venturelli, Davide

    2014-01-01

    There are two common ways to evaluate algorithms: performance on benchmark problems derived from real applications and analysis of performance on parametrized families of problems. The two approaches complement each other, each having its advantages and disadvantages. The planning community has concentrated on the first approach, with few ways of generating parametrized families of hard problems known prior to this work. Our group's main interest is in comparing approaches to solving planning problems using a novel type of computational device - a quantum annealer - to existing state-of-the-art planning algorithms. Because only small-scale quantum annealers are available, we must compare on small problem sizes. Small problems are primarily useful for comparison only if they are instances of parametrized families of problems for which scaling analysis can be done. In this technical report, we discuss our approach to the generation of hard planning problems from classes of well-studied NP-complete problems that map naturally to planning problems or to aspects of planning problems that many practical planning problems share. These problem classes exhibit a phase transition between easy-to-solve and easy-to-show-unsolvable planning problems. The parametrized families of hard planning problems lie at the phase transition. The exponential scaling of hardness with problem size is apparent in these families even at very small problem sizes, thus enabling us to characterize even very small problems as hard. The families we developed will prove generally useful to the planning community in analyzing the performance of planning algorithms, providing a complementary approach to existing evaluation methods. We illustrate the hardness of these problems and their scaling with results on four state-of-the-art planners, observing significant differences between these planners on these problem families. Finally, we describe two general, and quite different, mappings of planning

  18. Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Sartor, Dale; Tschudi, William

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  19. Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.