Benchmarking computer platforms for lattice QCD applications
International Nuclear Information System (INIS)
Hasenbusch, M.; Jansen, K.; Pleiter, D.; Stueben, H.; Wegner, P.; Wettig, T.; Wittig, H.
2004-01-01
We define a benchmark suite for lattice QCD and report on benchmark results from several computer platforms. The platforms considered are apeNEXT, CRAY T3E; Hitachi SR8000, IBM p690, PC-Clusters, and QCDOC
Benchmarking computer platforms for lattice QCD applications
International Nuclear Information System (INIS)
Hasenbusch, M.; Jansen, K.; Pleiter, D.; Wegner, P.; Wettig, T.
2003-09-01
We define a benchmark suite for lattice QCD and report on benchmark results from several computer platforms. The platforms considered are apeNEXT, CRAY T3E, Hitachi SR8000, IBM p690, PC-Clusters, and QCDOC. (orig.)
HELIOS calculations for UO2 lattice benchmarks
International Nuclear Information System (INIS)
Mosteller, R.D.
1998-01-01
Calculations for the ANS UO 2 lattice benchmark have been performed with the HELIOS lattice-physics code and six of its cross-section libraries. The results obtained from the different libraries permit conclusions to be drawn regarding the adequacy of the energy group structures and of the ENDF/B-VI evaluation for 238 U. Scandpower A/S, the developer of HELIOS, provided Los Alamos National Laboratory with six different cross section libraries. Three of the libraries were derived directly from Release 3 of ENDF/B-VI (ENDF/B-VI.3) and differ only in the number of groups (34, 89 or 190). The other three libraries are identical to the first three except for a modification to the cross sections for 238 U in the resonance range
Benchmarking lattice physics data and methods for boiling water reactor analysis
International Nuclear Information System (INIS)
Cacciapouti, R.J.; Edenius, M.; Harris, D.R.; Hebert, M.J.; Kapitz, D.M.; Pilat, E.E.; VerPlanck, D.M.
1983-01-01
The objective of the work reported was to verify the adequacy of lattice physics modeling for the analysis of the Vermont Yankee BWR using a multigroup, two-dimensional transport theory code. The BWR lattice physics methods have been benchmarked against reactor physics experiments, higher order calculations, and actual operating data
Review of international solutions to NEACRP benchmark BWR lattice cell problems
International Nuclear Information System (INIS)
Halsall, M.J.
1977-12-01
This paper summarises international solutions to a set of BWR benchmark problems. The problems, posed as an activity sponsored by the Nuclear Energy Agency Committee on Reactor Physics, were as follows: 9-pin supercell with central burnable poison pin, mini-BWR with 4 pin-cells and water gaps and control rod cruciform, full 7 x 7 pin BWR lattice cell with differential U 235 enrichment, and full 8 x 8 pin BWR lattice cell with water-hole, Pu-loading, burnable poison, and homogenised cruciform control rod. Solutions have been contributed by Denmark, Japan, Sweden, Switzerland and the UK. (author)
INL Results for Phases I and III of the OECD/NEA MHTGR-350 Benchmark
Energy Technology Data Exchange (ETDEWEB)
Gerhard Strydom; Javier Ortensi; Sonat Sen; Hans Hammer
2013-09-01
The Idaho National Laboratory (INL) Very High Temperature Reactor (VHTR) Technology Development Office (TDO) Methods Core Simulation group led the construction of the Organization for Economic Cooperation and Development (OECD) Modular High Temperature Reactor (MHTGR) 350 MW benchmark for comparing and evaluating prismatic VHTR analysis codes. The benchmark is sponsored by the OECD's Nuclear Energy Agency (NEA), and the project will yield a set of reference steady-state, transient, and lattice depletion problems that can be used by the Department of Energy (DOE), the Nuclear Regulatory Commission (NRC), and vendors to assess their code suits. The Methods group is responsible for defining the benchmark specifications, leading the data collection and comparison activities, and chairing the annual technical workshops. This report summarizes the latest INL results for Phase I (steady state) and Phase III (lattice depletion) of the benchmark. The INSTANT, Pronghorn and RattleSnake codes were used for the standalone core neutronics modeling of Exercise 1, and the results obtained from these codes are compared in Section 4. Exercise 2 of Phase I requires the standalone steady-state thermal fluids modeling of the MHTGR-350 design, and the results for the systems code RELAP5-3D are discussed in Section 5. The coupled neutronics and thermal fluids steady-state solution for Exercise 3 are reported in Section 6, utilizing the newly developed Parallel and Highly Innovative Simulation for INL Code System (PHISICS)/RELAP5-3D code suit. Finally, the lattice depletion models and results obtained for Phase III are compared in Section 7. The MHTGR-350 benchmark proved to be a challenging simulation set of problems to model accurately, and even with the simplifications introduced in the benchmark specification this activity is an important step in the code-to-code verification of modern prismatic VHTR codes. A final OECD/NEA comparison report will compare the Phase I and III
International Nuclear Information System (INIS)
Freudenreich, W.E.; Aaldijk, J.K.
1994-08-01
The Working Party on Plutonium Recycling of the Nuclear Science Committee of the OECD Nuclear Energy Agency has initiated a benchmark study on the calculation of the void reactivity effect in MOX lattices. The results presented here were obtained with the continuous energy, generalized geometry Monte Carlo transport code MCNP. The cross-section libraries used were processed from the JEF-2.2 evaluation taking into account selfshielding in the unresolved resonance ranges (selfshielding in the resolved resonance ranges is treated by MCNP). For an infinite lattice of unit cells a positive void reactivity effect was found only for the MOX fuel with the largest Pu content. For an infinite lattice of macro cells (voidable inner zone with different fuel mixtures surrounded by an outer zone of UO 2 fuel with moderator) a positive void reactivity effect was obtained for the three MOX fuel types considered. The results are not representative for MOX-loaded power reactor lattices, but serve only to intercompare reactor physics codes and libraries. (orig.)
Benchmarking of epithermal methods in the lattice-physics code EPRI-CELL
International Nuclear Information System (INIS)
Williams, M.L.; Wright, R.Q.; Barhen, J.; Rothenstein, W.; Toney, B.
1982-01-01
The epithermal cross section shielding methods used in the lattice physics code EPRI-CELL (E-C) have been extensively studied to determine its major approximations and to examine the sensitivity of computed results to these approximations. The study has resulted in several improvements in the original methodology. These include: treatment of the external moderator source with intermediate resonance (IR) theory, development of a new Dancoff factor expression to account for clad interactions, development of a new method for treating resonance interference, and application of a generalized least squares method to compute best-estimate values for the Bell factor and group-dependent IR parameters. The modified E-C code with its new ENDF/B-V cross section library is tested for several numerical benchmark problems. Integral parameters computed by EC are compared with those obtained with point-cross section Monte Carlo calculations, and E-C fine group cross sections are benchmarked against point-cross section descrete ordinates calculations. It is found that the code modifications improve agreement between E-C and the more sophisticated methods. E-C shows excellent agreement on the integral parameters and usually agrees within a few percent on fine-group, shielded cross sections
International Nuclear Information System (INIS)
Uddin, M.N.; Sarker, M.M.; Khan, M.J.H.; Islam, S.M.A.
2009-01-01
The aim of this paper is to present the validation of evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 through the analysis of the integral parameters of TRX and BAPL benchmark lattices of thermal reactors for neutronics analysis of TRIGA Mark-II Research Reactor at AERE, Bangladesh. In this process, the 69-group cross-section library for lattice code WIMS was generated using the basic evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 with the help of nuclear data processing code NJOY99.0. Integral measurements on the thermal reactor lattices TRX-1, TRX-2, BAPL-UO 2 -1, BAPL-UO 2 -2 and BAPL-UO 2 -3 served as standard benchmarks for testing nuclear data files and have also been selected for this analysis. The integral parameters of the said lattices were calculated using the lattice transport code WIMSD-5B based on the generated 69-group cross-section library. The calculated integral parameters were compared to the measured values as well as the results of Monte Carlo Code MCNP. It was found that in most cases, the values of integral parameters show a good agreement with the experiment and MCNP results. Besides, the group constants in WIMS format for the isotopes U-235 and U-238 between two data files have been compared using WIMS library utility code WILLIE and it was found that the group constants are identical with very insignificant difference. Therefore, this analysis reflects the validation of evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 through benchmarking the integral parameters of TRX and BAPL lattices and can also be essential to implement further neutronic analysis of TRIGA Mark-II research reactor at AERE, Dhaka, Bangladesh.
Thermal and fast reactor benchmark testing of ENDF/B-6.4
International Nuclear Information System (INIS)
Liu Guisheng
1999-01-01
The benchmark testing for B-6.4 was done with the same benchmark experiments and calculating method as for B-6.2. The effective multiplication factors k eff , central reaction rate ratios of fast assemblies and lattice cell reaction rate ratios of thermal lattice cell assemblies were calculated and compared with testing results of B-6.2 and CENDL-2. It is obvious that 238 U data files are most important for the calculations of large fast reactors and lattice thermal reactors. However, 238 U data in the new version of ENDF/B-6 have not been renewed. Only data of 235 U, 27 Al, 14 N and 2 D have been renewed in ENDF/B-6.4. Therefor, it will be shown that the thermal reactor benchmark testing results are remarkably improved and the fast reactor benchmark testing results are not improved
Analysis of a molten salt reactor benchmark
International Nuclear Information System (INIS)
Ghosh, Biplab; Bajpai, Anil; Degweker, S.B.
2013-01-01
This paper discusses results of our studies of an IAEA molten salt reactor (MSR) benchmark. The benchmark, proposed by Japan, involves burnup calculations of a single lattice cell of a MSR for burning plutonium and other minor actinides. We have analyzed this cell with in-house developed burnup codes BURNTRAN and McBURN. This paper also presents a comparison of the results of our codes and those obtained by the proposers of the benchmark. (author)
Two-dimensional benchmark calculations for PNL-30 through PNL-35
International Nuclear Information System (INIS)
Mosteller, R.D.
1997-01-01
Interest in critical experiments with lattices of mixed-oxide (MOX) fuel pins has been revived by the possibility that light water reactors will be used for disposition of weapons-grade plutonium. A series of six experiments with MOX lattices, designated PNL-30 through PNL-35, was performed at Pacific Northwest Laboratories in 1975 and 1976, and a set of benchmark specifications for these experiments subsequently was adopted by the Cross Section Evaluation Working Group (CSEWG). Although there appear to be some problems with these experiments, they remain the only CSEWG benchmarks for MOX lattices. The number of fuel pins in these experiments is relatively low, corresponding to fewer than 4 typical pressurized-water-reactor fuel assemblies. Accordingly, they are more appropriate as benchmarks for lattice-physics codes than for reactor-core simulator codes. Unfortunately, the CSEWG specifications retain the full three-dimensional (3D) detail of the experiments, while lattice-physics codes almost universally are limited to two dimensions (2D). This paper proposes an extension of the benchmark specifications to include a 2D model, and it justifies that extension by comparing results from the MCNP Monte Carlo code for the 2D and 3D specifications
Benchmark calculation of nuclear design code for HCLWR
International Nuclear Information System (INIS)
Suzuki, Katsuo; Saji, Etsuro; Gakuhari, Kazuhiko; Akie, Hiroshi; Takano, Hideki; Ishiguro, Yukio.
1986-01-01
In the calculation of the lattice cell for High Conversion Light Water Reactors, big differences of nuclear design parameters appear between the results obtained by various methods and nuclear data libraries. The validity of the calculation can be verified by the critical experiment. The benchmark calculation is also efficient for the estimation of the validity in wide range of lattice parameters and burnup. As we do not have many measured data. The benchmark calculations were done by JAERI and MAPI, using SRAC and WIMS-E respectively. The problem covered the wide range of lattice parameters, i.e., from tight lattice to the current PWR lattice. The comparison was made on the effective multiplication factor, conversion ratio, and reaction rate of each nuclide, including burnup and void effects. The difference of the result is largest at the tightest lattice. But even at that lattice, the difference of the effective multiplication factor is only 1.4 %. The main cause of the difference is the neutron absorption rate U-238 in resonance energy region. The difference of other nuclear design parameters and their cause were also grasped. (author)
On the thermal scattering law data for reactor lattice calculations
International Nuclear Information System (INIS)
Trkov, A.; Mattes, M.
2004-01-01
Thermal scattering law data for hydrogen bound in water, hydrogen bound in zirconium hydride and deuterium bound in heavy water have been re-evaluated. The influence of the thermal scattering law data on critical lattices has been studied with detailed Monte Carlo calculations and a summary of results is presented for a numerical benchmark and for the TRIGA reactor benchmark. Systematics for a large sequence of benchmarks analysed with the WIMS-D lattice code are also presented. (author)
Analysis of benchmark lattices with endf/b-vi, jef-2.2 and jendl-3 data
International Nuclear Information System (INIS)
Saglam, M.
1995-01-01
The NJOY Nuclear Data Processing System has been used to process the ENDF/B-VI , JEF-2.2 and JENDL-3 Nuclear Cross Section Data Bases into multigroup form. A brief description of the data bases is given and the assumptions made in processing the data from evaluated nuclear data file format to multigroup format are presented. The differences and similarities of the Evaluated Nuclear Data Files have been investigated by producing four group cross sections by using the GROUPIE code and calculating thermal, fission spectrum averaged and 2200 m/s cross sections and resonance integrals using the INTER cale. It has been shown that the evaluated data for U238 in JEF and ENDF/B-VI are principally the same while in case of U235 the same is true for JENDL and ENDF/B-VI. The evaluations for U233 and Th232 are different for all three ENDF files. Several utility codes have been written to convert the multigroup library into a WIMS-D4 compatible binary library. The performance and suitability of the generated libraries have been tested with the use of metal tueled TRX lattices, uranium oxide fueled BAPL lattices and Th232-U233 fueled BNL lattices. The use ot a new thermal scattering matrix for Hydrogen from ENDF/B-VI increased keff for 0.5 o/ while the use of ENDF/B-VI U238 decreased it for 2.5 %. Although the original WIMS library performed well for Ihe effective multiplication factor of the lattices there is an improvement for the epithermal to thermal capture rate of U238 while using new data in the TRX and BAPL lattices. The effect of the fission spectrum is investigated for the BNL lattices and it is shown that using U233 fission spectrum instead of the original U235 spectrum gives a keff which agrees better with the experimental value. The results obtained by using new multigroup data are generally acceptable and in the experimental error range. They especially improve the prediction of the reaction rate dependent benchmark parameters
Polarization response of RHIC electron lens lattices
Directory of Open Access Journals (Sweden)
V. H. Ranjbar
2016-10-01
Full Text Available Depolarization response for a system of two orthogonal snakes at irrational tunes is studied in depth using lattice independent spin integration. In particular we consider the effect of overlapping spin resonances in this system, to understand the impact of phase, tune, relative location and threshold strengths of the spin resonances. These results are benchmarked and compared to two dimensional direct tracking results for the RHIC e-lens lattice and the standard lattice. Finally we consider the effect of longitudinal motion via chromatic scans using direct six dimensional lattice tracking.
Polarization response of RHIC electron lens lattices
International Nuclear Information System (INIS)
Ranjbar, V. H.; Méot, F.; Bai, M.; Abell, D. T.; Meiser, D.
2016-01-01
Depolarization response for a system of two orthogonal snakes at irrational tunes is studied in depth using lattice independent spin integration. Particularly, we consider the effect of overlapping spin resonances in this system, to understand the impact of phase, tune, relative location and threshold strengths of the spin resonances. Furthermore, these results are benchmarked and compared to two dimensional direct tracking results for the RHIC e-lens lattice and the standard lattice. We then consider the effect of longitudinal motion via chromatic scans using direct six dimensional lattice tracking.
Analysis of Benchmark 2 results
International Nuclear Information System (INIS)
Bacha, F.; Lefievre, B.; Maillard, J.; Silva, J.
1994-01-01
The code GEANT315 has been compared to different codes in two benchmarks. We analyze its performances through our results, especially in the thick target case. In spite of gaps in nucleus-nucleus interaction theories at intermediate energies, benchmarks allow possible improvements of physical models used in our codes. Thereafter, a scheme of radioactive waste burning system is studied. (authors). 4 refs., 7 figs., 1 tab
International Nuclear Information System (INIS)
Khan, M.J.H.; Sarker, M.M.; Islam, S.M.A.
2013-01-01
Highlights: ► To validate the SRAC2006 code system for TRIGA neutronics calculations. ► TRX and BAPL lattices are treated as standard benchmarks for this purpose. ► To compare the calculated results with experiment as well as MCNP values in this study. ► The study demonstrates a good agreement with the experiment and the MCNP results. ► Thus, this analysis reflects the validation study of the SRAC2006 code system. - Abstract: The goal of this study is to present the validation study of the SRAC2006 code system based on evaluated nuclear data libraries ENDF/B-VII.0 and JENDL-3.3 for neutronics analysis of TRIGA Mark-II Research Reactor at AERE, Bangladesh. This study is achieved through the analysis of integral parameters of TRX and BAPL benchmark lattices of thermal reactors. In integral measurements, the thermal reactor lattices TRX-1, TRX-2, BAPL-UO 2 -1, BAPL-UO 2 -2 and BAPL-UO 2 -3 are treated as standard benchmarks for validating/testing the SRAC2006 code system as well as nuclear data libraries. The integral parameters of the said lattices are calculated using the collision probability transport code PIJ of the SRAC2006 code system at room temperature 20 °C based on the above libraries. The calculated integral parameters are compared to the measured values as well as the MCNP values based on the Chinese evaluated nuclear data library CENDL-3.0. It was found that in most cases, the values of integral parameters demonstrate a good agreement with the experiment and the MCNP results. In addition, the group constants in SRAC format for TRX and BAPL lattices in fast and thermal energy range respectively are compared between the above libraries and it was found that the group constants are identical with very insignificant difference. Therefore, this analysis reflects the validation study of the SRAC2006 code system based on evaluated nuclear data libraries JENDL-3.3 and ENDF/B-VII.0 and can also be essential to implement further neutronics calculations
Energy Technology Data Exchange (ETDEWEB)
Sharpe, J.; Salaun, F.; Hummel, D.; Moghrabi, A., E-mail: sharpejr@mcmaster.ca [McMaster University, Hamilton, ON (Canada); Nowak, M. [McMaster University, Hamilton, ON (Canada); Institut National Polytechnique de Grenoble, Phelma, Grenoble (France); Pencer, J. [McMaster University, Hamilton, ON (Canada); Canadian Nuclear Laboratories, Chalk River, ON, (Canada); Novog, D.; Buijs, A. [McMaster University, Hamilton, ON (Canada)
2015-07-01
Discrepancies in key lattice physics parameters have been observed between various deterministic (e.g. DRAGON and WIMS-AECL) and stochastic (MCNP, KENO) neutron transport codes in modeling previous versions of the Canadian SCWR lattice cell. Further, inconsistencies in these parameters have also been observed when using different nuclear data libraries. In this work, the predictions of k∞, various reactivity coefficients, and relative ring-averaged pin powers have been re-evaluated using these codes and libraries with the most recent 64-element fuel assembly geometry. A benchmark problem has been defined to quantify the dissimilarities between code results for a number of responses along the fuel channel under prescribed hot full power (HFP), hot zero power (HZP) and cold zero power (CZP) conditions and at several fuel burnups (0, 25 and 50 MW·d·kg{sup -1} [HM]). Results from deterministic (TRITON, DRAGON) and stochastic codes (MCNP6, KENO V.a and KENO-VI) are presented. (author)
International Nuclear Information System (INIS)
Sharpe, J.; Salaun, F.; Hummel, D.; Moghrabi, A.; Nowak, M.; Pencer, J.; Novog, D.; Buijs, A.
2015-01-01
Discrepancies in key lattice physics parameters have been observed between various deterministic (e.g. DRAGON and WIMS-AECL) and stochastic (MCNP, KENO) neutron transport codes in modeling previous versions of the Canadian SCWR lattice cell. Further, inconsistencies in these parameters have also been observed when using different nuclear data libraries. In this work, the predictions of k∞, various reactivity coefficients, and relative ring-averaged pin powers have been re-evaluated using these codes and libraries with the most recent 64-element fuel assembly geometry. A benchmark problem has been defined to quantify the dissimilarities between code results for a number of responses along the fuel channel under prescribed hot full power (HFP), hot zero power (HZP) and cold zero power (CZP) conditions and at several fuel burnups (0, 25 and 50 MW·d·kg"-"1 [HM]). Results from deterministic (TRITON, DRAGON) and stochastic codes (MCNP6, KENO V.a and KENO-VI) are presented. (author)
ZZ WPPR, Pu Recycling Benchmark Results
International Nuclear Information System (INIS)
Lutz, D.; Mattes, M.; Delpech, Marc; Juanola, Marc
2002-01-01
Description of program or function: The NEA NSC Working Party on Physics of Plutonium Recycling has commissioned a series of benchmarks covering: - Plutonium recycling in pressurized-water reactors; - Void reactivity effect in pressurized-water reactors; - Fast Plutonium-burner reactors: beginning of life; - Plutonium recycling in fast reactors; - Multiple recycling in advanced pressurized-water reactors. The results have been published (see references). ZZ-WPPR-1-A/B contains graphs and tables relative to the PWR Mox pin cell benchmark, representing typical fuel for plutonium recycling, one corresponding to a first cycle, the second for a fifth cycle. These computer readable files contain the complete set of results, while the printed report contains only a subset. ZZ-WPPR-2-CYC1 are the results from cycle 1 of the multiple recycling benchmarks
A 3D stylized half-core CANDU benchmark problem
International Nuclear Information System (INIS)
Pounders, Justin M.; Rahnema, Farzad; Serghiuta, Dumitru; Tholammakkil, John
2011-01-01
A 3D stylized half-core Canadian deuterium uranium (CANDU) reactor benchmark problem is presented. The benchmark problem is comprised of a heterogeneous lattice of 37-element natural uranium fuel bundles, heavy water moderated, heavy water cooled, with adjuster rods included as reactivity control devices. Furthermore, a 2-group macroscopic cross section library has been developed for the problem to increase the utility of this benchmark for full-core deterministic transport methods development. Monte Carlo results are presented for the benchmark problem in cooled, checkerboard void, and full coolant void configurations.
International Nuclear Information System (INIS)
McCoy, D.R.
1981-01-01
S/sub N/ computational benchmark solutions are generated for a onegroup and multigroup fuel-void slab lattice cell which is a rough model of a gas-cooled fast reactor (GCFR) lattice cell. The reactivity induced by the extrusion of the fuel material into the voided region is determined for a series of partially extruded lattice cell configurations. A special modified Gauss S/sub N/ ordinate array design is developed in order to obtain eigenvalues with errors less than 0.03% in all of the configurations that are considered. The modified Gauss S/sub N/ ordinate array design has a substantially improved eigenvalue angular convergence behavior when compared to existing S/sub N/ ordinate array designs used in neutron streaming applications. The angular refinement computations are performed in some cases by using a perturbation theory method which enables one to obtain high order S/sub N/ eigenvalue estimates for greatly reduced computational costs
Benchmark calculation of APOLLO-2 and SLAROM-UF in a fast reactor lattice
International Nuclear Information System (INIS)
Hazama, T.
2009-07-01
A lattice cell benchmark calculation is carried out for APOLLO2 and SLAROM-UF on the infinite lattice of a simple pin cell featuring a fast reactor. The accuracy in k-infinity and reaction rates is investigated in their reference and standard level calculations. In the 1. reference level calculation, APOLLO2 and SLAROM-UF agree with the reference value of k-infinity obtained by a continuous energy Monte Carlo calculation within 50 pcm. However, larger errors are observed in a particular reaction rate and energy range. The major problem common to both codes is in the cross section library of 239 Pu in the unresolved energy range. In the 2. reference level calculation, which is based on the ECCO 1968 group structure, both results of k-infinity agree with the reference value within 100 pcm. The resonance overlap effect is observed by several percents in cross sections of heavy nuclides. In the standard level calculation based on the APOLLO2 library creation methodology, a discrepancy appears by more than 300 pcm. A restriction is revealed in APOLLO2. Its standard cross section library does not have a sufficiently small background cross section to evaluate the self shielding effect on 56 Fe cross sections. The restriction can be removed by introducing the mixture self-shielding treatment recently introduced to APOLLO2. SLAROM-UF original standard level calculation based on the JFS-3 library creation methodology is the best among the standard level calculations. Improvement from the SLAROM-UF standard level calculation is achieved mainly by use of a proper weight function for light or intermediate nuclides. (author)
Benchmarking of the 99-group ANSL-V library
International Nuclear Information System (INIS)
Wright, R.Q.; Ford, W.E. III; Greene, N.M.; Petrie, L.M.; Primm, R.T. III; Westfall, R.M.
1987-01-01
The purpose of this paper is to present thermal benchmark data testing results for the BAPL-1, TRX-1, and SEEP-1 lattices, using selected processed cross-sections from the ANSL-V 99-group library. 7 refs., 1 tab
Benchmark test of CP-PACS for lattice QCD
International Nuclear Information System (INIS)
Yoshie, Tomoteru
1996-01-01
The CP-PACS is a massively parallel computer dedicated for calculations in computational physics and will be in operation in the spring of 1996 at Center for Computational Physics, University of Tsukuba. In this paper, we describe the architecture of the CP-PACS and report the results of the estimate of the performance of the CP-PACS for typical lattice QCD calculations. (author)
HELIOS2: Benchmarking against experiments for hexagonal and square lattices
International Nuclear Information System (INIS)
Simeonov, T.
2009-01-01
HELIOS2, is a 2D transport theory program for fuel burnup and gamma-flux calculation. It solves the neutron and gamma transport equations in a general, two-dimensional geometry bounded by a polygon of straight lines. The applied transport solver may be chosen between: The Method of Collision Probabilities (CP) and The Method of Characteristics(MoC). The former is well known for its successful application for preparation of cross section data banks for 3D simulators for all types lattices for WWERs, PWRs, BWRs, AGRs, RBMK and CANDU reactors. The later, MoC, helps in the areas where the requirements of CP for computational power become too large of practical application. The application of HELIOS2 and The Method of Characteristics for some large from calculation point of view benchmarks is presented in this paper. The analysis combines comparisons to measured data from the Hungarian ZR-6 reactor and JAERI facility of Tank type Critical Assembly (TCA) to verify and validate HELIOS2 and MOC for WWER assembly imitators; configurations with different absorber types- ZrB 2 , B 4 C, Eu 2 O 3 and Gd 2 O 3 ; and critical configurations with stainless steel in the reflector. Core eigenvalues and reaction rates are compared. With the account for the uncertainties the results are generally excellent. Special place in this paper is given to the effect of Iron-made radial reflector. Comparisons to measurements from TIC and TCA for stainless steel and Iron reflected cores are presented. The calculated by HELIOS-2 reactivity effect is in very good agreement with the measurements. (author)
HELIOS2: Benchmarking Against Experiments for Hexagonal and Square Lattices
International Nuclear Information System (INIS)
Simeonov, T.
2009-01-01
HELIOS2, is a 2D transport theory program for fuel burnup and gamma-flux calculation. It solves the neutron and gamma transport equations in a general, two-dimensional geometry bounded by a polygon of straight lines. The applied transport solver may be chosen between: The Method of Collision Probabilities and The Method of Characteristics. The former is well known for its successful application for preparation of cross section data banks for 3D simulators for all types lattices for WWER's, PWR's, BWR's, AGR's, RBMK and CANDU reactors. The later, method of characteristics, helps in the areas where the requirements of collision probability for computational power become too large of practical application. The application of HELIOS2 and The method of characteristics for some large from calculation point of view benchmarks is presented in this paper. The analysis combines comparisons to measured data from the Hungarian ZR-6 reactor and JAERI's facility of tanktype critical assembly to verify and validate HELIOS2 and method of characteristics for WWER assembly imitators; configurations with different absorber types-ZrB2, B4C, Eu2O3 and Gd2O3; and critical configurations with stainless steel in the reflector. Core eigenvalues and reaction rates are compared. With the account for the uncertainties the results are generally excellent. Special place in this paper is given to the effect of Iron-made radial reflector. Comparisons to measurements from The Temporary International Collective and tanktype critical assembly for stainless steel and Iron reflected cores are presented. The calculated by HELIOS-2 reactivity effect is in very good agreement with the measurements. (Authors)
Monte Carlo benchmark calculations for 400MWTH PBMR core
International Nuclear Information System (INIS)
Kim, H. C.; Kim, J. K.; Kim, S. Y.; Noh, J. M.
2007-01-01
A large interest in high-temperature gas-cooled reactors (HTGR) has been initiated in connection with hydrogen production in recent years. In this study, as a part of work for establishing Monte Carlo computation system for HTGR core analysis, some benchmark calculations for pebble-type HTGR were carried out using MCNP5 code. The core of the 400MW t h Pebble-bed Modular Reactor (PBMR) was selected as a benchmark model. Recently, the IAEA CRP5 neutronics and thermal-hydraulics benchmark problem was proposed for the testing of existing methods for HTGRs to analyze the neutronics and thermal-hydraulic behavior for the design and safety evaluations of the PBMR. This study deals with the neutronic benchmark problems, for fresh fuel and cold conditions (Case F-1), and first core loading with given number densities (Case F-2), proposed for PBMR. After the detailed MCNP modeling of the whole facility, benchmark calculations were performed. Spherical fuel region of a fuel pebble is divided into cubic lattice element in order to model a fuel pebble which contains, on average, 15000 CFPs (Coated Fuel Particles). Each element contains one CFP at its center. In this study, the side length of each cubic lattice element to have the same amount of fuel was calculated to be 0.1635 cm. The remaining volume of each lattice element was filled with graphite. All of different 5 concentric shells of CFP were modeled. The PBMR annular core consists of approximately 452000 pebbles in the benchmark problems. In Case F-1 where the core was filled with only fresh fuel pebble, a BCC(body-centered-cubic) lattice model was employed in order to achieve the random packing core with the packing fraction of 0.61. The BCC lattice was also employed with the size of the moderator pebble increased in a manner that reproduces the specified F/M ratio of 1:2 while preserving the packing fraction of 0.61 in Case F-2. The calculations were pursued with ENDF/B-VI cross-section library and used sab2002 S(α,
Thermal reactor benchmark tests on JENDL-2
International Nuclear Information System (INIS)
Takano, Hideki; Tsuchihashi, Keichiro; Yamane, Tsuyoshi; Akino, Fujiyoshi; Ishiguro, Yukio; Ido, Masaru.
1983-11-01
A group constant library for the thermal reactor standard nuclear design code system SRAC was produced by using the evaluated nuclear data JENDL-2. Furthermore, the group constants for 235 U were calculated also from ENDF/B-V. Thermal reactor benchmark calculations were performed using the produced group constant library. The selected benchmark cores are two water-moderated lattices (TRX-1 and 2), two heavy water-moderated cores (DCA and ETA-1), two graphite-moderated cores (SHE-8 and 13) and eight critical experiments for critical safety. The effective multiplication factors and lattice cell parameters were calculated and compared with the experimental values. The results are summarized as follows. (1) Effective multiplication factors: The results by JENDL-2 are considerably improved in comparison with ones by ENDF/B-IV. The best agreement is obtained by using JENDL-2 and ENDF/B-V (only 235 U) data. (2) Lattice cell parameters: For the rho 28 (the ratio of epithermal to thermal 238 U captures) and C* (the ratio of 238 U captures to 235 U fissions), the values calculated by JENDL-2 are in good agreement with the experimental values. The rho 28 (the ratio of 238 U to 235 U fissions) are overestimated as found also for the fast reactor benchmarks. The rho 02 (the ratio of epithermal to thermal 232 Th captures) calculated by JENDL-2 or ENDF/B-IV are considerably underestimated. The functions of the SRAC system have been continued to be extended according to the needs of its users. A brief description will be given, in Appendix B, to the extended parts of the SRAC system together with the input specification. (author)
Energy Technology Data Exchange (ETDEWEB)
Gruppelaar, H. [Netherlands Energy Research Foundation (ECN), Petten (Netherlands); Klippel, H.T. [Netherlands Energy Research Foundation (ECN), Petten (Netherlands); Kloosterman, J.L. [Netherlands Energy Research Foundation (ECN), Petten (Netherlands); Hoogenboom, J.E. [Technische Univ. Delft (Netherlands). Interfacultair Reactor Instituut; Leege, P.F.A. de [Technische Univ. Delft (Netherlands). Interfacultair Reactor Instituut; Verhagen, F.C.M. [Keuring van Electrotechnische Materialen NV, Arnhem (Netherlands); Bruggink, J.C. [Gemeenschappelijke Kernenergiecentrale Nederland N.V., Dodewaard (Netherlands)
1993-11-01
Benchmark results of the Dutch PINK working group on calculational benchmarks on single pin cell and multipin assemblies as defined by EPRI are presented and evaluated. First a short update of methods used by the various institutes involved is given as well as an update of the status with respect to previous performed pin-cell calculations. Problems detected in previous pin-cell calculations are inspected more closely. Detailed discussion of results of multipin assembly calculations is given. The assembly consists of 9 pins in a multicell square lattice in which the central pin is filled differently, i.e. a Gd pin for the BWR assembly and a control rod/guide tube for the PWR assembly. The results for pin cells showed a rather good overall agreement between the four participants although BWR pins with high void fraction turned out to be difficult to calculate. With respect to burnup calculations good overall agreement for the reactivity swing was obtained, provided that a fine time grid is used. (orig.)
International Nuclear Information System (INIS)
Gruppelaar, H.; Klippel, H.T.; Kloosterman, J.L.; Hoogenboom, J.E.; Bruggink, J.C.
1993-11-01
Benchmark results of the Dutch PINK working group on calculational benchmarks on single pin cell and multipin assemblies as defined by EPRI are presented and evaluated. First a short update of methods used by the various institutes involved is given as well as an update of the status with respect to previous performed pin-cell calculations. Problems detected in previous pin-cell calculations are inspected more closely. Detailed discussion of results of multipin assembly calculations is given. The assembly consists of 9 pins in a multicell square lattice in which the central pin is filled differently, i.e. a Gd pin for the BWR assembly and a control rod/guide tube for the PWR assembly. The results for pin cells showed a rather good overall agreement between the four participants although BWR pins with high void fraction turned out to be difficult to calculate. With respect to burnup calculations good overall agreement for the reactivity swing was obtained, provided that a fine time grid is used. (orig.)
Assessment of neutron transport codes for application to CANDU fuel lattices analysis
International Nuclear Information System (INIS)
Roh, Gyu Hong; Choi, Hang Bok
1999-08-01
In order to assess the applicability of WIMS-AECL and HELIOS code to the CANDU fuel lattice analysis, the physics calculations has been carried out for the standard CANDU fuel and DUPIC fuel lattices, and the results were compared with those of Monte Carlo code MCNP-4B. In this study, in order to consider the full isotopic composition and the temperature effect, new MCNP libraries have been generated from ENDF/B-VI release 3 and validated for typical benchmark problems. The TRX-1,2,BAPL-1,2,3 pin -cell lattices and KENO criticality safety benchmark calculations have been performed for the new MCNP libraries, and the results have shown that the new MCNP library has sufficient accuracy to be used for physics calculation. Then, the lattice codes have been benchmarked by the MCNP code for the major physics parameters such as the burnup reactivity, void reactivity, relative pin power and Doppler coefficient, etc. for the standard CANDU fuel and DUPIC fuel lattices. For the standard CANDU fuel lattice, it was found that the results of WIMS-AECL calculations are consistent with those of MCNP. For the DUPIC fuel lattice, however, the results of WIMS-AECL calculations with ENDF/B-V library have shown that the discrepancy from the results of MCNP calculations increases when the fuel burnup is relatively high. The burnup reactivities of WIMS-ACEL calculations with ENDF/B-VI library have shown excellent agreements with those of MCNP calculation for both the standard CANDU and DUPIC fuel lattices. However, the Doppler coefficient have relatively large discrepancies compared with MCNP calculations, and the difference increases as the fuel burns. On the other hand, the results of HELIOS calculation are consistent with those of MCNP even though the discrepancy is slightly larger compared with the case of the standard CANDU fuel lattice. this study has shown that the WIMS-AECL products reliable results for the natural uranium fuel. However, it is recommended that the WIMS
Criticality Benchmark Results Using Various MCNP Data Libraries
International Nuclear Information System (INIS)
Frankle, Stephanie C.
1999-01-01
A suite of 86 criticality benchmarks has been recently implemented in MCNPtrademark as part of the nuclear data validation effort. These benchmarks have been run using two sets of MCNP continuous-energy neutron data: ENDF/B-VI based data through Release 2 (ENDF60) and the ENDF/B-V based data. New evaluations were completed for ENDF/B-VI for a number of the important nuclides such as the isotopes of H, Be, C, N, O, Fe, Ni, 235,238 U, 237 Np, and 239,240 Pu. When examining the results of these calculations for the five manor categories of 233 U, intermediate-enriched 235 U (IEU), highly enriched 235 U (HEU), 239 Pu, and mixed metal assembles, we find the following: (1) The new evaluations for 9 Be, 12 C, and 14 N show no net effect on k eff ; (2) There is a consistent decrease in k eff for all of the solution assemblies for ENDF/B-VI due to 1 H and 16 O, moving k eff further from the benchmark value for uranium solutions and closer to the benchmark value for plutonium solutions; (3) k eff decreased for the ENDF/B-VI Fe isotopic data, moving the calculated k eff further from the benchmark value; (4) k eff decreased for the ENDF/B-VI Ni isotopic data, moving the calculated k eff closer to the benchmark value; (5) The W data remained unchanged and tended to calculate slightly higher than the benchmark values; (6) For metal uranium systems, the ENDF/B-VI data for 235 U tends to decrease k eff while the 238 U data tends to increase k eff . The net result depends on the energy spectrum and material specifications for the particular assembly; (7) For more intermediate-energy systems, the changes in the 235,238 U evaluations tend to increase k eff . For the mixed graphite and normal uranium-reflected assembly, a large increase in k eff due to changes in the 238 U evaluation moved the calculated k eff much closer to the benchmark value. (8) There is little change in k eff for the uranium solutions due to the new 235,238 U evaluations; and (9) There is little change in k eff
Evaluation of PWR and BWR pin cell benchmark results
International Nuclear Information System (INIS)
Pijlgroms, B.J.; Gruppelaar, H.; Janssen, A.J.; Hoogenboom, J.E.; Leege, P.F.A. de; Voet, J. van der; Verhagen, F.C.M.
1991-12-01
Benchmark results of the Dutch PINK working group on PWR and BWR pin cell calculational benchmark as defined by EPRI are presented and evaluated. The observed discrepancies are problem dependent: a part of the results is satisfactory, some other results require further analysis. A brief overview is given of the different code packages used in this analysis. (author). 14 refs., 9 figs., 30 tabs
Evaluation of PWR and BWR pin cell benchmark results
Energy Technology Data Exchange (ETDEWEB)
Pijlgroms, B.J.; Gruppelaar, H.; Janssen, A.J. (Netherlands Energy Research Foundation (ECN), Petten (Netherlands)); Hoogenboom, J.E.; Leege, P.F.A. de (Interuniversitair Reactor Inst., Delft (Netherlands)); Voet, J. van der (Gemeenschappelijke Kernenergiecentrale Nederland NV, Dodewaard (Netherlands)); Verhagen, F.C.M. (Keuring van Electrotechnische Materialen NV, Arnhem (Netherlands))
1991-12-01
Benchmark results of the Dutch PINK working group on PWR and BWR pin cell calculational benchmark as defined by EPRI are presented and evaluated. The observed discrepancies are problem dependent: a part of the results is satisfactory, some other results require further analysis. A brief overview is given of the different code packages used in this analysis. (author). 14 refs., 9 figs., 30 tabs.
Evaluation of PWR and BWR pin cell benchmark results
Energy Technology Data Exchange (ETDEWEB)
Pilgroms, B.J.; Gruppelaar, H.; Janssen, A.J. (Netherlands Energy Research Foundation (ECN), Petten (Netherlands)); Hoogenboom, J.E.; Leege, P.F.A. de (Interuniversitair Reactor Inst., Delft (Netherlands)); Voet, J. van der (Gemeenschappelijke Kernenergiecentrale Nederland NV, Dodewaard (Netherlands)); Verhagen, F.C.M. (Keuring van Electrotechnische Materialen NV, Arnhem (Netherlands))
1991-12-01
Benchmark results of the Dutch PINK working group on the PWR and BWR pin cell calculational benchmark as defined by EPRI are presented and evaluated. The observed discrepancies are problem dependent: a part of the results is satisfactory, some other results require further analysis. A brief overview is given of the different code packages used in this analysis. (author). 14 refs.; 9 figs.; 30 tabs.
Benchmarking of EPRI-cell epithermal methods with the point-energy discrete-ordinates code (OZMA)
International Nuclear Information System (INIS)
Williams, M.L.; Wright, R.Q.; Barhen, J.; Rothenstein, W.
1982-01-01
The purpose of the present study is to benchmark E-C resonance-shielding and cell-averaging methods against a rigorous deterministic solution on a fine-group level (approx. 30 groups between 1 eV and 5.5 keV). The benchmark code used is OZMA, which solves the space-dependent slowing-down equations using continuous-energy discrete ordinates or integral transport theory to produce fine-group cross sections. Results are given for three water-moderated lattices - a mixed oxide, a uranium method, and a tight-pitch high-conversion uranium oxide configuration. The latter two lattices were chosen because of the strong self shielding of the 238 U resonances
Lattice results for heavy light matrix elements
International Nuclear Information System (INIS)
Soni, A.
1994-09-01
Lattice results for heavy light matrix elements are reviewed and some of their implications are very briefly discussed. Despite the fact that in most cases the lattice results for weak matrix elements at the moment have only a modest accuracy of about 20--30% they already have important phenomenological repercussions; e.g. for V td /V ts , x s /x d and B → K*γ
KAERI results for BN600 full MOX benchmark (Phase 4)
International Nuclear Information System (INIS)
Lee, Kibog Lee
2003-01-01
The purpose of this document is to report the results of KAERI's calculation for the Phase-4 of BN-600 full MOX fueled core benchmark analyses according to the RCM report of IAEA CRP Action on U pdated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effects. T he BN-600 full MOX core model is based on the specification in the document, F ull MOX Model (Phase4. doc ) . This document addresses the calculational methods employed in the benchmark analyses and benchmark results carried out by KAERI
Status on benchmark testing of CENDL-3
Liu Ping
2002-01-01
CENDL-3, the newest version of China Evaluated Nuclear Data Library has been finished, and distributed for some benchmarks analysis recently. The processing was carried out using the NJOY nuclear data processing code system. The calculations and analysis of benchmarks were done with Monte Carlo code MCNP and reactor lattice code WIMSD5A. The calculated results were compared with the experimental results based on ENDF/B6. In most thermal and fast uranium criticality benchmarks, the calculated k sub e sub f sub f values with CENDL-3 were in good agreements with experimental results. In the plutonium fast cores, the k sub e sub f sub f values were improved significantly with CENDL-3. This is duo to reevaluation of the fission spectrum and elastic angular distributions of sup 2 sup 3 sup 9 Pu and sup 2 sup 4 sup 0 Pu. CENDL-3 underestimated the k sub e sub f sub f values compared with other evaluated data libraries for most spherical or cylindrical assemblies of plutonium or uranium with beryllium
Quantum lattice model solver HΦ
Kawamura, Mitsuaki; Yoshimi, Kazuyoshi; Misawa, Takahiro; Yamaji, Youhei; Todo, Synge; Kawashima, Naoki
2017-08-01
HΦ [aitch-phi ] is a program package based on the Lanczos-type eigenvalue solution applicable to a broad range of quantum lattice models, i.e., arbitrary quantum lattice models with two-body interactions, including the Heisenberg model, the Kitaev model, the Hubbard model and the Kondo-lattice model. While it works well on PCs and PC-clusters, HΦ also runs efficiently on massively parallel computers, which considerably extends the tractable range of the system size. In addition, unlike most existing packages, HΦ supports finite-temperature calculations through the method of thermal pure quantum (TPQ) states. In this paper, we explain theoretical background and user-interface of HΦ. We also show the benchmark results of HΦ on supercomputers such as the K computer at RIKEN Advanced Institute for Computational Science (AICS) and SGI ICE XA (Sekirei) at the Institute for the Solid State Physics (ISSP).
Studies of thermal-reactor benchmark-data interpretation: experimental corrections
International Nuclear Information System (INIS)
Sher, R.; Fiarman, S.
1976-10-01
Experimental values of integral parameters of the lattices studied in this report, i.e., the MIT(D 2 O) and TRX benchmark lattices have been re-examined and revised. The revisions correct several systematic errors that have been previously ignored or considered insignificant. These systematic errors are discussed in detail. The final corrected values are presented
Thermal lattice benchmarks for testing basic evaluated data files, developed with MCNP4B
International Nuclear Information System (INIS)
Maucec, M.; Glumac, B.
1996-01-01
The development of unit cell and full reactor core models of DIMPLE S01A and TRX-1 and TRX-2 benchmark experiments, using Monte Carlo computer code MCNP4B is presented. Nuclear data from ENDF/B-V and VI version of cross-section library were used in the calculations. In addition, a comparison to results obtained with the similar models and cross-section data from the EJ2-MCNPlib library (which is based upon the JEF-2.2 evaluation) developed in IRC Petten, Netherlands is presented. The results of the criticality calculation with ENDF/B-VI data library, and a comparison to results obtained using JEF-2.2 evaluation, confirm the MCNP4B full core model of a DIMPLE reactor as a good benchmark for testing basic evaluated data files. On the other hand, the criticality calculations results obtained using the TRX full core models show less agreement with experiment. It is obvious that without additional data about the TRX geometry, our TRX models are not suitable as Monte Carlo benchmarks. (author)
Benchmark of the CASMO-3G/MICROBURN-B codes for Commonwealth Edison boiling water reactors
International Nuclear Information System (INIS)
Wheeler, J.K.; Pallotta, A.S.
1992-01-01
The Commonwealth Edison Company has performed an extensive benchmark against measured data from three boiling water reactors using the Studsvik lattice physics code CASMO-3G and the Siemens Nuclear Power three-dimensional simulator code MICROBURN-B. The measured data of interest for this benchmark are the hot and cold reactivity, and the core power distributions as measured by the traversing incore probe system and gamma scan data for fuel pins and assemblies. A total of nineteen unit-cycles were evaluated. The database included fuel product lines manufactured by General Electric and Siemens Nuclear Power, wit assemblies containing 7 x 7 to 9 x 9 pin configurations, several water rod designs, various enrichments and gadolina loadings, and axially varying lattice designs throughout the enriched portion of the bundle. The results of the benchmark present evidence that the CASMO-3G/MICROBURN-B code package can adequately model the range of fuel and core types in the benchmark, and the codes are acceptable for performing neutronic analyses of Commonwealth Edison's boiling water reactors
Comparing the results of lattice and off-lattice simulations for the melt of nonconcatenated rings
International Nuclear Information System (INIS)
Halverson, Jonathan D; Kremer, Kurt; Grosberg, Alexander Y
2013-01-01
To study the conformational properties of unknotted and nonconcatenated ring polymers in the melt, we present a detailed qualitative and quantitative comparison of simulation data obtained by molecular dynamics simulation using an off-lattice bead-spring model and by Monte Carlo simulation using a lattice model. We observe excellent, and sometimes even unexpectedly good, agreement between the off-lattice and lattice results for many quantities measured including the gyration radii of the ring polymers, gyration radii of their subchains, contact probabilities, surface characteristics, number of contacts between subchains, and the static structure factors of the rings and their subchains. These results are, in part, put in contrast to Moore curves, and the open, linear polymer counterparts. While our analysis is extensive, our understanding of the ring melt conformations is still rather preliminary. (paper)
Compilation of benchmark results for fusion related Nuclear Data
International Nuclear Information System (INIS)
Maekawa, Fujio; Wada, Masayuki; Oyama, Yukio; Ichihara, Chihiro; Makita, Yo; Takahashi, Akito
1998-11-01
This report compiles results of benchmark tests for validation of evaluated nuclear data to be used in nuclear designs of fusion reactors. Parts of results were obtained under activities of the Fusion Neutronics Integral Test Working Group organized by the members of both Japan Nuclear Data Committee and the Reactor Physics Committee. The following three benchmark experiments were employed used for the tests: (i) the leakage neutron spectrum measurement experiments from slab assemblies at the D-T neutron source at FNS/JAERI, (ii) in-situ neutron and gamma-ray measurement experiments (so-called clean benchmark experiments) also at FNS, and (iii) the pulsed sphere experiments for leakage neutron and gamma-ray spectra at the D-T neutron source facility of Osaka University, OKTAVIAN. Evaluated nuclear data tested were JENDL-3.2, JENDL Fusion File, FENDL/E-1.0 and newly selected data for FENDL/E-2.0. Comparisons of benchmark calculations with the experiments for twenty-one elements, i.e., Li, Be, C, N, O, F, Al, Si, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zr, Nb, Mo, W and Pb, are summarized. (author). 65 refs
International Nuclear Information System (INIS)
Marck, Steven C. van der
2006-01-01
benchmarks deviates only 0.017% from the measured benchmark value. Moreover, no clear trends (with e.g. enrichment, lattice pitch, or spectrum) have been observed. Also for fast spectrum benchmarks, both for intermediately or highly enriched uranium and for plutonium, clear improvements are apparent from the calculations. The results for bare assemblies have improved, as well as those with a depleted or natural uranium reflector. On the other hand, the results for plutonium solutions (PU-SOL-THERM) are still high, on average (over 120 benchmarks) roughly 0.6%. Furthermore there still is a bias for a range of benchmarks based on cores in the Zero Power Reactor (ANL) with sizable amounts of tungsten in them. The results for the fusion shielding benchmarks have not changed significantly, compared to ENDF/B-VI.8, for most materials. The delayed neutron testing shows that the values for both thermal and fast spectrum cases are now well predicted, which is an improvement when compared with ENDF/B-VI.8
Lattice QCD calculations on commodity clusters at DESY
International Nuclear Information System (INIS)
Gellrich, A.; Pop, D.; Wegner, P.; Wittig, H.; Hasenbusch, M.; Jansen, K.
2003-06-01
Lattice Gauge Theory is an integral part of particle physics that requires high performance computing in the multi-Tflops regime. These requirements are motivated by the rich research program and the physics milestones to be reached by the lattice community. Over the last years the enormous gains in processor performance, memory bandwidth, and external I/O bandwidth for parallel applications have made commodity clusters exploiting PCs or workstations also suitable for large Lattice Gauge Theory applications. For more than one year two clusters have been operated at the two DESY sites in Hamburg and Zeuthen, consisting of 32 resp. 16 dual-CPU PCs, equipped with Intel Pentium 4 Xeon processors. Interconnection of the nodes is done by way of Myrinet. Linux was chosen as the operating system. In the course of the projects benchmark programs for architectural studies were developed. The performance of the Wilson-Dirac Operator (also in an even-odd preconditioned version) as the inner loop of the Lattice QCD (LQCD) algorithms plays the most important role in classifying the hardware basis to be used. Using the SIMD streaming extensions (SSE/SSE2) on Intel's Pentium 4 Xeon CPUs give promising results for both the single CPU and the parallel version. The parallel performance, in addition to the CPU power and the memory throughput, is nevertheless strongly influenced by the behavior of hardware components like the PC chip-set and the communication interfaces. The paper starts by giving a short explanation about the physics background and the motivation for using PC clusters for Lattice QCD. Subsequently, the concept, implementation, and operating experiences of the two clusters are discussed. Finally, the paper presents benchmark results and discusses comparisons to systems with different hardware components including Myrinet-, GigaBit-Ethernet-, and Infiniband-based interconnects. (orig.)
DRAGON solutions to the 3D transport benchmark over a range in parameter space
International Nuclear Information System (INIS)
Martin, Nicolas; Hebert, Alain; Marleau, Guy
2010-01-01
DRAGON solutions to the 'NEA suite of benchmarks for 3D transport methods and codes over a range in parameter space' are discussed in this paper. A description of the benchmark is first provided, followed by a detailed review of the different computational models used in the lattice code DRAGON. Two numerical methods were selected for generating the required quantities for the 729 configurations of this benchmark. First, S N calculations were performed using fully symmetric angular quadratures and high-order diamond differencing for spatial discretization. To compare S N results with those of another deterministic method, the method of characteristics (MoC) was also considered for this benchmark. Comparisons between reference solutions, S N and MoC results illustrate the advantages and drawbacks of each methods for this 3-D transport problem.
EPRI depletion benchmark calculations using PARAGON
International Nuclear Information System (INIS)
Kucukboyaci, Vefa N.
2015-01-01
Highlights: • PARAGON depletion calculations are benchmarked against the EPRI reactivity decrement experiments. • Benchmarks cover a wide range of enrichments, burnups, cooling times, and burnable absorbers, and different depletion and storage conditions. • Results from PARAGON-SCALE scheme are more conservative relative to the benchmark data. • ENDF/B-VII based data reduces the excess conservatism and brings the predictions closer to benchmark reactivity decrement values. - Abstract: In order to conservatively apply burnup credit in spent fuel pool criticality analyses, code validation for both fresh and used fuel is required. Fresh fuel validation is typically done by modeling experiments from the “International Handbook.” A depletion validation can determine a bias and bias uncertainty for the worth of the isotopes not found in the fresh fuel critical experiments. Westinghouse’s burnup credit methodology uses PARAGON™ (Westinghouse 2-D lattice physics code) and its 70-group cross-section library, which have been benchmarked, qualified, and licensed both as a standalone transport code and as a nuclear data source for core design simulations. A bias and bias uncertainty for the worth of depletion isotopes, however, are not available for PARAGON. Instead, the 5% decrement approach for depletion uncertainty is used, as set forth in the Kopp memo. Recently, EPRI developed a set of benchmarks based on a large set of power distribution measurements to ascertain reactivity biases. The depletion reactivity has been used to create 11 benchmark cases for 10, 20, 30, 40, 50, and 60 GWd/MTU and 3 cooling times 100 h, 5 years, and 15 years. These benchmark cases are analyzed with PARAGON and the SCALE package and sensitivity studies are performed using different cross-section libraries based on ENDF/B-VI.3 and ENDF/B-VII data to assess that the 5% decrement approach is conservative for determining depletion uncertainty
Repeated Results Analysis for Middleware Regression Benchmarking
Czech Academy of Sciences Publication Activity Database
Bulej, Lubomír; Kalibera, T.; Tůma, P.
2005-01-01
Roč. 60, - (2005), s. 345-358 ISSN 0166-5316 R&D Projects: GA ČR GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : middleware benchmarking * regression benchmarking * regression testing Subject RIV: JD - Computer Applications, Robotics Impact factor: 0.756, year: 2005
Efficient LBM visual simulation on face-centered cubic lattices.
Petkov, Kaloian; Qiu, Feng; Fan, Zhe; Kaufman, Arie E; Mueller, Klaus
2009-01-01
The Lattice Boltzmann method (LBM) for visual simulation of fluid flow generally employs cubic Cartesian (CC) lattices such as the D3Q13 and D3Q19 lattices for the particle transport. However, the CC lattices lead to suboptimal representation of the simulation space. We introduce the face-centered cubic (FCC) lattice, fD3Q13, for LBM simulations. Compared to the CC lattices, the fD3Q13 lattice creates a more isotropic sampling of the simulation domain and its single lattice speed (i.e., link length) simplifies the computations and data storage. Furthermore, the fD3Q13 lattice can be decomposed into two independent interleaved lattices, one of which can be discarded, which doubles the simulation speed. The resulting LBM simulation can be efficiently mapped to the GPU, further increasing the computational performance. We show the numerical advantages of the FCC lattice on channeled flow in 2D and the flow-past-a-sphere benchmark in 3D. In both cases, the comparison is against the corresponding CC lattices using the analytical solutions for the systems as well as velocity field visualizations. We also demonstrate the performance advantages of the fD3Q13 lattice for interactive simulation and rendering of hot smoke in an urban environment using thermal LBM.
Directory of Open Access Journals (Sweden)
2015-12-01
Full Text Available Numerical results for ground-state and excited-state properties (energies, double occupancies, and Matsubara-axis self-energies of the single-orbital Hubbard model on a two-dimensional square lattice are presented, in order to provide an assessment of our ability to compute accurate results in the thermodynamic limit. Many methods are employed, including auxiliary-field quantum Monte Carlo, bare and bold-line diagrammatic Monte Carlo, method of dual fermions, density matrix embedding theory, density matrix renormalization group, dynamical cluster approximation, diffusion Monte Carlo within a fixed-node approximation, unrestricted coupled cluster theory, and multireference projected Hartree-Fock methods. Comparison of results obtained by different methods allows for the identification of uncertainties and systematic errors. The importance of extrapolation to converged thermodynamic-limit values is emphasized. Cases where agreement between different methods is obtained establish benchmark results that may be useful in the validation of new approaches and the improvement of existing methods.
The Benchmark Test Results of QNX RTOS
Energy Technology Data Exchange (ETDEWEB)
Kim, Jang Yeol; Lee, Young Jun; Cheon, Se Woo; Lee, Jang Soo; Kwon, Kee Choon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)
2010-10-15
A Real-Time Operating System(RTOS) is an Operating System(OS) intended for real-time applications. Benchmarking is a point of reference by which something can be measured. The QNX is a Real Time Operating System(RTOS) developed by QSSL(QNX Software Systems Ltd.) in Canada. The ELMSYS is the brand name of commercially available Personal Computer(PC) for applications such as Cabinet Operator Module(COM) of Digital Plant Protection System(DPPS) and COM of Digital Engineered Safety Features Actuation System(DESFAS). The ELMSYS PC Hardware is being qualified by KTL(Korea Testing Lab.) for use as a Cabinet Operator Module(COM). The QNX RTOS is being dedicated by Korea Atomic Energy Research Institute (KAERI). This paper describes the outline and benchmarking test results on Context Switching, Message Passing, Synchronization and Deadline Violation of QNX RTOS under the ELMSYS PC platform
The Benchmark Test Results of QNX RTOS
International Nuclear Information System (INIS)
Kim, Jang Yeol; Lee, Young Jun; Cheon, Se Woo; Lee, Jang Soo; Kwon, Kee Choon
2010-01-01
A Real-Time Operating System(RTOS) is an Operating System(OS) intended for real-time applications. Benchmarking is a point of reference by which something can be measured. The QNX is a Real Time Operating System(RTOS) developed by QSSL(QNX Software Systems Ltd.) in Canada. The ELMSYS is the brand name of commercially available Personal Computer(PC) for applications such as Cabinet Operator Module(COM) of Digital Plant Protection System(DPPS) and COM of Digital Engineered Safety Features Actuation System(DESFAS). The ELMSYS PC Hardware is being qualified by KTL(Korea Testing Lab.) for use as a Cabinet Operator Module(COM). The QNX RTOS is being dedicated by Korea Atomic Energy Research Institute (KAERI). This paper describes the outline and benchmarking test results on Context Switching, Message Passing, Synchronization and Deadline Violation of QNX RTOS under the ELMSYS PC platform
Dark matter, constrained minimal supersymmetric standard model, and lattice QCD.
Giedt, Joel; Thomas, Anthony W; Young, Ross D
2009-11-13
Recent lattice measurements have given accurate estimates of the quark condensates in the proton. We use these results to significantly improve the dark matter predictions in benchmark models within the constrained minimal supersymmetric standard model. The predicted spin-independent cross sections are at least an order of magnitude smaller than previously suggested and our results have significant consequences for dark matter searches.
Benchmarking NNWSI flow and transport codes: COVE 1 results
International Nuclear Information System (INIS)
Hayden, N.K.
1985-06-01
The code verification (COVE) activity of the Nevada Nuclear Waste Storage Investigations (NNWSI) Project is the first step in certification of flow and transport codes used for NNWSI performance assessments of a geologic repository for disposing of high-level radioactive wastes. The goals of the COVE activity are (1) to demonstrate and compare the numerical accuracy and sensitivity of certain codes, (2) to identify and resolve problems in running typical NNWSI performance assessment calculations, and (3) to evaluate computer requirements for running the codes. This report describes the work done for COVE 1, the first step in benchmarking some of the codes. Isothermal calculations for the COVE 1 benchmarking have been completed using the hydrologic flow codes SAGUARO, TRUST, and GWVIP; the radionuclide transport codes FEMTRAN and TRUMP; and the coupled flow and transport code TRACR3D. This report presents the results of three cases of the benchmarking problem solved for COVE 1, a comparison of the results, questions raised regarding sensitivities to modeling techniques, and conclusions drawn regarding the status and numerical sensitivities of the codes. 30 refs
Uranium-fuel thermal reactor benchmark testing of CENDL-3
International Nuclear Information System (INIS)
Liu Ping
2001-01-01
CENDL-3, the new version of China Evaluated Nuclear Data Library are being processed, and distributed for thermal reactor benchmark analysis recently. The processing was carried out using the NJOY nuclear data processing system. The calculations and analyses of uranium-fuel thermal assemblies TRX-1,2, BAPL-1,2,3, ZEEP-1,2,3 were done with lattice code WIMSD5A. The results were compared with the experimental results, the results of the '1986'WIMS library and the results based on ENDF/B-VI. (author)
Chiral and continuum extrapolation of partially quenched lattice results
Energy Technology Data Exchange (ETDEWEB)
C.R. Allton; W. Armour; D.B. Leinweber; A.W. Thomas; R.D. Young
2005-04-01
The vector meson mass is extracted from a large sample of partially quenched, two-flavor lattice QCD simulations. For the first time, discretization, finite-volume and partial quenching artifacts are treated in a unified chiral effective field theory analysis of the lattice simulation results.
Aircraft Engine Gas Path Diagnostic Methods: Public Benchmarking Results
Simon, Donald L.; Borguet, Sebastien; Leonard, Olivier; Zhang, Xiaodong (Frank)
2013-01-01
Recent technology reviews have identified the need for objective assessments of aircraft engine health management (EHM) technologies. To help address this issue, a gas path diagnostic benchmark problem has been created and made publicly available. This software tool, referred to as the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES), has been constructed based on feedback provided by the aircraft EHM community. It provides a standard benchmark problem enabling users to develop, evaluate and compare diagnostic methods. This paper will present an overview of ProDiMES along with a description of four gas path diagnostic methods developed and applied to the problem. These methods, which include analytical and empirical diagnostic techniques, will be described and associated blind-test-case metric results will be presented and compared. Lessons learned along with recommendations for improving the public benchmarking processes will also be presented and discussed.
Low-energy scattering on the lattice
International Nuclear Information System (INIS)
Bour Bour, Shahin
2014-01-01
In this thesis we present precision benchmark calculations for two-component fermions in the unitarity limit using an ab initio method, namely Hamiltonian lattice formalism. We calculate the ground state energy for unpolarized four particles (Fermi gas) in a periodic cube as a fraction of the ground state energy of the non-interacting system for two independent representations of the lattice Hamiltonians. We obtain the values 0.211(2) and 0.210(2). These results are in full agreement with the Euclidean lattice and fixed-node diffusion Monte Carlo calculations. We also give an expression for the energy corrections to the binding energy of a bound state in a moving frame. These corrections contain information about the mass and number of the constituents and are topological in origin and will have a broad applications to the lattice calculations of nucleons, nuclei, hadronic molecules and cold atoms. As one of its applications we use this expression and determine the low-energy parameters for the fermion dimer elastic scattering in shallow binding limit. For our lattice calculations we use Luescher's finite volume method. From the lattice calculations we find κa fd =1.174(9) and κr fd =-0.029(13), where κ represents the binding momentum of dimer and a fd (r fd ) denotes the scattering length (effective-range). These results are confirmed by the continuum calculations using the Skorniakov-Ter-Martirosian integral equation which gives 1.17907(1) and -0.0383(3) for the scattering length and effective range, respectively.
Towards a physical interpretation of the entropic lattice Boltzmann method
Malaspinas, Orestis; Deville, Michel; Chopard, Bastien
2008-12-01
The entropic lattice Boltzmann method (ELBM) is one among several different versions of the lattice Boltzmann method for the simulation of hydrodynamics. The collision term of the ELBM is characterized by a nonincreasing H function, guaranteed by a variable relaxation time. We propose here an analysis of the ELBM using the Chapman-Enskog expansion. We show that it can be interpreted as some kind of subgrid model, where viscosity correction scales like the strain rate tensor. We confirm our analytical results by the numerical computations of the relaxation time modifications on the two-dimensional dipole-wall interaction benchmark.
Pericles and Attila results for the C5G7 MOX benchmark problems
International Nuclear Information System (INIS)
Wareing, T.A.; McGhee, J.M.
2002-01-01
Recently the Nuclear Energy Agency has published a new benchmark entitled, 'C5G7 MOX Benchmark.' This benchmark is to test the ability of current transport codes to treat reactor core problems without spatial homogenization. The benchmark includes both a two- and three-dimensional problem. We have calculated results for these benchmark problems with our Pericles and Attila codes. Pericles is a one-,two-, and three-dimensional unstructured grid discrete-ordinates code and was used for the twodimensional benchmark problem. Attila is a three-dimensional unstructured tetrahedral mesh discrete-ordinate code and was used for the three-dimensional problem. Both codes use discontinuous finite element spatial differencing. Both codes use diffusion synthetic acceleration (DSA) for accelerating the inner iterations.
Analyzing the BBOB results by means of benchmarking concepts.
Mersmann, O; Preuss, M; Trautmann, H; Bischl, B; Weihs, C
2015-01-01
We present methods to answer two basic questions that arise when benchmarking optimization algorithms. The first one is: which algorithm is the "best" one? and the second one is: which algorithm should I use for my real-world problem? Both are connected and neither is easy to answer. We present a theoretical framework for designing and analyzing the raw data of such benchmark experiments. This represents a first step in answering the aforementioned questions. The 2009 and 2010 BBOB benchmark results are analyzed by means of this framework and we derive insight regarding the answers to the two questions. Furthermore, we discuss how to properly aggregate rankings from algorithm evaluations on individual problems into a consensus, its theoretical background and which common pitfalls should be avoided. Finally, we address the grouping of test problems into sets with similar optimizer rankings and investigate whether these are reflected by already proposed test problem characteristics, finding that this is not always the case.
Thermal reactor benchmark testing of 69 group library
International Nuclear Information System (INIS)
Liu Guisheng; Wang Yaoqing; Liu Ping; Zhang Baocheng
1994-01-01
Using a code system NSLINK, AMPX master library in WIMS 69 groups structure are made from nuclides relating to 4 newest evaluated nuclear data libraries. Some integrals of 10 thermal reactor benchmark assemblies recommended by the U.S. CSEWG are calculated using rectified PASC-1 code system and compared with foreign results, the authors results are in good agreement with others. 69 group libraries of evaluated data bases in TPFAP interface file are generated with NJOY code system. The k ∞ values of 6 cell lattice assemblies are calculated by the code CBM. The calculated results are analysed and compared
International Nuclear Information System (INIS)
Akie, Hiroshi; Ishiguro, Yukio; Takano, Hideki
1988-10-01
The results of the NEACRP HCLWR cell burnup benchmark calculations are summarized in this report. Fifteen organizations from eight countries participated in this benchmark and submitted twenty solutions. Large differences are still observed among the calculated values of void reactivities and conversion ratios. These differences are mainly caused from the discrepancies in the reaction rates of U-238, Pu-239 and fission products. The physics problems related to these results are briefly investigated in the report. In the specialists' meeting on this benchmark calculations held in April 1988, it was recommended to perform continuous energy Monte Carlo calculations in order to obtain reference solutions for design codes. The conclusions resulted from the specialists' meeting are also presented. (author)
Results from the IAEA benchmark of spallation models
International Nuclear Information System (INIS)
Leray, S.; David, J.C.; Khandaker, M.; Mank, G.; Mengoni, A.; Otsuka, N.; Filges, D.; Gallmeier, F.; Konobeyev, A.; Michel, R.
2011-01-01
Spallation reactions play an important role in a wide domain of applications. In the simulation codes used in this field, the nuclear interaction cross-sections and characteristics are computed by spallation models. The International Atomic Energy Agency (IAEA) has recently organised a benchmark of the spallation models used or that could be used in the future into high-energy transport codes. The objectives were, first, to assess the prediction capabilities of the different spallation models for the different mass and energy regions and the different exit channels and, second, to understand the reason for the success or deficiency of the models. Results of the benchmark concerning both the analysis of the prediction capabilities of the models and the first conclusions on the physics of spallation models are presented. (authors)
Straight velocity boundaries in the lattice Boltzmann method
Latt, Jonas; Chopard, Bastien; Malaspinas, Orestis; Deville, Michel; Michler, Andreas
2008-05-01
Various ways of implementing boundary conditions for the numerical solution of the Navier-Stokes equations by a lattice Boltzmann method are discussed. Five commonly adopted approaches are reviewed, analyzed, and compared, including local and nonlocal methods. The discussion is restricted to velocity Dirichlet boundary conditions, and to straight on-lattice boundaries which are aligned with the horizontal and vertical lattice directions. The boundary conditions are first inspected analytically by applying systematically the results of a multiscale analysis to boundary nodes. This procedure makes it possible to compare boundary conditions on an equal footing, although they were originally derived from very different principles. It is concluded that all five boundary conditions exhibit second-order accuracy, consistent with the accuracy of the lattice Boltzmann method. The five methods are then compared numerically for accuracy and stability through benchmarks of two-dimensional and three-dimensional flows. None of the methods is found to be throughout superior to the others. Instead, the choice of a best boundary condition depends on the flow geometry, and on the desired trade-off between accuracy and stability. From the findings of the benchmarks, the boundary conditions can be classified into two major groups. The first group comprehends boundary conditions that preserve the information streaming from the bulk into boundary nodes and complete the missing information through closure relations. Boundary conditions in this group are found to be exceptionally accurate at low Reynolds number. Boundary conditions of the second group replace all variables on boundary nodes by new values. They exhibit generally much better numerical stability and are therefore dedicated for use in high Reynolds number flows.
Results of LWR core transient benchmarks
International Nuclear Information System (INIS)
Finnemann, H.; Bauer, H.; Galati, A.; Martinelli, R.
1993-10-01
LWR core transient (LWRCT) benchmarks, based on well defined problems with a complete set of input data, are used to assess the discrepancies between three-dimensional space-time kinetics codes in transient calculations. The PWR problem chosen is the ejection of a control assembly from an initially critical core at hot zero power or at full power, each for three different geometrical configurations. The set of problems offers a variety of reactivity excursions which efficiently test the coupled neutronic/thermal - hydraulic models of the codes. The 63 sets of submitted solutions are analyzed by comparison with a nodal reference solution defined by using a finer spatial and temporal resolution than in standard calculations. The BWR problems considered are reactivity excursions caused by cold water injection and pressurization events. In the present paper, only the cold water injection event is discussed and evaluated in some detail. Lacking a reference solution the evaluation of the 8 sets of BWR contributions relies on a synthetic comparative discussion. The results of this first phase of LWRCT benchmark calculations are quite satisfactory, though there remain some unresolved issues. It is therefore concluded that even more challenging problems can be successfully tackled in a suggested second test phase. (authors). 46 figs., 21 tabs., 3 refs
PHISICS/RELAP5-3D RESULTS FOR EXERCISES II-1 AND II-2 OF THE OECD/NEA MHTGR-350 BENCHMARK
Energy Technology Data Exchange (ETDEWEB)
Strydom, Gerhard [Idaho National Laboratory
2016-03-01
The Idaho National Laboratory (INL) Advanced Reactor Technologies (ART) High-Temperature Gas-Cooled Reactor (HTGR) Methods group currently leads the Modular High-Temperature Gas-Cooled Reactor (MHTGR) 350 benchmark. The benchmark consists of a set of lattice-depletion, steady-state, and transient problems that can be used by HTGR simulation groups to assess the performance of their code suites. The paper summarizes the results obtained for the first two transient exercises defined for Phase II of the benchmark. The Parallel and Highly Innovative Simulation for INL Code System (PHISICS), coupled with the INL system code RELAP5-3D, was used to generate the results for the Depressurized Conduction Cooldown (DCC) (exercise II-1a) and Pressurized Conduction Cooldown (PCC) (exercise II-2) transients. These exercises require the time-dependent simulation of coupled neutronics and thermal-hydraulics phenomena, and utilize the steady-state solution previously obtained for exercise I-3 of Phase I. This paper also includes a comparison of the benchmark results obtained with a traditional system code “ring” model against a more detailed “block” model that include kinetics feedback on an individual block level and thermal feedbacks on a triangular sub-mesh. The higher spatial fidelity that can be obtained by the block model is illustrated with comparisons of the maximum fuel temperatures, especially in the case of natural convection conditions that dominate the DCC and PCC events. Differences up to 125 K (or 10%) were observed between the ring and block model predictions of the DCC transient, mostly due to the block model’s capability of tracking individual block decay powers and more detailed helium flow distributions. In general, the block model only required DCC and PCC calculation times twice as long as the ring models, and it therefore seems that the additional development and calculation time required for the block model could be worth the gain that can be
Monte Carlo burnup simulation of the TAKAHAMA-3 benchmark experiment
International Nuclear Information System (INIS)
Dalle, Hugo M.
2009-01-01
High burnup PWR fuel is currently being studied at CDTN/CNEN-MG. Monte Carlo burnup code system MONTEBURNS is used to characterize the neutronic behavior of the fuel. In order to validate the code system and calculation methodology to be used in this study the Japanese Takahama-3 Benchmark was chosen, as it is the single burnup benchmark experimental data set freely available that partially reproduces the conditions of the fuel under evaluation. The burnup of the three PWR fuel rods of the Takahama-3 burnup benchmark was calculated by MONTEBURNS using the simplest infinite fuel pin cell model and also a more complex representation of an infinite heterogeneous fuel pin cells lattice. Calculations results for the mass of most isotopes of Uranium, Neptunium, Plutonium, Americium, Curium and some fission products, commonly used as burnup monitors, were compared with the Post Irradiation Examinations (PIE) values for all the three fuel rods. Results have shown some sensitivity to the MCNP neutron cross-section data libraries, particularly affected by the temperature in which the evaluated nuclear data files were processed. (author)
Updated lattice results for parton distributions
International Nuclear Information System (INIS)
Alexandrou, Constantia; Cichy, Krzysztof; Hadjiyiannakou, Kyriakos; Jansen, Karl; Steffens, Fernanda; Wiese, Christian
2017-07-01
We provide an analysis of the x-dependence of the bare unpolarized, helicity and transversity iso-vector parton distribution functions (PDFs) from lattice calculations employing (maximally) twisted mass fermions. The x-dependence of the calculated PDFs resembles the one of the phenomenological parameterizations, a feature that makes this approach very promising. Furthermore, we apply momentum smearing for the relevant matrix elements to compute the lattice PDFs and find a large improvement factor when compared to conventional Gaussian smearing. This allows us to extend the lattice computation of the distributions to higher values of the nucleon momentum, which is essential for the prospects of a reliable extraction of the PDFs in the future.
Updated lattice results for parton distributions
Energy Technology Data Exchange (ETDEWEB)
Alexandrou, Constantia [Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; The Cyprus Institute, Nicosia (Cyprus); Cichy, Krzysztof [Frankfurt Univ. (Germany). Inst. fuer Theoretische Physik; Poznan Univ. (Poland). Faculty of Physics; Constantinou, Martha [Temple Univ., Philadelphia, PA (United States); Hadjiyiannakou, Kyriakos [Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; Jansen, Karl; Steffens, Fernanda; Wiese, Christian [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC
2017-07-15
We provide an analysis of the x-dependence of the bare unpolarized, helicity and transversity iso-vector parton distribution functions (PDFs) from lattice calculations employing (maximally) twisted mass fermions. The x-dependence of the calculated PDFs resembles the one of the phenomenological parameterizations, a feature that makes this approach very promising. Furthermore, we apply momentum smearing for the relevant matrix elements to compute the lattice PDFs and find a large improvement factor when compared to conventional Gaussian smearing. This allows us to extend the lattice computation of the distributions to higher values of the nucleon momentum, which is essential for the prospects of a reliable extraction of the PDFs in the future.
Benchmark results in radiative transfer
International Nuclear Information System (INIS)
Garcia, R.D.M.; Siewert, C.E.
1986-02-01
Several aspects of the F N method are reported, and the method is used to solve accurately some benchmark problems in radiative transfer in the field of atmospheric physics. The method was modified to solve cases of pure scattering and an improved process was developed for computing the radiation intensity. An algorithms for computing several quantities used in the F N method was done. An improved scheme to evaluate certain integrals relevant to the method is done, and a two-term recursion relation that has proved useful for the numerical evaluation of matrix elements, basic for the method, is given. The methods used to solve the encountered linear algebric equations are discussed, and the numerical results are evaluated. (M.C.K.) [pt
The Medical Library Association Benchmarking Network: results*
Dudden, Rosalind Farnam; Corcoran, Kate; Kaplan, Janice; Magouirk, Jeff; Rand, Debra C.; Smith, Bernie Todd
2006-01-01
Objective: This article presents some limited results from the Medical Library Association (MLA) Benchmarking Network survey conducted in 2002. Other uses of the data are also presented. Methods: After several years of development and testing, a Web-based survey opened for data input in December 2001. Three hundred eighty-five MLA members entered data on the size of their institutions and the activities of their libraries. The data from 344 hospital libraries were edited and selected for reporting in aggregate tables and on an interactive site in the Members-Only area of MLANET. The data represent a 16% to 23% return rate and have a 95% confidence level. Results: Specific questions can be answered using the reports. The data can be used to review internal processes, perform outcomes benchmarking, retest a hypothesis, refute a previous survey findings, or develop library standards. The data can be used to compare to current surveys or look for trends by comparing the data to past surveys. Conclusions: The impact of this project on MLA will reach into areas of research and advocacy. The data will be useful in the everyday working of small health sciences libraries as well as provide concrete data on the current practices of health sciences libraries. PMID:16636703
A simplified 2D HTTR benchmark problem
International Nuclear Information System (INIS)
Zhang, Z.; Rahnema, F.; Pounders, J. M.; Zhang, D.; Ougouag, A.
2009-01-01
To access the accuracy of diffusion or transport methods for reactor calculations, it is desirable to create heterogeneous benchmark problems that are typical of relevant whole core configurations. In this paper we have created a numerical benchmark problem in 2D configuration typical of a high temperature gas cooled prismatic core. This problem was derived from the HTTR start-up experiment. For code-to-code verification, complex details of geometry and material specification of the physical experiments are not necessary. To this end, the benchmark problem presented here is derived by simplifications that remove the unnecessary details while retaining the heterogeneity and major physics properties from the neutronics viewpoint. Also included here is a six-group material (macroscopic) cross section library for the benchmark problem. This library was generated using the lattice depletion code HELIOS. Using this library, benchmark quality Monte Carlo solutions are provided for three different configurations (all-rods-in, partially-controlled and all-rods-out). The reference solutions include the core eigenvalue, block (assembly) averaged fuel pin fission density distributions, and absorption rate in absorbers (burnable poison and control rods). (authors)
International Nuclear Information System (INIS)
D'Hondt, P.; Gehin, J.; Na, B.C.; Sartori, E.; Wiesenack, W.
2001-01-01
One of the options envisaged for disposing of weapons grade plutonium, declared surplus for national defence in the Russian Federation and Usa, is to burn it in nuclear power reactors. The scientific/technical know-how accumulated in the use of MOX as a fuel for electricity generation is of great relevance for the plutonium disposition programmes. An Expert Group of the OECD/Nea is carrying out a series of benchmarks with the aim of facilitating the use of this know-how for meeting this objective. This paper describes the background that led to establishing the Expert Group, and the present status of results from these benchmarks. The benchmark studies cover a theoretical reactor physics benchmark on a VVER-1000 core loaded with MOX, two experimental benchmarks on MOX lattices and a benchmark concerned with MOX fuel behaviour for both solid and hollow pellets. First conclusions are outlined as well as future work. (author)
The Monte Carlo performance benchmark test - AIMS, specifications and first results
Energy Technology Data Exchange (ETDEWEB)
Hoogenboom, J. Eduard, E-mail: j.e.hoogenboom@tudelft.nl [Faculty of Applied Sciences, Delft University of Technology (Netherlands); Martin, William R., E-mail: wrm@umich.edu [Nuclear Engineering and Radiological Sciences, University of Michigan, Ann Arbor, MI (United States); Petrovic, Bojan, E-mail: Bojan.Petrovic@gatech.edu [Nuclear and Radiological Engineering, Georgia Institute of Technology, Atlanta, GA (United States)
2011-07-01
The Monte Carlo performance benchmark for detailed power density calculation in a full-size reactor core is organized under the auspices of the OECD NEA Data Bank. It aims at monitoring over a range of years the increase in performance, measured in terms of standard deviation and computer time, of Monte Carlo calculation of the power density in small volumes. A short description of the reactor geometry and composition is discussed. One of the unique features of the benchmark exercise is the possibility to upload results from participants at a web site of the NEA Data Bank which enables online analysis of results and to graphically display how near we are at the goal of doing a detailed power distribution calculation with acceptable statistical uncertainty in an acceptable computing time. First results are discussed which show that 10 to 100 billion histories must be simulated to reach a standard deviation of a few percent in the estimated power of most of the requested the fuel zones. Even when using a large supercomputer, a considerable speedup is still needed to reach the target of 1 hour computer time. An outlook is given of what to expect from this benchmark exercise over the years. Possible extensions of the benchmark for specific issues relevant in current Monte Carlo calculation for nuclear reactors are also discussed. (author)
The Monte Carlo performance benchmark test - AIMS, specifications and first results
International Nuclear Information System (INIS)
Hoogenboom, J. Eduard; Martin, William R.; Petrovic, Bojan
2011-01-01
The Monte Carlo performance benchmark for detailed power density calculation in a full-size reactor core is organized under the auspices of the OECD NEA Data Bank. It aims at monitoring over a range of years the increase in performance, measured in terms of standard deviation and computer time, of Monte Carlo calculation of the power density in small volumes. A short description of the reactor geometry and composition is discussed. One of the unique features of the benchmark exercise is the possibility to upload results from participants at a web site of the NEA Data Bank which enables online analysis of results and to graphically display how near we are at the goal of doing a detailed power distribution calculation with acceptable statistical uncertainty in an acceptable computing time. First results are discussed which show that 10 to 100 billion histories must be simulated to reach a standard deviation of a few percent in the estimated power of most of the requested the fuel zones. Even when using a large supercomputer, a considerable speedup is still needed to reach the target of 1 hour computer time. An outlook is given of what to expect from this benchmark exercise over the years. Possible extensions of the benchmark for specific issues relevant in current Monte Carlo calculation for nuclear reactors are also discussed. (author)
Development of a set of benchmark problems to verify numerical methods for solving burnup equations
International Nuclear Information System (INIS)
Lago, Daniel; Rahnema, Farzad
2017-01-01
Highlights: • Description transmutation chain benchmark problems. • Problems for validating numerical methods for solving burnup equations. • Analytical solutions for the burnup equations. • Numerical solutions for the burnup equations. - Abstract: A comprehensive set of transmutation chain benchmark problems for numerically validating methods for solving burnup equations was created. These benchmark problems were designed to challenge both traditional and modern numerical methods used to solve the complex set of ordinary differential equations used for tracking the change in nuclide concentrations over time due to nuclear phenomena. Given the development of most burnup solvers is done for the purpose of coupling with an established transport solution method, these problems provide a useful resource in testing and validating the burnup equation solver before coupling for use in a lattice or core depletion code. All the relevant parameters for each benchmark problem are described. Results are also provided in the form of reference solutions generated by the Mathematica tool, as well as additional numerical results from MATLAB.
Precise determination of lattice phase shifts and mixing angles
Energy Technology Data Exchange (ETDEWEB)
Lu, Bing-Nan, E-mail: b.lu@fz-juelich.de [Institute for Advanced Simulation, Institut für Kernphysik, and Jülich Center for Hadron Physics, Forschungszentrum Jülich, D-52425 Jülich (Germany); Lähde, Timo A. [Institute for Advanced Simulation, Institut für Kernphysik, and Jülich Center for Hadron Physics, Forschungszentrum Jülich, D-52425 Jülich (Germany); Lee, Dean [Department of Physics, North Carolina State University, Raleigh, NC 27695 (United States); Meißner, Ulf-G. [Helmholtz-Institut für Strahlen- und Kernphysik and Bethe Center for Theoretical Physics, Universität Bonn, D-53115 Bonn (Germany); Institute for Advanced Simulation, Institut für Kernphysik, and Jülich Center for Hadron Physics, Forschungszentrum Jülich, D-52425 Jülich (Germany); JARA – High Performance Computing, Forschungszentrum Jülich, D-52425 Jülich (Germany)
2016-09-10
We introduce a general and accurate method for determining lattice phase shifts and mixing angles, which is applicable to arbitrary, non-cubic lattices. Our method combines angular momentum projection, spherical wall boundaries and an adjustable auxiliary potential. This allows us to construct radial lattice wave functions and to determine phase shifts at arbitrary energies. For coupled partial waves, we use a complex-valued auxiliary potential that breaks time-reversal invariance. We benchmark our method using a system of two spin-1/2 particles interacting through a finite-range potential with a strong tensor component. We are able to extract phase shifts and mixing angles for all angular momenta and energies, with precision greater than that of extant methods. We discuss a wide range of applications from nuclear lattice simulations to optical lattice experiments.
Lattice Boltzmann simulation of flow around a confined circular cyclinder
International Nuclear Information System (INIS)
Ashrafizaadeh, M.; Zadehgol, A.
2002-01-01
A two dimensional lattice Boltzmann model (LBM) based on a single time relaxation BGK model has been developed. Several benchmark problems including the Poiseuille flow, the lid driven cavity flow and the flow around a circular cylinder have been performed employing a d2q9 lattice. The laminar flow around a circular cylinder within a channel has been extensively investigated using the present lattice Boltzmann model. Both symmetric and asymmetric placement configurations of the circular cylinder within the channel have been considered. A new treatment for the outlet velocity and pressure (density) boundary conditions has been proposed and validated. The present LBM results are in excellent agreement with those of the other existing CFD results. Careful examination of the LBM results and an appropriate calculation of the lift coefficient based on the rectangular lattice representation of the circular cylinder reveals that the periodic oscillation of the lift coefficient has a second harmonic when the cylinder is placed asymmetrically within the channel. The second harmonic could be associated with an asymmetrical shedding pattern of the vortices behind the cylinder from the upper and lower sides of the cylinder. (author)
D.C. Blitz (David)
2011-01-01
textabstractBenchmarking benchmarks is a bundle of six studies that are inspired by the prevalence of benchmarking in academic finance research as well as in investment practice. Three studies examine if current benchmark asset pricing models adequately describe the cross-section of stock returns.
MCNP analysis of the nine-cell LWR gadolinium benchmark
International Nuclear Information System (INIS)
Arkuszewski, J.J.
1988-01-01
The Monte Carlo results for a 9-cell fragment of the light water reactor square lattice with a central gadolinium-loaded pin are presented. The calculations are performed with the code MCNP-3A and the ENDF-B/5 library and compared with the results obtained from the BOXER code system and the JEF-1 library. The objective of this exercise is to study the feasibility of BOXER for the analysis of a Gd-loaded LWR lattice in the broader framework of GAP International Benchmark Analysis. A comparison of results indicates that, apart from unavoidable discrepancies originating from different data evaluations, the BOXER code overestimates the multiplication factor by 1.4 % and underestimates the power release in a Gd cell by 4.66 %. It is hoped that further similar studies with use of the JEF-1 library for both BOXER and MCNP will help to isolate and explain these discrepancies in a cleaner way. (author) 4 refs., 9 figs., 10 tabs
International Nuclear Information System (INIS)
Chakir, E.; Erradi, L.; Bardouni, T El.; Khoukhi, T El.; Boukhal, H.; Meroun, O.; Bakkari, B El
2007-01-01
Full text: In a previous work, we have analysed the main french experiments available on the reactivity temperature coefficient (RTC) : CREAOLE and Mistral experiments. In these experiments, the RTC has been measured in both UO2 and UO2-PuO2 PWR type lattices. Our calculations, using APPOLO2 code with CEA93 library based on JEF2.2 evaluation, have shown that the calculation error in UO2 lattices is less than 1 pcm/Deg C which is considered as the target accuracy. On the other hand the calculation error in the MOX lattices is more significant in both low and high temperature ranges : an average error of -2 ± 0.5 pcm/Deg C is observed in low temperatures and an error of +3±2 pcm/Deg C is obtained for temperature higher than 250Deg C. In the present work, we analysed additional experimental benchmarks on the RTC of UO2 and MOX light water moderated lattices. To analyze these benchmarks and with the aim of minimizing uncertainties related to modelling of the experimental set up, we chose the Monte Carlo Method which has the advantage of taking into account in the most exact manner the geometry of the experimental configurations. Thus we have used the code MCNP5, for its recognized power and its availability. This analysis shows for the UO2 lattices, an average experiment-calculation deviation of about 0,5 pcm/Deg C, which is largely below the target accuracy for this type of lattices, that we estimate at approximately 1 pcm/Deg C. For the KAMINI experiment, which relates to the measurement of the RTC in light water moderated lattice using U-233 as fuel our analysis shows that the Endf/B6 library gives the best result, with an experiment -calculation deviation of the order of -0,16 pcm/Deg C. The analysis of the benchmarks using MOX fuel made it possible to highlight a discrepancy between experiment and calculation on the RTC of about -0.7pcm/Deg C ( for a range of temperature going from 20 to 248 Deg C) and -1.2 pcm/Deg C ( for a range of temperature going from 20 to
International Nuclear Information System (INIS)
Shekhar Kumar; Koganti, S.B.
2003-07-01
Benchmarking and application of a computer code SIMPSEX for high plutonium FBR flowsheets was reported recently in an earlier report (IGC-234). Improvements and recompilation of the code (Version 4.01, March 2003) required re-validation with the existing benchmarks as well as additional benchmark flowsheets. Improvements in the high Pu region (Pu Aq >30 g/L) resulted in better results in the 75% Pu flowsheet benchmark. Below 30 g/L Pu Aq concentration, results were identical to those from the earlier version (SIMPSEX Version 3, code compiled in 1999). In addition, 13 published flowsheets were taken as additional benchmarks. Eleven of these flowsheets have a wide range of feed concentrations and few of them are β-γ active runs with FBR fuels having a wide distribution of burnup and Pu ratios. A published total partitioning flowsheet using externally generated U(IV) was also simulated using SIMPSEX. SIMPSEX predictions were compared with listed predictions from conventional SEPHIS, PUMA, PUNE and PUBG. SIMPSEX results were found to be comparable and better than the result from above listed codes. In addition, recently reported UREX demo results along with AMUSE simulations are also compared with SIMPSEX predictions. Results of the benchmarking SIMPSEX with these 14 benchmark flowsheets are discussed in this report. (author)
Results of the benchmark for blade structural models, part A
DEFF Research Database (Denmark)
Lekou, D.J.; Chortis, D.; Belen Fariñas, A.
2013-01-01
A benchmark on structural design methods for blades was performed within the InnWind.Eu project under WP2 “Lightweight Rotor” Task 2.2 “Lightweight structural design”. The present document is describes the results of the comparison simulation runs that were performed by the partners involved within...... Task 2.2 of the InnWind.Eu project. The benchmark is based on the reference wind turbine and the reference blade provided by DTU [1]. "Structural Concept developers/modelers" of WP2 were provided with the necessary input for a comparison numerical simulation run, upon definition of the reference blade...
Entropic multirelaxation lattice Boltzmann models for turbulent flows
Bösch, Fabian; Chikatamarla, Shyam S.; Karlin, Ilya V.
2015-10-01
We present three-dimensional realizations of a class of lattice Boltzmann models introduced recently by the authors [I. V. Karlin, F. Bösch, and S. S. Chikatamarla, Phys. Rev. E 90, 031302(R) (2014), 10.1103/PhysRevE.90.031302] and review the role of the entropic stabilizer. Both coarse- and fine-grid simulations are addressed for the Kida vortex flow benchmark. We show that the outstanding numerical stability and performance is independent of a particular choice of the moment representation for high-Reynolds-number flows. We report accurate results for low-order moments for homogeneous isotropic decaying turbulence and second-order grid convergence for most assessed statistical quantities. It is demonstrated that all the three-dimensional lattice Boltzmann realizations considered herein converge to the familiar lattice Bhatnagar-Gross-Krook model when the resolution is increased. Moreover, thanks to the dynamic nature of the entropic stabilizer, the present model features less compressibility effects and maintains correct energy and enstrophy dissipation. The explicit and efficient nature of the present lattice Boltzmann method renders it a promising candidate for both engineering and scientific purposes for highly turbulent flows.
JNC results of BN-600 benchmark calculation (phase 4)
International Nuclear Information System (INIS)
Ishikawa, Makoto
2003-01-01
The present work is the results of JNC, Japan, for the Phase 4 of the BN-600 core benchmark problem (Hex-Z fully MOX fuelled core model) organized by IAEA. The benchmark specification is based on 1) the RCM report of IAEA CRP on 'Updated Codes and Methods to Reduce the Calculational Uncertainties of LMFR Reactivity Effects, Action 3.12' (Calculations for BN-600 fully fuelled MOX core for subsequent transient analyses). JENDL-3.2 nuclear data library was used for calculating 70 group ABBN-type group constants. Cell models for fuel assembly and control rod calculations were applied: homogeneous and heterogeneous (cylindrical supercell) model. Basic diffusion calculation was three-dimensional Hex-Z model, 18 group (Citation code). Transport calculations were 18 group, three-dimensional (NSHEC code) based on Sn-transport nodal method developed at JNC. The generated thermal power per fission was based on Sher's data corrected on the basis of ENDF/B-IV data library. Calculation results are presented in Tables for intercomparison
Actinides transmutation - a comparison of results for PWR benchmark
International Nuclear Information System (INIS)
Claro, Luiz H.
2009-01-01
The physical aspects involved in the Partitioning and Transmutation (P and T) of minor actinides (MA) and fission products (FP) generated by reactors PWR are of great interest in the nuclear industry. Besides these the reduction in the storage of radioactive wastes are related with the acceptability of the nuclear electric power. From the several concepts for partitioning and transmutation suggested in literature, one of them involves PWR reactors to burn the fuel containing plutonium and minor actinides reprocessed of UO 2 used in previous stages. In this work are presented the results of the calculations of a benchmark in P and T carried with WIMSD5B program using its new cross sections library generated from the ENDF-B-VII and the comparison with the results published in literature by other calculations. For comparison, was used the benchmark transmutation concept based in a typical PWR cell and the analyzed results were the k∞ and the atomic density of the isotopes Np-239, Pu-241, Pu-242 and Am-242m, as function of burnup considering discharge of 50 GWd/tHM. (author)
Integral benchmark test of JENDL-4.0 for U-233 systems with ICSBEP handbook
International Nuclear Information System (INIS)
Kuwagaki, Kazuki; Nagaya, Yasunobu
2017-03-01
The integral benchmark test of JENDL-4.0 for U-233 systems using the continuous-energy Monte Carlo code MVP was conducted. The previous benchmark test was performed only for U-233 thermal solution and fast metallic systems in the ICSBEP handbook. In this study, MVP input files were prepared for uninvestigated benchmark problems in the handbook including compound thermal systems (mainly lattice systems) and integral benchmark test was performed. The prediction accuracy of JENDL-4.0 was evaluated for effective multiplication factors (k eff 's) of the U-233 systems. As a result, a trend of underestimation was observed for all the categories of U-233 systems. In the benchmark test of ENDF/B-VII.1 for U-233 systems with the ICSBEP handbook, it is reported that a decreasing trend of calculated k eff values in association with a parameter ATFF (Above-Thermal Fission Fraction) is observed. The ATFF values were also calculated in this benchmark test of JENDL-4.0 and the same trend as ENDF/B-VII.1 was observed. A CD-ROM is attached as an appendix. (J.P.N.)
OECD/NEA Burnup Credit Calculational Criticality Benchmark Phase I-B Results
International Nuclear Information System (INIS)
DeHart, M.D.
1993-01-01
Burnup credit is an ongoing technical concern for many countries that operate commercial nuclear power reactors. In a multinational cooperative effort to resolve burnup credit issues, a Burnup Credit Working Group has been formed under the auspices of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development. This working group has established a set of well-defined calculational benchmarks designed to study significant aspects of burnup credit computational methods. These benchmarks are intended to provide a means for the intercomparison of computer codes, methods, and data applied in spent fuel analysis. The benchmarks have been divided into multiple phases, each phase focusing on a particular feature of burnup credit analysis. This report summarizes the results and findings of the Phase I-B benchmark, which was proposed to provide a comparison of the ability of different code systems and data libraries to perform depletion analysis for the prediction of spent fuel isotopic concentrations. Results included here represent 21 different sets of calculations submitted by 16 different organizations worldwide, and are based on a limited set of nuclides determined to have the most important effect on the neutron multiplication factor of light-water-reactor spent fuel. A comparison of all sets of results demonstrates that most methods are in agreement to within 10% in the ability to estimate the spent fuel concentrations of most actinides. All methods are within 11% agreement about the average for all fission products studied. Furthermore, most deviations are less than 10%, and many are less than 5%. The exceptions are 149 Sm, 151 Sm, and 155 Gd
Boiling water reactor turbine trip (TT) benchmark. Volume II: Summary Results of Exercise 1
International Nuclear Information System (INIS)
Akdeniz, Bedirhan; Ivanov, Kostadin N.; Olson, Andy M.
2005-06-01
The OECD Nuclear Energy Agency (NEA) completed under US Nuclear Regulatory Commission (NRC) sponsorship a PWR main steam line break (MSLB) benchmark against coupled system three-dimensional (3-D) neutron kinetics and thermal-hydraulic codes. Another OECD/NRC coupled-code benchmark was recently completed for a BWR turbine trip (TT) transient and is the object of the present report. Turbine trip transients in a BWR are pressurisation events in which the coupling between core space-dependent neutronic phenomena and system dynamics plays an important role. The data made available from actual experiments carried out at the Peach Bottom 2 plant make the present benchmark particularly valuable. While defining and coordinating the BWR TT benchmark, a systematic approach and level methodology not only allowed for a consistent and comprehensive validation process, but also contributed to the study of key parameters of pressurisation transients. The benchmark consists of three separate exercises, two initial states and five transient scenarios. The BWR TT Benchmark will be published in four volumes as NEA reports. CD-ROMs will also be prepared and will include the four reports and the transient boundary conditions, decay heat values as a function of time, cross-section libraries and supplementary tables and graphs not published in the paper version. BWR TT Benchmark - Volume I: Final Specifications was issued in 2001 [NEA/NSC/DOC(2001)]. The benchmark team [Pennsylvania State University (PSU) in co-operation with Exelon Nuclear and the NEA] has been responsible for coordinating benchmark activities, answering participant questions and assisting them in developing their models, as well as analysing submitted solutions and providing reports summarising the results for each phase. The benchmark team has also been involved in the technical aspects of the benchmark, including sensitivity studies for the different exercises. Volume II summarises the results for Exercise 1 of the
A lattice Boltzmann coupled to finite volumes method for solving phase change problems
Directory of Open Access Journals (Sweden)
El Ganaoui Mohammed
2009-01-01
Full Text Available A numerical scheme coupling lattice Boltzmann and finite volumes approaches has been developed and qualified for test cases of phase change problems. In this work, the coupled partial differential equations of momentum conservation equations are solved with a non uniform lattice Boltzmann method. The energy equation is discretized by using a finite volume method. Simulations show the ability of this developed hybrid method to model the effects of convection, and to predict transfers. Benchmarking is operated both for conductive and convective situation dominating solid/liquid transition. Comparisons are achieved with respect to available analytical solutions and experimental results.
The Medical Library Association Benchmarking Network: results.
Dudden, Rosalind Farnam; Corcoran, Kate; Kaplan, Janice; Magouirk, Jeff; Rand, Debra C; Smith, Bernie Todd
2006-04-01
This article presents some limited results from the Medical Library Association (MLA) Benchmarking Network survey conducted in 2002. Other uses of the data are also presented. After several years of development and testing, a Web-based survey opened for data input in December 2001. Three hundred eighty-five MLA members entered data on the size of their institutions and the activities of their libraries. The data from 344 hospital libraries were edited and selected for reporting in aggregate tables and on an interactive site in the Members-Only area of MLANET. The data represent a 16% to 23% return rate and have a 95% confidence level. Specific questions can be answered using the reports. The data can be used to review internal processes, perform outcomes benchmarking, retest a hypothesis, refute a previous survey findings, or develop library standards. The data can be used to compare to current surveys or look for trends by comparing the data to past surveys. The impact of this project on MLA will reach into areas of research and advocacy. The data will be useful in the everyday working of small health sciences libraries as well as provide concrete data on the current practices of health sciences libraries.
Phase transitions in cooperative coinfections: Simulation results for networks and lattices
Grassberger, Peter; Chen, Li; Ghanbarnejad, Fakhteh; Cai, Weiran
2016-04-01
We study the spreading of two mutually cooperative diseases on different network topologies, and with two microscopic realizations, both of which are stochastic versions of a susceptible-infected-removed type model studied by us recently in mean field approximation. There it had been found that cooperativity can lead to first order transitions from spreading to extinction. However, due to the rapid mixing implied by the mean field assumption, first order transitions required nonzero initial densities of sick individuals. For the stochastic model studied here the results depend strongly on the underlying network. First order transitions are found when there are few short but many long loops: (i) No first order transitions exist on trees and on 2-d lattices with local contacts. (ii) They do exist on Erdős-Rényi (ER) networks, on d -dimensional lattices with d ≥4 , and on 2-d lattices with sufficiently long-ranged contacts. (iii) On 3-d lattices with local contacts the results depend on the microscopic details of the implementation. (iv) While single infected seeds can always lead to infinite epidemics on regular lattices, on ER networks one sometimes needs finite initial densities of infected nodes. (v) In all cases the first order transitions are actually "hybrid"; i.e., they display also power law scaling usually associated with second order transitions. On regular lattices, our model can also be interpreted as the growth of an interface due to cooperative attachment of two species of particles. Critically pinned interfaces in this model seem to be in different universality classes than standard critically pinned interfaces in models with forbidden overhangs. Finally, the detailed results mentioned above hold only when both diseases propagate along the same network of links. If they use different links, results can be rather different in detail, but are similar overall.
International Nuclear Information System (INIS)
Rubin, Adam; Avramova, Maria; Velazquez-Lozada, Alexander
2016-03-01
This report summarised the first phase of the Nuclear Energy Agency (NEA) and the US Nuclear Regulatory Commission Benchmark based on NUPEC PWR Sub-channel and Bundle Tests (PSBT), which was intended to provide data for the verification of void distribution models in participants' codes. This phase was composed of four exercises; Exercise 1: steady-state single sub-channel benchmark, Exercise 2: steady-state rod bundle benchmark, Exercise 3: transient rod bundle benchmark and Exercise 4: a pressure drop benchmark. The experimental data provided to the participants of this benchmark is from a series of void measurement tests using full-size mock-up tests for both Boiling Water Reactors (BWRs) and Pressurised Water Reactors (PWRs). These tests were performed from 1987 to 1995 by the Nuclear Power Engineering Corporation (NUPEC) in Japan and made available by the Japan Nuclear Energy Safety Organisation (JNES) for the purposes of this benchmark, which was organised by Pennsylvania State University. Twenty-one institutions from nine countries participated in this benchmark. Seventeen different computer codes were used in Exercises 1, 2, 3 and 4. Among the computer codes were porous media, sub-channel, systems thermal-hydraulic code and Computational Fluid Dynamics (CFD) codes. It was observed that the codes tended to overpredict the thermal equilibrium quality at lower elevations and under predict it at higher elevations. There was also a tendency to overpredict void fraction at lower elevations and underpredict it at high elevations for the bundle test cases. The overprediction of void fraction at low elevations is likely caused by the x-ray densitometer measurement method used. Under sub-cooled boiling conditions, the voids accumulate at heated surfaces (and are therefore not seen in the centre of the sub-channel, where the measurements are being taken), so the experimentally-determined void fractions will be lower than the actual void fraction. Some of the best
International Nuclear Information System (INIS)
Erradi, L.; Chetaine, A.; Chakir, E.; Kharchaf, A.; Elbardouni, T.; Elkhoukhi, T.
2005-01-01
In a previous work, we have analysed the main French experiments available on the reactivity temperature coefficient (RTC): CREOLE and MISTRAL experiments. In these experiments, the RTC has been measured in both UO 2 and UO 2 -PuO 2 PWR type lattices. Our calculations, using APOLLO2 code with CEA93 library based on JEF2.2 evaluation, have shown that the calculation error in UO 2 lattices is less than 1 pcm/C degrees which is considered as the target accuracy. On the other hand the calculation error in the MOX lattices is more significant in both low and high temperature ranges: an average error of -2 ± 0.5 pcm/C degrees is observed in low temperatures and an error of +3 ± 2 pcm/C degrees is obtained for temperatures higher than 250 C degrees. In the present work, we analysed additional experimental benchmarks on the RTC of UO 2 and MOX light water moderated lattices. To analyze these benchmarks and with the aim of minimizing uncertainties related to modelling of the experimental set up, we chose the Monte Carlo method which has the advantage of taking into account in the most exact manner the geometry of the experimental configurations. This analysis shows for the UO 2 lattices, a maximum experiment-calculation deviation of about 0,7 pcm/C degrees, which is below the target accuracy for this type of lattices. For the KAMINI experiment, which relates to the measurement of the RTC in a light water moderated lattice using U-233 as fuel our analysis shows that the ENDF/B6 library gives the best result, with an experiment-calculation deviation of the order of -0,16 pcm/C degrees. The analysis of the benchmarks using MOX fuel made it possible to highlight a discrepancy between experiment and calculation on the RTC of about -0.7 pcm/C degrees (for a range of temperatures going from 20 to 248 C degrees) and -1,2 pcm/C degrees (for a range of temperatures going from 20 to 80 C degrees). This result, in particular the tendency which has the error to decrease when the
OECD/NEA Burnup Credit Calculational Criticality Benchmark Phase I-B Results
Energy Technology Data Exchange (ETDEWEB)
DeHart, M.D.
1993-01-01
Burnup credit is an ongoing technical concern for many countries that operate commercial nuclear power reactors. In a multinational cooperative effort to resolve burnup credit issues, a Burnup Credit Working Group has been formed under the auspices of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development. This working group has established a set of well-defined calculational benchmarks designed to study significant aspects of burnup credit computational methods. These benchmarks are intended to provide a means for the intercomparison of computer codes, methods, and data applied in spent fuel analysis. The benchmarks have been divided into multiple phases, each phase focusing on a particular feature of burnup credit analysis. This report summarizes the results and findings of the Phase I-B benchmark, which was proposed to provide a comparison of the ability of different code systems and data libraries to perform depletion analysis for the prediction of spent fuel isotopic concentrations. Results included here represent 21 different sets of calculations submitted by 16 different organizations worldwide, and are based on a limited set of nuclides determined to have the most important effect on the neutron multiplication factor of light-water-reactor spent fuel. A comparison of all sets of results demonstrates that most methods are in agreement to within 10% in the ability to estimate the spent fuel concentrations of most actinides. All methods are within 11% agreement about the average for all fission products studied. Furthermore, most deviations are less than 10%, and many are less than 5%. The exceptions are {sup 149}Sm, {sup 151}Sm, and {sup 155}Gd.
Comparison of RSYST and WIMSD-4 performance for gadolinium poisoned lattices
Energy Technology Data Exchange (ETDEWEB)
Kulikowska, T; Szczesna, B; Sadowska, B
1992-06-01
The participation in the Co-ordinated Research Programme on `Safe Core Management with Burnable Absorbers in VVERs` has created a possibility of validation of our basic calculational tools for advanced lattice calculations. A systematic analysis of the performance of WIMSD-4 and the recently adapted RSYST modular systems has been carried out on the basis of two benchmarks with gadolinium bearing pins. The report consists of a detailed comparison of methods and models available in RSYST and WIMSD-4 followed by calculational results and their discussion. Finally, the conclusions are drawn concerning the applicability of the two codes for clean fuel and gadolinium poisoned reactor lattices. (author). 26 refs, 19 figs, 19 tabs.
Resonance shielding in thermal reactor lattices
International Nuclear Information System (INIS)
Rothenstein, W.; Taviv, E.; Aminpour, M.
1982-01-01
The theoretical foundations of a new methodology for the accurate treatment of resonance absorption in thermal reactor lattice analysis are presented. This methodology is based on the solution of the point-energy transport equation in its integral or integro-differential form for a heterogeneous lattice using detailed resonance cross-section profiles. The methodology is applied to LWR benchmark analysis, with emphasis on temperature dependence of resonance absorption during fuel depletion, spatial and mutual self-shielding, integral parameter analysis and treatment of cluster geometry. The capabilities of the OZMA code, which implements the new methodology are discussed. These capabilities provide a means against which simpler and more rapid resonance absorption algorithms can be checked. (author)
Simplified two and three dimensional HTTR benchmark problems
International Nuclear Information System (INIS)
Zhang Zhan; Rahnema, Farzad; Zhang Dingkang; Pounders, Justin M.; Ougouag, Abderrafi M.
2011-01-01
To assess the accuracy of diffusion or transport methods for reactor calculations, it is desirable to create heterogeneous benchmark problems that are typical of whole core configurations. In this paper we have created two and three dimensional numerical benchmark problems typical of high temperature gas cooled prismatic cores. Additionally, a single cell and single block benchmark problems are also included. These problems were derived from the HTTR start-up experiment. Since the primary utility of the benchmark problems is in code-to-code verification, minor details regarding geometry and material specification of the original experiment have been simplified while retaining the heterogeneity and the major physics properties of the core from a neutronics viewpoint. A six-group material (macroscopic) cross section library has been generated for the benchmark problems using the lattice depletion code HELIOS. Using this library, Monte Carlo solutions are presented for three configurations (all-rods-in, partially-controlled and all-rods-out) for both the 2D and 3D problems. These solutions include the core eigenvalues, the block (assembly) averaged fission densities, local peaking factors, the absorption densities in the burnable poison and control rods, and pin fission density distribution for selected blocks. Also included are the solutions for the single cell and single block problems.
featsel: A framework for benchmarking of feature selection algorithms and cost functions
Marcelo S. Reis; Gustavo Estrela; Carlos Eduardo Ferreira; Junior Barrera
2017-01-01
In this paper, we introduce featsel, a framework for benchmarking of feature selection algorithms and cost functions. This framework allows the user to deal with the search space as a Boolean lattice and has its core coded in C++ for computational efficiency purposes. Moreover, featsel includes Perl scripts to add new algorithms and/or cost functions, generate random instances, plot graphs and organize results into tables. Besides, this framework already comes with dozens of algorithms and co...
Review of lattice results concerning low-energy particle physics
International Nuclear Information System (INIS)
Aoki, S.; Aoki, Y.; Brookhaven National Laboratory, Upton, NY; Becirevic, D.
2016-07-01
We review lattice results related to pion, kaon, D- and B-meson physics with the aim of making them easily accessible to the particle physics community. More specifically, we report on the determination of the light-quark masses, the form factor f_+(0), arising in the semileptonic K→π transition at zero momentum transfer, as well as the decay constant ratio f_K/f_π and its consequences for the CKM matrix elements V_u_s and V_u_d. Furthermore, we describe the results obtained on the lattice for some of the low-energy constants of SU(2)_L x SU(2)_R and SU(3)_L x SU(3)_R Chiral Perturbation Theory. We review the determination of the B_K parameter of neutral kaon mixing as well as the additional four B parameters that arise in theories of physics beyond the Standard Model. The latter quantities are an addition compared to the previous review. For the heavy-quark sector, we provide results for m_c and m_b (also new compared to the previous review), as well as those for D- and B-meson decay constants, form factors, and mixing parameters. These are the heavy-quark quantities most relevant for the determination of CKM matrix elements and the global CKM unitarity-triangle fit. Finally, we review the status of lattice determinations of the strong coupling constant α_s.
Results of the event sequence reliability benchmark exercise
International Nuclear Information System (INIS)
Silvestri, E.
1990-01-01
The Event Sequence Reliability Benchmark Exercise is the fourth of a series of benchmark exercises on reliability and risk assessment, with specific reference to nuclear power plant applications, and is the logical continuation of the previous benchmark exercises on System Analysis Common Cause Failure and Human Factors. The reference plant is the Nuclear Power Plant at Grohnde Federal Republic of Germany a 1300 MW PWR plant of KWU design. The specific objective of the Exercise is to model, to quantify and to analyze such event sequences initiated by the occurrence of a loss of offsite power that involve the steam generator feed. The general aim is to develop a segment of a risk assessment, which ought to include all the specific aspects and models of quantification, such as common canal failure, Human Factors and System Analysis, developed in the previous reliability benchmark exercises, with the addition of the specific topics of dependences between homologous components belonging to different systems featuring in a given event sequence and of uncertainty quantification, to end up with an overall assessment of: - the state of the art in risk assessment and the relative influences of quantification problems in a general risk assessment framework. The Exercise has been carried out in two phases, both requiring modelling and quantification, with the second phase adopting more restrictive rules and fixing certain common data, as emerged necessary from the first phase. Fourteen teams have participated in the Exercise mostly from EEC countries, with one from Sweden and one from the USA. (author)
Review of lattice results concerning low-energy particle physics
Energy Technology Data Exchange (ETDEWEB)
Aoki, S. [Kyoto University, Yukawa Institute for Theoretical Physics, Kyoto (Japan); Aoki, Y. [Nagoya University, Kobayashi-Maskawa Institute for the Origin of Particles and the Universe (KMI), Nagoya (Japan); Brookhaven National Laboratory, RIKEN BNL Research Center, Upton, NY (United States); Bernard, C. [Washington University, Department of Physics, Saint Louis, MO (United States); Blum, T. [Brookhaven National Laboratory, RIKEN BNL Research Center, Upton, NY (United States); University of Connecticut, Physics Department, Storrs, CT (United States); Colangelo, G.; Leutwyler, H.; Necco, S.; Wenger, U. [Institut fuer theoretische Physik, Universitaet Bern, Albert Einstein Center for Fundamental Physics, Bern (Switzerland); Della Morte, M. [University of Southern Denmark, CP3-Origins and Danish IAS, Odense M (Denmark); IFIC (CSIC), Paterna (Spain); Duerr, S. [Bergische Universitaet Wuppertal, Wuppertal (Germany); Juelich Supercomputing Center, Juelich (Germany); El-Khadra, A.X. [University of Illinois, Department of Physics, Urbana, IL (United States); Fukaya, H.; Onogi, T. [Osaka University, Department of Physics, Osaka (Japan); Horsley, R. [University of Edinburgh, School of Physics, Edinburgh (United Kingdom); Juettner, A.; Sachrajda, C.T. [University of Southampton, School of Physics and Astronomy, Southampton (United Kingdom); Kaneko, T. [High Energy Accelerator Research Organization (KEK), Ibaraki (Japan); Laiho, J. [University of Glasgow, SUPA, Department of Physics and Astronomy, Glasgow (United Kingdom); Syracuse University, Department of Physics, Syracuse, New York (United States); Lellouch, L. [Aix-Marseille Universite, CNRS, CPT, UMR 7332, Marseille (France); Universite de Toulon, CNRS, CPT, UMR 7332, La Garde (France); Lubicz, V. [Universita Roma Tre, Dipartimento di Matematica e Fisica, Rome (Italy); Sezione di Roma Tre, INFN, Rome (Italy); Lunghi, E. [Indiana University, Physics Department, Bloomington, IN (United States); Pena, C. [Universidad Autonoma de Madrid, Instituto de Fisica Teorica UAM/CSIC and Departamento de Fisica Teorica, Madrid (Spain); Sharpe, S.R. [University of Washington, Physics Department, Seattle, WA (United States); Simula, S. [Sezione di Roma Tre, INFN, Rome (Italy); Sommer, R. [NIC rate at DESY, Zeuthen (Germany); Water, R.S.V. de [Fermi National Accelerator Laboratory, Batavia, IL (United States); Vladikas, A. [Universita di Roma Tor Vergata, INFN, Sezione di Tor Vergata, c/o Dipartimento di Fisica, Rome (Italy); Wittig, H. [University of Mainz, PRISMA Cluster of Excellence, Institut fuer Kernphysik and Helmholtz Institute Mainz, Mainz (Germany); Collaboration: FLAG Working Group
2014-09-15
We review lattice results related to pion, kaon, D- and B-meson physics with the aim of making them easily accessible to the particle-physics community. More specifically, we report on the determination of the lightquark masses, the form factor f{sub +}(0), arising in semileptonic K → π transition at zero momentum transfer, as well as the decay-constant ratio f{sub K}/f{sub π} of decay constants and its consequences for the CKM matrix elements V{sub us} and V{sub ud}. Furthermore, we describe the results obtained on the lattice for some of the low-energy constants of SU(2){sub L} x SU(2){sub R} andSU(3)L{sub L} x SU(3){sub R} Chiral Perturbation Theory and review the determination of the BK parameter of neutral kaon mixing. The inclusion of heavy-quark quantities significantly expands the FLAG scope with respect to the previous review. Therefore, we focus here on D- and B-meson decay constants, form factors, and mixing parameters, since these are most relevant for the determination of CKM matrix elements and the global CKM unitarity-triangle fit. In addition we review the status of lattice determinations of the strong coupling constant α{sub s}. (orig.)
Lattice Boltzmann model for three-phase viscoelastic fluid flow
Xie, Chiyu; Lei, Wenhai; Wang, Moran
2018-02-01
A lattice Boltzmann (LB) framework is developed for simulation of three-phase viscoelastic fluid flows in complex geometries. This model is based on a Rothman-Keller type model for immiscible multiphase flows which ensures mass conservation of each component in porous media even for a high density ratio. To account for the viscoelastic effects, the Maxwell constitutive relation is correctly introduced into the momentum equation, which leads to a modified lattice Boltzmann evolution equation for Maxwell fluids by removing the normal but excess viscous term. Our simulation tests indicate that this excess viscous term may induce significant errors. After three benchmark cases, the displacement processes of oil by dispersed polymer are studied as a typical example of three-phase viscoelastic fluid flow. The results show that increasing either the polymer intrinsic viscosity or the elastic modulus will enhance the oil recovery.
Classical Logic and Quantum Logic with Multiple and Common Lattice Models
Directory of Open Access Journals (Sweden)
Mladen Pavičić
2016-01-01
Full Text Available We consider a proper propositional quantum logic and show that it has multiple disjoint lattice models, only one of which is an orthomodular lattice (algebra underlying Hilbert (quantum space. We give an equivalent proof for the classical logic which turns out to have disjoint distributive and nondistributive ortholattices. In particular, we prove that both classical logic and quantum logic are sound and complete with respect to each of these lattices. We also show that there is one common nonorthomodular lattice that is a model of both quantum and classical logic. In technical terms, that enables us to run the same classical logic on both a digital (standard, two-subset, 0-1-bit computer and a nondigital (say, a six-subset computer (with appropriate chips and circuits. With quantum logic, the same six-element common lattice can serve us as a benchmark for an efficient evaluation of equations of bigger lattice models or theorems of the logic.
Benchmark Calculations on Halden IFA-650 LOCA Test Results
International Nuclear Information System (INIS)
Ek, Mirkka; Kekkonen, Laura; Kelppe, Seppo; Stengaard, J.O.; Josek, Radomir; Wiesenack, Wolfgang; Aounallah, Yacine; Wallin, Hannu; Grandjean, Claude; Herb, Joachim; Lerchl, Georg; Trambauer, Klaus; Sonnenburg, Heinz-Guenther; Nakajima, Tetsuo; Spykman, Gerold; Struzik, Christine
2010-01-01
through several blow-downs and heat-ups and reached peak clad temperatures of more than 1000 C. In the second run, where the rod was sufficiently pre-pressurised, ballooning and burst was obtained. The first benchmark consisted of three rounds of code calculations related to IFA-650.3: 1. Pre-test calculations: Participants were provided with information regarding the setup of the Halden LOCA test facility, data from the commissioning runs, and information about the test pin and power conditions to be applied in the execution of the test. 2. Post-test calculations I: In addition to the information from the first round, participants were provided with the in-pile results from the test. 3. Post-test calculations II, unified thermal-hydraulic boundary conditions: Calculations were repeated using a cladding temperature distribution calculated with ATHLET-CD at GRS. Since the test, when executed, did not produce the expected ballooning and fuel relocation, it was decided to continue with a second benchmark using tests 650.4 and 650.5, this time as post-test calculations. The fourth test of the series, IFA-650.4 conducted in April 2006, caused particular attention in the international nuclear community. The fuel used in the experiment had a high burnup, 92 MWd/kgU, and a low pre-test hydrogen content of about 50 ppm. The cladding burst at about 790 deg. C caused a marked temperature increase at the lower end of the segment and a decrease at the upper end, indicating that fuel relocation had occurred. Subsequent gamma scanning showed that approximately 19 cm (40%) of the fuel stack were missing from the upper part of the rod. PIE at the IFE-Kjeller hot cells corroborated this evidence of substantial fuel relocation. This report presents the results of the codes which participated in the various benchmarks. The two main parts, on benchmark I and II, each start with a brief description of the most important experimental data. Then, the code calculation results follow
Wilson Dslash Kernel From Lattice QCD Optimization
Energy Technology Data Exchange (ETDEWEB)
Joo, Balint [Jefferson Lab, Newport News, VA; Smelyanskiy, Mikhail [Parallel Computing Lab, Intel Corporation, California, USA; Kalamkar, Dhiraj D. [Parallel Computing Lab, Intel Corporation, India; Vaidyanathan, Karthikeyan [Parallel Computing Lab, Intel Corporation, India
2015-07-01
Lattice Quantum Chromodynamics (LQCD) is a numerical technique used for calculations in Theoretical Nuclear and High Energy Physics. LQCD is traditionally one of the first applications ported to many new high performance computing architectures and indeed LQCD practitioners have been known to design and build custom LQCD computers. Lattice QCD kernels are frequently used as benchmarks (e.g. 168.wupwise in the SPEC suite) and are generally well understood, and as such are ideal to illustrate several optimization techniques. In this chapter we will detail our work in optimizing the Wilson-Dslash kernels for Intel Xeon Phi, however, as we will show the technique gives excellent performance on regular Xeon Architecture as well.
Review of lattice results concerning low-energy particle physics
Energy Technology Data Exchange (ETDEWEB)
Aoki, S. [Kyoto University, Center for Gravitational Physics, Yukawa Institute for Theoretical Physics, Kyoto (Japan); Aoki, Y. [Nagoya University, Kobayashi-Maskawa Institute for the Origin of Particles and the Universe (KMI), Nagoya (Japan); Brookhaven National Laboratory, RIKEN BNL Research Center, Upton, NY (United States); High Energy Accelerator Research Organization (KEK), Tsukuba (Japan); Becirevic, D. [Universite Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR8627), CNRS, Orsay (France); Bernard, C. [Washington University, Department of Physics, Saint Louis, MO (United States); Blum, T. [Brookhaven National Laboratory, RIKEN BNL Research Center, Upton, NY (United States); University of Connecticut, Physics Department, Storrs, CT (United States); Colangelo, G.; Leutwyler, H.; Wenger, U. [Universitaet Bern, Albert Einstein Center for Fundamental Physics, Institut fuer Theoretische Physik, Bern (Switzerland); Della Morte, M. [University of Southern Denmark, CP3-Origins and Danish IAS, Odense M (Denmark); IFIC (CSIC), Paterna (Spain); Dimopoulos, P. [Centro Fermi-Museo Storico della Fisica e Centro Studi e Ricerche Enrico Fermi Compendio del Viminale, Rome (Italy); Universita di Roma Tor Vergata, c/o Dipartimento di Fisica, Rome (Italy); Duerr, S. [University of Wuppertal, Wuppertal (Germany); Juelich Supercomputing Center, Forschungszentrum Juelich, Juelich (Germany); Fukaya, H.; Onogi, T. [Osaka University, Department of Physics, Toyonaka, Osaka (Japan); Golterman, M. [San Francisco State University, Department of Physics and Astronomy, San Francisco, CA (United States); Gottlieb, Steven; Lunghi, E. [Indiana University, Department of Physics, Bloomington, IN (United States); Hashimoto, S.; Kaneko, T. [High Energy Accelerator Research Organization (KEK), Tsukuba (Japan); The Graduate University for Advanced Studies (Sokendai), School of High Energy Accelerator Science, Tsukuba (Japan); Heller, U.M. [American Physical Society (APS), Ridge, NY (United States); Horsley, R. [University of Edinburgh, Higgs Centre for Theoretical Physics, School of Physics and Astronomy, Edinburgh (United Kingdom); Juettner, A.; Sachrajda, C.T. [University of Southampton, School of Physics and Astronomy, Southampton (United Kingdom); Lellouch, L. [CNRS, Aix-Marseille Universite, Universite de Toulon, Centre de Physique Theorique, UMR 7332, Marseille (France); Lin, C.J.D. [CNRS, Aix-Marseille Universite, Universite de Toulon, Centre de Physique Theorique, UMR 7332, Marseille (France); National Chiao-Tung University, Institute of Physics, Hsinchu (China); Lubicz, V. [Universita Roma Tre, Dipartimento di Matematica e Fisica, Rome (Italy); INFN, Sezione di Roma Tre, Rome (Italy); Mawhinney, R. [Columbia University, Physics Department, New York, NY (United States); Pena, C. [Universidad Autonoma de Madrid, Departamento de Fisica Teorica, Instituto de Fisica Teorica UAM/CSIC, Madrid (Spain); Sharpe, S.R. [University of Washington, Physics Department, Seattle, WA (United States); Simula, S. [INFN, Sezione di Roma Tre, Rome (Italy); Sommer, R. [DESY, John von Neumann Institute for Computing (NIC), Zeuthen (Germany); Vladikas, A. [Universita di Roma ' ' Tor Vergata' ' , Dipartimento di Fisica, Rome (Italy); INFN, Rome (Italy); Wittig, H. [University of Mainz, PRISMA Cluster of Excellence, Institut fuer Kernphysik and Helmholtz Institute Mainz, Mainz (Germany); Collaboration: Flavour Lattice Averaging Group (FLAG)
2017-02-15
We review lattice results related to pion, kaon, D- and B-meson physics with the aim of making them easily accessible to the particle-physics community. More specifically, we report on the determination of the light-quark masses, the form factor f{sub +}(0), arising in the semileptonic K → π transition at zero momentum transfer, as well as the decay constant ratio f{sub K}/f{sub π} and its consequences for the CKM matrix elements V{sub us} and V{sub ud}. Furthermore, we describe the results obtained on the lattice for some of the low-energy constants of SU(2){sub L} x SU(2){sub R} and SU(3){sub L} x SU(3){sub R} Chiral Perturbation Theory. We review the determination of the B{sub K} parameter of neutral kaon mixing as well as the additional four B parameters that arise in theories of physics beyond the Standard Model. The latter quantities are an addition compared to the previous review. For the heavy-quark sector, we provide results for m{sub c} and m{sub b} (also new compared to the previous review), as well as those for D- and B-meson-decay constants, form factors, and mixing parameters. These are the heavy-quark quantities most relevant for the determination of CKM matrix elements and the global CKM unitarity-triangle fit. Finally, we review the status of lattice determinations of the strong coupling constant α{sub s}. (orig.)
Review of lattice results concerning low-energy particle physics
Energy Technology Data Exchange (ETDEWEB)
Aoki, S. [Kyoto Univ. (Japan). Yukawa Inst. for Theoretical Physics; Aoki, Y. [Nagoya Univ. (Japan). Kobayashi-Maskawa Inst. for the Origin of Particles and the Universe; Brookhaven National Laboratory, Upton, NY (United States). RIKEN BNL Research Center; Becirevic, D. [Univ. Paris-Saclay, Orsay (France). CNRS; Collaboration: FLAG Working Group; and others
2016-07-15
We review lattice results related to pion, kaon, D- and B-meson physics with the aim of making them easily accessible to the particle physics community. More specifically, we report on the determination of the light-quark masses, the form factor f{sub +}(0), arising in the semileptonic K→π transition at zero momentum transfer, as well as the decay constant ratio f{sub K}/f{sub π} and its consequences for the CKM matrix elements V{sub us} and V{sub ud}. Furthermore, we describe the results obtained on the lattice for some of the low-energy constants of SU(2){sub L} x SU(2){sub R} and SU(3){sub L} x SU(3){sub R} Chiral Perturbation Theory. We review the determination of the B{sub K} parameter of neutral kaon mixing as well as the additional four B parameters that arise in theories of physics beyond the Standard Model. The latter quantities are an addition compared to the previous review. For the heavy-quark sector, we provide results for m{sub c} and m{sub b} (also new compared to the previous review), as well as those for D- and B-meson decay constants, form factors, and mixing parameters. These are the heavy-quark quantities most relevant for the determination of CKM matrix elements and the global CKM unitarity-triangle fit. Finally, we review the status of lattice determinations of the strong coupling constant α{sub s}.
Meylianti S., Brigita
1999-01-01
Benchmarking has different meaning to different people. There are five types of benchmarking, namely internal benchmarking, competitive benchmarking, industry / functional benchmarking, process / generic benchmarking and collaborative benchmarking. Each type of benchmarking has its own advantages as well as disadvantages. Therefore it is important to know what kind of benchmarking is suitable to a specific application. This paper will discuss those five types of benchmarking in detail, includ...
Depletion benchmarks calculation of random media using explicit modeling approach of RMC
International Nuclear Information System (INIS)
Liu, Shichang; She, Ding; Liang, Jin-gang; Wang, Kan
2016-01-01
Highlights: • Explicit modeling of RMC is applied to depletion benchmark for HTGR fuel element. • Explicit modeling can provide detailed burnup distribution and burnup heterogeneity. • The results would serve as a supplement for the HTGR fuel depletion benchmark. • The method of adjacent burnup regions combination is proposed for full-core problems. • The combination method can reduce memory footprint, keeping the computing accuracy. - Abstract: Monte Carlo method plays an important role in accurate simulation of random media, owing to its advantages of the flexible geometry modeling and the use of continuous-energy nuclear cross sections. Three stochastic geometry modeling methods including Random Lattice Method, Chord Length Sampling and explicit modeling approach with mesh acceleration technique, have been implemented in RMC to simulate the particle transport in the dispersed fuels, in which the explicit modeling method is regarded as the best choice. In this paper, the explicit modeling method is applied to the depletion benchmark for HTGR fuel element, and the method of combination of adjacent burnup regions has been proposed and investigated. The results show that the explicit modeling can provide detailed burnup distribution of individual TRISO particles, and this work would serve as a supplement for the HTGR fuel depletion benchmark calculations. The combination of adjacent burnup regions can effectively reduce the memory footprint while keeping the computational accuracy.
Validation of WIMS-CANDU using Pin-Cell Lattices
International Nuclear Information System (INIS)
Kim, Won Young; Min, Byung Joo; Park, Joo Hwan
2006-01-01
The WIMS-CANDU is a lattice code which has a depletion capability for the analysis of reactor physics problems related to a design and safety. The WIMS-CANDU code has been developed from the WIMSD5B, a version of the WIMS code released from the OECD/NEA data bank in 1998. The lattice code POWDERPUFS-V (PPV) has been used for the physics design and analysis of a natural uranium fuel for the CANDU reactor. However since the application of PPV is limited to a fresh fuel due to its empirical correlations, the WIMS-AECL code has been developed by AECL to substitute the PPV. Also, the WIMS-CANDU code is being developed to perform the physics analysis of the present operating CANDU reactors as a replacement of PPV. As one of the developing work of WIMS-CANDU, the U 238 absorption cross-section in the nuclear data library of WIMS-CANDU was updated and WIMS-CANDU was validated using the benchmark problems for pin-cell lattices such as TRX-1, TRX-2, Bapl-1, Bapl-2 and Bapl-3. The results by the WIMS-CANDU and the WIMS-AECL were compared with the experimental data
OECD/NEA burnup credit calculational criticality benchmark Phase I-B results
Energy Technology Data Exchange (ETDEWEB)
DeHart, M.D.; Parks, C.V. [Oak Ridge National Lab., TN (United States); Brady, M.C. [Sandia National Labs., Las Vegas, NV (United States)
1996-06-01
In most countries, criticality analysis of LWR fuel stored in racks and casks has assumed that the fuel is fresh with the maximum allowable initial enrichment. This assumption has led to the design of widely spaced and/or highly poisoned storage and transport arrays. If credit is assumed for fuel burnup, initial enrichment limitations can be raised in existing systems, and more compact and economical arrays can be designed. Such reliance on the reduced reactivity of spent fuel for criticality control is referred to as burnup credit. The Burnup Credit Working Group, formed under the auspices of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development, has established a set of well-defined calculational benchmarks designed to study significant aspects of burnup credit computational methods. These benchmarks are intended to provide a means for the intercomparison of computer codes, methods, and data applied in spent fuel analysis. The benchmarks have been divided into multiple phases, each phase focusing on a particular feature of burnup credit analysis. This report summarizes the results and findings of the Phase I-B benchmark, which was proposed to provide a comparison of the ability of different code systems and data libraries to perform depletion analysis for the prediction of spent fuel isotopic concentrations. Results included here represent 21 different sets of calculations submitted by 16 different organizations worldwide and are based on a limited set of nuclides determined to have the most important effect on the neutron multiplication factor of light-water-reactor spent fuel. A comparison of all sets of results demonstrates that most methods agree to within 10% in the ability to estimate the spent fuel concentrations of most actinides. All methods agree within 11% about the average for all fission products studied. Most deviations are less than 10%, and many are less than 5%. The exceptions are Sm 149, Sm 151, and Gd 155.
OECD/NEA burnup credit calculational criticality benchmark Phase I-B results
International Nuclear Information System (INIS)
DeHart, M.D.; Parks, C.V.; Brady, M.C.
1996-06-01
In most countries, criticality analysis of LWR fuel stored in racks and casks has assumed that the fuel is fresh with the maximum allowable initial enrichment. This assumption has led to the design of widely spaced and/or highly poisoned storage and transport arrays. If credit is assumed for fuel burnup, initial enrichment limitations can be raised in existing systems, and more compact and economical arrays can be designed. Such reliance on the reduced reactivity of spent fuel for criticality control is referred to as burnup credit. The Burnup Credit Working Group, formed under the auspices of the Nuclear Energy Agency of the Organization for Economic Cooperation and Development, has established a set of well-defined calculational benchmarks designed to study significant aspects of burnup credit computational methods. These benchmarks are intended to provide a means for the intercomparison of computer codes, methods, and data applied in spent fuel analysis. The benchmarks have been divided into multiple phases, each phase focusing on a particular feature of burnup credit analysis. This report summarizes the results and findings of the Phase I-B benchmark, which was proposed to provide a comparison of the ability of different code systems and data libraries to perform depletion analysis for the prediction of spent fuel isotopic concentrations. Results included here represent 21 different sets of calculations submitted by 16 different organizations worldwide and are based on a limited set of nuclides determined to have the most important effect on the neutron multiplication factor of light-water-reactor spent fuel. A comparison of all sets of results demonstrates that most methods agree to within 10% in the ability to estimate the spent fuel concentrations of most actinides. All methods agree within 11% about the average for all fission products studied. Most deviations are less than 10%, and many are less than 5%. The exceptions are Sm 149, Sm 151, and Gd 155
Fast burner reactor benchmark results from the NEA working party on physics of plutonium recycle
International Nuclear Information System (INIS)
Hill, R.N.; Wade, D.C.; Palmiotti, G.
1995-01-01
As part of a program proposed by the OECD/NEA Working Party on Physics of Plutonium Recycling (WPPR) to evaluate different scenarios for the use of plutonium, fast reactor physics benchmarks were developed; fuel cycle scenarios using either PUREX/TRUEX (oxide fuel) or pyrometallurgical (metal fuel) separation technologies were specified. These benchmarks were designed to evaluate the nuclear performance and radiotoxicity impact of a transuranic-burning fast reactor system. International benchmark results are summarized in this paper; and key conclusions are highlighted
The reactive transport benchmark proposed by GdR MoMaS: presentation and first results
Energy Technology Data Exchange (ETDEWEB)
Carrayrou, J. [Institut de Mecanique des Fluides et des Solides, UMR ULP-CNRS 7507, 67 - Strasbourg (France); Lagneau, V. [Ecole des Mines de Paris, Centre de Geosciences, 77 - Fontainebleau (France)
2007-07-01
We present here the actual context of reactive transport modelling and the major numerical challenges. GdR MoMaS proposes a benchmark on reactive transport. We present this benchmark and some results obtained on it by two reactive transport codes HYTEC and SPECY. (authors)
The reactive transport benchmark proposed by GdR MoMaS: presentation and first results
International Nuclear Information System (INIS)
Carrayrou, J.; Lagneau, V.
2007-01-01
We present here the actual context of reactive transport modelling and the major numerical challenges. GdR MoMaS proposes a benchmark on reactive transport. We present this benchmark and some results obtained on it by two reactive transport codes HYTEC and SPECY. (authors)
Systems reliability Benchmark exercise part 1-Description and results
International Nuclear Information System (INIS)
Amendola, A.
1986-01-01
The report describes aims, rules and results of the Systems Reliability Benchmark Exercise, which has been performed in order to assess methods and procedures for reliability analysis of complex systems and involved a large number of European organizations active in NPP safety evaluation. The exercise included both qualitative and quantitative methods and was structured in such a way that separation of the effects of uncertainties in modelling and in data on the overall spread was made possible. Part I describes the way in which RBE has been performed, its main results and conclusions
JNC results of BN-600 benchmark calculation (phase 3)
International Nuclear Information System (INIS)
Ishikawa, M.
2002-01-01
The present work is the result of phase 3 BN-600 core benchmark problem, meaning burnup and heterogeneity. Analytical method applied consisted of: JENDL-3.2 nuclear data library, group constants (70 group, ABBN type self shielding transport factors), heterogeneous cell model for fuel and control rod, basic diffusion calculation (CITATION code), transport theory and mesh size correction (NSHEX code based on SN transport nodal method developed by JNC). Burnup and heterogeneity calculation results are presented obtained by applying both diffusion and transport approach for beginning and end of cycle
Simulating colloid hydrodynamics with lattice Boltzmann methods
International Nuclear Information System (INIS)
Cates, M E; Stratford, K; Adhikari, R; Stansell, P; Desplat, J-C; Pagonabarraga, I; Wagner, A J
2004-01-01
We present a progress report on our work on lattice Boltzmann methods for colloidal suspensions. We focus on the treatment of colloidal particles in binary solvents and on the inclusion of thermal noise. For a benchmark problem of colloids sedimenting and becoming trapped by capillary forces at a horizontal interface between two fluids, we discuss the criteria for parameter selection, and address the inevitable compromise between computational resources and simulation accuracy
International Nuclear Information System (INIS)
Leszczynski, Francisco
2002-01-01
The IAEA-WIMS Library Update Project (WLUP) is on the end stage. The final library will be released on 2002. It is a result of research and development made by more than ten investigators during 10 years. The organization of benchmarks for testing and choosing the best set of data has been coordinated by the author of this paper. It is presented the organization, name conventions, contents and documentation of WLUP benchmarks, and an updated list of the main parameters for all cases. First, the benchmarks objectives and types are given. Then, comparisons of results from different WIMSD libraries are included. Finally it is described the program QVALUE for analysis and plot of results. Some examples are given. The set of benchmarks implemented on this work is a fundamental tool for testing new multigroup libraries. (author)
Non-grey benchmark results for two temperature non-equilibrium radiative transfer
International Nuclear Information System (INIS)
Su, B.; Olson, G.L.
1999-01-01
Benchmark solutions to time-dependent radiative transfer problems involving non-equilibrium coupling to the material temperature field are crucial for validating time-dependent radiation transport codes. Previous efforts on generating analytical solutions to non-equilibrium radiative transfer problems were all restricted to the one-group grey model. In this paper, a non-grey model, namely the picket-fence model, is considered for a two temperature non-equilibrium radiative transfer problem in an infinite medium. The analytical solutions, as functions of space and time, are constructed in the form of infinite integrals for both the diffusion description and transport description. These expressions are evaluated numerically and the benchmark results are generated. The asymptotic solutions for large and small times are also derived in terms of elementary functions and are compared with the exact results. Comparisons are given between the transport and diffusion solutions and between the grey and non-grey solutions. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)
A consumer`s guide to lattice QCD results
Energy Technology Data Exchange (ETDEWEB)
DeGrand, T. [Univ. of Colorado, Boulder, CO (United States)
1994-12-01
The author presents an overview of recent lattice QCD results on hadron spectroscopy and matrix elements. Case studies include light quark spectroscopy, the determination of {alpha}{sub s} from heavy quark spectroscopy, the D-meson decay constant, a calculation of the Isgur-Wise function, and some examples of the (lack of) effect of sea quarks on matrix elements. The review is intended for the nonexpert.
Montecarlo calculation for a benchmark on interactive effects of Gadolinium poisoned pins in BWRs
International Nuclear Information System (INIS)
Borgia, M.G.; Casali, F.; Cepraga, D.
1985-01-01
K infinite and burn-up calculations have been done in the frame of a benchmark organized by Physic Reactor Committee of NEA. The calculations, performed by the Montecarlo code KIM, concerned BWR lattices having UO*L2 fuel rodlets with and without gadolinium oxide
3-D extension C5G7 MOX benchmark results using PARTISN
Energy Technology Data Exchange (ETDEWEB)
Dahl, J.A. [Los Alamos National Laboratory, CCS-4 Transport Methods Group, Los Alamos, NM (United States)
2005-07-01
We have participated in the Expert Group of 3-D Radiation Transport Benchmarks' proposed 3-dimensional Extension C5G7 MOX problems using the discrete ordinate transport code PARTISN. The computational mesh was created using the FRAC-IN-THE-BOX code, which produces a volume fraction Cartesian mesh from combinatorial geometry descriptions. k{sub eff} eigenvalues, maximum pin powers, and average fuel assembly powers are reported and compared to a benchmark quality Monte Carlo solution. We also present a two dimensional mesh convergence study examining the affects of using volume fractions to approximate the water-pin cell interface. It appears that the control rod pin cell must be meshed twice as fine as a fuel pin cell in order to achieve the same spatial error when using the volume fraction method to define water channel-pin cell interfaces. It is noted that the previous PARTISN results provided to the OECD/NEA Expert Group on 3-dimensional Radiation Benchmarks contained a cross section error, and therefore should be disregarded.
3-D extension C5G7 MOX benchmark results using PARTISN
International Nuclear Information System (INIS)
Dahl, J.A.
2005-01-01
We have participated in the Expert Group of 3-D Radiation Transport Benchmarks' proposed 3-dimensional Extension C5G7 MOX problems using the discrete ordinate transport code PARTISN. The computational mesh was created using the FRAC-IN-THE-BOX code, which produces a volume fraction Cartesian mesh from combinatorial geometry descriptions. k eff eigenvalues, maximum pin powers, and average fuel assembly powers are reported and compared to a benchmark quality Monte Carlo solution. We also present a two dimensional mesh convergence study examining the affects of using volume fractions to approximate the water-pin cell interface. It appears that the control rod pin cell must be meshed twice as fine as a fuel pin cell in order to achieve the same spatial error when using the volume fraction method to define water channel-pin cell interfaces. It is noted that the previous PARTISN results provided to the OECD/NEA Expert Group on 3-dimensional Radiation Benchmarks contained a cross section error, and therefore should be disregarded
Theory and application of deterministic multidimensional pointwise energy lattice physics methods
International Nuclear Information System (INIS)
Zerkle, M.L.
1999-01-01
The theory and application of deterministic, multidimensional, pointwise energy lattice physics methods are discussed. These methods may be used to solve the neutron transport equation in multidimensional geometries using near-continuous energy detail to calculate equivalent few-group diffusion theory constants that rigorously account for spatial and spectral self-shielding effects. A dual energy resolution slowing down algorithm is described which reduces the computer memory and disk storage requirements for the slowing down calculation. Results are presented for a 2D BWR pin cell depletion benchmark problem
Directory of Open Access Journals (Sweden)
Aiman El-Saed
2013-10-01
Full Text Available Summary: Growing numbers of healthcare facilities are routinely collecting standardized data on healthcare-associated infection (HAI, which can be used not only to track internal performance but also to compare local data to national and international benchmarks. Benchmarking overall (crude HAI surveillance metrics without accounting or adjusting for potential confounders can result in misleading conclusions. Methods commonly used to provide risk-adjusted metrics include multivariate logistic regression analysis, stratification, indirect standardization, and restrictions. The characteristics of recognized benchmarks worldwide, including the advantages and limitations are described. The choice of the right benchmark for the data from the Gulf Cooperation Council (GCC states is challenging. The chosen benchmark should have similar data collection and presentation methods. Additionally, differences in surveillance environments including regulations should be taken into consideration when considering such a benchmark. The GCC center for infection control took some steps to unify HAI surveillance systems in the region. GCC hospitals still need to overcome legislative and logistic difficulties in sharing data to create their own benchmark. The availability of a regional GCC benchmark may better enable health care workers and researchers to obtain more accurate and realistic comparisons. Keywords: Benchmarking, Comparison, Surveillance, Healthcare-associated infections
Validation of VHTRC calculation benchmark of critical experiment using the MCB code
Directory of Open Access Journals (Sweden)
Stanisz Przemysław
2016-01-01
Full Text Available The calculation benchmark problem Very High Temperature Reactor Critical (VHTR a pin-in-block type core critical assembly has been investigated with the Monte Carlo Burnup (MCB code in order to validate the latest version of Nuclear Data Library based on ENDF format. Executed benchmark has been made on the basis of VHTR benchmark available from the International Handbook of Evaluated Reactor Physics Benchmark Experiments. This benchmark is useful for verifying the discrepancies in keff values between various libraries and experimental values. This allows to improve accuracy of the neutron transport calculations that may help in designing the high performance commercial VHTRs. Almost all safety parameters depend on the accuracy of neutron transport calculation results that, in turn depend on the accuracy of nuclear data libraries. Thus, evaluation of the libraries applicability to VHTR modelling is one of the important subjects. We compared the numerical experiment results with experimental measurements using two versions of available nuclear data (ENDF-B-VII.1 and JEFF-3.2 prepared for required temperatures. Calculations have been performed with the MCB code which allows to obtain very precise representation of complex VHTR geometry, including the double heterogeneity of a fuel element. In this paper, together with impact of nuclear data, we discuss also the impact of different lattice modelling inside the fuel pins. The discrepancies of keff have been successfully observed and show good agreement with each other and with the experimental data within the 1 σ range of the experimental uncertainty. Because some propagated discrepancies observed, we proposed appropriate corrections in experimental constants which can improve the reactivity coefficient dependency. Obtained results confirm the accuracy of the new Nuclear Data Libraries.
Heavy nucleus resonant absorption calculation benchmarks
International Nuclear Information System (INIS)
Tellier, H.; Coste, H.; Raepsaet, C.; Van der Gucht, C.
1993-01-01
The calculation of the space and energy dependence of the heavy nucleus resonant absorption in a heterogeneous lattice is one of the hardest tasks in reactor physics. Because of the computer time and memory needed, it is impossible to represent finely the cross-section behavior in the resonance energy range for everyday computations. Consequently, reactor physicists use a simplified formalism, the self-shielding formalism. As no clean and detailed experimental results are available to validate the self-shielding calculations, Monte Carlo computations are used as a reference. These results, which were obtained with the TRIPOLI continuous-energy Monte Carlo code, constitute a set of numerical benchmarks than can be used to evaluate the accuracy of the techniques or formalisms that are included in any reactor physics codes. Examples of such evaluations, for the new assembly code APOLLO2 and the slowing-down code SECOL, are given for cases of 238 U and 232 Th fuel elements
International Nuclear Information System (INIS)
Mishra, Subhash C.; Vernekar, Rohan Ranganath
2012-01-01
Application of the lattice Boltzmann method (LBM) recently proposed by Asinari et al. [Asinari P, Mishra SC, Borchiellini R. A lattice Boltzmann formulation to the analysis of radiative heat transfer problems in a participating medium. Numer Heat Transfer B 2010; 57:126–146] is extended to the analysis of transport of collimated radiation in a planar participating medium. To deal with azimuthally symmetric radiation in planar medium, a new lattice structure for the LBM is used. The transport of the collimated component in the medium is analysed by two different, viz., flux splitting and direct approaches. For different angles of incidence of the collimated radiation, the LBM formulation is tested for the effects of the extinction coefficient, the anisotropy factor, and the boundary emissivities on heat flux and emissive power distributions. Results are compared with the benchmark results obtained using the finite volume method. Both the approaches in LBM provide accurate results. -- Highlights: ► Transport of collimated radiation in participating media is studied. ► Usage of Lattice Boltzmann method (LBM) is extended in this study. ► In LBM, flux splitting and direct approaches are proposed. ► Effects of various parameters are studied on heat flux and temperature profiles. ► In all cases, LBM provides correct results.
International Nuclear Information System (INIS)
Werner, W.
1975-01-01
In 1973, NEACRP and CSNI posed a number of kinetic benchmark problems intended to be solved by different groups. Comparison of the submitted results should lead to estimates on the accuracy and efficiency of the employed codes. This was felt to be of great value since the codes involved become more and more important in the field of reactor safety. In this paper the results of the 2d and 3d benchmark problem for a BWR are presented. The specification of the problem is included in the appendix of this survey. For the 2d benchmark problem, 5 contributions have been obtained, while for the 3d benchmark problem 2 contributions have been submitted. (orig./RW) [de
Definition and Analysis of Heavy Water Reactor Benchmarks for Testing New Wims-D Libraries
International Nuclear Information System (INIS)
Leszczynski, Francisco
2000-01-01
This work is part of the IAEA-WIMS Library Update Project (WLUP). A group of heavy water reactor benchmarks have been selected for testing new WIMS-D libraries, including calculations with WIMSD5B program and the analysis of results.These benchmarks cover a wide variety of reactors and conditions, from fresh fuels to high burnup, and from natural to enriched uranium.Besides, each benchmark includes variations in lattice pitch and in coolants (normally heavy water and void).Multiplication factors with critical experimental bucklings and other parameters are calculated and compared with experimental reference values.The WIMS libraries used for the calculations were generated with basic data from JEF-2.2 Rev.3 (JEF) and ENDF/B-VI iNReleaseln 5 (E6) Results obtained with WIMS-86 (W86) library, included with WIMSD5B package, from Windfrith, UK with adjusted data, are included also, for showing the improvements obtained with the new -not adjusted- libraries.The calculations with WIMSD5B were made with two methods (input program options): PIJ (two-dimension collision probability method) and DSN (one-dimension Sn method, with homogenization of materials by ring).The general conclusions are: the library based on JEF data and the DSN meted give the best results, that in average are acceptable
Energy Technology Data Exchange (ETDEWEB)
Leszczynski, Francisco [Comision Nacional de Energia Atomica, Centro Atomico Bariloche (Argentina)
2000-07-01
This work is part of the IAEA-WIMS Library Update Project (WLUP). A group of heavy water reactor benchmarks have been selected for testing new WIMS-D libraries, including calculations with WIMSD5B program and the analysis of results.These benchmarks cover a wide variety of reactors and conditions, from fresh fuels to high burnup, and from natural to enriched uranium.Besides, each benchmark includes variations in lattice pitch and in coolants (normally heavy water and void).Multiplication factors with critical experimental bucklings and other parameters are calculated and compared with experimental reference values.The WIMS libraries used for the calculations were generated with basic data from JEF-2.2 Rev.3 (JEF) and ENDF/B-VI iNReleaseln 5 (E6) Results obtained with WIMS-86 (W86) library, included with WIMSD5B package, from Windfrith, UK with adjusted data, are included also, for showing the improvements obtained with the new -not adjusted- libraries.The calculations with WIMSD5B were made with two methods (input program options): PIJ (two-dimension collision probability method) and DSN (one-dimension Sn method, with homogenization of materials by ring).The general conclusions are: the library based on JEF data and the DSN meted give the best results, that in average are acceptable.
Monte Carlo code criticality benchmark comparisons for waste packaging
International Nuclear Information System (INIS)
Alesso, H.P.; Annese, C.E.; Buck, R.M.; Pearson, J.S.; Lloyd, W.R.
1992-07-01
COG is a new point-wise Monte Carlo code being developed and tested at Lawrence Livermore National Laboratory (LLNL). It solves the Boltzmann equation for the transport of neutrons and photons. The objective of this paper is to report on COG results for criticality benchmark experiments both on a Cray mainframe and on a HP 9000 workstation. COG has been recently ported to workstations to improve its accessibility to a wider community of users. COG has some similarities to a number of other computer codes used in the shielding and criticality community. The recently introduced high performance reduced instruction set (RISC) UNIX workstations provide computational power that approach mainframes at a fraction of the cost. A version of COG is currently being developed for the Hewlett Packard 9000/730 computer with a UNIX operating system. Subsequent porting operations will move COG to SUN, DEC, and IBM workstations. In addition, a CAD system for preparation of the geometry input for COG is being developed. In July 1977, Babcock ampersand Wilcox Co. (B ampersand W) was awarded a contract to conduct a series of critical experiments that simulated close-packed storage of LWR-type fuel. These experiments provided data for benchmarking and validating calculational methods used in predicting K-effective of nuclear fuel storage in close-packed, neutron poisoned arrays. Low enriched UO2 fuel pins in water-moderated lattices in fuel storage represent a challenging criticality calculation for Monte Carlo codes particularly when the fuel pins extend out of the water. COG and KENO calculational results of these criticality benchmark experiments are presented
International Nuclear Information System (INIS)
Hadek, J.
1999-01-01
The paper gives a brief survey of the fifth three-dimensional dynamic Atomic Energy Research benchmark calculation results received with the code DYN3D/ATHLET at NRI Rez. This benchmark was defined at the seventh Atomic Energy Research Symposium (Hoernitz near Zittau, 1997). Its initiating event is a symmetrical break of the main steam header at the end of the first fuel cycle and hot shutdown conditions with one stuck out control rod group. The calculations were performed with the externally coupled codes ATHLET Mod.1.1 Cycle C and DYN3DH1.1/M3. The standard WWER-440/213 input deck of ATHLET code was adopted for benchmark purposes and for coupling with the code DYN3D. The first part of paper contains a brief characteristics of NPP input deck and reactor core model. The second part shows the time dependencies of important global and local parameters. In comparison with the results published at the eighth Atomic Energy Research Symposium (Bystrice nad Pernstejnem, 1998), the results published in this paper are based on improved ATHLET descriptions of control and safety systems. (Author)
Calculation of Single Cell and Fuel Assembly IRIS Benchmarks Using WIMSD5B and GNOMER Codes
International Nuclear Information System (INIS)
Pevec, D.; Grgic, D.; Jecmenica, R.
2002-01-01
IRIS reactor (an acronym for International Reactor Innovative and Secure) is a modular, integral, light water cooled, small to medium power (100-335 MWe/module) reactor, which addresses the requirements defined by the United States Department of Energy for Generation IV nuclear energy systems, i.e., proliferation resistance, enhanced safety, improved economics, and waste reduction. An international consortium led by Westinghouse/BNFL was created for development of IRIS reactor; it includes universities, institutes, commercial companies, and utilities. Faculty of Electrical Engineering and Computing, University of Zagreb joined the consortium in year 2001, with the aim to take part in IRIS neutronics design and safety analyses of IRIS transients. A set of neutronic benchmarks for IRIS reactor was defined with the objective to compare results of all participants with exactly the same assumptions. In this paper a calculation of Benchmark 44 for IRIS reactor is described. Benchmark 44 is defined as a core depletion benchmark problem for specified IRIS reactor operating conditions (e.g., temperatures, moderator density) without feedback. Enriched boron, inhomogeneously distributed in axial direction, is used as an integral fuel burnable absorber (IFBA). The aim of this benchmark was to enable a more direct comparison of results of different code systems. Calculations of Benchmark 44 were performed using the modified CORD-2 code package. The CORD-2 code package consists of WIMSD and GNOMER codes. WIMSD is a well-known lattice spectrum calculation code. GNOMER solves the neutron diffusion equation in three-dimensional Cartesian geometry by the Green's function nodal method. The following parameters were obtained in Benchmark 44 analysis: effective multiplication factor as a function of burnup, nuclear peaking factor as a function of burnup, axial offset as a function of burnup, core-average axial power profile, core radial power profile, axial power profile for selected
Criticality benchmark guide for light-water-reactor fuel in transportation and storage packages
International Nuclear Information System (INIS)
Lichtenwalter, J.J.; Bowman, S.M.; DeHart, M.D.; Hopper, C.M.
1997-03-01
This report is designed as a guide for performing criticality benchmark calculations for light-water-reactor (LWR) fuel applications. The guide provides documentation of 180 criticality experiments with geometries, materials, and neutron interaction characteristics representative of transportation packages containing LWR fuel or uranium oxide pellets or powder. These experiments should benefit the U.S. Nuclear Regulatory Commission (NRC) staff and licensees in validation of computational methods used in LWR fuel storage and transportation concerns. The experiments are classified by key parameters such as enrichment, water/fuel volume, hydrogen-to-fissile ratio (H/X), and lattice pitch. Groups of experiments with common features such as separator plates, shielding walls, and soluble boron are also identified. In addition, a sample validation using these experiments and a statistical analysis of the results are provided. Recommendations for selecting suitable experiments and determination of calculational bias and uncertainty are presented as part of this benchmark guide
Effects of neutron data libraries and criticality codes on IAEA criticality benchmark problems
International Nuclear Information System (INIS)
Sarker, Md.M.; Takano, Makoto; Masukawa, Fumihiro; Naito, Yoshitaka
1993-10-01
In order to compare the effects of neutron data libraries and criticality codes to thermal reactors (LWR), the IAEA criticality benchmark calculations have been performed. The experiments selected in this study include TRX-1 and TRX-2 with a simple geometric configuration. Reactor lattice calculation codes WIMS-D/4, MCNP-4, JACS (MGCL, KENO), and SRAC were used in the present calculations. The TRX cores were analyzed by WIMS-D/4 using WIMS original library and also by MCNP-4, JACS (MGCL, KENO), and SRAC using the libraries generated from JENDL-3 and ENDF/B-IV nuclear data files. An intercomparison work for the above mentioned code systems and cross section libraries was performed by analyzing the LWR benchmark experiments TRX-1 and TRX-2. The TRX cores were also analyzed for supercritical and subcritical conditions and these results were compared. In the case of critical condition, the results were in good agreement. But for the supercritical and subcritical conditions, the difference of the results obtained by using the different cross section libraries become larger than for the critical condition. (author)
The NEPTUN experiments on LOCA thermal-hydraulics for tight-lattice PWRs
International Nuclear Information System (INIS)
Dreier, J.; Chawla, R.; Rouge, N.; Yanar, S.
1990-01-01
The NEPTUN test facility at the Paul Scherrer Institute is currently being used to provide a broad data base for the validation of thermal-hydraulics codes used in predicting the reflooding behaviour of a tight-lattice PWR (light water highb conversion reactor, LWHCR). The present paper gives a description of the facility and the matrix to be covered in the experimental program. Results are presented from a number of forced-feed, bottom-reflooding experiments, comparisons being made with (a) measurements carried out earlier for standard-PWR geometry and (b) the results of a calculational benchmark exercise conducted in the framework of a Swiss/German LWHCR-development agreement. Rewetting for the tight, hexagonal-geometry (p/d = 1.13) NEPTUN-III test bundle has been found to occur in all tests carried out to date, in which reasonably LWHCR-representative values for the various thermal-hydraulics parameters are used. Results of the calculational benchmark exercise have confirmed the need for further code development efforts for achieving reliable predictions of LWHCR reflooding behaviour. (author) 11 figs., 3 tabs., 3 refs
Theory and application of the RAZOR two-dimensional continuous energy lattice physics code
International Nuclear Information System (INIS)
Zerkle, M.L.; Abu-Shumays, I.K.; Ott, M.W.; Winwood, J.P.
1997-01-01
The theory and application of the RAZOR two-dimensional, continuous energy lattice physics code are discussed. RAZOR solves the continuous energy neutron transport equation in one- and two-dimensional geometries, and calculates equivalent few-group diffusion theory constants that rigorously account for spatial and spectral self-shielding effects. A dual energy resolution slowing down algorithm is used to reduce computer memory and disk storage requirements for the slowing down calculation. Results are presented for a 2D BWR pin cell depletion benchmark problem
International Nuclear Information System (INIS)
Strydom, G.; Reitsma, F.; Ngeleka, P.T.; Ivanov, K.N.
2010-01-01
The PBMR is a High-Temperature Gas-cooled Reactor (HTGR) concept developed to be built in South Africa. The analysis tools used for core neutronic design and core safety analysis need to be verified and validated, and code-to-code comparisons are an essential part of the V and V plans. As part of this plan the PBMR 400 MWth design and a representative set of transient exercises are defined as an OECD benchmark. The scope of the benchmark is to establish a series of well defined multi-dimensional computational benchmark problems with a common given set of cross sections, to compare methods and tools in coupled neutronics and thermal hydraulics analysis with a specific focus on transient events. This paper describes the current status of the benchmark project and shows the results for the six transient exercises, consisting of three Loss of Cooling Accidents, two Control Rod Withdrawal transients, a power load-follow transient, and a Helium over-cooling Accident. The participants' results are compared using a statistical method and possible areas of future code improvement are identified. (authors)
Lattice QCD results on soft and hard probes of strongly interacting matter
Kaczmarek, Olaf
2017-11-01
We present recent results from lattice QCD relevant for the study of strongly interacting matter as it is produced in heavy ion collision experiments. The equation of state at non-vanishing density from a Taylor expansion up to 6th order will be discussed for a strangeness neutral system and using the expansion coefficients of the series limits on the critical point are estimated. Chemical freeze-out temperatures from the STAR and ALICE Collaborations will be compared to lines of constant physics calculated from the Taylor expansion of QCD bulk thermodynamic quantities. We show that qualitative features of the √{sNN} dependence of skewness and kurtosis ratios of net proton-number fluctuations measured by the STAR Collaboration can be understood from QCD results for cumulants of conserved baryon-number fluctuations. As an example for recent progress towards the determination of spectral and transport properties of the QGP from lattice QCD, we will present constraints on the thermal photon rate determined from a spectral reconstruction of continuum extrapolated lattice correlation functions in combination with input from most recent perturbative calculations.
Experiment vs simulation RT WFNDEC 2014 benchmark: CIVA results
International Nuclear Information System (INIS)
Tisseur, D.; Costin, M.; Rattoni, B.; Vienne, C.; Vabre, A.; Cattiaux, G.; Sollier, T.
2015-01-01
The French Atomic Energy Commission and Alternative Energies (CEA) has developed for years the CIVA software dedicated to simulation of NDE techniques such as Radiographic Testing (RT). RT modelling is achieved in CIVA using combination of a determinist approach based on ray tracing for transmission beam simulation and a Monte Carlo model for the scattered beam computation. Furthermore, CIVA includes various detectors models, in particular common x-ray films and a photostimulable phosphor plates. This communication presents the results obtained with the configurations proposed in the World Federation of NDEC 2014 RT modelling benchmark with the RT models implemented in the CIVA software
Experiment vs simulation RT WFNDEC 2014 benchmark: CIVA results
Energy Technology Data Exchange (ETDEWEB)
Tisseur, D., E-mail: david.tisseur@cea.fr; Costin, M., E-mail: david.tisseur@cea.fr; Rattoni, B., E-mail: david.tisseur@cea.fr; Vienne, C., E-mail: david.tisseur@cea.fr; Vabre, A., E-mail: david.tisseur@cea.fr; Cattiaux, G., E-mail: david.tisseur@cea.fr [CEA LIST, CEA Saclay 91191 Gif sur Yvette Cedex (France); Sollier, T. [Institut de Radioprotection et de Sûreté Nucléaire, B.P.17 92262 Fontenay-Aux-Roses (France)
2015-03-31
The French Atomic Energy Commission and Alternative Energies (CEA) has developed for years the CIVA software dedicated to simulation of NDE techniques such as Radiographic Testing (RT). RT modelling is achieved in CIVA using combination of a determinist approach based on ray tracing for transmission beam simulation and a Monte Carlo model for the scattered beam computation. Furthermore, CIVA includes various detectors models, in particular common x-ray films and a photostimulable phosphor plates. This communication presents the results obtained with the configurations proposed in the World Federation of NDEC 2014 RT modelling benchmark with the RT models implemented in the CIVA software.
International Nuclear Information System (INIS)
Kliem, S.
1998-01-01
The fifth dynamic benchmark was defined at seventh AER-Symposium, held in Hoernitz, Germany in 1997. It is the first benchmark for coupled thermohydraulic system/three-dimensional hexagonal neutron kinetic core models. In this benchmark the interaction between the components of a WWER-440 NPP with the reactor core has been investigated. The initiating event is a symmetrical break of the main steam header at the end of the first fuel cycle and hot shutdown conditions with one control rod group stucking. This break causes an overcooling of the primary circuit. During this overcooling the scram reactivity is compensated and the scrammed reactor becomes re critical. The calculation was continued until the highly-borated water from the high pressure injection system terminated the power excursion. Each participant used own best-estimate nuclear cross section data. Only the initial subcriticality at the beginning of the transient was given. Solutions were received from Kurchatov Institute Russia with the code BIPR8/ATHLET, VTT Energy Finland with HEXTRAN/SMABRE, NRI Rez Czech Republic with DYN3/ATHLET, KFKI Budapest Hungary with KIKO3D/ATHLET and from FZR Germany with the code DYN3D/ATHLET.In this paper the results are compared. Beside the comparison of global results, the behaviour of several thermohydraulic and neutron kinetic parameters is presented to discuss the revealed differences between the solutions.(Authors)
Tsimihodimos, Vasilis; Kostapanos, Michael S.; Moulis, Alexandros; Nikas, Nikos; Elisaf, Moses S.
2015-01-01
Objectives: To investigate the effect of benchmarking on the quality of type 2 diabetes (T2DM) care in Greece. Methods: The OPTIMISE (Optimal Type 2 Diabetes Management Including Benchmarking and Standard Treatment) study [ClinicalTrials.gov identifier: NCT00681850] was an international multicenter, prospective cohort study. It included physicians randomized 3:1 to either receive benchmarking for glycated hemoglobin (HbA1c), systolic blood pressure (SBP) and low-density lipoprotein cholesterol (LDL-C) treatment targets (benchmarking group) or not (control group). The proportions of patients achieving the targets of the above-mentioned parameters were compared between groups after 12 months of treatment. Also, the proportions of patients achieving those targets at 12 months were compared with baseline in the benchmarking group. Results: In the Greek region, the OPTIMISE study included 797 adults with T2DM (570 in the benchmarking group). At month 12 the proportion of patients within the predefined targets for SBP and LDL-C was greater in the benchmarking compared with the control group (50.6 versus 35.8%, and 45.3 versus 36.1%, respectively). However, these differences were not statistically significant. No difference between groups was noted in the percentage of patients achieving the predefined target for HbA1c. At month 12 the increase in the percentage of patients achieving all three targets was greater in the benchmarking (5.9–15.0%) than in the control group (2.7–8.1%). In the benchmarking group more patients were on target regarding SBP (50.6% versus 29.8%), LDL-C (45.3% versus 31.3%) and HbA1c (63.8% versus 51.2%) at 12 months compared with baseline (p Benchmarking may comprise a promising tool for improving the quality of T2DM care. Nevertheless, target achievement rates of each, and of all three, quality indicators were suboptimal, indicating there are still unmet needs in the management of T2DM. PMID:26445642
High Energy Physics (HEP) benchmark program
International Nuclear Information System (INIS)
Yasu, Yoshiji; Ichii, Shingo; Yashiro, Shigeo; Hirayama, Hideo; Kokufuda, Akihiro; Suzuki, Eishin.
1993-01-01
High Energy Physics (HEP) benchmark programs are indispensable tools to select suitable computer for HEP application system. Industry standard benchmark programs can not be used for this kind of particular selection. The CERN and the SSC benchmark suite are famous HEP benchmark programs for this purpose. The CERN suite includes event reconstruction and event generator programs, while the SSC one includes event generators. In this paper, we found that the results from these two suites are not consistent. And, the result from the industry benchmark does not agree with either of these two. Besides, we describe comparison of benchmark results using EGS4 Monte Carlo simulation program with ones from two HEP benchmark suites. Then, we found that the result from EGS4 in not consistent with the two ones. The industry standard of SPECmark values on various computer systems are not consistent with the EGS4 results either. Because of these inconsistencies, we point out the necessity of a standardization of HEP benchmark suites. Also, EGS4 benchmark suite should be developed for users of applications such as medical science, nuclear power plant, nuclear physics and high energy physics. (author)
Reactor calculation benchmark PCA blind test results
International Nuclear Information System (INIS)
Kam, F.B.K.; Stallmann, F.W.
1980-01-01
Further improvement in calculational procedures or a combination of calculations and measurements is necessary to attain 10 to 15% (1 sigma) accuracy for neutron exposure parameters (flux greater than 0.1 MeV, flux greater than 1.0 MeV, and dpa). The calculational modeling of power reactors should be benchmarked in an actual LWR plant to provide final uncertainty estimates for end-of-life predictions and limitations for plant operations. 26 references, 14 figures, 6 tables
Reactor calculation benchmark PCA blind test results
Energy Technology Data Exchange (ETDEWEB)
Kam, F.B.K.; Stallmann, F.W.
1980-01-01
Further improvement in calculational procedures or a combination of calculations and measurements is necessary to attain 10 to 15% (1 sigma) accuracy for neutron exposure parameters (flux greater than 0.1 MeV, flux greater than 1.0 MeV, and dpa). The calculational modeling of power reactors should be benchmarked in an actual LWR plant to provide final uncertainty estimates for end-of-life predictions and limitations for plant operations. 26 references, 14 figures, 6 tables.
International Nuclear Information System (INIS)
Jeong, Chang-Joon; Okumura, Keisuke; Ishiguro, Yukio; Tanaka, Ken-ichi
1990-01-01
Validation tests were made for the accuracy of cell calculation methods used in analyses of tight lattices of a mixed-oxide (MOX) fuel core in a high conversion light water reactor (HCLWR). A series of cell calculations was carried out for the lattices referred from an international HCLWR benchmark comparison, with emphasis placed on the resonance calculation methods; the NR, IR approximations, the collision probability method with ultra-fine energy group. Verification was also performed for the geometrical modelling; a hexagonal/cylindrical cell, and the boundary condition; mirror/white reflection. In the calculations, important reactor physics parameters, such as the neutron multiplication factor, the conversion ratio and the void coefficient, were evaluated using the above methods for various HCLWR lattices with different moderator to fuel volume ratios, fuel materials and fissile plutonium enrichments. The calculated results were compared with each other, and the accuracy and applicability of each method were clarified by comparison with continuous energy Monte Carlo calculations. It was verified that the accuracy of the IR approximation became worse when the neutron spectrum became harder. It was also concluded that the cylindrical cell model with the white boundary condition was not so suitable for MOX fuelled lattices, as for UO 2 fuelled lattices. (author)
A lattice Boltzmann model for solute transport in open channel flow
Wang, Hongda; Cater, John; Liu, Haifei; Ding, Xiangyi; Huang, Wei
2018-01-01
A lattice Boltzmann model of advection-dispersion problems in one-dimensional (1D) open channel flows is developed for simulation of solute transport and pollutant concentration. The hydrodynamics are calculated based on a previous lattice Boltzmann approach to solving the 1D Saint-Venant equations (LABSVE). The advection-dispersion model is coupled with the LABSVE using the lattice Boltzmann method. Our research recovers the advection-dispersion equations through the Chapman-Enskog expansion of the lattice Boltzmann equation. The model differs from the existing schemes in two points: (1) the lattice Boltzmann numerical method is adopted to solve the advection-dispersion problem by meso-scopic particle distribution; (2) and the model describes the relation between discharge, cross section area and solute concentration, which increases the applicability of the water quality model in practical engineering. The model is verified using three benchmark tests: (1) instantaneous solute transport within a short distance; (2) 1D point source pollution with constant velocity; (3) 1D point source pollution in a dam break flow. The model is then applied to a 50-year flood point source pollution accident on the Yongding River, which showed good agreement with a MIKE 11 solution and gauging data.
International Nuclear Information System (INIS)
Reitsma, F.; Han, J.; Ivanov, K.; Sartori, E.
2008-01-01
The PBMR is a High-Temperature Gas-cooled Reactor (HTGR) concept developed to be built in South Africa. The analysis tools used for core neutronic design and core safety analysis need to be verified and validated. Since only a few pebble-bed HTR experimental facilities or plant data are available the use of code-to-code comparisons are an essential part of the V and V plans. As part of this plan the PBMR 400 MW design and a representative set of transient cases is defined as an OECD benchmark. The scope of the benchmark is to establish a series of well-defined multi-dimensional computational benchmark problems with a common given set of cross-sections, to compare methods and tools in coupled neutronics and thermal hydraulics analysis with a specific focus on transient events. The OECD benchmark includes steady-state and transients cases. Although the focus of the benchmark is on the modelling of the transient behaviour of the PBMR core, it was also necessary to define some steady-state cases to ensure consistency between the different approaches before results of transient cases could be compared. This paper describes the status of the benchmark project and shows the results for the three steady state exercises defined as a standalone neutronics calculation, a standalone thermal-hydraulic core calculation, and a coupled neutronics/thermal-hydraulic simulation. (authors)
Validation of the WIMSD4M cross-section generation code with benchmark results
International Nuclear Information System (INIS)
Deen, J.R.; Woodruff, W.L.; Leal, L.E.
1995-01-01
The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section libraries for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D 2 O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented
Validation of the WIMSD4M cross-section generation code with benchmark results
Energy Technology Data Exchange (ETDEWEB)
Deen, J.R.; Woodruff, W.L. [Argonne National Lab., IL (United States); Leal, L.E. [Oak Ridge National Lab., TN (United States)
1995-01-01
The WIMSD4 code has been adopted for cross-section generation in support of the Reduced Enrichment Research and Test Reactor (RERTR) program at Argonne National Laboratory (ANL). Subsequently, the code has undergone several updates, and significant improvements have been achieved. The capability of generating group-collapsed micro- or macroscopic cross sections from the ENDF/B-V library and the more recent evaluation, ENDF/B-VI, in the ISOTXS format makes the modified version of the WIMSD4 code, WIMSD4M, very attractive, not only for the RERTR program, but also for the reactor physics community. The intent of the present paper is to validate the WIMSD4M cross-section libraries for reactor modeling of fresh water moderated cores. The results of calculations performed with multigroup cross-section data generated with the WIMSD4M code will be compared against experimental results. These results correspond to calculations carried out with thermal reactor benchmarks of the Oak Ridge National Laboratory (ORNL) unreflected HEU critical spheres, the TRX LEU critical experiments, and calculations of a modified Los Alamos HEU D{sub 2}O moderated benchmark critical system. The benchmark calculations were performed with the discrete-ordinates transport code, TWODANT, using WIMSD4M cross-section data. Transport calculations using the XSDRNPM module of the SCALE code system are also included. In addition to transport calculations, diffusion calculations with the DIF3D code were also carried out, since the DIF3D code is used in the RERTR program for reactor analysis and design. For completeness, Monte Carlo results of calculations performed with the VIM and MCNP codes are also presented.
Initial Mechanical Testing of Superalloy Lattice Block Structures Conducted
Krause, David L.; Whittenberger, J. Daniel
2002-01-01
The first mechanical tests of superalloy lattice block structures produced promising results for this exciting new lightweight material system. The testing was performed in-house at NASA Glenn Research Center's Structural Benchmark Test Facility, where small subelement-sized compression and beam specimens were loaded to observe elastic and plastic behavior, component strength levels, and fatigue resistance for hundreds of thousands of load cycles. Current lattice block construction produces a flat panel composed of thin ligaments arranged in a three-dimensional triangulated trusslike structure. Investment casting of lattice block panels has been developed and greatly expands opportunities for using this unique architecture in today's high-performance structures. In addition, advances made in NASA's Ultra-Efficient Engine Technology Program have extended the lattice block concept to superalloy materials. After a series of casting iterations, the nickel-based superalloy Inconel 718 (IN 718, Inco Alloys International, Inc., Huntington, WV) was successfully cast into lattice block panels; this combination offers light weight combined with high strength, high stiffness, and elevated-temperature durability. For tests to evaluate casting quality and configuration merit, small structural compression and bend test specimens were machined from the 5- by 12- by 0.5-in. panels. Linear elastic finite element analyses were completed for several specimen layouts to predict material stresses and deflections under proposed test conditions. The structural specimens were then subjected to room-temperature static and cyclic loads in Glenn's Life Prediction Branch's material test machine. Surprisingly, the test results exceeded analytical predictions: plastic strains greater than 5 percent were obtained, and fatigue lives did not depreciate relative to the base material. These assets were due to the formation of plastic hinges and the redundancies inherent in lattice block construction
Results of the GABLS3 diurnal-cycle benchmark for wind energy applications
DEFF Research Database (Denmark)
Rodrigo, J. Sanz; Allaerts, D.; Avila, M.
2017-01-01
errors are used to quantify model performance. The results of the benchmark are used to discuss input uncertainties from mesoscale modelling, different meso-micro coupling strategies (online vs offline) and consistency between RANS and LES codes when dealing with boundary-layer mean flow quantities....... Overall, all the microscale simulations produce a consistent coupling with mesoscale forcings....
Analysis of the European results on the HTTR's core physics benchmarks
International Nuclear Information System (INIS)
Raepsaet, X.; Damian, F.; Ohlig, U.A.; Brockmann, H.J.; Haas, J.B.M. de; Wallerboss, E.M.
2002-01-01
Within the frame of the European contract HTR-N1 calculations are performed on the benchmark problems of the HTTR's start-up core physics experiments initially proposed by the IAEA in a Co-ordinated Research Programme. Three European partners, the FZJ in Germany, NRG and IRI in the Netherlands, and CEA in France, have joined this work package with the aim to validate their calculational methods. Pre-test and post-test calculational results, obtained by the partners, are compared with each other and with the experiment. Parts of the discrepancies between experiment and pre-test predictions are analysed and tackled by different treatments. In the case of the Monte Carlo code TRIPOLI4, used by CEA, the discrepancy between measurement and calculation at the first criticality is reduced to Δk/k∼0.85%, when considering the revised data of the HTTR benchmark. In the case of the diffusion codes, this discrepancy is reduced to: Δk/k∼0.8% (FZJ) and 2.7 or 1.8% (CEA). (author)
Energy Technology Data Exchange (ETDEWEB)
Hannstein, Volker; Sommer, Fabian
2017-05-15
The report summarizes the performed studies and results in the frame of the phase II benchmarks of the expert group of used nuclear fuel (EGUNF) of the working party of nuclear criticality safety (WPNCS) of the nuclear energy agency (NEA) of the organization for economic co-operation and development (OECD). The studies specified within the benchmarks have been realized to the full extent. The scope of the benchmarks was the comparison of a generic BWR fuel element with gadolinium containing fuel rods with several computer codes and cross section libraries of different international working groups and institutions. The used computational model allows the evaluation of the accuracy of fuel rod and their influence of the inventory calculations and the respective influence on BWR burnout credit calculations.
Scalar meson in dynamical and partially quenched two-flavor QCD: Lattice results and chiral loops
International Nuclear Information System (INIS)
Prelovsek, S.; Dawson, C.; Izubuchi, T.; Orginos, K.; Soni, A.
2004-01-01
This is an exploratory study of the lightest nonsinglet scalar qq state on the lattice with two dynamical quarks. Domain wall fermions are used for both sea and valence quarks on a 16 3 x32 lattice with an inverse lattice spacing of 1.7 GeV. We extract the scalar meson mass 1.58±0.34 GeV from the exponential time dependence of the dynamical correlators with m val =m sea and N f =2. Since this statistical error bar from dynamical correlators is rather large, we analyze also the partially quenched lattice correlators with m val ≠m sea . They are positive for m val ≥m sea and negative for m val sea . In order to understand this striking effect of partial quenching, we derive the scalar correlator within the partially quenched chiral perturbation theory (ChPT) and find it describes lattice correlators well. The leading unphysical contribution in partially quenched ChPT comes from the exchange of the two pseudoscalar fields and is also positive for m val ≥m sea and negative for m val sea at large t. After the subtraction of this unphysical contribution from the partially quenched lattice correlators, the correlators are positive and exponentially falling. The resulting scalar meson mass 1.51±0.19 GeV from the partially quenched correlators is consistent with the dynamical result and has an appreciably smaller error bar
Benchmark calculation for GT-MHR using HELIOS/MASTER code package and MCNP
International Nuclear Information System (INIS)
Lee, Kyung Hoon; Kim, Kang Seog; Noh, Jae Man; Song, Jae Seung; Zee, Sung Quun
2005-01-01
The latest research associated with the very high temperature gas-cooled reactor (VHTR) is focused on the verification of a system performance and safety under operating conditions for the VHTRs. As a part of those, an international gas-cooled reactor program initiated by IAEA is going on. The key objectives of this program are the validation of analytical computer codes and the evaluation of benchmark models for the projected and actual VHTRs. New reactor physics analysis procedure for the prismatic VHTR is under development by adopting the conventional two-step procedure. In this procedure, a few group constants are generated through the transport lattice calculations using the HELIOS code, and the core physics analysis is performed by the 3-dimensional nodal diffusion code MASTER. We evaluated the performance of the HELIOS/MASTER code package through the benchmark calculations related to the GT-MHR (Gas Turbine-Modular Helium Reactor) to dispose weapon plutonium. In parallel, MCNP is employed as a reference code to verify the results of the HELIOS/MASTER procedure
Lattice-induced nonadiabatic frequency shifts in optical lattice clocks
International Nuclear Information System (INIS)
Beloy, K.
2010-01-01
We consider the frequency shift in optical lattice clocks which arises from the coupling of the electronic motion to the atomic motion within the lattice. For the simplest of three-dimensional lattice geometries this coupling is shown to affect only clocks based on blue-detuned lattices. We have estimated the size of this shift for the prospective strontium lattice clock operating at the 390-nm blue-detuned magic wavelength. The resulting fractional frequency shift is found to be on the order of 10 -18 and is largely overshadowed by the electric quadrupole shift. For lattice clocks based on more complex geometries or other atomic systems, this shift could potentially be a limiting factor in clock accuracy.
Cold dilute neutron matter on the lattice. II. Results in the unitary limit
International Nuclear Information System (INIS)
Lee, Dean; Schaefer, Thomas
2006-01-01
This is the second of two articles that investigate cold dilute neutron matter on the lattice using pionless effective field theory. In the unitary limit, where the effective range is zero and scattering length is infinite, simple scaling relations relate thermodynamic functions at different temperatures. When the second virial coefficient is properly tuned, we find that the lattice results obey these scaling relations. We compute the energy per particle, pressure, spin susceptibility, dineutron correlation function, and an upper bound for the superfluid critical temperature
Sensitivity and Uncertainty Analysis of IAEA CRP HTGR Benchmark Using McCARD
International Nuclear Information System (INIS)
Jang, Sang Hoon; Shim, Hyung Jin
2016-01-01
The benchmark consists of 4 phases starting from the local standalone modeling (Phase I) to the safety calculation of coupled system with transient situation (Phase IV). As a preliminary study of UAM on HTGR, this paper covers the exercise 1 and 2 of Phase I which defines the unit cell and lattice geometry of MHTGR-350 (General Atomics). The objective of these exercises is to quantify the uncertainty of the multiplication factor induced by perturbing nuclear data as well as to analyze the specific features of HTGR such as double heterogeneity and self-shielding treatment. The uncertainty quantification of IAEA CRP HTGR UAM benchmarks were conducted using first-order AWP method in McCARD. Uncertainty of the multiplication factor was estimated only for the microscopic cross section perturbation. To reduce the computation time and memory shortage, recently implemented uncertainty analysis module in MC wielandt calculation was adjusted. The covariance data of cross section was generated by NJOY/ERRORR module with ENDF/B-VII.1. The numerical result was compared with evaluation result of DeCART/MUSAD code system developed by KAERI. IAEA CRP HTGR UAM benchmark problems were analyzed using McCARD. The numerical results were compared with Serpent for eigenvalue calculation and DeCART/MUSAD for S/U analysis. In eigenvalue calculation, inconsistencies were found in the result with ENDF/B-VII.1 cross section library and it was found to be the effect of thermal scattering data of graphite. As to S/U analysis, McCARD results matched well with DeCART/MUSAD, but showed some discrepancy in 238U capture regarding implicit uncertainty.
Characterizing a four-qubit planar lattice for arbitrary error detection
Chow, Jerry M.; Srinivasan, Srikanth J.; Magesan, Easwar; Córcoles, A. D.; Abraham, David W.; Gambetta, Jay M.; Steffen, Matthias
2015-05-01
Quantum error correction will be a necessary component towards realizing scalable quantum computers with physical qubits. Theoretically, it is possible to perform arbitrarily long computations if the error rate is below a threshold value. The two-dimensional surface code permits relatively high fault-tolerant thresholds at the ~1% level, and only requires a latticed network of qubits with nearest-neighbor interactions. Superconducting qubits have continued to steadily improve in coherence, gate, and readout fidelities, to become a leading candidate for implementation into larger quantum networks. Here we describe characterization experiments and calibration of a system of four superconducting qubits arranged in a planar lattice, amenable to the surface code. Insights into the particular qubit design and comparison between simulated parameters and experimentally determined parameters are given. Single- and two-qubit gate tune-up procedures are described and results for simultaneously benchmarking pairs of two-qubit gates are given. All controls are eventually used for an arbitrary error detection protocol described in separate work [Corcoles et al., Nature Communications, 6, 2015].
On the characterization and software implementation of general protein lattice models.
Directory of Open Access Journals (Sweden)
Alessio Bechini
Full Text Available models of proteins have been widely used as a practical means to computationally investigate general properties of the system. In lattice models any sterically feasible conformation is represented as a self-avoiding walk on a lattice, and residue types are limited in number. So far, only two- or three-dimensional lattices have been used. The inspection of the neighborhood of alpha carbons in the core of real proteins reveals that also lattices with higher coordination numbers, possibly in higher dimensional spaces, can be adopted. In this paper, a new general parametric lattice model for simplified protein conformations is proposed and investigated. It is shown how the supporting software can be consistently designed to let algorithms that operate on protein structures be implemented in a lattice-agnostic way. The necessary theoretical foundations are developed and organically presented, pinpointing the role of the concept of main directions in lattice-agnostic model handling. Subsequently, the model features across dimensions and lattice types are explored in tests performed on benchmark protein sequences, using a Python implementation. Simulations give insights on the use of square and triangular lattices in a range of dimensions. The trend of potential minimum for sequences of different lengths, varying the lattice dimension, is uncovered. Moreover, an extensive quantitative characterization of the usage of the so-called "move types" is reported for the first time. The proposed general framework for the development of lattice models is simple yet complete, and an object-oriented architecture can be proficiently employed for the supporting software, by designing ad-hoc classes. The proposed framework represents a new general viewpoint that potentially subsumes a number of solutions previously studied. The adoption of the described model pushes to look at protein structure issues from a more general and essential perspective, making
Mixed-oxide (MOX) fuel performance benchmark. Summary of the results for the PRIMO MOX rod BD8
International Nuclear Information System (INIS)
Ott, L.J.; Sartori, E.; Costa, A.; ); Sobolev, V.; Lee, B-H.; Alekseev, P.N.; Shestopalov, A.A.; Mikityuk, K.O.; Fomichenko, P.A.; Shatrova, L.P.; Medvedev, A.V.; Bogatyr, S.M.; Khvostov, G.A.; Kuznetsov, V.I.; Stoenescu, R.; Chatwin, C.P.
2009-01-01
The OECD/NEA Nuclear Science Committee has established an Expert Group that deals with the status and trends of reactor physics, nuclear fuel performance, and fuel cycle issues related to the disposition of weapons-grade plutonium as MOX fuel. The activities of the NEA Expert Group on Reactor-based Plutonium Disposition are carried out in close cooperation with the NEA Working Party on Scientific Issues in Reactor Systems (WPRS). A major part of these activities includes benchmark studies. This report describes the results of the PRIMO rod BD8 benchmark exercise, the second benchmark by the TFRPD relative to MOX fuel behaviour. The corresponding PRIMO experimental data have been released, compiled and reviewed for the International Fuel Performance Experiments (IFPE) database. The observed ranges (as noted in the text) in the predicted thermal and FGR responses are reasonable given the variety and combination of thermal conductivity and FGR models employed by the benchmark participants with their respective fuel performance codes
International Nuclear Information System (INIS)
Maroufi, Arman; Aghanajafi, Cyrus
2013-01-01
This article deals with the analysis of solidification of a 2-D semitransparent material using the lattice Boltzmann method (LBM). Both conduction and radiation terms in governing energy equation were computed using the LBM. First, the LBM formulation regarding conduction component was validated and the results analyzed. Next, the results involving phase change or radiation term in the LBM were compared with the finite volume method (FVM). The results show good accuracy and less time consumption during LBM implementation. Finally, temperature distribution, the location of solid-liquid front, mushy zone thickness and the effects of heat transfer parameters were studied. -- Highlights: ► Solidification of 2-D semitransparent material is studied. ► Both conduction and radiation were computed using lattice Boltzmann method (LBM). ► LBM results validated by solving three benchmark problems. ► Effects of various parameters were studied on temperature distributions. ► Results show good accuracy and less time consumption during LBM implementation.
Light hadron spectrum from quenched lattice QCD. Results from the CP-PACS
International Nuclear Information System (INIS)
Yoshie, Tomoteru
2001-01-01
Deriving the light hadron spectrum from first principles of QCD has been a fundamental issue in elementary particle physics since the mid-1970s, when QCD was established. With this goal in mind, we have carried out large-scale simulations of lattice QCD on the CP-PACS computer. In this article, we present results for the light hadron spectrum derived in the quenched approximation to lattice QCD. We find that although the global structure of the observed spectrum is reproduced, the quenched spectrum systematically deviates from experiment when examined with an accuracy at better than a 10% level. Results for light quark masses are also reported. Another simulation of full QCD done recently (also on the CP-PACS computer) shows indications that the discrepancy observed in quenched QCD is significantly reduced by the introduction of two flavors of light dynamical quarks. (author)
Mitchell, L
1996-01-01
The processes of benchmarking, benchmark data comparative analysis, and study of best practices are distinctly different. The study of best practices is explained with an example based on the Arthur Andersen & Co. 1992 "Study of Best Practices in Ambulatory Surgery". The results of a national best practices study in ambulatory surgery were used to provide our quality improvement team with the goal of improving the turnaround time between surgical cases. The team used a seven-step quality improvement problem-solving process to improve the surgical turnaround time. The national benchmark for turnaround times between surgical cases in 1992 was 13.5 minutes. The initial turnaround time at St. Joseph's Medical Center was 19.9 minutes. After the team implemented solutions, the time was reduced to an average of 16.3 minutes, an 18% improvement. Cost-benefit analysis showed a potential enhanced revenue of approximately $300,000, or a potential savings of $10,119. Applying quality improvement principles to benchmarking, benchmarks, or best practices can improve process performance. Understanding which form of benchmarking the institution wishes to embark on will help focus a team and use appropriate resources. Communicating with professional organizations that have experience in benchmarking will save time and money and help achieve the desired results.
HS06 Benchmark for an ARM Server
Kluth, Stefan
2014-06-01
We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.
HS06 benchmark for an ARM server
International Nuclear Information System (INIS)
Kluth, Stefan
2014-01-01
We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.
International Nuclear Information System (INIS)
Cho, Moon-Sung; Kim, Y. M.; Lee, Y. W.; Jeong, K. C.; Kim, Y. K.; Oh, S. C.
2006-01-01
The fundamental design for a gas-cooled reactor relies on an understanding of the behavior of a coated particle fuel. KAERI, which has been carrying out the Korean VHTR (Very High Temperature modular gas cooled Reactor) Project since 2004, is developing a fuel performance analysis code for a VHTR named COPA (COated Particle fuel Analysis). COPA predicts temperatures, stresses, a fission gas release and failure probabilities of a coated particle fuel in normal operating conditions. Validation of COPA in the process of its development is realized partly by participating in the benchmark section of the international CRP-6 program led by IAEA which provides comprehensive benchmark problems and analysis results obtained from the CRP-6 member countries. Apart from the validation effort through the CRP-6, a validation of COPA was attempted by comparing its benchmark results with the visco-elastic solutions obtained from the ABAQUS code calculations for the same CRP-6 TRISO coated particle benchmark problems involving creep, swelling, and pressure. The study shows the calculation results of the IAEA-CRP-6 benchmark cases 5 through 7 by using the ABAQUS FE model for a comparison with the COPA results
First results from simulations of supersymmetric lattices
Catterall, Simon
2009-01-01
We conduct the first numerical simulations of lattice theories with exact supersymmetry arising from the orbifold constructions of \\cite{Cohen:2003xe,Cohen:2003qw,Kaplan:2005ta}. We consider the Script Q = 4 theory in D = 0,2 dimensions and the Script Q = 16 theory in D = 0,2,4 dimensions. We show that the U(N) theories do not possess vacua which are stable non-perturbatively, but that this problem can be circumvented after truncation to SU(N). We measure the distribution of scalar field eigenvalues, the spectrum of the fermion operator and the phase of the Pfaffian arising after integration over the fermions. We monitor supersymmetry breaking effects by measuring a simple Ward identity. Our results indicate that simulations of Script N = 4 super Yang-Mills may be achievable in the near future.
Hadronic corrections to electroweak observables from twisted mass lattice QCD
International Nuclear Information System (INIS)
Pientka, Grit
2015-01-01
For several benchmark quantities investigated to detect signs for new physics beyond the standard model of elementary particle physics, lattice QCD currently constitutes the only ab initio approach available at small momentum transfers for the computation of non-perturbative hadronic contributions. Among those observables are the lepton anomalous magnetic moments and the running of the electroweak coupling constants. We compute the leading QCD contribution to the muon anomalous magnetic moment by performing lattice QCD calculations on ensembles incorporating N f =2+1+1 dynamical twisted mass fermions. Considering active up, down, strange, and charm quarks, admits for the first time a direct comparison of the lattice data for the muon anomaly with phenomenological results because both the latter as well as the experimentally obtained values are sensitive to the complete first two generations of quarks at the current level of precision. Recently, it has been noted that improved measurements of the electron and tau anomalous magnetic moments might also provide ways of detecting new physics contributions. Therefore, we also compute their leading QCD contributions, which simultaneously serve as cross-checks of the value obtained for the muon. Additionally, we utilise the obtained data to compute the leading hadronic contribution to the running of the fine structure constant, which enters all perturbative QED calculations. Furthermore, we show that even for the weak mixing angle the leading QCD contribution can be computed from this data. In this way, we identify a new prime observable in the search for new physics whose hadronic contributions can be obtained from lattice QCD. With the results obtained in this thesis, we are able to exclude unsuitable phenomenologically necessary flavour separations and thus directly assist the presently more precise phenomenological determinations of this eminent quantity.
Parton distributions and lattice QCD calculations: A community white paper
Lin, Huey-Wen; Nocera, Emanuele R.; Olness, Fred; Orginos, Kostas; Rojo, Juan; Accardi, Alberto; Alexandrou, Constantia; Bacchetta, Alessandro; Bozzi, Giuseppe; Chen, Jiunn-Wei; Collins, Sara; Cooper-Sarkar, Amanda; Constantinou, Martha; Del Debbio, Luigi; Engelhardt, Michael; Green, Jeremy; Gupta, Rajan; Harland-Lang, Lucian A.; Ishikawa, Tomomi; Kusina, Aleksander; Liu, Keh-Fei; Liuti, Simonetta; Monahan, Christopher; Nadolsky, Pavel; Qiu, Jian-Wei; Schienbein, Ingo; Schierholz, Gerrit; Thorne, Robert S.; Vogelsang, Werner; Wittig, Hartmut; Yuan, C.-P.; Zanotti, James
2018-05-01
In the framework of quantum chromodynamics (QCD), parton distribution functions (PDFs) quantify how the momentum and spin of a hadron are divided among its quark and gluon constituents. Two main approaches exist to determine PDFs. The first approach, based on QCD factorization theorems, realizes a QCD analysis of a suitable set of hard-scattering measurements, often using a variety of hadronic observables. The second approach, based on first-principle operator definitions of PDFs, uses lattice QCD to compute directly some PDF-related quantities, such as their moments. Motivated by recent progress in both approaches, in this document we present an overview of lattice-QCD and global-analysis techniques used to determine unpolarized and polarized proton PDFs and their moments. We provide benchmark numbers to validate present and future lattice-QCD calculations and we illustrate how they could be used to reduce the PDF uncertainties in current unpolarized and polarized global analyses. This document represents a first step towards establishing a common language between the two communities, to foster dialogue and to further improve our knowledge of PDFs.
Medical school benchmarking - from tools to programmes.
Wilkinson, Tim J; Hudson, Judith N; Mccoll, Geoffrey J; Hu, Wendy C Y; Jolly, Brian C; Schuwirth, Lambert W T
2015-02-01
Benchmarking among medical schools is essential, but may result in unwanted effects. To apply a conceptual framework to selected benchmarking activities of medical schools. We present an analogy between the effects of assessment on student learning and the effects of benchmarking on medical school educational activities. A framework by which benchmarking can be evaluated was developed and applied to key current benchmarking activities in Australia and New Zealand. The analogy generated a conceptual framework that tested five questions to be considered in relation to benchmarking: what is the purpose? what are the attributes of value? what are the best tools to assess the attributes of value? what happens to the results? and, what is the likely "institutional impact" of the results? If the activities were compared against a blueprint of desirable medical graduate outcomes, notable omissions would emerge. Medical schools should benchmark their performance on a range of educational activities to ensure quality improvement and to assure stakeholders that standards are being met. Although benchmarking potentially has positive benefits, it could also result in perverse incentives with unforeseen and detrimental effects on learning if it is undertaken using only a few selected assessment tools.
Cousineau, Sarah M
2005-01-01
Space charge effects are a major contributor to beam halo and emittance growth leading to beam loss in high intensity, low energy accelerators. As future accelerators strive towards unprecedented levels of beam intensity and beam loss control, a more comprehensive understanding of space charge effects is required. A wealth of simulation tools have been developed for modeling beams in linacs and rings, and with the growing availability of high-speed computing systems, computationally expensive problems that were inconceivable a decade ago are now being handled with relative ease. This has opened the field for realistic simulations of space charge effects, including detailed benchmarks with experimental data. A great deal of effort is being focused in this direction, and several recent benchmark studies have produced remarkably successful results. This paper reviews the achievements in space charge benchmarking in the last few years, and discusses the challenges that remain.
Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions
Energy Technology Data Exchange (ETDEWEB)
Mathew, Paul; Sartor, Dale; Tschudi, William
2009-07-13
This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.
Verification of HELIOS-MASTER system through benchmark of critical experiments
International Nuclear Information System (INIS)
Kim, H. Y.; Kim, K. Y.; Cho, B. O.; Lee, C. C.; Zee, S. O.
1999-01-01
The HELIOS-MASTER code system is verified through the benchmark of the critical experiments that were performed by RRC 'Kurchatov Institute' with water-moderated hexagonally pitched lattices of highly enriched Uranium fuel rods (80w/o). We also used the same input by using the MCNP code that was described in the evaluation report, and compared our results with those of the evaluation report. HELIOS, developed by Scandpower A/S, is a two-dimensional transport program for the generation of group cross-sections, and MASTER, developed by KAERI, is a three-dimensional nuclear design and analysis code based on the two-group diffusion theory. It solves neutronics model with the AFEN (Analytic Function Expansion Nodal) method for hexagonal geometry. The results show that the HELIOS-MASTER code system is fast and accurate enough to be used as nuclear core analysis tool for hexagonal geometry
VENUS-2 MOX Core Benchmark: Results of ORNL Calculations Using HELIOS-1.4
Energy Technology Data Exchange (ETDEWEB)
Ellis, RJ
2001-02-02
The Task Force on Reactor-Based Plutonium Disposition, now an Expert Group, was set up through the Organization for Economic Cooperation and Development/Nuclear Energy Agency to facilitate technical assessments of burning weapons-grade plutonium mixed-oxide (MOX) fuel in U.S. pressurized-water reactors and Russian VVER nuclear reactors. More than ten countries participated to advance the work of the Task Force in a major initiative, which was a blind benchmark study to compare code benchmark calculations against experimental data for the VENUS-2 MOX core at SCK-CEN in Mol, Belgium. At the Oak Ridge National Laboratory, the HELIOS-1.4 code was used to perform a comprehensive study of pin-cell and core calculations for the VENUS-2 benchmark.
Benchmarking and the laboratory
Galloway, M; Nadin, L
2001-01-01
This article describes how benchmarking can be used to assess laboratory performance. Two benchmarking schemes are reviewed, the Clinical Benchmarking Company's Pathology Report and the College of American Pathologists' Q-Probes scheme. The Clinical Benchmarking Company's Pathology Report is undertaken by staff based in the clinical management unit, Keele University with appropriate input from the professional organisations within pathology. Five annual reports have now been completed. Each report is a detailed analysis of 10 areas of laboratory performance. In this review, particular attention is focused on the areas of quality, productivity, variation in clinical practice, skill mix, and working hours. The Q-Probes scheme is part of the College of American Pathologists programme in studies of quality assurance. The Q-Probes scheme and its applicability to pathology in the UK is illustrated by reviewing two recent Q-Probe studies: routine outpatient test turnaround time and outpatient test order accuracy. The Q-Probes scheme is somewhat limited by the small number of UK laboratories that have participated. In conclusion, as a result of the government's policy in the UK, benchmarking is here to stay. Benchmarking schemes described in this article are one way in which pathologists can demonstrate that they are providing a cost effective and high quality service. Key Words: benchmarking • pathology PMID:11477112
Results of the ISPRS benchmark on urban object detection and 3D building reconstruction
Rottensteiner, F.; Sohn, G.; Gerke, M.; Wegner, J.D.; Breitkopf, U.; Jung, J.
2014-01-01
For more than two decades, many efforts have been made to develop methods for extracting urban objects from data acquired by airborne sensors. In order to make the results of such algorithms more comparable, benchmarking data sets are of paramount importance. Such a data set, consisting of airborne
Verification and validation benchmarks.
Energy Technology Data Exchange (ETDEWEB)
Oberkampf, William Louis; Trucano, Timothy Guy
2007-02-01
Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of
Review of lattice results concerning low energy particle physics
DEFF Research Database (Denmark)
Aoki, Sinya; Aoki, Yasumichi; Bernard, Claude
2014-01-01
We review lattice results related to pion, kaon, D- and B-meson physics with the aim of making them easily accessible to the particle physics community. More specifically, we report on the determination of the light-quark masses, the form factor f+(0), arising in semileptonic K -> pi transition...... Theory and review the determination of the BK parameter of neutral kaon mixing. The inclusion of heavy-quark quantities significantly expands the FLAG scope with respect to the previous review. Therefore, for this review, we focus on D- and B-meson decay constants, form factors, and mixing parameters...
Benchmark tests of JENDL-3.2 for thermal and fast reactors
International Nuclear Information System (INIS)
Takano, Hideki
1995-01-01
Benchmark calculations for a variety of thermal and fast reactors have been performed by using the newly evaluated JENDL-3 Version-2 (JENDL-3.2) file. In the thermal reactor calculations for the uranium and plutonium fueled cores of TRX and TCA, the k eff and lattice parameters were well predicted. The fast reactor calculations for ZPPR-9 and FCA assemblies showed that the k eff , reactivity worth of Doppler, sodium void and control rod, and reaction rate distribution were in a very good agreement with the experiments. (author)
Finite-lattice-spacing corrections to masses and g factors on a lattice
International Nuclear Information System (INIS)
Roskies, R.; Wu, J.C.
1986-01-01
We suggest an alternative method for extracting masses and g factors from lattice calculations. Our method takes account of more of the infrared and ultraviolet lattice effects. It leads to more reasonable results in simulations of QED on a lattice
Improvement of the instability of compressible lattice Boltzmann model by shockdetecting sensor
International Nuclear Information System (INIS)
Esfahanian, Vahid; Ghadyani, Mohsen
2015-01-01
Recently, lattice Boltzmann method (LBM) has drawn attention as an alternative and promising numerical technique for simulating fluid flows. The stability of LBM is a challenging problem in the simulation of compressible flows with different types of embedded discontinuities. This study, proposes a complementary scheme for simulating inviscid flows by a compressible lattice Boltzmann model in order to improve the instability using a shock-detecting procedure. The advantages and disadvantages of using a numerical hybrid filter on the primitive or conservative variables, in addition to, macroscopic or mesoscopic variables are investigated. The study demonstrates that the robustness of the utilized LB model is improved for inviscid compressible flows by implementation of the complementary scheme on mesoscopic variables. The validity of the procedure to capture shocks and resolve contact discontinuity and rarefaction waves in well-known benchmark problems is investigated. The numerical results show that the scheme is capable of generating more robust solutions in the simulation of compressible flows and prevents the formation of oscillations. Good agreements are obtained for all test cases.
Improvement of the instability of compressible lattice Boltzmann model by shockdetecting sensor
Energy Technology Data Exchange (ETDEWEB)
Esfahanian, Vahid [University of Tehran, Tehran (Iran, Islamic Republic of); Ghadyani, Mohsen [Islamic Azad University, Tehran (Iran, Islamic Republic of)
2015-05-15
Recently, lattice Boltzmann method (LBM) has drawn attention as an alternative and promising numerical technique for simulating fluid flows. The stability of LBM is a challenging problem in the simulation of compressible flows with different types of embedded discontinuities. This study, proposes a complementary scheme for simulating inviscid flows by a compressible lattice Boltzmann model in order to improve the instability using a shock-detecting procedure. The advantages and disadvantages of using a numerical hybrid filter on the primitive or conservative variables, in addition to, macroscopic or mesoscopic variables are investigated. The study demonstrates that the robustness of the utilized LB model is improved for inviscid compressible flows by implementation of the complementary scheme on mesoscopic variables. The validity of the procedure to capture shocks and resolve contact discontinuity and rarefaction waves in well-known benchmark problems is investigated. The numerical results show that the scheme is capable of generating more robust solutions in the simulation of compressible flows and prevents the formation of oscillations. Good agreements are obtained for all test cases.
Elimination of spurious lattice fermion solutions and noncompact lattice QCD
Energy Technology Data Exchange (ETDEWEB)
Lee, T.D.
1997-09-22
It is well known that the Dirac equation on a discrete hyper-cubic lattice in D dimension has 2{sup D} degenerate solutions. The usual method of removing these spurious solutions encounters difficulties with chiral symmetry when the lattice spacing l {ne} 0, as exemplified by the persistent problem of the pion mass. On the other hand, we recall that in any crystal in nature, all the electrons do move in a lattice and satisfy the Dirac equation; yet there is not a single physical result that has ever been entangled with a spurious fermion solution. Therefore it should not be difficult to eliminate these unphysical elements. On a discrete lattice, particle hop from point to point, whereas in a real crystal the lattice structure in embedded in a continuum and electrons move continuously from lattice cell to lattice cell. In a discrete system, the lattice functions are defined only on individual points (or links as in the case of gauge fields). However, in a crystal the electron state vector is represented by the Bloch wave functions which are continuous functions in {rvec {gamma}}, and herein lies one of the essential differences.
A partial entropic lattice Boltzmann MHD simulation of the Orszag-Tang vortex
Flint, Christopher; Vahala, George
2018-02-01
Karlin has introduced an analytically determined entropic lattice Boltzmann (LB) algorithm for Navier-Stokes turbulence. Here, this is partially extended to an LB model of magnetohydrodynamics, on using the vector distribution function approach of Dellar for the magnetic field (which is permitted to have field reversal). The partial entropic algorithm is benchmarked successfully against standard simulations of the Orszag-Tang vortex [Orszag, S.A.; Tang, C.M. J. Fluid Mech. 1979, 90 (1), 129-143].
Benchmarking Evaluation Results for Prototype Extravehicular Activity Gloves
Aitchison, Lindsay; McFarland, Shane
2012-01-01
The Space Suit Assembly (SSA) Development Team at NASA Johnson Space Center has invested heavily in the advancement of rear-entry planetary exploration suit design but largely deferred development of extravehicular activity (EVA) glove designs, and accepted the risk of using the current flight gloves, Phase VI, for unique mission scenarios outside the Space Shuttle and International Space Station (ISS) Program realm of experience. However, as design reference missions mature, the risks of using heritage hardware have highlighted the need for developing robust new glove technologies. To address the technology gap, the NASA Game-Changing Technology group provided start-up funding for the High Performance EVA Glove (HPEG) Project in the spring of 2012. The overarching goal of the HPEG Project is to develop a robust glove design that increases human performance during EVA and creates pathway for future implementation of emergent technologies, with specific aims of increasing pressurized mobility to 60% of barehanded capability, increasing the durability by 100%, and decreasing the potential of gloves to cause injury during use. The HPEG Project focused initial efforts on identifying potential new technologies and benchmarking the performance of current state of the art gloves to identify trends in design and fit leading to establish standards and metrics against which emerging technologies can be assessed at both the component and assembly levels. The first of the benchmarking tests evaluated the quantitative mobility performance and subjective fit of four prototype gloves developed by Flagsuit LLC, Final Frontier Designs, LLC Dover, and David Clark Company as compared to the Phase VI. All of the companies were asked to design and fabricate gloves to the same set of NASA provided hand measurements (which corresponded to a single size of Phase Vi glove) and focus their efforts on improving mobility in the metacarpal phalangeal and carpometacarpal joints. Four test
Benchmarking Using Basic DBMS Operations
Crolotte, Alain; Ghazal, Ahmad
The TPC-H benchmark proved to be successful in the decision support area. Many commercial database vendors and their related hardware vendors used these benchmarks to show the superiority and competitive edge of their products. However, over time, the TPC-H became less representative of industry trends as vendors keep tuning their database to this benchmark-specific workload. In this paper, we present XMarq, a simple benchmark framework that can be used to compare various software/hardware combinations. Our benchmark model is currently composed of 25 queries that measure the performance of basic operations such as scans, aggregations, joins and index access. This benchmark model is based on the TPC-H data model due to its maturity and well-understood data generation capability. We also propose metrics to evaluate single-system performance and compare two systems. Finally we illustrate the effectiveness of this model by showing experimental results comparing two systems under different conditions.
Comparison of typical inelastic analysis predictions with benchmark problem experimental results
International Nuclear Information System (INIS)
Clinard, J.A.; Corum, J.M.; Sartory, W.K.
1975-01-01
The results of exemplary inelastic analyses are presented for a series of experimental benchmark problems. Consistent analytical procedures and constitutive relations were used in each of the analyses, and published material behavior data were used in all cases. Two finite-element inelastic computer programs were employed. These programs implement the analysis procedures and constitutive equations for Type 304 stainless steel that are currently used in many analyses of elevated-temperature nuclear reactor system components. The analysis procedures and constitutive relations are briefly discussed, and representative analytical results are presented and compared to the test data. The results that are presented demonstrate the feasibility of performing inelastic analyses, and they are indicative of the general level of agreement that the analyst might expect when using conventional inelastic analysis procedures. (U.S.)
Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions
Energy Technology Data Exchange (ETDEWEB)
Mathew, Paul; Greenberg, Steve; Sartor, Dale
2009-07-13
This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.
Recent results of EPR and Moessbauer investigations on lattice dynamics in ammonium sulphate
Grecu, M N; Grecu, V V
2003-01-01
Recent results of the lattice dynamics investigation on ammonium sulfate are reported based on recent experiments carried out using using the non-destructive experimental technique of EPR and NGR. The main results confirm the presence and the contribution of a soft mode, which accompanied the paraferroelectric phase transition in the investigated crystal. (authors)
Verification and validation benchmarks
International Nuclear Information System (INIS)
Oberkampf, William Louis; Trucano, Timothy Guy
2007-01-01
Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the
Verification and validation benchmarks
International Nuclear Information System (INIS)
Oberkampf, William L.; Trucano, Timothy G.
2008-01-01
Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the
Numisheet2005 Benchmark Analysis on Forming of an Automotive Underbody Cross Member: Benchmark 2
International Nuclear Information System (INIS)
Buranathiti, Thaweepat; Cao Jian
2005-01-01
This report presents an international cooperation benchmark effort focusing on simulations of a sheet metal stamping process. A forming process of an automotive underbody cross member using steel and aluminum blanks is used as a benchmark. Simulation predictions from each submission are analyzed via comparison with the experimental results. A brief summary of various models submitted for this benchmark study is discussed. Prediction accuracy of each parameter of interest is discussed through the evaluation of cumulative errors from each submission
Lattice Boltzmann methods for global linear instability analysis
Pérez, José Miguel; Aguilar, Alfonso; Theofilis, Vassilis
2017-12-01
Modal global linear instability analysis is performed using, for the first time ever, the lattice Boltzmann method (LBM) to analyze incompressible flows with two and three inhomogeneous spatial directions. Four linearization models have been implemented in order to recover the linearized Navier-Stokes equations in the incompressible limit. Two of those models employ the single relaxation time and have been proposed previously in the literature as linearization of the collision operator of the lattice Boltzmann equation. Two additional models are derived herein for the first time by linearizing the local equilibrium probability distribution function. Instability analysis results are obtained in three benchmark problems, two in closed geometries and one in open flow, namely the square and cubic lid-driven cavity flow and flow in the wake of the circular cylinder. Comparisons with results delivered by classic spectral element methods verify the accuracy of the proposed new methodologies and point potential limitations particular to the LBM approach. The known issue of appearance of numerical instabilities when the SRT model is used in direct numerical simulations employing the LBM is shown to be reflected in a spurious global eigenmode when the SRT model is used in the instability analysis. Although this mode is absent in the multiple relaxation times model, other spurious instabilities can also arise and are documented herein. Areas of potential improvements in order to make the proposed methodology competitive with established approaches for global instability analysis are discussed.
IRPhEP-handbook, International Handbook of Evaluated Reactor Physics Benchmark Experiments
International Nuclear Information System (INIS)
Sartori, Enrico; Blair Briggs, J.
2008-01-01
experimental series that were performed at 17 different reactor facilities. The Handbook is organized in a manner that allows easy inclusion of additional evaluations, as they become available. Additional evaluations are in progress and will be added to the handbook periodically. Content: FUND - Fundamental; GCR - Gas Cooled (Thermal) Reactor; HWR - Heavy Water Moderated Reactor; LMFR - Liquid Metal Fast Reactor; LWR - Light Water Moderated Reactor; PWR - Pressurized Water Reactor; VVER - VVER Reactor; Evaluations published as drafts 2 - Related Information: International Criticality Safety Benchmark Evaluation Project (ICSBEP); IRPHE/B and W-SS-LATTICE, Spectral Shift Reactor Lattice Experiments; IRPHE-JAPAN, Reactor Physics Experiments carried out in Japan ; IRPHE/JOYO MK-II, JOYO MK-II core management and characteristics database ; IRPhE/RRR-SEG, Reactor Physics Experiments from Fast-Thermal Coupled Facility; IRPHE-SNEAK, KFK SNEAK Fast Reactor Experiments, Primary Documentation ; IRPhE/STEK, Reactor Physics Experiments from Fast-Thermal Coupled Facility ; IRPHE-ZEBRA, AEEW Fast Reactor Experiments, Primary Documentation ; IRPHE-DRAGON-DPR, OECD High Temperature Reactor Dragon Project, Primary Documents; IRPHE-ARCH-01, Archive of HTR Primary Documents ; IRPHE/AVR, AVR High Temperature Reactor Experience, Archival Documentation ; IRPHE-KNK-II-ARCHIVE, KNK-II fast reactor documents, power history and measured parameters; IRPhE/BERENICE, effective delayed neutron fraction measurements ; IRPhE-TAPIRO-ARCHIVE, fast neutron source reactor primary documents, reactor physics experiments. The International Handbook of Evaluated Reactor Physics Benchmark Experiments was prepared by a working party comprised of experienced reactor physics personnel from Belgium, Brazil, Canada, P.R. of China, Germany, Hungary, Japan, Republic of Korea, Russian Federation, Switzerland, United Kingdom, and the United States of America. The IRPhEP Handbook is available to authorised requesters from the
ANN-Benchmarks: A Benchmarking Tool for Approximate Nearest Neighbor Algorithms
DEFF Research Database (Denmark)
Aumüller, Martin; Bernhardsson, Erik; Faithfull, Alexander
2017-01-01
This paper describes ANN-Benchmarks, a tool for evaluating the performance of in-memory approximate nearest neighbor algorithms. It provides a standard interface for measuring the performance and quality achieved by nearest neighbor algorithms on different standard data sets. It supports several...... visualise these as images, Open image in new window plots, and websites with interactive plots. ANN-Benchmarks aims to provide a constantly updated overview of the current state of the art of k-NN algorithms. In the short term, this overview allows users to choose the correct k-NN algorithm and parameters...... for their similarity search task; in the longer term, algorithm designers will be able to use this overview to test and refine automatic parameter tuning. The paper gives an overview of the system, evaluates the results of the benchmark, and points out directions for future work. Interestingly, very different...
The calculational VVER burnup Credit Benchmark No.3 results with the ENDF/B-VI rev.5 (1999)
Energy Technology Data Exchange (ETDEWEB)
Rodriguez Gual, Maritza [Centro de Tecnologia Nuclear, La Habana (Cuba). E-mail: mrgual@ctn.isctn.edu.cu
2000-07-01
The purpose of this papers to present the results of CB3 phase of the VVER calculational benchmark with the recent evaluated nuclear data library ENDF/B-VI Rev.5 (1999). This results are compared with the obtained from the other participants in the calculations (Czech Republic, Finland, Hungary, Slovaquia, Spain and the United Kingdom). The phase (CB3) of the VVER calculation benchmark is similar to the Phase II-A of the OECD/NEA/INSC BUC Working Group benchmark for PWR. The cases without burnup profile (BP) were performed with the WIMS/D-4 code. The rest of the cases have been carried with DOTIII discrete ordinates code. The neutron library used was the ENDF/B-VI rev. 5 (1999). The WIMS/D-4 (69 groups) is used to collapse cross sections from the ENDF/B-VI Rev. 5 (1999) to 36 groups working library for 2-D calculations. This work also comprises the results of CB1 (obtained with ENDF/B-VI rev. 5 (1999), too) and CB3 for cases with Burnup of 30 MWd/TU and cooling time of 1 and 5 years and for case with Burnup of 40 MWd/TU and cooling time of 1 year. (author)
Benchmarking Swiss electricity grids
International Nuclear Information System (INIS)
Walti, N.O.; Weber, Ch.
2001-01-01
This extensive article describes a pilot benchmarking project initiated by the Swiss Association of Electricity Enterprises that assessed 37 Swiss utilities. The data collected from these utilities on a voluntary basis included data on technical infrastructure, investments and operating costs. These various factors are listed and discussed in detail. The assessment methods and rating mechanisms that provided the benchmarks are discussed and the results of the pilot study are presented that are to form the basis of benchmarking procedures for the grid regulation authorities under the planned Switzerland's electricity market law. Examples of the practical use of the benchmarking methods are given and cost-efficiency questions still open in the area of investment and operating costs are listed. Prefaces by the Swiss Association of Electricity Enterprises and the Swiss Federal Office of Energy complete the article
Benchmarking in Czech Higher Education
Plaček Michal; Ochrana František; Půček Milan
2015-01-01
The first part of this article surveys the current experience with the use of benchmarking at Czech universities specializing in economics and management. The results indicate that collaborative benchmarking is not used on this level today, but most actors show some interest in its introduction. The expression of the need for it and the importance of benchmarking as a very suitable performance-management tool in less developed countries are the impetus for the second part of our article. Base...
Chen, Yuntian; Zhang, Yan; Femius Koenderink, A
2017-09-04
We study semi-analytically the light emission and absorption properties of arbitrary stratified photonic structures with embedded two-dimensional magnetoelectric point scattering lattices, as used in recent plasmon-enhanced LEDs and solar cells. By employing dyadic Green's function for the layered structure in combination with the Ewald lattice summation to deal with the particle lattice, we develop an efficient method to study the coupling between planar 2D scattering lattices of plasmonic, or metamaterial point particles, coupled to layered structures. Using the 'array scanning method' we deal with localized sources. Firstly, we apply our method to light emission enhancement of dipole emitters in slab waveguides, mediated by plasmonic lattices. We benchmark the array scanning method against a reciprocity-based approach to find that the calculated radiative rate enhancement in k-space below the light cone shows excellent agreement. Secondly, we apply our method to study absorption-enhancement in thin-film solar cells mediated by periodic Ag nanoparticle arrays. Lastly, we study the emission distribution in k-space of a coupled waveguide-lattice system. In particular, we explore the dark mode excitation on the plasmonic lattice using the so-called array scanning method. Our method could be useful for simulating a broad range of complex nanophotonic structures, i.e., metasurfaces, plasmon-enhanced light emitting systems and photovoltaics.
The impact of ENDF/B-VI Rev. 3 data on thermal reactor lattices
International Nuclear Information System (INIS)
Trkov, A.
1995-10-01
The ENDF/B-VI Revision 3 files have been released through the International Atomic Energy Agency. The data for hydrogen, aluminium and uranium-235 were processed to prepare an updated WIMS-D library. Thermal benchmark lattices TRX, BAPL and DIMPLE were analyzed. The new data for the thermal scattering laws of hydrogen bound in water had no significant influence on the integral parameters. The effect of the new uranium-235 data was to reduce the lattice multiplication factor by up to 0.3% Δ k/k. The effect of the new aluminium data was also non-negligible. It was traced to the change in the interpolation law for the total and the capture cross sections, which seems incorrect. (author). 8 refs, 1 fig., 2 tabs
IRPHE/B and W-SS-LATTICE, Spectral Shift Reactor Lattice Experiments
International Nuclear Information System (INIS)
2003-01-01
Description: B and W has performed and analysed a series of physics experiments basically concerned with the technology of heterogeneous reactors moderated and cooled by a variable mixture of heavy and light water. A reactor so moderated is termed Spectral Shift Control Reactor (S SCR). In the practical application of this concept, the moderator mixture is rich in heavy water at the beginning of core life, so a relatively large fraction of the neutrons are epithermal and are absorbed in the fertile material. As fuel is consumed, the moderator is diluted with light water. In this way the neutron spectrum is shifted, thereby increasing the proportion of thermal neutrons and the reactivity of the system. The general objective of the S SCR Basic Physics Program was to study the nuclear properties of rod lattices moderated by D 2 O-H 2 O mixtures. The volume ratio of moderator to non-moderator in all lattices was approximately 1.0, and the fuel was either 4%-enriched UO 2 clad in stainless steel or 93%-enriched UO 2 -ThO 2 (Nth/N 15) pellets clad in aluminum. The D 2 O concentration in the moderator ranged from zero to about 90 mole %. The experimental program includes critical experiments with both types of fuel, exponential experiments at room temperature with both types of fuel, exponential experiments at elevated temperatures with the 4%-enriched UO 2 fuel, and neutron age measurements in ThO 2 lattices. The theoretical program included the development of calculation methods applicable to these systems, and the analysis and correlation of the experimental data. A first report provides the results of critical experiments performed under the Spectral Shift Control Reactor Basic Physics Program. A second report documents experimental results and theoretical interpretation of a series of twenty uniform lattice critical experiments in which the neutron spectrum is varied over a fairly broad range. A third report addresses issues that bear on the problems associated with
Comparison of typical inelastic analysis predictions with benchmark problem experimental results
International Nuclear Information System (INIS)
Clinard, J.A.; Corum, J.M.; Sartory, W.K.
1975-01-01
The results of exemplary inelastic analyses for experimental benchmark problems on reactor components are presented. Consistent analytical procedures and constitutive relations were used in each of the analyses, and the material behavior data presented in the Appendix were used in all cases. Two finite-element inelastic computer programs were employed. These programs implement the analysis procedures and constitutive equations for type 304 stainless steel that are currently used in many analyses of elevated-temperature nuclear reactor system components. The analysis procedures and constitutive relations are briefly discussed, and representative analytical results are presented and compared to the test data. The results that are presented demonstrate the feasibility of performing inelastic analyses for the types of problems discussed, and they are indicative of the general level of agreement that the analyst might expect when using conventional inelastic analysis procedures. (U.S.)
Radiation Detection Computational Benchmark Scenarios
Energy Technology Data Exchange (ETDEWEB)
Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.; McDonald, Ben S.
2013-09-24
Modeling forms an important component of radiation detection development, allowing for testing of new detector designs, evaluation of existing equipment against a wide variety of potential threat sources, and assessing operation performance of radiation detection systems. This can, however, result in large and complex scenarios which are time consuming to model. A variety of approaches to radiation transport modeling exist with complementary strengths and weaknesses for different problems. This variety of approaches, and the development of promising new tools (such as ORNL’s ADVANTG) which combine benefits of multiple approaches, illustrates the need for a means of evaluating or comparing different techniques for radiation detection problems. This report presents a set of 9 benchmark problems for comparing different types of radiation transport calculations, identifying appropriate tools for classes of problems, and testing and guiding the development of new methods. The benchmarks were drawn primarily from existing or previous calculations with a preference for scenarios which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22. From a technical perspective, the benchmarks were chosen to span a range of difficulty and to include gamma transport, neutron transport, or both and represent different important physical processes and a range of sensitivity to angular or energy fidelity. Following benchmark identification, existing information about geometry, measurements, and previous calculations were assembled. Monte Carlo results (MCNP decks) were reviewed or created and re-run in order to attain accurate computational times and to verify agreement with experimental data, when present. Benchmark information was then conveyed to ORNL in order to guide testing and development of hybrid calculations. The results of those ADVANTG calculations were then sent to PNNL for
Benchmarks for GADRAS performance validation
International Nuclear Information System (INIS)
Mattingly, John K.; Mitchell, Dean James; Rhykerd, Charles L. Jr.
2009-01-01
The performance of the Gamma Detector Response and Analysis Software (GADRAS) was validated by comparing GADRAS model results to experimental measurements for a series of benchmark sources. Sources for the benchmark include a plutonium metal sphere, bare and shielded in polyethylene, plutonium oxide in cans, a highly enriched uranium sphere, bare and shielded in polyethylene, a depleted uranium shell and spheres, and a natural uranium sphere. The benchmark experimental data were previously acquired and consist of careful collection of background and calibration source spectra along with the source spectra. The calibration data were fit with GADRAS to determine response functions for the detector in each experiment. A one-dimensional model (pie chart) was constructed for each source based on the dimensions of the benchmark source. The GADRAS code made a forward calculation from each model to predict the radiation spectrum for the detector used in the benchmark experiment. The comparisons between the GADRAS calculation and the experimental measurements are excellent, validating that GADRAS can correctly predict the radiation spectra for these well-defined benchmark sources.
Benchmarking in Czech Higher Education
Directory of Open Access Journals (Sweden)
Plaček Michal
2015-12-01
Full Text Available The first part of this article surveys the current experience with the use of benchmarking at Czech universities specializing in economics and management. The results indicate that collaborative benchmarking is not used on this level today, but most actors show some interest in its introduction. The expression of the need for it and the importance of benchmarking as a very suitable performance-management tool in less developed countries are the impetus for the second part of our article. Based on an analysis of the current situation and existing needs in the Czech Republic, as well as on a comparison with international experience, recommendations for public policy are made, which lie in the design of a model of a collaborative benchmarking for Czech economics and management in higher-education programs. Because the fully complex model cannot be implemented immediately – which is also confirmed by structured interviews with academics who have practical experience with benchmarking –, the final model is designed as a multi-stage model. This approach helps eliminate major barriers to the implementation of benchmarking.
Better than $1/Mflops substained: a scalable PC-based parallel computer for lattice QCD
International Nuclear Information System (INIS)
Fodor, Z.; Papp, G.
2002-02-01
We study the feasibility of a PC-based parallel computer for medium to large scale lattice QCD simulations. Our cluster built at the Eoetvoes Univ., Inst. Theor. Phys. consists of 137 Intel P4-1.7 GHz nodes with 512 MB RDRAM. The 32-bit, single precision sustained performance for dynamical QCD without communication is 1510 Mflops/node with Wilson and 970 Mflops/node with staggered fermions. This gives a total performance of 208 Gflops for Wilson and 133 Gflops for staggered QCD, respectively (for 64-bit applications the performance is approximately halved). The novel feature of our system is its communication architecture. In order to have a scalable, cost-effective machine we use Gigabit Ethernet cards for nearest-neighbor communications in a two-dimensional mesh. This type of communication is cost effective (only 30% of the hardware costs is spent on the communication). According to our benchmark measurements this type of communication results in around 40% communication time fraction for lattices upto 48 3 . 96 in full QCD simulations. The price/sustained-perfomance ratio for full QCD is better than $1/Mflops for Wilson (and around $1.5/Mflops for staggered) quarks for practically any lattice size, which can fit in our parallel computer. (orig.)
International Nuclear Information System (INIS)
Catterall, Simon
2013-01-01
Discretization of supersymmetric theories is an old problem in lattice field theory. It has resisted solution until quite recently when new ideas drawn from orbifold constructions and topological field theory have been brought to bear on the question. The result has been the creation of a new class of lattice gauge theory in which the lattice action is invariant under one or more supersymmetries. The resultant theories are local and free of doublers and in the case of Yang-Mills theories also possess exact gauge invariance. In principle they form the basis for a truly non-perturbative definition of the continuum supersymmetric field theory. In this talk these ideas are reviewed with particular emphasis being placed on N = 4 super Yang-Mills theory.
Lattice QCD results for the HVP contribution to the anomalous magnetic moments of leptons
Directory of Open Access Journals (Sweden)
Borsanyi Szabolcs
2018-01-01
Full Text Available We present lattice QCD results by the Budapest-Marseille-Wuppertal (BMW Collaboration for the leading-order contribution of the hadron vacuum polarization (LOHVP to the anomalous magnetic moments of all charged leptons. Calculations are performed with u, d, s and c quarks at their physical masses, in volumes of linear extent larger than 6 fm, and at six values of the lattice spacing, allowing for controlled continuum extrapolations. All connected and disconnected contributions are calculated for not only the muon but also the electron and tau anomalous magnetic moments. Systematic uncertainties are thoroughly discussed and comparisons with other calculations and phenomenological estimates are made.
Lattice QCD results for the HVP contribution to the anomalous magnetic moments of leptons
2018-03-01
We present lattice QCD results by the Budapest-Marseille-Wuppertal (BMW) Collaboration for the leading-order contribution of the hadron vacuum polarization (LOHVP) to the anomalous magnetic moments of all charged leptons. Calculations are performed with u, d, s and c quarks at their physical masses, in volumes of linear extent larger than 6 fm, and at six values of the lattice spacing, allowing for controlled continuum extrapolations. All connected and disconnected contributions are calculated for not only the muon but also the electron and tau anomalous magnetic moments. Systematic uncertainties are thoroughly discussed and comparisons with other calculations and phenomenological estimates are made.
Infinite projected entangled-pair state algorithm for ruby and triangle-honeycomb lattices
Jahromi, Saeed S.; Orús, Román; Kargarian, Mehdi; Langari, Abdollah
2018-03-01
The infinite projected entangled-pair state (iPEPS) algorithm is one of the most efficient techniques for studying the ground-state properties of two-dimensional quantum lattice Hamiltonians in the thermodynamic limit. Here, we show how the algorithm can be adapted to explore nearest-neighbor local Hamiltonians on the ruby and triangle-honeycomb lattices, using the corner transfer matrix (CTM) renormalization group for 2D tensor network contraction. Additionally, we show how the CTM method can be used to calculate the ground-state fidelity per lattice site and the boundary density operator and entanglement entropy (EE) on an infinite cylinder. As a benchmark, we apply the iPEPS method to the ruby model with anisotropic interactions and explore the ground-state properties of the system. We further extract the phase diagram of the model in different regimes of the couplings by measuring two-point correlators, ground-state fidelity, and EE on an infinite cylinder. Our phase diagram is in agreement with previous studies of the model by exact diagonalization.
Analysis of the international criticality benchmark no 19 of a realistic fuel dissolver
International Nuclear Information System (INIS)
Smith, H.J.; Santamarina, A.
1991-01-01
The dispersion of the order of 12000 pcm in the results of the international criticality fuel dissolver benchmark calculation, exercise OECD/19, showed the necessity of analysing the calculational methods used in this case. The APOLLO/PIC method developed to treat this type of problem permits us to propose international reference values. The problem studied here, led us to investigate two supplementary parameters in addition to the double heterogeneity of the fuel: the reactivity variation as a function of moderation and the effects of the size of the fuel pellets during dissolution. The following conclusions were obtained: The fast cross-section sets used by the international SCALE package introduces a bias of - 3000 pcm in undermoderated lattices. More generally, the fast and resonance nuclear data in criticality codes are not sufficiently reliable. Geometries with micro-pellets led to an underestimation of reactivity at the end of dissolution of 3000 pcm in certain 1988 Sn calculations; this bias was avoided in the up-dated 1990 computation because of a correct use of calculation tools. The reactivity introduced by the dissolved fuel is underestimated by 3000 pcm in contributions based on the standard NITAWL module in the SCALE code. More generally, the neutron balance analysis pointed out that standard ND self shielding formalism cannot account for 238 U resonance mutual self-shielding in the pellet-fissile liquor interaction. The combination of these three types of bias explain the underestimation of all of the international contributions of the reactivity of dissolver lattices by -2000 to -6000 pcm. The improved 1990 calculations confirm the need to use rigorous methods in the calculation of systems which involve the fuel double heterogeneity. This study points out the importance of periodic benchmarking exercises for probing the efficacity of criticality codes, data libraries and the users
International Nuclear Information System (INIS)
Creutz, M.
1983-04-01
In the last few years lattice gauge theory has become the primary tool for the study of nonperturbative phenomena in gauge theories. The lattice serves as an ultraviolet cutoff, rendering the theory well defined and amenable to numerical and analytical work. Of course, as with any cutoff, at the end of a calculation one must consider the limit of vanishing lattice spacing in order to draw conclusions on the physical continuum limit theory. The lattice has the advantage over other regulators that it is not tied to the Feynman expansion. This opens the possibility of other approximation schemes than conventional perturbation theory. Thus Wilson used a high temperature expansion to demonstrate confinement in the strong coupling limit. Monte Carlo simulations have dominated the research in lattice gauge theory for the last four years, giving first principle calculations of nonperturbative parameters characterizing the continuum limit. Some of the recent results with lattice calculations are reviewed
Toxicological Benchmarks for Wildlife
Energy Technology Data Exchange (ETDEWEB)
Sample, B.E. Opresko, D.M. Suter, G.W.
1993-01-01
Ecological risks of environmental contaminants are evaluated by using a two-tiered process. In the first tier, a screening assessment is performed where concentrations of contaminants in the environment are compared to no observed adverse effects level (NOAEL)-based toxicological benchmarks. These benchmarks represent concentrations of chemicals (i.e., concentrations presumed to be nonhazardous to the biota) in environmental media (water, sediment, soil, food, etc.). While exceedance of these benchmarks does not indicate any particular level or type of risk, concentrations below the benchmarks should not result in significant effects. In practice, when contaminant concentrations in food or water resources are less than these toxicological benchmarks, the contaminants may be excluded from further consideration. However, if the concentration of a contaminant exceeds a benchmark, that contaminant should be retained as a contaminant of potential concern (COPC) and investigated further. The second tier in ecological risk assessment, the baseline ecological risk assessment, may use toxicological benchmarks as part of a weight-of-evidence approach (Suter 1993). Under this approach, based toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. Other sources of evidence include media toxicity tests, surveys of biota (abundance and diversity), measures of contaminant body burdens, and biomarkers. This report presents NOAEL- and lowest observed adverse effects level (LOAEL)-based toxicological benchmarks for assessment of effects of 85 chemicals on 9 representative mammalian wildlife species (short-tailed shrew, little brown bat, meadow vole, white-footed mouse, cottontail rabbit, mink, red fox, and whitetail deer) or 11 avian wildlife species (American robin, rough-winged swallow, American woodcock, wild turkey, belted kingfisher, great blue heron, barred owl, barn owl, Cooper's hawk, and red
Shielding Benchmark Computational Analysis
International Nuclear Information System (INIS)
Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.
2000-01-01
Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)
International Nuclear Information System (INIS)
Boyarinov, V. F.; Bryzgalov, V. I.; Davidenko, V. D.; Fomichenko, P. A.; Glushkov, E. S.; Gomin, E. A.; Gurevich, M. I.; Kodochigov, N. G.; Marova, E. V.; Mitenkova, E. F.; Novikov, N. V.; Osipov, S. L.; Sukharev, Y. P.; Tsibulsky, V. F.; Yudkevich, M. S.
2008-01-01
The paper presents a description of benchmark cases, achieved results, analysis of possible reasons of differences of calculation results obtained by various neutronic codes. The comparative analysis is presented showing the benchmark-results obtained with reference and design codes by Russian specialists (WIMS-D, JAR-HTGR, UNK, MCU, MCNP5-MONTEBURNS1.0-ORIGEN2.0), by French specialists (AP0LL02, TRIP0LI4 codes), and by Korean specialists (HELIOS, MASTER, MCNP5 codes). The analysis of possible reasons for deviations was carried out, which was aimed at the decrease of uncertainties in calculated characteristics. This additional investigation was conducted with the use of 2D models of a fuel assembly cell and a reactor plane section. (authors)
The Isprs Benchmark on Indoor Modelling
Khoshelham, K.; Díaz Vilariño, L.; Peter, M.; Kang, Z.; Acharya, D.
2017-09-01
Automated generation of 3D indoor models from point cloud data has been a topic of intensive research in recent years. While results on various datasets have been reported in literature, a comparison of the performance of different methods has not been possible due to the lack of benchmark datasets and a common evaluation framework. The ISPRS benchmark on indoor modelling aims to address this issue by providing a public benchmark dataset and an evaluation framework for performance comparison of indoor modelling methods. In this paper, we present the benchmark dataset comprising several point clouds of indoor environments captured by different sensors. We also discuss the evaluation and comparison of indoor modelling methods based on manually created reference models and appropriate quality evaluation criteria. The benchmark dataset is available for download at: html"target="_blank">http://www2.isprs.org/commissions/comm4/wg5/benchmark-on-indoor-modelling.html.
Criticality safety benchmarking of PASC-3 and ECNJEF1.1
International Nuclear Information System (INIS)
Li, J.
1992-09-01
To validate the code system PASC-3 and the multigroup cross section library ECNJEF1.1 on various applications many benchmarks are required. This report presents the results of critically safety benchmarking for five calculational and four experimental benchmarks. These benchmarks are related to the transport package of fissile materials such as spent fuel. The fissile nuclides in these benchmarks are 235 U and 239 Pu. The modules of PASC-3 which have been used for the calculations are BONAMI, NITAWL and KENO.5A. The final results for the experimental benchmarks do agree well with experimental data. For the calculational benchmarks the results presented here are in reasonable agreement with the results from other investigations. (author). 8 refs.; 20 figs.; 5 tabs
Benchmarking the energy efficiency of commercial buildings
International Nuclear Information System (INIS)
Chung, William; Hui, Y.V.; Lam, Y. Miu
2006-01-01
Benchmarking energy-efficiency is an important tool to promote the efficient use of energy in commercial buildings. Benchmarking models are mostly constructed in a simple benchmark table (percentile table) of energy use, which is normalized with floor area and temperature. This paper describes a benchmarking process for energy efficiency by means of multiple regression analysis, where the relationship between energy-use intensities (EUIs) and the explanatory factors (e.g., operating hours) is developed. Using the resulting regression model, these EUIs are then normalized by removing the effect of deviance in the significant explanatory factors. The empirical cumulative distribution of the normalized EUI gives a benchmark table (or percentile table of EUI) for benchmarking an observed EUI. The advantage of this approach is that the benchmark table represents a normalized distribution of EUI, taking into account all the significant explanatory factors that affect energy consumption. An application to supermarkets is presented to illustrate the development and the use of the benchmarking method
Full sphere hydrodynamic and dynamo benchmarks
Marti, P.
2014-01-26
Convection in planetary cores can generate fluid flow and magnetic fields, and a number of sophisticated codes exist to simulate the dynamic behaviour of such systems. We report on the first community activity to compare numerical results of computer codes designed to calculate fluid flow within a whole sphere. The flows are incompressible and rapidly rotating and the forcing of the flow is either due to thermal convection or due to moving boundaries. All problems defined have solutions that alloweasy comparison, since they are either steady, slowly drifting or perfectly periodic. The first two benchmarks are defined based on uniform internal heating within the sphere under the Boussinesq approximation with boundary conditions that are uniform in temperature and stress-free for the flow. Benchmark 1 is purely hydrodynamic, and has a drifting solution. Benchmark 2 is a magnetohydrodynamic benchmark that can generate oscillatory, purely periodic, flows and magnetic fields. In contrast, Benchmark 3 is a hydrodynamic rotating bubble benchmark using no slip boundary conditions that has a stationary solution. Results from a variety of types of code are reported, including codes that are fully spectral (based on spherical harmonic expansions in angular coordinates and polynomial expansions in radius), mixed spectral and finite difference, finite volume, finite element and also a mixed Fourier-finite element code. There is good agreement between codes. It is found that in Benchmarks 1 and 2, the approximation of a whole sphere problem by a domain that is a spherical shell (a sphere possessing an inner core) does not represent an adequate approximation to the system, since the results differ from whole sphere results. © The Authors 2014. Published by Oxford University Press on behalf of The Royal Astronomical Society.
Benchmarking in University Toolbox
Directory of Open Access Journals (Sweden)
Katarzyna Kuźmicz
2015-06-01
Full Text Available In the face of global competition and rising challenges that higher education institutions (HEIs meet, it is imperative to increase innovativeness and efficiency of their management. Benchmarking can be the appropriate tool to search for a point of reference necessary to assess institution’s competitive position and learn from the best in order to improve. The primary purpose of the paper is to present in-depth analysis of benchmarking application in HEIs worldwide. The study involves indicating premises of using benchmarking in HEIs. It also contains detailed examination of types, approaches and scope of benchmarking initiatives. The thorough insight of benchmarking applications enabled developing classification of benchmarking undertakings in HEIs. The paper includes review of the most recent benchmarking projects and relating them to the classification according to the elaborated criteria (geographical range, scope, type of data, subject, support and continuity. The presented examples were chosen in order to exemplify different approaches to benchmarking in higher education setting. The study was performed on the basis of the published reports from benchmarking projects, scientific literature and the experience of the author from the active participation in benchmarking projects. The paper concludes with recommendations for university managers undertaking benchmarking, derived on the basis of the conducted analysis.
On Traveling Waves in Lattices: The Case of Riccati Lattices
Dimitrova, Zlatinka
2012-09-01
The method of simplest equation is applied for analysis of a class of lattices described by differential-difference equations that admit traveling-wave solutions constructed on the basis of the solution of the Riccati equation. We denote such lattices as Riccati lattices. We search for Riccati lattices within two classes of lattices: generalized Lotka-Volterra lattices and generalized Holling lattices. We show that from the class of generalized Lotka-Volterra lattices only the Wadati lattice belongs to the class of Riccati lattices. Opposite to this many lattices from the Holling class are Riccati lattices. We construct exact traveling wave solutions on the basis of the solution of Riccati equation for three members of the class of generalized Holling lattices.
Vortex lattices in layered superconductors
International Nuclear Information System (INIS)
Prokic, V.; Davidovic, D.; Dobrosavljevic-Grujic, L.
1995-01-01
We study vortex lattices in a superconductor--normal-metal superlattice in a parallel magnetic field. Distorted lattices, resulting from the shear deformations along the layers, are found to be unstable. Under field variation, nonequilibrium configurations undergo an infinite sequence of continuous transitions, typical for soft lattices. The equilibrium vortex arrangement is always a lattice of isocell triangles, without shear
LATTICE: an interactive lattice computer code
International Nuclear Information System (INIS)
Staples, J.
1976-10-01
LATTICE is a computer code which enables an interactive user to calculate the functions of a synchrotron lattice. This program satisfies the requirements at LBL for a simple interactive lattice program by borrowing ideas from both TRANSPORT and SYNCH. A fitting routine is included
Directory of Open Access Journals (Sweden)
Wiji Suwarno
2017-02-01
Full Text Available The term benchmarking has been encountered in the implementation of total quality (TQM or in Indonesian termed holistic quality management because benchmarking is a tool to look for ideas or learn from the library. Benchmarking is a processof measuring and comparing for continuous business process of systematic and continuous measurement, the process of measuring and comparing for continuous business process of an organization to get information that can help these organization improve their performance efforts.
Exact results on the one-dimensional Potts lattice gas
International Nuclear Information System (INIS)
Riera, R.; Chaves, C.M.G.F.
1982-12-01
An exact calculation of the Potts Lattice Gas in one dimension is presented. Close to T=O 0 K, the uniform susceptibility presents an essencial singularity, when the excharge parameter is positive, and a power law behaviour with critical exponent γ=1, when this parameter is negative. (Author) [pt
Exact results on the one-dimensional Potts lattice gas
International Nuclear Information System (INIS)
Riera, R.; Chaves, C.M.G.F.
1983-01-01
An exact calculation of the Potts Lattice Gas in one dimension is presented. Close to T=O 0 K, the uniform susceptibility presents an essential singularity, when the exchange parameter is positive, and a power law behaviour with critical exponent γ=1, when this parameter is negative. (Author) [pt
FENDL neutronics benchmark: Specifications for the calculational neutronics and shielding benchmark
International Nuclear Information System (INIS)
Sawan, M.E.
1994-12-01
During the IAEA Advisory Group Meeting on ''Improved Evaluations and Integral Data Testing for FENDL'' held in Garching near Munich, Germany in the period 12-16 September 1994, the Working Group II on ''Experimental and Calculational Benchmarks on Fusion Neutronics for ITER'' recommended that a calculational benchmark representative of the ITER design should be developed. This report describes the neutronics and shielding calculational benchmark available for scientists interested in performing analysis for this benchmark. (author)
Entropic Lattice Boltzmann: an implicit Large-Eddy Simulation?
Tauzin, Guillaume; Biferale, Luca; Sbragaglia, Mauro; Gupta, Abhineet; Toschi, Federico; Ehrhardt, Matthias; Bartel, Andreas
2017-11-01
We study the modeling of turbulence implied by the unconditionally stable Entropic Lattice Boltzmann Method (ELBM). We first focus on 2D homogeneous turbulence, for which we conduct numerical simulations for a wide range of relaxation times τ. For these simulations, we analyze the effective viscosity obtained by numerically differentiating the kinetic energy and enstrophy balance equations averaged over sub-domains of the computational grid. We aim at understanding the behavior of the implied sub-grid scale model and verify a formulation previously derived using Chapman-Enskog expansion. These ELBM benchmark simulations are thus useful to understand the range of validity of ELBM as a turbulence model. Finally, we will discuss an extension of the previously obtained results to the 3D case. Supported by the European Unions Framework Programme for Research and Innovation Horizon 2020 (2014-2020) under the Marie Sklodowska-Curie Grant Agreement No. 642069 and by the European Research Council under the ERC Grant Agreement No. 339032.
International Nuclear Information System (INIS)
Shindler, A.
2007-07-01
I review the theoretical foundations, properties as well as the simulation results obtained so far of a variant of the Wilson lattice QCD formulation: Wilson twisted mass lattice QCD. Emphasis is put on the discretization errors and on the effects of these discretization errors on the phase structure for Wilson-like fermions in the chiral limit. The possibility to use in lattice simulations different lattice actions for sea and valence quarks to ease the renormalization patterns of phenomenologically relevant local operators, is also discussed. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Shindler, A. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC
2007-07-15
I review the theoretical foundations, properties as well as the simulation results obtained so far of a variant of the Wilson lattice QCD formulation: Wilson twisted mass lattice QCD. Emphasis is put on the discretization errors and on the effects of these discretization errors on the phase structure for Wilson-like fermions in the chiral limit. The possibility to use in lattice simulations different lattice actions for sea and valence quarks to ease the renormalization patterns of phenomenologically relevant local operators, is also discussed. (orig.)
Benchmarking and Performance Measurement.
Town, J. Stephen
This paper defines benchmarking and its relationship to quality management, describes a project which applied the technique in a library context, and explores the relationship between performance measurement and benchmarking. Numerous benchmarking methods contain similar elements: deciding what to benchmark; identifying partners; gathering…
WWER-1000 Burnup Credit Benchmark (CB5)
International Nuclear Information System (INIS)
Manolova, M.A.
2002-01-01
In the paper the specification of WWER-1000 Burnup Credit Benchmark first phase (depletion calculations), given. The second phase - criticality calculations for the WWER-1000 fuel pin cell, will be given after the evaluation of the results, obtained at the first phase. The proposed benchmark is a continuation of the WWER benchmark activities in this field (Author)
Analysis of a computational benchmark for a high-temperature reactor using SCALE
International Nuclear Information System (INIS)
Goluoglu, S.
2006-01-01
Several proposed advanced reactor concepts require methods to address effects of double heterogeneity. In doubly heterogeneous systems, heterogeneous fuel particles in a moderator matrix form the fuel region of the fuel element and thus constitute the first level of heterogeneity. Fuel elements themselves are also heterogeneous with fuel and moderator or reflector regions, forming the second level of heterogeneity. The fuel elements may also form regular or irregular lattices. A five-phase computational benchmark for a high-temperature reactor (HTR) fuelled with uranium or reactor-grade plutonium has been defined by the Organization for Economic Cooperation and Development, Nuclear Energy Agency (OECD NEA), Nuclear Science Committee, Working Party on the Physics of Plutonium Fuels and Innovative Fuel Cycles. This paper summarizes the analysis results using the latest SCALE code system (to be released in CY 2006 as SCALE 5.1). (authors)
Benchmarking gate-based quantum computers
Michielsen, Kristel; Nocon, Madita; Willsch, Dennis; Jin, Fengping; Lippert, Thomas; De Raedt, Hans
2017-11-01
With the advent of public access to small gate-based quantum processors, it becomes necessary to develop a benchmarking methodology such that independent researchers can validate the operation of these processors. We explore the usefulness of a number of simple quantum circuits as benchmarks for gate-based quantum computing devices and show that circuits performing identity operations are very simple, scalable and sensitive to gate errors and are therefore very well suited for this task. We illustrate the procedure by presenting benchmark results for the IBM Quantum Experience, a cloud-based platform for gate-based quantum computing.
A Heterogeneous Medium Analytical Benchmark
International Nuclear Information System (INIS)
Ganapol, B.D.
1999-01-01
A benchmark, called benchmark BLUE, has been developed for one-group neutral particle (neutron or photon) transport in a one-dimensional sub-critical heterogeneous plane parallel medium with surface illumination. General anisotropic scattering is accommodated through the Green's Function Method (GFM). Numerical Fourier transform inversion is used to generate the required Green's functions which are kernels to coupled integral equations that give the exiting angular fluxes. The interior scalar flux is then obtained through quadrature. A compound iterative procedure for quadrature order and slab surface source convergence provides highly accurate benchmark qualities (4- to 5- places of accuracy) results
Benchmarking in the Netherlands
International Nuclear Information System (INIS)
1999-01-01
In two articles an overview is given of the activities in the Dutch industry and energy sector with respect to benchmarking. In benchmarking operational processes of different competitive businesses are compared to improve your own performance. Benchmark covenants for energy efficiency between the Dutch government and industrial sectors contribute to a growth of the number of benchmark surveys in the energy intensive industry in the Netherlands. However, some doubt the effectiveness of the benchmark studies
Numerical study of natural convection in porous media (metals) using Lattice Boltzmann Method (LBM)
Energy Technology Data Exchange (ETDEWEB)
Zhao, C.Y., E-mail: c.y.zhao@warwick.ac.u [School of Engineering, University of Warwick, Coventry CV4 7AL (United Kingdom); School of Energy and Power Engineering, Xi' an Jiaotong University, Xi' an, Shaanxi 710049 (China); Dai, L.N.; Tang, G.H.; Qu, Z.G.; Li, Z.Y. [School of Energy and Power Engineering, Xi' an Jiaotong University, Xi' an, Shaanxi 710049 (China)
2010-10-15
A thermal lattice BGK model with doubled populations is proposed to simulate the two-dimensional natural convection flow in porous media (porous metals). The accuracy of this method is validated by the benchmark solutions. The detailed flow and heat transfer at the pore level are revealed. The effects of pore density (cell size) and porosity on the natural convection are examined. Also the effect of porous media configuration (shape) on natural convection is investigated. The results showed that the overall heat transfer will be enhanced by lowering the porosity and cell size. The square porous medium can have a higher heat transfer performance than spheres due to the strong flow mixing and more surface area.
ZZ ECN-BUBEBO, ECN-Petten Burnup Benchmark Book, Inventories, Afterheat
International Nuclear Information System (INIS)
Kloosterman, Jan Leen
1999-01-01
Description of program or function: Contains experimental benchmarks which can be used for the validation of burnup code systems and accompanied data libraries. Although the benchmarks presented here are thoroughly described in literature, it is in many cases not straightforward to retrieve unambiguously the correct input data and corresponding results from the benchmark Descriptions. Furthermore, results which can easily be measured, are sometimes difficult to calculate because of conversions to be made. Therefore, emphasis has been put to clarify the input of the benchmarks and to present the benchmark results in such a way that they can easily be calculated and compared. For more thorough Descriptions of the benchmarks themselves, the literature referred to here should be consulted. This benchmark book is divided in 11 chapters/files containing the following in text and tabular form: chapter 1: Introduction; chapter 2: Burnup Credit Criticality Benchmark Phase 1-B; chapter 3: Yankee-Rowe Core V Fuel Inventory Study; chapter 4: H.B. Robinson Unit 2 Fuel Inventory Study; chapter 5: Turkey Point Unit 3 Fuel Inventory Study; chapter 6: Turkey Point Unit 3 Afterheat Power Study; chapter 7: Dickens Benchmark on Fission Product Energy Release of U-235; chapter 8: Dickens Benchmark on Fission Product Energy Release of Pu-239; chapter 9: Yarnell Benchmark on Decay Heat Measurements of U-233; chapter 10: Yarnell Benchmark on Decay Heat Measurements of U-235; chapter 11: Yarnell Benchmark on Decay Heat Measurements of Pu-239
Energy Technology Data Exchange (ETDEWEB)
Mackenzie, Paul
1989-03-15
The forty-year dream of understanding the properties of the strongly interacting particles from first principles is now approaching reality. Quantum chromodynamics (QCD - the field theory of the quark and gluon constituents of strongly interacting particles) was initially handicapped by the severe limitations of the conventional (perturbation) approach in this picture, but Ken Wilson's inventions of lattice gauge theory and renormalization group methods opened new doors, making calculations of masses and other particle properties possible. Lattice gauge theory became a major industry around 1980, when Monte Carlo methods were introduced, and the first prototype calculations yielded qualitatively reasonable results. The promising developments over the past year were highlighted at the 1988 Symposium on Lattice Field Theory - Lattice 88 - held at Fermilab.
International Nuclear Information System (INIS)
Mackenzie, Paul
1989-01-01
The forty-year dream of understanding the properties of the strongly interacting particles from first principles is now approaching reality. Quantum chromodynamics (QCD - the field theory of the quark and gluon constituents of strongly interacting particles) was initially handicapped by the severe limitations of the conventional (perturbation) approach in this picture, but Ken Wilson's inventions of lattice gauge theory and renormalization group methods opened new doors, making calculations of masses and other particle properties possible. Lattice gauge theory became a major industry around 1980, when Monte Carlo methods were introduced, and the first prototype calculations yielded qualitatively reasonable results. The promising developments over the past year were highlighted at the 1988 Symposium on Lattice Field Theory - Lattice 88 - held at Fermilab
Benchmarking infrastructure for mutation text mining
2014-01-01
Background Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. Results We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. Conclusion We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption. PMID:24568600
Review of lattice results concerning low-energy particle physics
DEFF Research Database (Denmark)
Aoki, Sinya; Aoki, Yasumichi; Bečirević, D.
2017-01-01
We review lattice results related to pion, kaon, D- and B-meson physics with the aim of making them easily accessible to the particle-physics community. More specifically, we report on the determination of the light-quark masses, the form factor f+(0) , arising in the semileptonic K→ π transition...... review the determination of the BK parameter of neutral kaon mixing as well as the additional four B parameters that arise in theories of physics beyond the Standard Model. The latter quantities are an addition compared to the previous review. For the heavy-quark sector, we provide results for mc and mb...... (also new compared to the previous review), as well as those for D- and B-meson-decay constants, form factors, and mixing parameters. These are the heavy-quark quantities most relevant for the determination of CKM matrix elements and the global CKM unitarity-triangle fit. Finally, we review the status...
Salomons, Erik M; Lohman, Walter J A; Zhou, Han
2016-01-01
Propagation of sound waves in air can be considered as a special case of fluid dynamics. Consequently, the lattice Boltzmann method (LBM) for fluid flow can be used for simulating sound propagation. In this article application of the LBM to sound propagation is illustrated for various cases: free-field propagation, propagation over porous and non-porous ground, propagation over a noise barrier, and propagation in an atmosphere with wind. LBM results are compared with solutions of the equations of acoustics. It is found that the LBM works well for sound waves, but dissipation of sound waves with the LBM is generally much larger than real dissipation of sound waves in air. To circumvent this problem it is proposed here to use the LBM for assessing the excess sound level, i.e. the difference between the sound level and the free-field sound level. The effect of dissipation on the excess sound level is much smaller than the effect on the sound level, so the LBM can be used to estimate the excess sound level for a non-dissipative atmosphere, which is a useful quantity in atmospheric acoustics. To reduce dissipation in an LBM simulation two approaches are considered: i) reduction of the kinematic viscosity and ii) reduction of the lattice spacing.
Benchmarking electricity distribution
Energy Technology Data Exchange (ETDEWEB)
Watts, K. [Department of Justice and Attorney-General, QLD (Australia)
1995-12-31
Benchmarking has been described as a method of continuous improvement that involves an ongoing and systematic evaluation and incorporation of external products, services and processes recognised as representing best practice. It is a management tool similar to total quality management (TQM) and business process re-engineering (BPR), and is best used as part of a total package. This paper discusses benchmarking models and approaches and suggests a few key performance indicators that could be applied to benchmarking electricity distribution utilities. Some recent benchmarking studies are used as examples and briefly discussed. It is concluded that benchmarking is a strong tool to be added to the range of techniques that can be used by electricity distribution utilities and other organizations in search of continuous improvement, and that there is now a high level of interest in Australia. Benchmarking represents an opportunity for organizations to approach learning from others in a disciplined and highly productive way, which will complement the other micro-economic reforms being implemented in Australia. (author). 26 refs.
Jet Substructure at the Tevatron and LHC: New results, new tools, new benchmarks
Altheimer, A; Asquith, L; Brooijmans, G; Butterworth, J; Campanelli, M; Chapleau, B; Cholakian, A E; Chou, J P; Dasgupta, M; Davison, A; Dolen, J; Ellis, S D; Essig, R; Fan, J J; Field, R; Fregoso, A; Gallicchio, J; Gershtein, Y; Gomes, A; Haas, A; Halkiadakis, E; Halyo, V; Hoeche, S; Hook, A; Hornig, A; Huang, P; Izaguirre, E; Jankowiak, M; Kribs, G; Krohn, D; Larkoski, A J; Lath, A; Lee, C; Lee, S J; Loch, P; Maksimovic, P; Martinez, M; Miller, D W; Plehn, T; Prokofiev, K; Rahmat, R; Rappoccio, S; Safonov, A; Salam, G P; Schumann, S; Schwartz, M D; Schwartzman, A; Seymour, M; Shao, J; Sinervo, P; Son, M; Soper, D E; Spannowsky, M; Stewart, I W; Strassler, M; Strauss, E; Takeuchi, M; Thaler, J; Thomas, S; Tweedie, B; Vasquez Sierra, R; Vermilion, C K; Villaplana, M; Vos, M; Wacker, J; Walker, D; Walsh, J R; Wang, L-T; Wilbur, S; Yavin, I; Zhu, W
2012-01-01
In this report we review recent theoretical progress and the latest experimental results in jet substructure from the Tevatron and the LHC. We review the status of and outlook for calculation and simulation tools for studying jet substructure. Following up on the report of the Boost 2010 workshop, we present a new set of benchmark comparisons of substructure techniques, focusing on the set of variables and grooming methods that are collectively known as "top taggers". To facilitate further exploration, we have attempted to collect, harmonise, and publish software implementations of these techniques.
DEFF Research Database (Denmark)
Rocha, Vera; Van Praag, Mirjam; Carneiro, Anabela
This paper studies three related questions: To what extent otherwise similar startups employ different quantities and qualities of human capital at the moment of entry? How persistent are initial human capital choices over time? And how does deviating from human capital benchmarks influence firm......, founders human capital, and the ownership structure of startups (solo entrepreneurs versus entrepreneurial teams). We then study the survival implications of exogenous deviations from these benchmarks, based on spline models for survival data. Our results indicate that (especially negative) deviations from...... the benchmark can be substantial, are persistent over time, and hinder the survival of firms. The implications may, however, vary according to the sector and the ownership structure at entry. Given the stickiness of initial choices, wrong human capital decisions at entry turn out to be a close to irreversible...
Energy Technology Data Exchange (ETDEWEB)
Freudenreich, W.E.; Gruppelaar, H
1998-12-01
This report contains the results of calculations made at ECN-Petten of a benchmark to study the neutronic potential of a modular fast spectrum ADS (Accelerator-Driven System) for radiotoxic waste transmutation. The study is focused on the incineration of TRans-Uranium elements (TRU), Minor Actinides (MA) and Long-Lived Fission Products (LLFP), in this case {sup 99}Tc. The benchmark exercise is made in the framework of an IAEA Co-ordinated Research Programme. A simplified description of an ADS, restricted to the reactor part, with TRU or MA fuel (k{sub eff}=0.96) has been analysed. All spectrum calculations have been performed with the Monte Carlo code MCNP-4A. The burnup calculations have been performed with the code FISPACT coupled to MCNP-4A by means of our OCTOPUS system. The cross sections are based upon JEF-2.2 for transport calculations and supplemented with EAF-4 data for inventory calculations. The determined quantities are: core dimensions, fuel inventories, system power, sensitivity on external source spectrum and waste transmutation rates. The main conclusions are: The MA-burner requires only a small accelerator current increase during burnup, in contrast to the TRU-burner. The {sup 99} Tc-burner has a large initial loading; a more effective design may be possible. 5 refs.
International Nuclear Information System (INIS)
Freudenreich, W.E.; Gruppelaar, H.
1998-12-01
This report contains the results of calculations made at ECN-Petten of a benchmark to study the neutronic potential of a modular fast spectrum ADS (Accelerator-Driven System) for radiotoxic waste transmutation. The study is focused on the incineration of TRans-Uranium elements (TRU), Minor Actinides (MA) and Long-Lived Fission Products (LLFP), in this case 99 Tc. The benchmark exercise is made in the framework of an IAEA Co-ordinated Research Programme. A simplified description of an ADS, restricted to the reactor part, with TRU or MA fuel (k eff =0.96) has been analysed. All spectrum calculations have been performed with the Monte Carlo code MCNP-4A. The burnup calculations have been performed with the code FISPACT coupled to MCNP-4A by means of our OCTOPUS system. The cross sections are based upon JEF-2.2 for transport calculations and supplemented with EAF-4 data for inventory calculations. The determined quantities are: core dimensions, fuel inventories, system power, sensitivity on external source spectrum and waste transmutation rates. The main conclusions are: The MA-burner requires only a small accelerator current increase during burnup, in contrast to the TRU-burner. The 99 Tc-burner has a large initial loading; a more effective design may be possible. 5 refs
Computing nucleon EDM on a lattice
Abramczyk, Michael; Aoki, Sinya; Blum, Tom; Izubuchi, Taku; Ohki, Hiroshi; Syritsyn, Sergey
2018-03-01
I will discuss briefly recent changes in the methodology of computing the baryon EDM on a lattice. The associated correction substantially reduces presently existing lattice values for the proton and neutron theta-induced EDMs, so that even the most precise previous lattice results become consistent with zero. On one hand, this change removes previous disagreements between these lattice results and the phenomenological estimates of the nucleon EDM. On the other hand, the nucleon EDM becomes much harder to compute on a lattice. In addition, I will review the progress in computing quark chromo-EDM-induced nucleon EDM using chiral quark action.
Computing nucleon EDM on a lattice
Energy Technology Data Exchange (ETDEWEB)
Abramczyk, Michael; Izubuchi, Taku
2017-06-18
I will discuss briefly recent changes in the methodology of computing the baryon EDM on a lattice. The associated correction substantially reduces presently existing lattice values for the proton and neutron theta-induced EDMs, so that even the most precise previous lattice results become consistent with zero. On one hand, this change removes previous disagreements between these lattice results and the phenomenological estimates of the nucleon EDM. On the other hand, the nucleon EDM becomes much harder to compute on a lattice. In addition, I will review the progress in computing quark chromo-EDM-induced nucleon EDM using chiral quark action.
Benchmarking semantic web technology
García-Castro, R
2009-01-01
This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:
Grenier, Christophe; Roux, Nicolas; Anbergen, Hauke; Collier, Nathaniel; Costard, Francois; Ferrry, Michel; Frampton, Andrew; Frederick, Jennifer; Holmen, Johan; Jost, Anne; Kokh, Samuel; Kurylyk, Barret; McKenzie, Jeffrey; Molson, John; Orgogozo, Laurent; Rivière, Agnès; Rühaak, Wolfram; Selroos, Jan-Olof; Therrien, René; Vidstrand, Patrik
2015-04-01
The impacts of climate change in boreal regions has received considerable attention recently due to the warming trends that have been experienced in recent decades and are expected to intensify in the future. Large portions of these regions, corresponding to permafrost areas, are covered by water bodies (lakes, rivers) that interact with the surrounding permafrost. For example, the thermal state of the surrounding soil influences the energy and water budget of the surface water bodies. Also, these water bodies generate taliks (unfrozen zones below) that disturb the thermal regimes of permafrost and may play a key role in the context of climate change. Recent field studies and modeling exercises indicate that a fully coupled 2D or 3D Thermo-Hydraulic (TH) approach is required to understand and model the past and future evolution of landscapes, rivers, lakes and associated groundwater systems in a changing climate. However, there is presently a paucity of 3D numerical studies of permafrost thaw and associated hydrological changes, and the lack of study can be partly attributed to the difficulty in verifying multi-dimensional results produced by numerical models. Numerical approaches can only be validated against analytical solutions for a purely thermic 1D equation with phase change (e.g. Neumann, Lunardini). When it comes to the coupled TH system (coupling two highly non-linear equations), the only possible approach is to compare the results from different codes to provided test cases and/or to have controlled experiments for validation. Such inter-code comparisons can propel discussions to try to improve code performances. A benchmark exercise was initialized in 2014 with a kick-off meeting in Paris in November. Participants from USA, Canada, Germany, Sweden and France convened, representing altogether 13 simulation codes. The benchmark exercises consist of several test cases inspired by existing literature (e.g. McKenzie et al., 2007) as well as new ones. They
DEFF Research Database (Denmark)
Peña, Alfredo
This report contains the description of a number of benchmarks with the purpose of evaluating flow models for near-shore wind resource estimation. The benchmarks are designed based on the comprehensive database of observations that the RUNE coastal experiment established from onshore lidar...
International Nuclear Information System (INIS)
Hendricks, J.S.; Whalen, D.J.; Cardon, D.A.; Uhle, J.L.
1991-01-01
Over 50 neutron benchmark calculations have recently been completed as part of an ongoing program to validate the MCNP Monte Carlo radiation transport code. The new and significant aspects of this work are as follows: These calculations are the first attempt at a validation program for MCNP and the first official benchmarking of version 4 of the code. We believe the chosen set of benchmarks is a comprehensive set that may be useful for benchmarking other radiation transport codes and data libraries. These calculations provide insight into how well neutron transport calculations can be expected to model a wide variety of problems
International Nuclear Information System (INIS)
Jersak, J.
1986-01-01
This year has brought a sudden interest in lattice Higgs models. After five years of only modest activity we now have many new results obtained both by analytic and Monte Carlo methods. This talk is a review of the present state of lattice Higgs models with particular emphasis on the recent development
Directory of Open Access Journals (Sweden)
Epelbaum E.
2010-04-01
Full Text Available We review recent progress on nuclear lattice simulations using chiral eﬀective ﬁeld theory. We discuss lattice results for dilute neutron matter at next-to-leading order, three-body forces at next-to-next-toleading order, isospin-breaking and Coulomb eﬀects, and the binding energy of light nuclei.
International Nuclear Information System (INIS)
Pesic, M.
1998-01-01
A selected set of the RB reactor benchmark cores is presented in this paper. The first results of validation of the well-known Monte Carlo MCNP TM code and adjoining neutron cross section libraries are given. They confirm the idea for the proposal of the new U-D 2 O criticality benchmark system and support the intention to include this system in the next edition of the recent OECD/NEA Project: International Handbook of Evaluated Criticality Safety Experiment, in near future. (author)
Unquenched lattice upsilon spectroscopy
International Nuclear Information System (INIS)
Marcantonio, L.M.
2001-03-01
A non-relativistic effective theory of QCD (NRQCD) is used in calculations of the upsilon spectrum. Simultaneous multi-correlation fitting routines are used to yield lattice channel energies and amplitudes. The lattice configurations used were both dynamical, with two flavours of sea quarks included in the action; and quenched, with no sea quarks. These configurations were generated by the UKQCD collaboration. The dynamical configurations used were ''matched'', having the same lattice spacing, but differing in the sea quark mass. Thus, it was possible to analyse trends of observables with sea quark mass, in the certainty that the trend isn't partially due to varying lattice spacing. The lattice spacing used for spectroscopy was derived from the lattice 1 1 P 1 - 1 3 S 1 splitting. On each set of configurations two lattice bare b quark masses were used, giving kinetic masses bracketing the physical Υ mass. The only quantity showing a strong dependence on these masses was the hyperfine splitting, so it was interpolated to the real Υ mass. The radial and orbital splittings gave good agreement with experiment. The hyperfine splitting results showed a clear signal for unquenching and the dynamical hyperfine splitting results were extrapolated to a physical sea quark mass. This result, combined with the quenched result yielded a value for the hyperfine splitting at n f = 3, predicting an η b mass of 9.517(4) GeV. The NRQCD technique for obtaining a value of the strong coupling constant in the M-barS-bar scheme was followed. Using quenched and dynamical results a value was extrapolated to n f = 3. Employing a three loop beta function to run the coupling, with suitable matching conditions at heavy quark thresholds, the final result was obtained for n f = 5 at a scale equal to the Z boson mass. This result was α(5)/MS(Mz)=0.110(4). Two methods for finding the mass of the b quark in the MS scheme were employed. The results of both methods agree within error but the
Reactor fuel depletion benchmark of TINDER
International Nuclear Information System (INIS)
Martin, W.J.; Oliveira, C.R.E. de; Hecht, A.A.
2014-01-01
Highlights: • A reactor burnup benchmark of TINDER, coupling MCNP6 to CINDER2008, was performed. • TINDER is a poor candidate for fuel depletion calculations using its current libraries. • Data library modification is necessary if fuel depletion is desired from TINDER. - Abstract: Accurate burnup calculations are key to proper nuclear reactor design, fuel cycle modeling, and disposal estimations. The TINDER code, originally designed for activation analyses, has been modified to handle full burnup calculations, including the widely used predictor–corrector feature. In order to properly characterize the performance of TINDER for this application, a benchmark calculation was performed. Although the results followed the trends of past benchmarked codes for a UO 2 PWR fuel sample from the Takahama-3 reactor, there were obvious deficiencies in the final result, likely in the nuclear data library that was used. Isotopic comparisons versus experiment and past code benchmarks are given, as well as hypothesized areas of deficiency and future work
Energy Technology Data Exchange (ETDEWEB)
DeHart, Mark D. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mausolff, Zander [Univ. of Florida, Gainesville, FL (United States); Weems, Zach [Univ. of Florida, Gainesville, FL (United States); Popp, Dustin [Univ. of Florida, Gainesville, FL (United States); Smith, Kristin [Univ. of Florida, Gainesville, FL (United States); Shriver, Forrest [Univ. of Florida, Gainesville, FL (United States); Goluoglu, Sedat [Univ. of Florida, Gainesville, FL (United States); Prince, Zachary [Texas A & M Univ., College Station, TX (United States); Ragusa, Jean [Texas A & M Univ., College Station, TX (United States)
2016-08-01
One goal of the MAMMOTH M&S project is to validate the analysis capabilities within MAMMOTH. Historical data has shown limited value for validation of full three-dimensional (3D) multi-physics methods. Initial analysis considered the TREAT startup minimum critical core and one of the startup transient tests. At present, validation is focusing on measurements taken during the M8CAL test calibration series. These exercises will valuable in preliminary assessment of the ability of MAMMOTH to perform coupled multi-physics calculations; calculations performed to date are being used to validate the neutron transport solver Rattlesnake\\cite{Rattlesnake} and the fuels performance code BISON. Other validation projects outside of TREAT are available for single-physics benchmarking. Because the transient solution capability of Rattlesnake is one of the key attributes that makes it unique for TREAT transient simulations, validation of the transient solution of Rattlesnake using other time dependent kinetics benchmarks has considerable value. The Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development (OECD) has recently developed a computational benchmark for transient simulations. This benchmark considered both two-dimensional (2D) and 3D configurations for a total number of 26 different transients. All are negative reactivity insertions, typically returning to the critical state after some time.
International Nuclear Information System (INIS)
DeHart, Mark D.; Mausolff, Zander; Weems, Zach; Popp, Dustin; Smith, Kristin; Shriver, Forrest; Goluoglu, Sedat; Prince, Zachary; Ragusa, Jean
2016-01-01
One goal of the MAMMOTH M&S project is to validate the analysis capabilities within MAMMOTH. Historical data has shown limited value for validation of full three-dimensional (3D) multi-physics methods. Initial analysis considered the TREAT startup minimum critical core and one of the startup transient tests. At present, validation is focusing on measurements taken during the M8CAL test calibration series. These exercises will valuable in preliminary assessment of the ability of MAMMOTH to perform coupled multi-physics calculations; calculations performed to date are being used to validate the neutron transport solver Rattlesnake\\citelesnake) and the fuels performance code BISON. Other validation projects outside of TREAT are available for single-physics benchmarking. Because the transient solution capability of Rattlesnake is one of the key attributes that makes it unique for TREAT transient simulations, validation of the transient solution of Rattlesnake using other time dependent kinetics benchmarks has considerable value. The Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development (OECD) has recently developed a computational benchmark for transient simulations. This benchmark considered both two-dimensional (2D) and 3D configurations for a total number of 26 different transients. All are negative reactivity insertions, typically returning to the critical state after some time.
Comparative analysis of exercise 2 results of the OECD WWER-1000 MSLB benchmark
International Nuclear Information System (INIS)
Kolev, N.; Petrov, N.; Royer, E.; Ivanov, B.; Ivanov, K.
2006-01-01
In the framework of joint effort between OECD/NEA, US DOE and CEA France a coupled three-dimensional (3D) thermal-hydraulic/neutron kinetics benchmark for WWER-1000 was defined. Phase 2 of this benchmark is labeled W1000CT-2 and consists of calculation of a vessel mixing experiment and main steam line break (MSLB) transients. The reference plant is Kozloduy-6 in Bulgaria. Plant data are available for code validation consisting of one experiment of pump start-up (W1000CT-1) and one experiment of steam generator isolation (W1000CT-2). The validated codes can be used to calculate asymmetric MSLB transients involving similar mixing patterns. This paper summarizes a comparison of the available results for W1000CT-2 Exercise 2 devoted to core-vessel calculation with imposed MSLB vessel boundary conditions. Because of the recent re-calculation of the cross-section libraries, core physics results from PARCS and CRONOS codes could be compared only. The comparison is code-to-code (including BIPR7A/TVS-M lib) and code vs. plant measured data in a steady state close to the MSLB initial state. The results provide a test of the cross-section libraries and show a good agreement of plant measured and computed data. The comparison of full vessel calculations was made from the point of view of vessel mixing, considering mainly the coarse-mesh features of the flow. The FZR and INRNE results from multi-1D calculations with different mixing models are similar, while the FZK calculations with a coarse-3D vessel model show deviations from the others. These deviations seem to be due to an error in the use of a boundary condition after flow reversal (Authors)
International Nuclear Information System (INIS)
Reitsma, F.; Tyobeka, B.
2010-01-01
The verification and validation of computer codes used in the analysis of high temperature gas cooled pebble bed reactor systems has not been an easy goal to achieve. A limited amount of tests and operating reactor measurements are available. Code-to-code comparisons for realistic pebble bed reactor designs often exhibit differences that are difficult to explain and are often blamed on the complexity of the core models or the variety of analysis methods and cross section data sets employed. For this reason, within the framework of the IAEA CRP5, the 'Pebble Box' benchmark was formulated as a simple way to compare various treatments of neutronics phenomena. The problem is comprised of six test cases which were designed to investigate the treatments and effects of leakage and heterogeneity. This paper presents the preliminary results of the benchmark exercise as received during the CRP and suggests possible future steps towards the resolution of discrepancies between the results. Although few participants took part in the benchmarking exercise, the results presented here show that there is still a need for further evaluation and in-depth understanding in order to build the confidence that all the different methods, codes and cross-section data sets have the capability to handle the various neutronics effects for such systems. (authors)
Computing Moment-Based Probability Tables for Self-Shielding Calculations in Lattice Codes
International Nuclear Information System (INIS)
Hebert, Alain; Coste, Mireille
2002-01-01
As part of the self-shielding model used in the APOLLO2 lattice code, probability tables are required to compute self-shielded cross sections for coarse energy groups (typically with 99 or 172 groups). This paper describes the replacement of the multiband tables (typically with 51 subgroups) with moment-based tables in release 2.5 of APOLLO2. An improved Ribon method is proposed to compute moment-based probability tables, allowing important savings in CPU resources while maintaining the accuracy of the self-shielding algorithm. Finally, a validation is presented where the absorption rates obtained with each of these techniques are compared with exact values obtained using a fine-group elastic slowing-down calculation in the resolved energy domain. Other results, relative to the Rowland's benchmark and to three assembly production cases, are also presented
International Nuclear Information System (INIS)
Di Renzo, F.; Onofri, E.; Marchesini, G.; Marenzoni, P.
1994-01-01
We describe a stochastic technique which allows one to compute numerically the coefficients of the weak-coupling perturbative expansion of any observable in Lattice Gauge Theory. The idea is to insert the exponential representation of the link variables U μ (x) →exp {A μ (x)/√(β)} into the Langevin algorithm and the observables and to perform the expansion in β -1/2 . The Langevin algorithm is converted into an infinite hierarchy of maps which can be exactly truncated at any order. We give the result for the simple plaquette of SU(3) up to fourth loop order (β -4 ) which extends by one loop the previously known series. ((orig.))
International Nuclear Information System (INIS)
Woloshyn, R.M.
1988-03-01
The basic concepts of the Lagrangian formulation of lattice field theory are discussed. The Wilson and staggered schemes for dealing with fermions on the lattice are described. Some recent results for hadron masses and vector and axial vector current matrix elements in lattice QCD are reviewed. (Author) (118 refs., 16 figs.)
Benchmark analysis of MCNP trademark ENDF/B-VI iron
International Nuclear Information System (INIS)
Court, J.D.; Hendricks, J.S.
1994-12-01
The MCNP ENDF/B-VI iron cross-section data was subjected to four benchmark studies as part of the Hiroshima/Nagasaki dose re-evaluation for the National Academy of Science and the Defense Nuclear Agency. The four benchmark studies were: (1) the iron sphere benchmarks from the Lawrence Livermore Pulsed Spheres; (2) the Oak Ridge National Laboratory Fusion Reactor Shielding Benchmark; (3) a 76-cm diameter iron sphere benchmark done at the University of Illinois; (4) the Oak Ridge National Laboratory Benchmark for Neutron Transport through Iron. MCNP4A was used to model each benchmark and computational results from the ENDF/B-VI iron evaluations were compared to ENDF/B-IV, ENDF/B-V, the MCNP Recommended Data Set (which includes Los Alamos National Laboratory Group T-2 evaluations), and experimental data. The results show that the ENDF/B-VI iron evaluations are as good as, or better than, previous data sets
International Nuclear Information System (INIS)
Hasenfratz, P.
1983-01-01
The author presents a general introduction to lattice gauge theories and discusses non-perturbative methods in the gauge sector. He then shows how the lattice works in obtaining the string tension in SU(2). Lattice QCD at finite physical temperature is discussed. Universality tests in SU(2) lattice QCD are presented. SU(3) pure gauge theory is briefly dealt with. Finally, fermions on the lattice are considered. (Auth.)
Benchmarking for Cost Improvement. Final report
Energy Technology Data Exchange (ETDEWEB)
1993-09-01
The US Department of Energy`s (DOE) Office of Environmental Restoration and Waste Management (EM) conducted the Benchmarking for Cost Improvement initiative with three objectives: Pilot test benchmarking as an EM cost improvement tool; identify areas for cost improvement and recommend actions to address these areas; provide a framework for future cost improvement. The benchmarking initiative featured the use of four principal methods (program classification, nationwide cost improvement survey, paired cost comparison and component benchmarking). Interested parties contributed during both the design and execution phases. The benchmarking initiative was conducted on an accelerated basis. Of necessity, it considered only a limited set of data that may not be fully representative of the diverse and complex conditions found at the many DOE installations. The initiative generated preliminary data about cost differences and it found a high degree of convergence on several issues. Based on this convergence, the report recommends cost improvement strategies and actions. This report describes the steps taken as part of the benchmarking initiative and discusses the findings and recommended actions for achieving cost improvement. The results and summary recommendations, reported below, are organized by the study objectives.
Lattice gauge theory using parallel processors
International Nuclear Information System (INIS)
Lee, T.D.; Chou, K.C.; Zichichi, A.
1987-01-01
The book's contents include: Lattice Gauge Theory Lectures: Introduction and Current Fermion Simulations; Monte Carlo Algorithms for Lattice Gauge Theory; Specialized Computers for Lattice Gauge Theory; Lattice Gauge Theory at Finite Temperature: A Monte Carlo Study; Computational Method - An Elementary Introduction to the Langevin Equation, Present Status of Numerical Quantum Chromodynamics; Random Lattice Field Theory; The GF11 Processor and Compiler; and The APE Computer and First Physics Results; Columbia Supercomputer Project: Parallel Supercomputer for Lattice QCD; Statistical and Systematic Errors in Numerical Simulations; Monte Carlo Simulation for LGT and Programming Techniques on the Columbia Supercomputer; Food for Thought: Five Lectures on Lattice Gauge Theory
IAEA sodium void reactivity benchmark calculations
International Nuclear Information System (INIS)
Hill, R.N.; Finck, P.J.
1992-01-01
In this paper, the IAEA-1 992 ''Benchmark Calculation of Sodium Void Reactivity Effect in Fast Reactor Core'' problem is evaluated. The proposed design is a large axially heterogeneous oxide-fueled fast reactor as described in Section 2; the core utilizes a sodium plenum above the core to enhance leakage effects. The calculation methods used in this benchmark evaluation are described in Section 3. In Section 4, the calculated core performance results for the benchmark reactor model are presented; and in Section 5, the influence of steel and interstitial sodium heterogeneity effects is estimated
High order spectral difference lattice Boltzmann method for incompressible hydrodynamics
Li, Weidong
2017-09-01
This work presents a lattice Boltzmann equation (LBE) based high order spectral difference method for incompressible flows. In the present method, the spectral difference (SD) method is adopted to discretize the convection and collision term of the LBE to obtain high order (≥3) accuracy. Because the SD scheme represents the solution as cell local polynomials and the solution polynomials have good tensor-product property, the present spectral difference lattice Boltzmann method (SD-LBM) can be implemented on arbitrary unstructured quadrilateral meshes for effective and efficient treatment of complex geometries. Thanks to only first oder PDEs involved in the LBE, no special techniques, such as hybridizable discontinuous Galerkin method (HDG), local discontinuous Galerkin method (LDG) and so on, are needed to discrete diffusion term, and thus, it simplifies the algorithm and implementation of the high order spectral difference method for simulating viscous flows. The proposed SD-LBM is validated with four incompressible flow benchmarks in two-dimensions: (a) the Poiseuille flow driven by a constant body force; (b) the lid-driven cavity flow without singularity at the two top corners-Burggraf flow; and (c) the unsteady Taylor-Green vortex flow; (d) the Blasius boundary-layer flow past a flat plate. Computational results are compared with analytical solutions of these cases and convergence studies of these cases are also given. The designed accuracy of the proposed SD-LBM is clearly verified.
DEFF Research Database (Denmark)
Hougaard, Jens Leth; Tvede, Mich
2002-01-01
Within a production theoretic framework, this paper considers an axiomatic approach to benchmark selection. It is shown that two simple and weak axioms; efficiency and comprehensive monotonicity characterize a natural family of benchmarks which typically becomes unique. Further axioms are added...... in order to obtain a unique selection...
Results of the Monte Carlo 'simple case' benchmark exercise
International Nuclear Information System (INIS)
2003-11-01
A new 'simple case' benchmark intercomparison exercise was launched, intended to study the importance of the fundamental nuclear data constants, physics treatments and geometry model approximations, employed by Monte Carlo codes in common use. The exercise was also directed at determining the level of agreement which can be expected between measured and calculated quantities, using current state or the art modelling codes and techniques. To this end, measurements and Monte Carlo calculations of the total (or gross) neutron count rates have been performed using a simple moderated 3 He cylindrical proportional counter array or 'slab monitor' counting geometry, deciding to select a very simple geometry for this exercise
Benchmarking school nursing practice: the North West Regional Benchmarking Group
Littler, Nadine; Mullen, Margaret; Beckett, Helen; Freshney, Alice; Pinder, Lynn
2016-01-01
It is essential that the quality of care is reviewed regularly through robust processes such as benchmarking to ensure all outcomes and resources are evidence-based so that children and young people’s needs are met effectively. This article provides an example of the use of benchmarking in school nursing practice. Benchmarking has been defined as a process for finding, adapting and applying best practices (Camp, 1994). This concept was first adopted in the 1970s ‘from industry where it was us...
Energy Technology Data Exchange (ETDEWEB)
Fayers, F J; Terry, M J [General Reactor Physics Division, Atomic Energy Establishment, Winfrith, Dorchester, Dorset (United Kingdom)
1967-07-15
Aluminium is often used as a structural material or fuel cladding in lattice experiments with light water moderators. In particular most of the experiments with regular rod lattices of plutonium fuel have contained significant quantities of aluminium. This report examines the importance of scattering data for aluminium in leakage calculations for light water systems. It is shown that some discrepancy exists between calculated plane moments and experimentally measured moments, which may be corrected by an 'ad hoc' adjustment of inelastic scattering data for aluminium. WIMS results are presented for some Battelle plutonium fuelled rod lattices, and it is shown that this adjustment of inelastic data leads to a noticeable correction for the predicted reactivities of these experiments. The influence of scattering data for aluminium on results for some other lattices of interest has been shown to be less important. (author)
Fuel lattice design using heuristics and new strategies
Energy Technology Data Exchange (ETDEWEB)
Ortiz S, J. J.; Castillo M, J. A.; Torres V, M.; Perusquia del Cueto, R. [ININ, Carretera Mexico-Toluca s/n, Ocoyoacac 52750, Estado de Mexico (Mexico); Pelta, D. A. [ETS Ingenieria Informatica y Telecomunicaciones, Universidad de Granada, Daniel Saucedo Aranda s/n, 18071 Granada (Spain); Campos S, Y., E-mail: juanjose.ortiz@inin.gob.m [IPN, Escuela Superior de Fisica y Matematicas, Unidad Profesional Adolfo Lopez Mateos, Edif. 9, 07738 Mexico D. F. (Mexico)
2010-10-15
This work show some results of the fuel lattice design in BWRs when some allocation pin rod rules are not taking into account. Heuristics techniques like Path Re linking and Greedy to design fuel lattices were used. The scope of this work is to search about how do classical rules in design fuel lattices affect the heuristics techniques results and the fuel lattice quality. The fuel lattices quality is measured by Power Peaking Factor and Infinite Multiplication Factor at the beginning of the fuel lattice life. CASMO-4 code to calculate these parameters was used. The analyzed rules are the following: pin rods with lowest uranium enrichment are only allocated in the fuel lattice corner, and pin rods with gadolinium cannot allocated in the fuel lattice edge. Fuel lattices with and without gadolinium in the main diagonal were studied. Some fuel lattices were simulated in an equilibrium cycle fuel reload, using Simulate-3 to verify their performance. So, the effective multiplication factor and thermal limits can be verified. The obtained results show a good performance in some fuel lattices designed, even thought, the knowing rules were not implemented. A fuel lattice performance and fuel lattice design characteristics analysis was made. To the realized tests, a dell workstation was used, under Li nux platform. (Author)
Fuel lattice design using heuristics and new strategies
International Nuclear Information System (INIS)
Ortiz S, J. J.; Castillo M, J. A.; Torres V, M.; Perusquia del Cueto, R.; Pelta, D. A.; Campos S, Y.
2010-10-01
This work show some results of the fuel lattice design in BWRs when some allocation pin rod rules are not taking into account. Heuristics techniques like Path Re linking and Greedy to design fuel lattices were used. The scope of this work is to search about how do classical rules in design fuel lattices affect the heuristics techniques results and the fuel lattice quality. The fuel lattices quality is measured by Power Peaking Factor and Infinite Multiplication Factor at the beginning of the fuel lattice life. CASMO-4 code to calculate these parameters was used. The analyzed rules are the following: pin rods with lowest uranium enrichment are only allocated in the fuel lattice corner, and pin rods with gadolinium cannot allocated in the fuel lattice edge. Fuel lattices with and without gadolinium in the main diagonal were studied. Some fuel lattices were simulated in an equilibrium cycle fuel reload, using Simulate-3 to verify their performance. So, the effective multiplication factor and thermal limits can be verified. The obtained results show a good performance in some fuel lattices designed, even thought, the knowing rules were not implemented. A fuel lattice performance and fuel lattice design characteristics analysis was made. To the realized tests, a dell workstation was used, under Li nux platform. (Author)
Mountrakis, L.; Lorenz, E.; Hoekstra, A. G.
2017-07-01
The immersed-boundary lattice-Boltzmann method (IB-LBM) is increasingly being used in simulations of dense suspensions. These systems are computationally very expensive and can strongly benefit from lower resolutions that still maintain the desired accuracy for the quantities of interest. IB-LBM has a number of free parameters that have to be defined, often without exact knowledge of the tradeoffs, since their behavior in low resolutions is not well understood. Such parameters are the lattice constant Δ x , the number of vertices Nv, the interpolation kernel ϕ , and the LBM relaxation time τ . We investigate the effect of these IB-LBM parameters on a number of straightforward but challenging benchmarks. The systems considered are (a) the flow of a single sphere in shear flow, (b) the collision of two spheres in shear flow, and (c) the lubrication interaction of two spheres. All benchmarks are performed in three dimensions. The first two systems are used for determining two effective radii: the hydrodynamic radius rhyd and the particle interaction radius rinter. The last system is used to establish the numerical robustness of the lubrication forces, used to probe the hydrodynamic interactions in the limit of small gaps. Our results show that lower spatial resolutions result in larger hydrodynamic and interaction radii, while surface densities should be chosen above two vertices per LU2 result to prevent fluid penetration in underresolved meshes. Underresolved meshes also failed to produce the migration of particles toward the center of the domain due to lift forces in Couette flow, mostly noticeable for IBM-kernel ϕ2. Kernel ϕ4, despite being more robust toward mesh resolution, produces a notable membrane thickness, leading to the breakdown of the lubrication forces in larger gaps, and its use in dense suspensions where the mean particle distances are small can result in undesired behavior. rhyd is measured to be different from rinter, suggesting that there is
The VENUS-7 benchmarks. Results from state-of-the-art transport codes and nuclear data
International Nuclear Information System (INIS)
Zwermann, Winfried; Pautz, Andreas; Timm, Wolf
2010-01-01
For the validation of both nuclear data and computational methods, comparisons with experimental data are necessary. Most advantageous are assemblies where not only the multiplication factors or critical parameters were measured, but also additional quantities like reactivity differences or pin-wise fission rate distributions have been assessed. Currently there is a comprehensive activity to evaluate such measure-ments and incorporate them in the International Handbook of Evaluated Reactor Physics Benchmark Experiments. A large number of such experiments was performed at the VENUS zero power reactor at SCK/CEN in Belgium in the sixties and seventies. The VENUS-7 series was specified as an international benchmark within the OECD/NEA Working Party on Scientific Issues of Reactor Systems (WPRS), and results obtained with various codes and nuclear data evaluations were summarized. In the present paper, results of high-accuracy transport codes with full spatial resolution with up-to-date nuclear data libraries from the JEFF and ENDF/B evaluations are presented. The comparisons of the results, both code-to-code and with the measured data are augmented by uncertainty and sensitivity analyses with respect to nuclear data uncertainties. For the multiplication factors, these are performed with the TSUNAMI-3D code from the SCALE system. In addition, uncertainties in the reactivity differences are analyzed with the TSAR code which is available from the current SCALE-6 version. (orig.)
HPCG Benchmark Technical Specification
Energy Technology Data Exchange (ETDEWEB)
Heroux, Michael Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dongarra, Jack [Univ. of Tennessee, Knoxville, TN (United States); Luszczek, Piotr [Univ. of Tennessee, Knoxville, TN (United States)
2013-10-01
The High Performance Conjugate Gradient (HPCG) benchmark [cite SNL, UTK reports] is a tool for ranking computer systems based on a simple additive Schwarz, symmetric Gauss-Seidel preconditioned conjugate gradient solver. HPCG is similar to the High Performance Linpack (HPL), or Top 500, benchmark [1] in its purpose, but HPCG is intended to better represent how today’s applications perform. In this paper we describe the technical details of HPCG: how it is designed and implemented, what code transformations are permitted and how to interpret and report results.
Energy Technology Data Exchange (ETDEWEB)
Xu Xixiang, E-mail: xu_xixiang@hotmail.co [College of Science, Shandong University of Science and Technology, Qingdao, 266510 (China)
2010-01-04
An integrable coupling family of Merola-Ragnisco-Tu lattice systems is derived from a four-by-four matrix spectral problem. The Hamiltonian structure of the resulting integrable coupling family is established by the discrete variational identity. Each lattice system in the resulting integrable coupling family is proved to be integrable discrete Hamiltonian system in Liouville sense. Ultimately, a nonisospectral integrable lattice family associated with the resulting integrable lattice family is constructed through discrete zero curvature representation.
International Nuclear Information System (INIS)
Xu Xixiang
2010-01-01
An integrable coupling family of Merola-Ragnisco-Tu lattice systems is derived from a four-by-four matrix spectral problem. The Hamiltonian structure of the resulting integrable coupling family is established by the discrete variational identity. Each lattice system in the resulting integrable coupling family is proved to be integrable discrete Hamiltonian system in Liouville sense. Ultimately, a nonisospectral integrable lattice family associated with the resulting integrable lattice family is constructed through discrete zero curvature representation.
International Nuclear Information System (INIS)
Khan, M.J.H.; Alam, A.B.M.K.; Ahsan, M.H.; Mamun, K.A.A.; Islam, S.M.A.
2015-01-01
Highlights: • To validate the reactor physics lattice code WIMSD-5B by this analysis. • To model TRX and BAPL critical experiments using WIMSD-5B. • To compare the calculated results with experiment and MCNP results. • To rely on WIMSD-5B code for TRIGA calculations. - Abstract: The aim of this analysis is to validate the reactor physics lattice transport code WIMSD-5B by TRX (thermal reactor-one region lattice) and BAPL (Bettis Atomic Power Laboratory-one region lattice) critical experiments of light water reactors for neutronics analysis of 3 MW TRIGA Mark-II research reactor at AERE, Dhaka, Bangladesh. This analysis is achieved through the analysis of integral parameters of five light water reactor critical experiments TRX-1, TRX-2, BAPL-UO 2 -1, BAPL-UO 2 -2 and BAPL-UO 2 -3 based on evaluated nuclear data libraries JEFF-3.1 and ENDF/B-VII.1. In integral measurements, these experiments are considered as standard benchmark lattices for validating the reactor physics lattice transport code WIMSD-5B as well as evaluated nuclear data libraries. The integral parameters of the said critical experiments are calculated using the reactor physics lattice transport code WIMSD-5B. The calculated integral parameters are compared to the measured values as well as the earlier published MCNP results based on the Chinese evaluated nuclear data library CENDL-3.0 for assessment of deterministic calculation. It was found that the calculated integral parameters give mostly reasonable and globally consistent results with the experiment and the MCNP results. Besides, the group constants in WIMS format for the isotopes U-235 and U-238 between two data files have been compared using WIMS library utility code WILLIE and it was found that the group constants are well consistent with each other. Therefore, this analysis reveals the validation study of the reactor physics lattice transport code WIMSD-5B based on JEFF-3.1 and ENDF/B-VII.1 libraries and can also be essential to
Vver-1000 Mox core computational benchmark
International Nuclear Information System (INIS)
2006-01-01
The NEA Nuclear Science Committee has established an Expert Group that deals with the status and trends of reactor physics, fuel performance and fuel cycle issues related to disposing of weapons-grade plutonium in mixed-oxide fuel. The objectives of the group are to provide NEA member countries with up-to-date information on, and to develop consensus regarding, core and fuel cycle issues associated with burning weapons-grade plutonium in thermal water reactors (PWR, BWR, VVER-1000, CANDU) and fast reactors (BN-600). These issues concern core physics, fuel performance and reliability, and the capability and flexibility of thermal water reactors and fast reactors to dispose of weapons-grade plutonium in standard fuel cycles. The activities of the NEA Expert Group on Reactor-based Plutonium Disposition are carried out in close co-operation (jointly, in most cases) with the NEA Working Party on Scientific Issues in Reactor Systems (WPRS). A prominent part of these activities include benchmark studies. At the time of preparation of this report, the following benchmarks were completed or in progress: VENUS-2 MOX Core Benchmarks: carried out jointly with the WPRS (formerly the WPPR) (completed); VVER-1000 LEU and MOX Benchmark (completed); KRITZ-2 Benchmarks: carried out jointly with the WPRS (formerly the WPPR) (completed); Hollow and Solid MOX Fuel Behaviour Benchmark (completed); PRIMO MOX Fuel Performance Benchmark (ongoing); VENUS-2 MOX-fuelled Reactor Dosimetry Calculation (ongoing); VVER-1000 In-core Self-powered Neutron Detector Calculational Benchmark (started); MOX Fuel Rod Behaviour in Fast Power Pulse Conditions (started); Benchmark on the VENUS Plutonium Recycling Experiments Configuration 7 (started). This report describes the detailed results of the benchmark investigating the physics of a whole VVER-1000 reactor core using two-thirds low-enriched uranium (LEU) and one-third MOX fuel. It contributes to the computer code certification process and to the
DEFF Research Database (Denmark)
Lawson, Lartey; Nielsen, Kurt
2005-01-01
We discuss individual learning by interactive benchmarking using stochastic frontier models. The interactions allow the user to tailor the performance evaluation to preferences and explore alternative improvement strategies by selecting and searching the different frontiers using directional...... in the suggested benchmarking tool. The study investigates how different characteristics on dairy farms influences the technical efficiency....
Criteria of benchmark selection for efficient flexible multibody system formalisms
Directory of Open Access Journals (Sweden)
Valášek M.
2007-10-01
Full Text Available The paper deals with the selection process of benchmarks for testing and comparing efficient flexible multibody formalisms. The existing benchmarks are briefly summarized. The purposes for benchmark selection are investigated. The result of this analysis is the formulation of the criteria of benchmark selection for flexible multibody formalisms. Based on them the initial set of suitable benchmarks is described. Besides that the evaluation measures are revised and extended.
QCD thermodynamics from an imaginary μB: Results on the four flavor lattice model
International Nuclear Information System (INIS)
D'Elia, Massimo; Lombardo, Maria-Paola
2004-01-01
We study four flavor QCD at nonzero temperature and density by analytic continuation from an imaginary chemical potential. The explored region is T=0.95T c c , and the baryochemical potentials range from 0 to ≅500 MeV. Observables include the number density, the order parameter for chiral symmetry, and the pressure, which is calculated via an integral method at fixed temperature and quark mass. The simulations are carried out on a 16 3 x4 lattice, and the mass dependence of the results is estimated by exploiting the Maxwell relations. In the hadronic region, we confirm that the results are consistent with a simple resonance hadron gas model, and we estimate the critical density by combining the results for the number density with those for the critical line. In the hot phase, above the end point of the Roberge-Weiss transition T E ≅1.1T c , the results are consistent with a free lattice model with a fixed effective number of flavor slightly different from four. We confirm that confinement and chiral symmetry are coincident by a further analysis of the critical line, and we discuss the interrelation between thermodynamics and critical behavior. We comment on the strength and weakness of the method, and propose further developments
The KMAT: Benchmarking Knowledge Management.
de Jager, Martha
Provides an overview of knowledge management and benchmarking, including the benefits and methods of benchmarking (e.g., competitive, cooperative, collaborative, and internal benchmarking). Arthur Andersen's KMAT (Knowledge Management Assessment Tool) is described. The KMAT is a collaborative benchmarking tool, designed to help organizations make…
Benchmarking in Mobarakeh Steel Company
Sasan Ghasemi; Mohammad Nazemi; Mehran Nejati
2008-01-01
Benchmarking is considered as one of the most effective ways of improving performance in companies. Although benchmarking in business organizations is a relatively new concept and practice, it has rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan's Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aims to share the process deployed for the benchmarking project in this company and illustrate how th...
Benchmarking of the PHOENIX-P/ANC [Advanced Nodal Code] advanced nuclear design system
International Nuclear Information System (INIS)
Nguyen, T.Q.; Liu, Y.S.; Durston, C.; Casadei, A.L.
1988-01-01
At Westinghouse, an advanced neutronic methods program was designed to improve the quality of the predictions, enhance flexibility in designing advanced fuel and related products, and improve design lead time. Extensive benchmarking data is presented to demonstrate the accuracy of the Advanced Nodal Code (ANC) and the PHOENIX-P advanced lattice code. Qualification data to demonstrate the accuracy of ANC include comparison of key physics parameters against a fine-mesh diffusion theory code, TORTISE. Benchmarking data to demonstrate the validity of the PHOENIX-P methodologies include comparison of physics predictions against critical experiments, isotopics measurements and measured power distributions from spatial criticals. The accuracy of the PHOENIX-P/ANC Advanced Design System is demonstrated by comparing predictions of hot zero power physics parameters and hot full power core follow against measured data from operating reactors. The excellent performance of this system for a broad range of comparisons establishes the basis for implementation of these tools for core design, licensing and operational follow of PWR [pressurized water reactor] cores at Westinghouse
DEFF Research Database (Denmark)
Seabrooke, Leonard; Wigan, Duncan
2015-01-01
Non-governmental organisations use benchmarks as a form of symbolic violence to place political pressure on firms, states, and international organisations. The development of benchmarks requires three elements: (1) salience, that the community of concern is aware of the issue and views...... are put to the test. The first is a reformist benchmarking cycle where organisations defer to experts to create a benchmark that conforms with the broader system of politico-economic norms. The second is a revolutionary benchmarking cycle driven by expert-activists that seek to contest strong vested...... interests and challenge established politico-economic norms. Differentiating these cycles provides insights into how activists work through organisations and with expert networks, as well as how campaigns on complex economic issues can be mounted and sustained....
Energy Technology Data Exchange (ETDEWEB)
Ryu, Seungyeob, E-mail: syryu@kaeri.re.kr [Korea Atomic Energy Research Institute (KAERI), 1045 Daeduk-daero, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Kim, Youngin; Yoon, Juhyeon [Korea Atomic Energy Research Institute (KAERI), 1045 Daeduk-daero, Yuseong-gu, Daejeon 305-353 (Korea, Republic of); Ko, Sungho, E-mail: sunghoko@cnu.ac.kr [Department of Mechanical Design Engineering, Chungnam National University, 220 Gung-dong, Yuseong-gu, Daejeon 305-764 (Korea, Republic of)
2014-01-15
Highlights: • We directly simulate circular-cap bubbles in low viscous liquids. • The counter diffusion multiphase lattice Boltzmann method is proposed. • The present method is validated through benchmark tests and experimental results. • The high-Reynolds-number bubbles can be simulated without any turbulence models. • The present method is feasible for the direct simulation of bubbly flows. -- Abstract: The counter diffusion lattice Boltzmann method (LBM) is used to directly simulate rising circular-cap bubbles in low viscous liquids. A counter diffusion model for single phase flows has been extended to multiphase flows, and the implicit formulation is converted into an explicit one for easy calculation. Bubbles at high Reynolds numbers ranging from O(10{sup 2}) to O(10{sup 4}) are simulated successfully without any turbulence models, which cannot be done for the existing LBM versions. The characteristics of the circular-cap bubbles are studied for a wide range of Morton numbers and compared with the previous literature. Calculated results agree with the theoretical and experimental data. Consequently, the wake phenomena of circular-cap bubbles and bubble induced turbulence are presented.
Benchmarking in Mobarakeh Steel Company
Directory of Open Access Journals (Sweden)
Sasan Ghasemi
2008-05-01
Full Text Available Benchmarking is considered as one of the most effective ways of improving performance incompanies. Although benchmarking in business organizations is a relatively new concept and practice, ithas rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahans Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aimsto share the process deployed for the benchmarking project in this company and illustrate how the projectsystematic implementation led to succes.
Energy Technology Data Exchange (ETDEWEB)
Zhaoyuan Liu; Kord Smith; Benoit Forget; Javier Ortensi
2016-05-01
A new method for computing homogenized assembly neutron transport cross sections and dif- fusion coefficients that is both rigorous and computationally efficient is proposed in this paper. In the limit of a homogeneous hydrogen slab, the new method is equivalent to the long-used, and only-recently-published CASMO transport method. The rigorous method is used to demonstrate the sources of inaccuracy in the commonly applied “out-scatter” transport correction. It is also demonstrated that the newly developed method is directly applicable to lattice calculations per- formed by Monte Carlo and is capable of computing rigorous homogenized transport cross sections for arbitrarily heterogeneous lattices. Comparisons of several common transport cross section ap- proximations are presented for a simple problem of infinite medium hydrogen. The new method has also been applied in computing 2-group diffusion data for an actual PWR lattice from BEAVRS benchmark.
U.S. Environmental Protection Agency — The Aquatic Life Benchmarks is an EPA-developed set of criteria for freshwater species. These benchmarks are based on toxicity values reviewed by EPA and used in the...
International Nuclear Information System (INIS)
Pavlovichev, A.M.
2001-01-01
Actual regulations while designing of new fuel cycles for nuclear power installations comprise a calculational justification to be performed by certified computer codes. It guarantees that obtained calculational results will be within the limits of declared uncertainties that are indicated in a certificate issued by Gosatomnadzor of Russian Federation (GAN) and concerning a corresponding computer code. A formal justification of declared uncertainties is the comparison of calculational results obtained by a commercial code with the results of experiments or of calculational tests that are calculated with an uncertainty defined by certified precision codes of MCU type or of other one. The actual level of international cooperation provides an enlarging of the bank of experimental and calculational benchmarks acceptable for a certification of commercial codes that are being used for a design of fuel loadings with MOX fuel. In particular, the work is practically finished on the forming of calculational benchmarks list for a certification of code TVS-M as applied to MOX fuel assembly calculations. The results on these activities are presented
Strategic behaviour under regulatory benchmarking
Energy Technology Data Exchange (ETDEWEB)
Jamasb, T. [Cambridge Univ. (United Kingdom). Dept. of Applied Economics; Nillesen, P. [NUON NV (Netherlands); Pollitt, M. [Cambridge Univ. (United Kingdom). Judge Inst. of Management
2004-09-01
In order to improve the efficiency of electricity distribution networks, some regulators have adopted incentive regulation schemes that rely on performance benchmarking. Although regulation benchmarking can influence the ''regulation game,'' the subject has received limited attention. This paper discusses how strategic behaviour can result in inefficient behaviour by firms. We then use the Data Envelopment Analysis (DEA) method with US utility data to examine implications of illustrative cases of strategic behaviour reported by regulators. The results show that gaming can have significant effects on the measured performance and profitability of firms. (author)
3-D neutron transport benchmarks
International Nuclear Information System (INIS)
Takeda, T.; Ikeda, H.
1991-03-01
A set of 3-D neutron transport benchmark problems proposed by the Osaka University to NEACRP in 1988 has been calculated by many participants and the corresponding results are summarized in this report. The results of K eff , control rod worth and region-averaged fluxes for the four proposed core models, calculated by using various 3-D transport codes are compared and discussed. The calculational methods used were: Monte Carlo, Discrete Ordinates (Sn), Spherical Harmonics (Pn), Nodal Transport and others. The solutions of the four core models are quite useful as benchmarks for checking the validity of 3-D neutron transport codes
DEFF Research Database (Denmark)
Agrell, Per J.; Bogetoft, Peter
2017-01-01
Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators. The appli......Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of bench-marking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...
DEFF Research Database (Denmark)
Agrell, Per J.; Bogetoft, Peter
2017-01-01
Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators. The appli......Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...
Nuclear lattice simulations using symmetry-sign extrapolation
Energy Technology Data Exchange (ETDEWEB)
Laehde, Timo A.; Luu, Thomas [Forschungszentrum Juelich, Institute for Advanced Simulation, Institut fuer Kernphysik, and Juelich Center for Hadron Physics, Juelich (Germany); Lee, Dean [North Carolina State University, Department of Physics, Raleigh, NC (United States); Meissner, Ulf G. [Universitaet Bonn, Helmholtz-Institut fuer Strahlen- und Kernphysik and Bethe Center for Theoretical Physics, Bonn (Germany); Forschungszentrum Juelich, Institute for Advanced Simulation, Institut fuer Kernphysik, and Juelich Center for Hadron Physics, Juelich (Germany); Forschungszentrum Juelich, JARA - High Performance Computing, Juelich (Germany); Epelbaum, Evgeny; Krebs, Hermann [Ruhr-Universitaet Bochum, Institut fuer Theoretische Physik II, Bochum (Germany); Rupak, Gautam [Mississippi State University, Department of Physics and Astronomy, Mississippi State, MS (United States)
2015-07-15
Projection Monte Carlo calculations of lattice Chiral Effective Field Theory suffer from sign oscillations to a varying degree dependent on the number of protons and neutrons. Hence, such studies have hitherto been concentrated on nuclei with equal numbers of protons and neutrons, and especially on the alpha nuclei where the sign oscillations are smallest. Here, we introduce the ''symmetry-sign extrapolation'' method, which allows us to use the approximate Wigner SU(4) symmetry of the nuclear interaction to systematically extend the Projection Monte Carlo calculations to nuclear systems where the sign problem is severe. We benchmark this method by calculating the ground-state energies of the {sup 12}C, {sup 6}He and {sup 6}Be nuclei, and discuss its potential for studies of neutron-rich halo nuclei and asymmetric nuclear matter. (orig.)
Benchmarking for Higher Education.
Jackson, Norman, Ed.; Lund, Helen, Ed.
The chapters in this collection explore the concept of benchmarking as it is being used and developed in higher education (HE). Case studies and reviews show how universities in the United Kingdom are using benchmarking to aid in self-regulation and self-improvement. The chapters are: (1) "Introduction to Benchmarking" (Norman Jackson…
Benchmarking and Learning in Public Healthcare
DEFF Research Database (Denmark)
Buckmaster, Natalie; Mouritsen, Jan
2017-01-01
This research investigates the effects of learning-oriented benchmarking in public healthcare settings. Benchmarking is a widely adopted yet little explored accounting practice that is part of the paradigm of New Public Management. Extant studies are directed towards mandated coercive benchmarking...... applications. The present study analyses voluntary benchmarking in a public setting that is oriented towards learning. The study contributes by showing how benchmarking can be mobilised for learning and offers evidence of the effects of such benchmarking for performance outcomes. It concludes that benchmarking...... can enable learning in public settings but that this requires actors to invest in ensuring that benchmark data are directed towards improvement....
Dynamic benchmarking of simulation codes
International Nuclear Information System (INIS)
Henry, R.E.; Paik, C.Y.; Hauser, G.M.
1996-01-01
Computer simulation of nuclear power plant response can be a full-scope control room simulator, an engineering simulator to represent the general behavior of the plant under normal and abnormal conditions, or the modeling of the plant response to conditions that would eventually lead to core damage. In any of these, the underlying foundation for their use in analysing situations, training of vendor/utility personnel, etc. is how well they represent what has been known from industrial experience, large integral experiments and separate effects tests. Typically, simulation codes are benchmarked with some of these; the level of agreement necessary being dependent upon the ultimate use of the simulation tool. However, these analytical models are computer codes, and as a result, the capabilities are continually enhanced, errors are corrected, new situations are imposed on the code that are outside of the original design basis, etc. Consequently, there is a continual need to assure that the benchmarks with important transients are preserved as the computer code evolves. Retention of this benchmarking capability is essential to develop trust in the computer code. Given the evolving world of computer codes, how is this retention of benchmarking capabilities accomplished? For the MAAP4 codes this capability is accomplished through a 'dynamic benchmarking' feature embedded in the source code. In particular, a set of dynamic benchmarks are included in the source code and these are exercised every time the archive codes are upgraded and distributed to the MAAP users. Three different types of dynamic benchmarks are used: plant transients; large integral experiments; and separate effects tests. Each of these is performed in a different manner. The first is accomplished by developing a parameter file for the plant modeled and an input deck to describe the sequence; i.e. the entire MAAP4 code is exercised. The pertinent plant data is included in the source code and the computer
Statistical hydrodynamics of lattice-gas automata
Grosfils, Patrick; Boon, Jean-Pierre; Brito López, Ricardo; Ernst, M. H.
1993-01-01
We investigate the space and time behavior of spontaneous thermohydrodynamic fluctuations in a simple fluid modeled by a lattice-gas automaton and develop the statistical-mechanical theory of thermal lattice gases to compute the dynamical structure factor, i.e., the power spectrum of the density correlation function. A comparative analysis of the theoretical predictions with our lattice gas simulations is presented. The main results are (i) the spectral function of the lattice-gas fluctuation...
Regression Benchmarking: An Approach to Quality Assurance in Performance
Bulej, Lubomír
2005-01-01
The paper presents a short summary of our work in the area of regression benchmarking and its application to software development. Specially, we explain the concept of regression benchmarking, the requirements for employing regression testing in a software project, and methods used for analyzing the vast amounts of data resulting from repeated benchmarking. We present the application of regression benchmarking on a real software project and conclude with a glimpse at the challenges for the fu...
International Nuclear Information System (INIS)
Thorn, C.B.
1988-01-01
The possibility of studying non-perturbative effects in string theory using a world sheet lattice is discussed. The light-cone lattice string model of Giles and Thorn is studied numerically to assess the accuracy of ''coarse lattice'' approximations. For free strings a 5 by 15 lattice seems sufficient to obtain better than 10% accuracy for the bosonic string tachyon mass squared. In addition a crude lattice model simulating string like interactions is studied to find out how easily a coarse lattice calculation can pick out effects such as bound states which would qualitatively alter the spectrum of the free theory. The role of the critical dimension in obtaining a finite continuum limit is discussed. Instead of the ''gaussian'' lattice model one could use one of the vertex models, whose continuum limit is the same as a gaussian model on a torus of any radius. Indeed, any critical 2 dimensional statistical system will have a stringy continuum limit in the absence of string interactions. 8 refs., 1 fig. , 9 tabs
Staff Association
2017-01-01
On 12 December 2016, in Echo No. 259, we already discussed at length the MERIT and benchmark jobs. Still, we find that a couple of issues warrant further discussion. Benchmark job – administrative decision on 1 July 2017 On 12 January 2017, the HR Department informed all staff members of a change to the effective date of the administrative decision regarding benchmark jobs. The benchmark job title of each staff member will be confirmed on 1 July 2017, instead of 1 May 2017 as originally announced in HR’s letter on 18 August 2016. Postponing the administrative decision by two months will leave a little more time to address the issues related to incorrect placement in a benchmark job. Benchmark job – discuss with your supervisor, at the latest during the MERIT interview In order to rectify an incorrect placement in a benchmark job, it is essential that the supervisor and the supervisee go over the assigned benchmark job together. In most cases, this placement has been done autom...
Benchmarking infrastructure for mutation text mining.
Klein, Artjom; Riazanov, Alexandre; Hindle, Matthew M; Baker, Christopher Jo
2014-02-25
Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption.
Maximally twisted mass lattice QCD at the physical pion mass
International Nuclear Information System (INIS)
Kostrzewa, Bartosz
2016-01-01
In computer simulations of Lattice Quantum Chromodynamics, the usage of unphysically large quark masses and the subsequent extrapolation of results to the physical value of the quark masses are major sources of systematic uncertainty. In this thesis, the feasibility and practicality of numerical simulations of Quantum Chromodynamics with physically light up and down quarks using the Wilson twisted mass quark discretisation are explored. Working in this regime is complicated firstly by the numerical expense of these simulations and secondly by the presence of potentially large lattice artefacts. The twisted mass discretisation is affected by an unphysical mass difference between the charged and neutral pions, rendering simulations at the physical charged pion mass infeasible if this mass splitting is too large. With the aim of reducing it, the Sheikholeslami-Wohlert term is added to the twisted mass fermion action and simulations with mass degenerate up and down quarks are then performed as a proof of concept. It is demonstrated that these simulations are stable and that the parameters of the lattice theory can be successfully tuned to correspond to the physical charged pion mass. Subsequently, the parameter tuning for simulations with mass degenerate up and down quarks as well as strange and charm quarks is explored and it is shown that it can be carried out in steps. As benchmark observables, the masses and decay constants of pseudoscalar mesons with light, strange and charm valence quarks are calculated and seen to largely reproduce their phenomenological values, even though continuum and infinite volume extrapolations are not performed. Light, strange and charm quark mass estimates are determined based on this data and also seen to coincide with phenomenological and other lattice determinations. In this analysis, a particular emphasis is placed on the systematic error due to the choice of fit range for pseudoscalar correlation functions and a weighting method is
Benchmarking reference services: an introduction.
Marshall, J G; Buchanan, H S
1995-01-01
Benchmarking is based on the common sense idea that someone else, either inside or outside of libraries, has found a better way of doing certain things and that your own library's performance can be improved by finding out how others do things and adopting the best practices you find. Benchmarking is one of the tools used for achieving continuous improvement in Total Quality Management (TQM) programs. Although benchmarking can be done on an informal basis, TQM puts considerable emphasis on formal data collection and performance measurement. Used to its full potential, benchmarking can provide a common measuring stick to evaluate process performance. This article introduces the general concept of benchmarking, linking it whenever possible to reference services in health sciences libraries. Data collection instruments that have potential application in benchmarking studies are discussed and the need to develop common measurement tools to facilitate benchmarking is emphasized.
Results of Magnetic Axis Measurements on a Prototype Main Lattice Quadrupole for the LHC
Smirnov, N; Deferne, G; Parma, V; Rohmig, P; Tortschanoff, Theodor
2004-01-01
More than 470 twin aperture lattice quadrupoles are needed for the Large Hadron Collider (LHC) under construction at CERN. The lattice quadrupole, assembled with correction magnets in its helium enclosure - the cold mass and integrated in a common cryostat called the Short Straight Section (SSS). All SSS cold mass prototypes have been developed and built by CEA (Saclay) in collaboration with CNRS (Orsay, France). The last SSS prototype (SSS5) was used to investigate the behavior of the magnetic axis through various steps of the installation cycle for the series quadrupoles: including transportation, thermal-cycles, and being lowered into the tunnel. Results of extensive measurements before and after each of these stages are presented here, showing that the effect of transport is weak and within the window of measurement resolution. Also shown is that the long-term stability observed during two years is comparable with the requirements from magnet tolerances. To minimize systematic errors, all tests were perfo...
DEFF Research Database (Denmark)
Bogetoft, Peter; Nielsen, Kurt
2005-01-01
We discuss the design of interactive, internet based benchmarking using parametric (statistical) as well as nonparametric (DEA) models. The user receives benchmarks and improvement potentials. The user is also given the possibility to search different efficiency frontiers and hereby to explore...
MCNP simulation of the TRIGA Mark II benchmark experiment
International Nuclear Information System (INIS)
Jeraj, R.; Glumac, B.; Maucec, M.
1996-01-01
The complete 3D MCNP model of the TRIGA Mark II reactor is presented. It enables precise calculations of some quantities of interest in a steady-state mode of operation. Calculational results are compared to the experimental results gathered during reactor reconstruction in 1992. Since the operating conditions were well defined at that time, the experimental results can be used as a benchmark. It may be noted that this benchmark is one of very few high enrichment benchmarks available. In our simulations experimental conditions were thoroughly simulated: fuel elements and control rods were precisely modeled as well as entire core configuration and the vicinity of the core. ENDF/B-VI and ENDF/B-V libraries were used. Partial results of benchmark calculations are presented. Excellent agreement of core criticality, excess reactivity and control rod worths can be observed. (author)
Geometry of lattice field theory
International Nuclear Information System (INIS)
Honan, T.J.
1986-01-01
Using some tools of algebraic topology, a general formalism for lattice field theory is presented. The lattice is taken to be a simplicial complex that is also a manifold and is referred to as a simplicial manifold. The fields on this lattice are cochains, that are called lattice forms to emphasize the connections with differential forms in the continuum. This connection provides a new bridge between lattice and continuum field theory. A metric can be put onto this simplicial manifold by assigning lengths to every link or I-simplex of the lattice. Regge calculus is a way of defining general relativity on this lattice. A geometric discussion of Regge calculus is presented. The Regge action, which is a discrete form of the Hilbert action, is derived from the Hilbert action using distribution valued forms. This is a new derivation that emphasizes the underlying geometry. Kramers-Wannier duality in statistical mechanics is discussed in this general setting. Nonlinear field theories, which include gauge theories and nonlinear sigma models are discussed in the continuum and then are put onto a lattice. The main new result here is the generalization to curved spacetime, which consists of making the theory compatible with Regge calculus
Pilati, Sebastiano; Zintchenko, Ilia; Troyer, Matthias; Ancilotto, Francesco
2018-04-01
We benchmark the ground state energies and the density profiles of atomic repulsive Fermi gases in optical lattices (OLs) computed via density functional theory (DFT) against the results of diffusion Monte Carlo (DMC) simulations. The main focus is on a half-filled one-dimensional OLs, for which the DMC simulations performed within the fixed-node approach provide unbiased results. This allows us to demonstrate that the local spin-density approximation (LSDA) to the exchange-correlation functional of DFT is very accurate in the weak and intermediate interactions regime, and also to underline its limitations close to the strongly-interacting Tonks-Girardeau limit and in very deep OLs. We also consider a three-dimensional OL at quarter filling, showing also in this case the high accuracy of the LSDA in the moderate interaction regime. The one-dimensional data provided in this study may represent a useful benchmark to further develop DFT methods beyond the LSDA and they will hopefully motivate experimental studies to accurately measure the equation of state of Fermi gases in higher-dimensional geometries. Supplementary material in the form of one pdf file available from the Journal web page at http://https://doi.org/10.1140/epjb/e2018-90021-1.
Benchmarking in academic pharmacy departments.
Bosso, John A; Chisholm-Burns, Marie; Nappi, Jean; Gubbins, Paul O; Ross, Leigh Ann
2010-10-11
Benchmarking in academic pharmacy, and recommendations for the potential uses of benchmarking in academic pharmacy departments are discussed in this paper. Benchmarking is the process by which practices, procedures, and performance metrics are compared to an established standard or best practice. Many businesses and industries use benchmarking to compare processes and outcomes, and ultimately plan for improvement. Institutions of higher learning have embraced benchmarking practices to facilitate measuring the quality of their educational and research programs. Benchmarking is used internally as well to justify the allocation of institutional resources or to mediate among competing demands for additional program staff or space. Surveying all chairs of academic pharmacy departments to explore benchmarking issues such as department size and composition, as well as faculty teaching, scholarly, and service productivity, could provide valuable information. To date, attempts to gather this data have had limited success. We believe this information is potentially important, urge that efforts to gather it should be continued, and offer suggestions to achieve full participation.
Benchmarking: applications to transfusion medicine.
Apelseth, Torunn Oveland; Molnar, Laura; Arnold, Emmy; Heddle, Nancy M
2012-10-01
Benchmarking is as a structured continuous collaborative process in which comparisons for selected indicators are used to identify factors that, when implemented, will improve transfusion practices. This study aimed to identify transfusion medicine studies reporting on benchmarking, summarize the benchmarking approaches used, and identify important considerations to move the concept of benchmarking forward in the field of transfusion medicine. A systematic review of published literature was performed to identify transfusion medicine-related studies that compared at least 2 separate institutions or regions with the intention of benchmarking focusing on 4 areas: blood utilization, safety, operational aspects, and blood donation. Forty-five studies were included: blood utilization (n = 35), safety (n = 5), operational aspects of transfusion medicine (n = 5), and blood donation (n = 0). Based on predefined criteria, 7 publications were classified as benchmarking, 2 as trending, and 36 as single-event studies. Three models of benchmarking are described: (1) a regional benchmarking program that collects and links relevant data from existing electronic sources, (2) a sentinel site model where data from a limited number of sites are collected, and (3) an institutional-initiated model where a site identifies indicators of interest and approaches other institutions. Benchmarking approaches are needed in the field of transfusion medicine. Major challenges include defining best practices and developing cost-effective methods of data collection. For those interested in initiating a benchmarking program, the sentinel site model may be most effective and sustainable as a starting point, although the regional model would be the ideal goal. Copyright © 2012 Elsevier Inc. All rights reserved.
WIPP Benchmark calculations with the large strain SPECTROM codes
International Nuclear Information System (INIS)
Callahan, G.D.; DeVries, K.L.
1995-08-01
This report provides calculational results from the updated Lagrangian structural finite-element programs SPECTROM-32 and SPECTROM-333 for the purpose of qualifying these codes to perform analyses of structural situations in the Waste Isolation Pilot Plant (WIPP). Results are presented for the Second WIPP Benchmark (Benchmark II) Problems and for a simplified heated room problem used in a parallel design calculation study. The Benchmark II problems consist of an isothermal room problem and a heated room problem. The stratigraphy involves 27 distinct geologic layers including ten clay seams of which four are modeled as frictionless sliding interfaces. The analyses of the Benchmark II problems consider a 10-year simulation period. The evaluation of nine structural codes used in the Benchmark II problems shows that inclusion of finite-strain effects is not as significant as observed for the simplified heated room problem, and a variety of finite-strain and small-strain formulations produced similar results. The simplified heated room problem provides stratigraphic complexity equivalent to the Benchmark II problems but neglects sliding along the clay seams. The simplified heated problem does, however, provide a calculational check case where the small strain-formulation produced room closures about 20 percent greater than those obtained using finite-strain formulations. A discussion is given of each of the solved problems, and the computational results are compared with available published results. In general, the results of the two SPECTROM large strain codes compare favorably with results from other codes used to solve the problems
S/sub n/ analysis of the TRX metal lattices with ENDF/B version III data
International Nuclear Information System (INIS)
Wheeler, F.J.
1975-01-01
Two critical assemblies, designated as thermal-reactor benchmarks TRX-1 and TRX-2 for ENDF/B data testing, were analyzed using the one-dimensional S/sub n/-theory code SCAMP. The two assemblies were simple lattices of aluminum-clad, uranium-metal fuel rods in triangular arrays with D 2 O as moderator and reflector. The fuel was low-enriched (1.3 percent 235 U), 0.387-inch in diameter and had an active height of 48 inches. The volume ratio of water to uranium was 2.35 for the TRX-1 lattice and 4.02 for TRX-2. Full-core S/sub n/ calculations based on Version III data were performed for these assemblies and the results obtained were compared with the measured values of the multiplication factors, the ratio of epithermal-to-thermal neutron capture in 238 U, the ratio of epithermal-to-thermal fission in 235 U, the ratio of 238 U fission to 235 U fission, and the ratio of capture in 238 U to fission in 235 U. Reaction rates were obtained from a central region of the full-core problems. Multigroup cross sections for the reactor calculation were obtained from S/sub n/ cell calculations with resonance self-shielding calculated using the RABBLE treatment. The results of the analyses are generally consistent with results obtained by other investigators
International Nuclear Information System (INIS)
Xolocostli M, J. V.; Vargas E, S.; Gomez T, A. M.; Reyes F, M. del C.; Del Valle G, E.
2014-10-01
In this paper a comparison is made in analyzing the suite of Benchmark problems for reactors type BWR between CASMO-4, MCNP6 and Serpent code. The Benchmark problem consists of two different geometries: a fuel cell of a pin and assembly type BWR. To facilitate the study of reactors physics in the fuel pin their nuclear characteristics are provided to detail, such as burnt dependence, the reactivity of selected nuclide, etc. With respect to the fuel assembly, the presented results are regarding to infinite multiplication factor for burning different steps and different vacuum conditions. Making the analysis of this set of Benchmark problems provides comprehensive test problems for the next fuels generation of BWR reactors with high extended burned. It is important to note that when making this comparison the purpose is to validate the methodologies used in modeling for different operating conditions, if the case is of other BWR assembly. The results will be within a range with some uncertainty, considering that does not depend on code that is used. Escuela Superior de Fisica y Matematicas of Instituto Politecnico Nacional (IPN (Mexico) has accumulated some experience in using Serpent, due to the potential of this code over other commercial codes such as CASMO and MCNP. The obtained results for the infinite multiplication factor are encouraging and motivate the studies to continue with the generation of the X S of a core to a next step a respective nuclear data library is constructed and this can be used by codes developed as part of the development project of the Mexican Analysis Platform of Nuclear Reactors AZTLAN. (Author)
RISKIND verification and benchmark comparisons
Energy Technology Data Exchange (ETDEWEB)
Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.
1997-08-01
This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.
RISKIND verification and benchmark comparisons
International Nuclear Information System (INIS)
Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.
1997-08-01
This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models
Introduction to lattice gauge theories
International Nuclear Information System (INIS)
La Cock, P.
1988-03-01
A general introduction to Lattice Gauge Theory (LGT) is given. The theory is discussed from first principles to facilitate an understanding of the techniques used in LGT. These include lattice formalism, gauge invariance, fermions on the lattice, group theory and integration, strong coupling methods and mean field techniques. A review of quantum chromodynamics on the lattice at finite temperature and density is also given. Monte Carlo results and analytical methods are discussed. An attempt has been made to include most relevant data up to the end of 1987, and to update some earlier reviews existing on the subject. 224 refs., 33 figs., 14 tabs
Analysis of an OECD/NEA high-temperature reactor benchmark
International Nuclear Information System (INIS)
Hosking, J. G.; Newton, T. D.; Koeberl, O.; Morris, P.; Goluoglu, S.; Tombakoglu, T.; Colak, U.; Sartori, E.
2006-01-01
This paper describes analyses of the OECD/NEA HTR benchmark organized by the 'Working Party on the Scientific Issues of Reactor Systems (WPRS)', formerly the 'Working Party on the Physics of Plutonium Fuels and Innovative Fuel Cycles'. The benchmark was specifically designed to provide inter-comparisons for plutonium and thorium fuels when used in HTR systems. Calculations considering uranium fuel have also been included in the benchmark, in order to identify any increased uncertainties when using plutonium or thorium fuels. The benchmark consists of five phases, which include cell and whole-core calculations. Analysis of the benchmark has been performed by a number of international participants, who have used a range of deterministic and Monte Carlo code schemes. For each of the benchmark phases, neutronics parameters have been evaluated. Comparisons are made between the results of the benchmark participants, as well as comparisons between the predictions of the deterministic calculations and those from detailed Monte Carlo calculations. (authors)
WWER in-core fuel management benchmark definition
International Nuclear Information System (INIS)
Apostolov, T.; Alekova, G.; Prodanova, R.; Petrova, T.; Ivanov, K.
1994-01-01
Two benchmark problems for WWER-440, including design parameters, operating conditions and measured quantities are discussed in this paper. Some benchmark results for infinitive multiplication factor -K eff , natural boron concentration - C β and relative power distribution - K q obtained by use of the code package are represented. (authors). 5 refs., 3 tabs
Defining a methodology for benchmarking spectrum unfolding codes
International Nuclear Information System (INIS)
Meyer, W.; Kirmser, P.G.; Miller, W.H.; Hu, K.K.
1976-01-01
It has long been recognized that different neutron spectrum unfolding codes will produce significantly different results when unfolding the same measured data. In reviewing the results of such analyses it has been difficult to determine which result if any is the best representation of what was measured by the spectrometer detector. A proposal to develop a benchmarking procedure for spectrum unfolding codes is presented. The objective of the procedure will be to begin to develop a methodology and a set of data with a well established and documented result that could be used to benchmark and standardize the various unfolding methods and codes. It is further recognized that development of such a benchmark must involve a consensus of the technical community interested in neutron spectrum unfolding
Numerical methods: Analytical benchmarking in transport theory
International Nuclear Information System (INIS)
Ganapol, B.D.
1988-01-01
Numerical methods applied to reactor technology have reached a high degree of maturity. Certainly one- and two-dimensional neutron transport calculations have become routine, with several programs available on personal computer and the most widely used programs adapted to workstation and minicomputer computational environments. With the introduction of massive parallelism and as experience with multitasking increases, even more improvement in the development of transport algorithms can be expected. Benchmarking an algorithm is usually not a very pleasant experience for the code developer. Proper algorithmic verification by benchmarking involves the following considerations: (1) conservation of particles, (2) confirmation of intuitive physical behavior, and (3) reproduction of analytical benchmark results. By using today's computational advantages, new basic numerical methods have been developed that allow a wider class of benchmark problems to be considered
The CMSSW benchmarking suite: Using HEP code to measure CPU performance
International Nuclear Information System (INIS)
Benelli, G
2010-01-01
The demanding computing needs of the CMS experiment require thoughtful planning and management of its computing infrastructure. A key factor in this process is the use of realistic benchmarks when assessing the computing power of the different architectures available. In recent years a discrepancy has been observed between the CPU performance estimates given by the reference benchmark for HEP computing (SPECint) and actual performances of HEP code. Making use of the CPU performance tools from the CMSSW performance suite, comparative CPU performance studies have been carried out on several architectures. A benchmarking suite has been developed and integrated in the CMSSW framework, to allow computing centers and interested third parties to benchmark architectures directly with CMSSW. The CMSSW benchmarking suite can be used out of the box, to test and compare several machines in terms of CPU performance and report with the wanted level of detail the different benchmarking scores (e.g. by processing step) and results. In this talk we describe briefly the CMSSW software performance suite, and in detail the CMSSW benchmarking suite client/server design, the performance data analysis and the available CMSSW benchmark scores. The experience in the use of HEP code for benchmarking will be discussed and CMSSW benchmark results presented.
Benchmark problems for numerical implementations of phase field models
International Nuclear Information System (INIS)
Jokisaari, A. M.; Voorhees, P. W.; Guyer, J. E.; Warren, J.; Heinonen, O. G.
2016-01-01
Here, we present the first set of benchmark problems for phase field models that are being developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST). While many scientific research areas use a limited set of well-established software, the growing phase field community continues to develop a wide variety of codes and lacks benchmark problems to consistently evaluate the numerical performance of new implementations. Phase field modeling has become significantly more popular as computational power has increased and is now becoming mainstream, driving the need for benchmark problems to validate and verify new implementations. We follow the example set by the micromagnetics community to develop an evolving set of benchmark problems that test the usability, computational resources, numerical capabilities and physical scope of phase field simulation codes. In this paper, we propose two benchmark problems that cover the physics of solute diffusion and growth and coarsening of a second phase via a simple spinodal decomposition model and a more complex Ostwald ripening model. We demonstrate the utility of benchmark problems by comparing the results of simulations performed with two different adaptive time stepping techniques, and we discuss the needs of future benchmark problems. The development of benchmark problems will enable the results of quantitative phase field models to be confidently incorporated into integrated computational materials science and engineering (ICME), an important goal of the Materials Genome Initiative.
Results for the η' mass from two-flavour lattice QCD
International Nuclear Information System (INIS)
Lesk, V.I.; Aoki, S.; Burkhalter, R.; Fukugita, M.; Ishikawa, K.I.; Ishizuka, N.; Iwasaki, Y.; Kanay, K.; Kaneko, T.; Kuramashi, Y.; Okawa, M.; Taniguchi, Y.; Ukawa, A.; Umeda, T.; Yoshie, T.
2003-01-01
We present results for the mass of the η' meson for two-flavor lattice QCD in the continuum limit, calculated on the CP-PACS computer, using an RG-improved gauge action and clover fermion action with tadpole-improved c sw . Measurements are made at three couplings corresponding to a ∼ 0.22, 0.16, 0.11 fm for four quark masses corresponding to m π /m ρ ∼ 0.8, 0.75, 0.7, 0.6. The two-loop diagrams are evaluated using a noisy source method. Quark smearing for both one- and two- loop diagrams is successfully applied to obtain ground state signals in the η' channel. We obtain m η' =0.960(87) -0.286 +0.036 in the continuum limit, where the second error represents the systematic uncertainty coming from varying the functional form for chiral and continuum extrapolations
Hospital benchmarking: are U.S. eye hospitals ready?
de Korne, Dirk F; van Wijngaarden, Jeroen D H; Sol, Kees J C A; Betz, Robert; Thomas, Richard C; Schein, Oliver D; Klazinga, Niek S
2012-01-01
Benchmarking is increasingly considered a useful management instrument to improve quality in health care, but little is known about its applicability in hospital settings. The aims of this study were to assess the applicability of a benchmarking project in U.S. eye hospitals and compare the results with an international initiative. We evaluated multiple cases by applying an evaluation frame abstracted from the literature to five U.S. eye hospitals that used a set of 10 indicators for efficiency benchmarking. Qualitative analysis entailed 46 semistructured face-to-face interviews with stakeholders, document analyses, and questionnaires. The case studies only partially met the conditions of the evaluation frame. Although learning and quality improvement were stated as overall purposes, the benchmarking initiative was at first focused on efficiency only. No ophthalmic outcomes were included, and clinicians were skeptical about their reporting relevance and disclosure. However, in contrast with earlier findings in international eye hospitals, all U.S. hospitals worked with internal indicators that were integrated in their performance management systems and supported benchmarking. Benchmarking can support performance management in individual hospitals. Having a certain number of comparable institutes provide similar services in a noncompetitive milieu seems to lay fertile ground for benchmarking. International benchmarking is useful only when these conditions are not met nationally. Although the literature focuses on static conditions for effective benchmarking, our case studies show that it is a highly iterative and learning process. The journey of benchmarking seems to be more important than the destination. Improving patient value (health outcomes per unit of cost) requires, however, an integrative perspective where clinicians and administrators closely cooperate on both quality and efficiency issues. If these worlds do not share such a relationship, the added
Graphene antidot lattice waveguides
DEFF Research Database (Denmark)
Pedersen, Jesper Goor; Gunst, Tue; Markussen, Troels
2012-01-01
We introduce graphene antidot lattice waveguides: nanostructured graphene where a region of pristine graphene is sandwiched between regions of graphene antidot lattices. The band gaps in the surrounding antidot lattices enable localized states to emerge in the central waveguide region. We model...... the waveguides via a position-dependent mass term in the Dirac approximation of graphene and arrive at analytical results for the dispersion relation and spinor eigenstates of the localized waveguide modes. To include atomistic details we also use a tight-binding model, which is in excellent agreement...... with the analytical results. The waveguides resemble graphene nanoribbons, but without the particular properties of ribbons that emerge due to the details of the edge. We show that electrons can be guided through kinks without additional resistance and that transport through the waveguides is robust against...
Salomons, E.M.; Lohman, W.J.A.; Zhou, H.
2016-01-01
Propagation of sound waves in air can be considered as a special case of fluid dynamics. Consequently, the lattice Boltzmann method (LBM) for fluid flow can be used for simulating sound propagation. In this article application of the LBM to sound propagation is illustrated for various cases:
Elastic lattice in an incommensurate background
International Nuclear Information System (INIS)
Dickman, R.; Chudnovsky, E.M.
1995-01-01
We study a harmonic triangular lattice, which relaxes in the presence of an incommensurate short-wavelength potential. Monte Carlo simulations reveal that the elastic lattice exhibits only short-ranged translational correlations, despite the absence of defects in either lattice. Extended orientational order, however, persists in the presence of the background. Translational correlation lengths exhibit approximate power-law dependence upon cooling rate and background strength. Our results may be relevant to Wigner crystals, atomic monolayers on crystals surfaces, and flux-line and magnetic bubble lattices
Energy Technology Data Exchange (ETDEWEB)
Ivanova, T.; Laville, C. [Institut de Radioprotection et de Surete Nucleaire IRSN, BP 17, 92262 Fontenay aux Roses (France); Dyrda, J. [Atomic Weapons Establishment AWE, Aldermaston, Reading, RG7 4PR (United Kingdom); Mennerdahl, D. [E Mennerdahl Systems EMS, Starvaegen 12, 18357 Taeby (Sweden); Golovko, Y.; Raskach, K.; Tsiboulia, A. [Inst. for Physics and Power Engineering IPPE, 1, Bondarenko sq., 249033 Obninsk (Russian Federation); Lee, G. S.; Woo, S. W. [Korea Inst. of Nuclear Safety KINS, 62 Gwahak-ro, Yuseong-gu, Daejeon 305-338 (Korea, Republic of); Bidaud, A.; Sabouri, P. [Laboratoire de Physique Subatomique et de Cosmologie LPSC, CNRS-IN2P3/UJF/INPG, Grenoble (France); Patel, A. [U.S. Nuclear Regulatory Commission (NRC), Washington, DC 20555-0001 (United States); Bledsoe, K.; Rearden, B. [Oak Ridge National Laboratory ORNL, M.S. 6170, P.O. Box 2008, Oak Ridge, TN 37831 (United States); Gulliford, J.; Michel-Sendis, F. [OECD/NEA, 12, Bd des Iles, 92130 Issy-les-Moulineaux (France)
2012-07-01
The sensitivities of the k{sub eff} eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplifications impact the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods. (authors)
Rosenberg, Peter; Shi, Hao; Zhang, Shiwei
2017-12-01
We present an ab initio, numerically exact study of attractive fermions in square lattices with Rashba spin-orbit coupling. The ground state of this system is a supersolid, with coexisting charge and superfluid order. The superfluid is composed of both singlet and triplet pairs induced by spin-orbit coupling. We perform large-scale calculations using the auxiliary-field quantum Monte Carlo method to provide the first full, quantitative description of the charge, spin, and pairing properties of the system. In addition to characterizing the exotic physics, our results will serve as essential high-accuracy benchmarks for the intense theoretical and especially experimental efforts in ultracold atoms to realize and understand an expanding variety of quantum Hall and topological superconductor systems.
Generalized isothermic lattices
International Nuclear Information System (INIS)
Doliwa, Adam
2007-01-01
We study multi-dimensional quadrilateral lattices satisfying simultaneously two integrable constraints: a quadratic constraint and the projective Moutard constraint. When the lattice is two dimensional and the quadric under consideration is the Moebius sphere one obtains, after the stereographic projection, the discrete isothermic surfaces defined by Bobenko and Pinkall by an algebraic constraint imposed on the (complex) cross-ratio of the circular lattice. We derive the analogous condition for our generalized isothermic lattices using Steiner's projective structure of conics, and we present basic geometric constructions which encode integrability of the lattice. In particular, we introduce the Darboux transformation of the generalized isothermic lattice and we derive the corresponding Bianchi permutability principle. Finally, we study two-dimensional generalized isothermic lattices, in particular geometry of their initial boundary value problem
International Nuclear Information System (INIS)
Yasu, Y.; Hirayama, H.; Namito, Y.; Yashiro, S.
1995-01-01
This paper proposes EGS4 Benchmark Suite which consists of three programs called UCSAMPL4, UCSAMPL4I and XYZDOS. This paper also evaluates optimization methods of recent RISC/UNIX systems, such as IBM, HP, DEC, Hitachi and Fujitsu, for the benchmark suite. When particular compiler option and math library were included in the evaluation process, system performed significantly better. Observed performance of some of the RISC/UNIX systems were beyond some so-called Mainframes of IBM, Hitachi or Fujitsu. The computer performance of EGS4 Code System on an HP9000/735 (99MHz) was defined to be the unit of EGS4 Unit. The EGS4 Benchmark Suite also run on various PCs such as Pentiums, i486 and DEC alpha and so forth. The performance of recent fast PCs reaches that of recent RISC/UNIX systems. The benchmark programs have been evaluated with correlation of industry benchmark programs, namely, SPECmark. (author)
Energy Technology Data Exchange (ETDEWEB)
Randjbar-Daemi, S
1995-12-01
The so-called doubling problem in the lattice description of fermions led to a proof that under certain circumstances chiral gauge theories cannot be defined on the lattice. This is called the no-go theorem. It implies that if {Gamma}/sub/A is defined on a lattice then its infrared limit, which should correspond to the quantum description of the classical action for the slowly varying fields on lattice scale, is inevitably a vector like theory. In particular, if not circumvented, the no-go theorem implies that there is no lattice formulation of the Standard Weinberg-Salam theory or SU(5) GUT, even though the fermions belong to anomaly-free representations of the gauge group. This talk aims to explain one possible attempt at bypassing the no-go theorem. 20 refs.
International Nuclear Information System (INIS)
Randjbar-Daemi, S.
1995-12-01
The so-called doubling problem in the lattice description of fermions led to a proof that under certain circumstances chiral gauge theories cannot be defined on the lattice. This is called the no-go theorem. It implies that if Γ/sub/A is defined on a lattice then its infrared limit, which should correspond to the quantum description of the classical action for the slowly varying fields on lattice scale, is inevitably a vector like theory. In particular, if not circumvented, the no-go theorem implies that there is no lattice formulation of the Standard Weinberg-Salam theory or SU(5) GUT, even though the fermions belong to anomaly-free representations of the gauge group. This talk aims to explain one possible attempt at bypassing the no-go theorem. 20 refs
International Nuclear Information System (INIS)
Orii, Shigeo
1998-06-01
A benchmark specification for performance evaluation of parallel computers for numerical analysis is proposed. Level 1 benchmark, which is a conventional type benchmark using processing time, measures performance of computers running a code. Level 2 benchmark proposed in this report is to give the reason of the performance. As an example, scalar-parallel computer SP2 is evaluated with this benchmark specification in case of a molecular dynamics code. As a result, the main causes to suppress the parallel performance are maximum band width and start-up time of communication between nodes. Especially the start-up time is proportional not only to the number of processors but also to the number of particles. (author)
VENUS-2 Benchmark Problem Analysis with HELIOS-1.9
International Nuclear Information System (INIS)
Jeong, Hyeon-Jun; Choe, Jiwon; Lee, Deokjung
2014-01-01
Since there are reliable results of benchmark data from the OECD/NEA report of the VENUS-2 MOX benchmark problem, by comparing benchmark results users can identify the credibility of code. In this paper, the solution of the VENUS-2 benchmark problem from HELIOS 1.9 using the ENDF/B-VI library(NJOY91.13) is compared with the result from HELIOS 1.7 with consideration of the MCNP-4B result as reference data. The comparison contains the results of pin cell calculation, assembly calculation, and core calculation. The eigenvalues from those are considered by comparing the results from other codes. In the case of UOX and MOX assemblies, the differences from the MCNP-4B results are about 10 pcm. However, there is some inaccuracy in baffle-reflector condition, and relatively large differences were found in the MOX-reflector assembly and core calculation. Although HELIOS 1.9 utilizes an inflow transport correction, it seems that it has a limited effect on the error in baffle-reflector condition
Benchmarking in the globalised world and its impact on South ...
African Journals Online (AJOL)
In order to understand the potential impact of international benchmarking on South African institutions, it is important to explore the future role of benchmarking on the international level. In this regard, examples of transnational benchmarking activities will be considered. As a result of the involvement of South African ...
MoMaS reactive transport benchmark using PFLOTRAN
Park, H.
2017-12-01
MoMaS benchmark was developed to enhance numerical simulation capability for reactive transport modeling in porous media. The benchmark was published in late September of 2009; it is not taken from a real chemical system, but realistic and numerically challenging tests. PFLOTRAN is a state-of-art massively parallel subsurface flow and reactive transport code that is being used in multiple nuclear waste repository projects at Sandia National Laboratories including Waste Isolation Pilot Plant and Used Fuel Disposition. MoMaS benchmark has three independent tests with easy, medium, and hard chemical complexity. This paper demonstrates how PFLOTRAN is applied to this benchmark exercise and shows results of the easy benchmark test case which includes mixing of aqueous components and surface complexation. Surface complexations consist of monodentate and bidentate reactions which introduces difficulty in defining selectivity coefficient if the reaction applies to a bulk reference volume. The selectivity coefficient becomes porosity dependent for bidentate reaction in heterogeneous porous media. The benchmark is solved by PFLOTRAN with minimal modification to address the issue and unit conversions were made properly to suit PFLOTRAN.
Frustrated lattices of Ising chains
International Nuclear Information System (INIS)
Kudasov, Yurii B; Korshunov, Aleksei S; Pavlov, V N; Maslov, Dmitrii A
2012-01-01
The magnetic structure and magnetization dynamics of systems of plane frustrated Ising chain lattices are reviewed for three groups of compounds: Ca 3 Co 2 O 6 , CsCoCl 3 , and Sr 5 Rh 4 O 12 . The available experimental data are analyzed and compared in detail. It is shown that a high-temperature magnetic phase on a triangle lattice is normally and universally a partially disordered antiferromagnetic (PDA) structure. The diversity of low-temperature phases results from weak interactions that lift the degeneracy of a 2D antiferromagnetic Ising model on the triangle lattice. Mean-field models, Monte Carlo simulation results on the static magnetization curve, and results on slow magnetization dynamics obtained with Glauber's theory are discussed in detail. (reviews of topical problems)
Sachs, Ulrich; Akkerman, Remko; Fetfatsidis, K.; Vidal-Sallé, E.; Schumacher, J.; Ziegmann, G.; Allaoui, S.; Hivet, G.; Maron, B.; Vanclooster, K.; Lomov, S.V.
2014-01-01
A benchmark exercise was conducted to compare various friction test set-ups with respect to the measured coefficients of friction. The friction was determined between Twintex®PP, a fabric of commingled yarns of glass and polypropylene filaments, and a metal surface. The same material was supplied to
California commercial building energy benchmarking
Energy Technology Data Exchange (ETDEWEB)
Kinney, Satkartar; Piette, Mary Ann
2003-07-01
Building energy benchmarking is the comparison of whole-building energy use relative to a set of similar buildings. It provides a useful starting point for individual energy audits and for targeting buildings for energy-saving measures in multiple-site audits. Benchmarking is of interest and practical use to a number of groups. Energy service companies and performance contractors communicate energy savings potential with ''typical'' and ''best-practice'' benchmarks while control companies and utilities can provide direct tracking of energy use and combine data from multiple buildings. Benchmarking is also useful in the design stage of a new building or retrofit to determine if a design is relatively efficient. Energy managers and building owners have an ongoing interest in comparing energy performance to others. Large corporations, schools, and government agencies with numerous facilities also use benchmarking methods to compare their buildings to each other. The primary goal of Task 2.1.1 Web-based Benchmarking was the development of a web-based benchmarking tool, dubbed Cal-Arch, for benchmarking energy use in California commercial buildings. While there were several other benchmarking tools available to California consumers prior to the development of Cal-Arch, there were none that were based solely on California data. Most available benchmarking information, including the Energy Star performance rating, were developed using DOE's Commercial Building Energy Consumption Survey (CBECS), which does not provide state-level data. Each database and tool has advantages as well as limitations, such as the number of buildings and the coverage by type, climate regions and end uses. There is considerable commercial interest in benchmarking because it provides an inexpensive method of screening buildings for tune-ups and retrofits. However, private companies who collect and manage consumption data are concerned that the
Benchmarking in Foodservice Operations
National Research Council Canada - National Science Library
Johnson, Bonnie
1998-01-01
The objective of this study was to identify usage of foodservice performance measures, important activities in foodservice benchmarking, and benchmarking attitudes, beliefs, and practices by foodservice directors...
Flavor extrapolation in lattice QCD
International Nuclear Information System (INIS)
Duffy, W.C.
1984-01-01
Explicit calculation of the effect of virtual quark-antiquark pairs in lattice QCD has eluded researchers. To include their effect explicitly one must calculate the determinant of the fermion-fermion coupling matrix. Owing to the large number of sites in a continuum limit size lattice, direct evaluation of this term requires an unrealistic amount of computer time. The effect of the virtual pairs can be approximated by ignoring this term and adjusting lattice couplings to reproduce experimental results. This procedure is called the valence approximation since it ignores all but the minimal number of quarks needed to describe hadrons. In this work the effect of the quark-antiquark pairs has been incorporated in a theory with an effective negative number of quark flavors contributing to the closed loops. Various particle masses and decay constants have been calculated for this theory and for one with no virtual pairs. The author attempts to extrapolate results towards positive numbers of quark flavors. The results show approximate agreement with experimental measurements and demonstrate the smoothness of lattice expectations in the number of quark flavors
The development of code benchmarks
International Nuclear Information System (INIS)
Glass, R.E.
1986-01-01
Sandia National Laboratories has undertaken a code benchmarking effort to define a series of cask-like problems having both numerical solutions and experimental data. The development of the benchmarks includes: (1) model problem definition, (2) code intercomparison, and (3) experimental verification. The first two steps are complete and a series of experiments are planned. The experiments will examine the elastic/plastic behavior of cylinders for both the end and side impacts resulting from a nine meter drop. The cylinders will be made from stainless steel and aluminum to give a range of plastic deformations. This paper presents the results of analyses simulating the model's behavior using materials properties for stainless steel and aluminum
Benchmark calculations of power distribution within assemblies
International Nuclear Information System (INIS)
Cavarec, C.; Perron, J.F.; Verwaerde, D.; West, J.P.
1994-09-01
The main objective of this Benchmark is to compare different techniques for fine flux prediction based upon coarse mesh diffusion or transport calculations. We proposed 5 ''core'' configurations including different assembly types (17 x 17 pins, ''uranium'', ''absorber'' or ''MOX'' assemblies), with different boundary conditions. The specification required results in terms of reactivity, pin by pin fluxes and production rate distributions. The proposal for these Benchmark calculations was made by J.C. LEFEBVRE, J. MONDOT, J.P. WEST and the specification (with nuclear data, assembly types, core configurations for 2D geometry and results presentation) was distributed to correspondents of the OECD Nuclear Energy Agency. 11 countries and 19 companies answered the exercise proposed by this Benchmark. Heterogeneous calculations and homogeneous calculations were made. Various methods were used to produce the results: diffusion (finite differences, nodal...), transport (P ij , S n , Monte Carlo). This report presents an analysis and intercomparisons of all the results received
International Nuclear Information System (INIS)
Murakami, Kiyonobu; Miyoshi, Yoshinori; Hirose, Hideyuki; Suzaki, Takenori
1985-03-01
Reactivity effects and reactivity-interference effects of absorber rods were measured with a cylindrical core aiming to obtain bench-marks for verification of the calculational methods. The core consisted of 2.6 w/o enriched UO 2 fuel rods lattice of which water-to-fuel volume ratio was 1.83. In the experiment, the critical water levels were measured changing neutron absorber content of absorber rods and the distance between two absorber rods in the core center. Monte Calro codes KENO-IV and MULTI-KENO were used to calculate reactivity worthes of absorber rods. The calculational results of effective multiplication factors ranged from 0.978 to 0.999 for the 60 cases of critical cores with inserted absorber rods. The calculational results of absorber worthes agreed to the experimental results within twice of the standerd deviation accompanied with the Monte Calro calculation. (author)
Introduction to lattice gauge theory
International Nuclear Information System (INIS)
Gupta, R.
1987-01-01
The lattice formulation of Quantum Field Theory (QFT) can be exploited in many ways. We can derive the lattice Feynman rules and carry out weak coupling perturbation expansions. The lattice then serves as a manifestly gauge invariant regularization scheme, albeit one that is more complicated than standard continuum schemes. Strong coupling expansions: these give us useful qualitative information, but unfortunately no hard numbers. The lattice theory is amenable to numerical simulations by which one calculates the long distance properties of a strongly interacting theory from first principles. The observables are measured as a function of the bare coupling g and a gauge invariant cut-off ≅ 1/α, where α is the lattice spacing. The continuum (physical) behavior is recovered in the limit α → 0, at which point the lattice artifacts go to zero. This is the more powerful use of lattice formulation, so in these lectures the author focuses on setting up the theory for the purpose of numerical simulations to get hard numbers. The numerical techniques used in Lattice Gauge Theories have their roots in statistical mechanics, so it is important to develop an intuition for the interconnection between quantum mechanics and statistical mechanics. This will be the emphasis of the first lecture. In the second lecture, the author reviews the essential ingredients of formulating QCD on the lattice and discusses scaling and the continuum limit. In the last lecture the author summarizes the status of some of the main results. He also mentions the bottlenecks and possible directions for research. 88 refs
Numisheet2005 Benchmark Analysis on Forming of an Automotive Deck Lid Inner Panel: Benchmark 1
International Nuclear Information System (INIS)
Buranathiti, Thaweepat; Cao Jian
2005-01-01
Numerical simulations in sheet metal forming processes have been a very challenging topic in industry. There are many computer codes and modeling techniques existing today. However, there are many unknowns affecting the prediction accuracy. Systematic benchmark tests are needed to accelerate the future implementations and to provide as a reference. This report presents an international cooperative benchmark effort for an automotive deck lid inner panel. Predictions from simulations are analyzed and discussed against the corresponding experimental results. The correlations between accuracy of each parameter of interest are discussed in this report
Vitreous in lattice degeneration of retina.
Foos, R Y; Simons, K B
1984-05-01
A localized pocket of missing vitreous invariably overlies lattice degeneration of the retina. Subjects with lattice also have a higher rate of rhegmatogenous retinal detachment, which is usually a complication of retinal tears. The latter are in turn a result of alterations in the central vitreous--that is, synchysis senilis leading to posterior vitreous detachment. In order to determine if there is either an association or a deleterious interaction between the local and central lesions of the vitreous in eyes with lattice, a comparison was made in autopsy eyes with and without lattice the degree of synchysis and rate of vitreous detachment. Results show no association between the local and central vitreous lesions, indicating that a higher rate of vitreous detachment is not the basis for the higher rate of retinal detachment in eyes with lattice. Also, there was no suggestion of deleterious interaction between the local and central vitreous lesions, either through vitreodonesis as a basis for precocious vitreous detachment, or through a greater degree of synchysis as a basis for interconnection of local and central lacunae (which could extend the localized retinal detachment in eyes with holes in lattice degeneration).
DEFF Research Database (Denmark)
Risager, Morten S.; Södergren, Carl Anders
2017-01-01
It is well known that the angles in a lattice acting on hyperbolic n -space become equidistributed. In this paper we determine a formula for the pair correlation density for angles in such hyperbolic lattices. Using this formula we determine, among other things, the asymptotic behavior of the den......It is well known that the angles in a lattice acting on hyperbolic n -space become equidistributed. In this paper we determine a formula for the pair correlation density for angles in such hyperbolic lattices. Using this formula we determine, among other things, the asymptotic behavior...... of the density function in both the small and large variable limits. This extends earlier results by Boca, Pasol, Popa and Zaharescu and Kelmer and Kontorovich in dimension 2 to general dimension n . Our proofs use the decay of matrix coefficients together with a number of careful estimates, and lead...
Irreversible stochastic processes on lattices
International Nuclear Information System (INIS)
Nord, R.S.
1986-01-01
Models for irreversible random or cooperative filling of lattices are required to describe many processes in chemistry and physics. Since the filling is assumed to be irreversible, even the stationary, saturation state is not in equilibrium. The kinetics and statistics of these processes are described by recasting the master equations in infinite hierarchical form. Solutions can be obtained by implementing various techniques: refinements in these solution techniques are presented. Programs considered include random dimer, trimer, and tetramer filling of 2D lattices, random dimer filling of a cubic lattice, competitive filling of two or more species, and the effect of a random distribution of inactive sites on the filling. Also considered is monomer filling of a linear lattice with nearest neighbor cooperative effects and solve for the exact cluster-size distribution for cluster sizes up to the asymptotic regime. Additionally, a technique is developed to directly determine the asymptotic properties of the cluster size distribution. Finally cluster growth is considered via irreversible aggregation involving random walkers. In particular, explicit results are provided for the large-lattice-size asymptotic behavior of trapping probabilities and average walk lengths for a single walker on a lattice with multiple traps. Procedures for exact calculation of these quantities on finite lattices are also developed
Benchmarking i den offentlige sektor
DEFF Research Database (Denmark)
Bukh, Per Nikolaj; Dietrichson, Lars; Sandalgaard, Niels
2008-01-01
I artiklen vil vi kort diskutere behovet for benchmarking i fraværet af traditionelle markedsmekanismer. Herefter vil vi nærmere redegøre for, hvad benchmarking er med udgangspunkt i fire forskellige anvendelser af benchmarking. Regulering af forsyningsvirksomheder vil blive behandlet, hvorefter...
Regional Competitive Intelligence: Benchmarking and Policymaking
Huggins , Robert
2010-01-01
Benchmarking exercises have become increasingly popular within the sphere of regional policymaking in recent years. The aim of this paper is to analyse the concept of regional benchmarking and its links with regional policymaking processes. It develops a typology of regional benchmarking exercises and regional benchmarkers, and critically reviews the literature, both academic and policy oriented. It is argued that critics who suggest regional benchmarking is a flawed concept and technique fai...
RELAP5-3D Results for Phase I (Exercise 2) of the OECD/NEA MHTGR-350 MW Benchmark
Energy Technology Data Exchange (ETDEWEB)
Gerhard Strydom
2012-06-01
The coupling of the PHISICS code suite to the thermal hydraulics system code RELAP5-3D has recently been initiated at the Idaho National Laboratory (INL) to provide a fully coupled prismatic Very High Temperature Reactor (VHTR) system modeling capability as part of the NGNP methods development program. The PHISICS code consists of three modules: INSTANT (performing 3D nodal transport core calculations), MRTAU (depletion and decay heat generation) and a perturbation/mixer module. As part of the verification and validation activities, steady state results have been obtained for Exercise 2 of Phase I of the newly-defined OECD/NEA MHTGR-350 MW Benchmark. This exercise requires participants to calculate a steady-state solution for an End of Equilibrium Cycle 350 MW Modular High Temperature Reactor (MHTGR), using the provided geometry, material, and coolant bypass flow description. The paper provides an overview of the MHTGR Benchmark and presents typical steady state results (e.g. solid and gas temperatures, thermal conductivities) for Phase I Exercise 2. Preliminary results are also provided for the early test phase of Exercise 3 using a two-group cross-section library and the Relap5-3D model developed for Exercise 2.
Experimental generation of optical coherence lattices
Energy Technology Data Exchange (ETDEWEB)
Chen, Yahong; Cai, Yangjian, E-mail: serpo@dal.ca, E-mail: yangjiancai@suda.edu.cn [College of Physics, Optoelectronics and Energy and Collaborative Innovation Center of Suzhou Nano Science and Technology, Soochow University, Suzhou 215006 (China); Key Lab of Advanced Optical Manufacturing Technologies of Jiangsu Province and Key Lab of Modern Optical Technologies of Education Ministry of China, Soochow University, Suzhou 215006 (China); Ponomarenko, Sergey A., E-mail: serpo@dal.ca, E-mail: yangjiancai@suda.edu.cn [Department of Electrical and Computer Engineering, Dalhousie University, Halifax, Nova Scotia B3J 2X4 (Canada)
2016-08-08
We report experimental generation and measurement of recently introduced optical coherence lattices. The presented optical coherence lattice realization technique hinges on a superposition of mutually uncorrelated partially coherent Schell-model beams with tailored coherence properties. We show theoretically that information can be encoded into and, in principle, recovered from the lattice degree of coherence. Our results can find applications to image transmission and optical encryption.
Performance Evaluation of Supercomputers using HPCC and IMB Benchmarks
Saini, Subhash; Ciotti, Robert; Gunney, Brian T. N.; Spelce, Thomas E.; Koniges, Alice; Dossa, Don; Adamidis, Panagiotis; Rabenseifner, Rolf; Tiyyagura, Sunil R.; Mueller, Matthias;
2006-01-01
The HPC Challenge (HPCC) benchmark suite and the Intel MPI Benchmark (IMB) are used to compare and evaluate the combined performance of processor, memory subsystem and interconnect fabric of five leading supercomputers - SGI Altix BX2, Cray XI, Cray Opteron Cluster, Dell Xeon cluster, and NEC SX-8. These five systems use five different networks (SGI NUMALINK4, Cray network, Myrinet, InfiniBand, and NEC IXS). The complete set of HPCC benchmarks are run on each of these systems. Additionally, we present Intel MPI Benchmarks (IMB) results to study the performance of 11 MPI communication functions on these systems.
Light Hadron Spectroscopy on course lattices with
Lee, F
1999-01-01
The masses and dispersions of light hadrons are calculated in lattice QCD using an O(a sup 2) tadpole-improved gluon action and an O(a sup 2) tadpole-improved next-nearest-neighbor fermion action originally proposed by Hamber and Wu. Two lattices of constant volume with lattice spacings of approximately 0.40 fm and 0.24 fm are considered. The results reveal some scaling violations at the coarser lattice spacing on the order of 5%. At the finer lattice spacing, the nucleon to rho mass ratio reproduces state-of-the-art results using unimproved actions. Good dispersion and rotational invariance up to momenta of pa approx = 1 are also found. The relative merit of alternative choices for improvement operators is assessed through close comparisons with other plaquette-based tadpole-improved actions.
Cluster computing for lattice QCD simulations
International Nuclear Information System (INIS)
Coddington, P.D.; Williams, A.G.
2000-01-01
main application is lattice QCD calculations. We have a number of programs for generating and analysing lattice QCD configurations. These programs are written in a data parallel style using Fortran 90 array syntax. Initially they were run on the CM-5 by using CM Fortran compiler directives for specifying data distribution among the processors of the parallel machine. It was a simple task to convert these codes to use the equivalent directives for High Performance Fortran (HPF), which is a standard, portable data parallel language that can be used on clusters. We have used the Portland Group HPF compiler (PGHPF), which offers good support for cluster computing. We benchmarked our codes on a number of different types of machine, before eventually deciding to purchase a large cluster from Sun Microsystems, which was installed at Adelaide University in June 2000. With a peak performance of 144 GFlops, it is currently the fastest computer in Australia. The machine is a new product from Sun, known as a Sun Technical Compute Farm (TCF). It consists of a cluster of Sun E420R workstations, each of which has four 450MHz UltraSPARC II processors, with a peak speed of 3.6 GFlops per workstation. The NCFLGT cluster consists of 40 E420R workstations, giving a total of 160 processors, 160 GBytes of memory, 640 MBytes of cache memory, and 720 GBytes of disk. The standard Sun TCF product comes with either Fast or Gigabit Ethernet, with an option for using a very high-bandwidth, low-latency SCI network targeted at parallel computing applications. For most parallel lattice QCD codes, Ethernet does not offer a low enough communications latency, while SCI is very expensive and is overkill for our applications. We therefore decided upon a third-party solution for the network, and will soon be installing a high-speed Myrinet 2000 network. Currently we only have very preliminary performance results for our lattice QCD codes, which look quite promising. We will present detailed performance
Burn-up TRIGA Mark II benchmark experiment
International Nuclear Information System (INIS)
Persic, A.; Ravnik, M.; Zagar, T.
1998-01-01
Different reactor codes are used for calculations of reactor parameters. The accuracy of the programs is tested through comparison of the calculated values with the experimental results. Well-defined and accurately measured benchmarks are required. The experimental results of reactivity measurements, fuel element reactivity worth distribution and fuel-up measurements are presented in this paper. The experiments were performed with partly burnt reactor core. The experimental conditions were well defined, so that the results can be used as a burn-up benchmark test case for a TRIGA Mark II reactor calculations.(author)
Canadian Health Libraries Association.
Nine Canadian health libraries participated in a pilot test of the Benchmarking Tool Kit between January and April, 1998. Although the Tool Kit was designed specifically for health libraries, the content and approach are useful to other types of libraries as well. Used to its full potential, benchmarking can provide a common measuring stick to…
Lattice gravity near the continuum limit
International Nuclear Information System (INIS)
Feinberg, G.; Friedberg, R.; Lee, T.D.; Ren, H.C.
1984-01-01
We prove that the lattice gravity always approaches the usual continuum limit when the link length l -> 0, provided that certain general boundary conditions are satisfied. This result holds for any lattice, regular or irregular. Furthermore, for a given lattice, the deviation from its continuum limit can be expressed as a power series in l 2 . General formulas for such a perturbative calculation are given, together with a number of illustrative examples, including the graviton propagator. The lattice gravity satisfies all the invariance properties of Einstein's theory of general relativity. In addition, it is symmetric under a new class of transformations that are absent in the usual continuum theory. The possibility that the lattice theory (with a nonzero l) may be more fundamental is discussed. (orig.)
Lattices for the TRIUMF KAON factory
International Nuclear Information System (INIS)
Servranckx, R.V.; Craddock, M.K.
1989-09-01
Separated-function racetrack lattices have been developed for the KAON Factory accelerators that have more flexibility than the old circular lattices. The arcs of the large rings have a regular FODO structure with a superimposed six-fold symmetric modulation of the betafunction in order to raise γ t to infinity. Straight sections with zero dispersion are provided for rf cavities and fast injection and extraction, and with controlled dispersion for H - injection and slow extraction. For the small rings, sixfold symmetric circular lattices with high γ t are retained. In the Accumulator lattice, a straight section with double waist and controlled η function allows for H - injection and phase-space painting. The ion-optical properties of the lattices and the results from tracking studies are discussed
International Nuclear Information System (INIS)
Mack, G.
1982-01-01
After a description of a pure Yang-Mills theory on a lattice, the author considers a three-dimensional pure U(1) lattice gauge theory. Thereafter he discusses the exact relation between lattice gauge theories with the gauge groups SU(2) and SO(3). Finally he presents Monte Carlo data on phase transitions in SU(2) and SO(3) lattice gauge models. (HSI)
International Nuclear Information System (INIS)
Hennig, D.; Nechvatal, L.
1996-09-01
The report describes the PSI stability analysis methodology and the validation of this methodology based on the international OECD/NEA BWR stability benchmark task. In the frame of this work, the stability properties of some operation points of the NPP Ringhals 1 have been analysed and compared with the experimental results. (author) figs., tabs., 45 refs
Internal space decimation for lattice gauge theories
International Nuclear Information System (INIS)
Flyvbjerg, H.
1984-01-01
By a systematic decimation of internal space lattice gauge theories with continuous symmetry groups are mapped into effective lattice gauge theories with finite symmetry groups. The decimation of internal space makes a larger lattice tractable with the same computational resources. In this sense the method is an alternative to Wilson's and Symanzik's programs of improved actions. As an illustrative test of the method U(1) is decimated to Z(N) and the results compared with Monte Carlo data for Z(4)- and Z(5)-invariant lattice gauge theories. The result of decimating SU(3) to its 1080-element crystal-group-like subgroup is given and discussed. (orig.)
Testing the holographic principle using lattice simulations
Directory of Open Access Journals (Sweden)
Jha Raghav G.
2018-01-01
Full Text Available The lattice studies of maximally supersymmetric Yang-Mills (MSYM theory at strong coupling and large N is important for verifying gauge/gravity duality. Due to the progress made in the last decade, based on ideas from topological twisting and orbifolding, it is now possible to study these theories on the lattice while preserving an exact supersymmetry on the lattice. We present some results from the lattice studies of two-dimensional MSYM which is related to Type II supergravity. Our results agree with the thermodynamics of different black hole phases on the gravity side and the phase transition (Gregory–Laflamme between them.
A Global Vision over Benchmarking Process: Benchmarking Based Enterprises
Sitnikov, Catalina; Giurca Vasilescu, Laura
2008-01-01
Benchmarking uses the knowledge and the experience of others to improve the enterprise. Starting from the analysis of the performance and underlying the strengths and weaknesses of the enterprise it should be assessed what must be done in order to improve its activity. Using benchmarking techniques, an enterprise looks at how processes in the value chain are performed. The approach based on the vision “from the whole towards the parts” (a fragmented image of the enterprise’s value chain) redu...
The benchmark testing of 9Be of CENDL-3
International Nuclear Information System (INIS)
Liu Ping
2002-01-01
CENDL-3, the latest version of China Evaluated Nuclear Data Library was finished. The data of 9 Be were updated, and distributed for benchmark analysis recently. The calculated results were presented, and compared with the experimental data and the results based on other evaluated nuclear data libraries. The results show that CENDL-3 is better than others for most benchmarks
Application of PHEBUS results to benchmarking of nuclear plant safety codes
International Nuclear Information System (INIS)
Birchley, J.; Cripps, R.; Guentay, S.; Hosemann, J.P.
2001-01-01
The PHEBUS Fission Product project comprises six nuclear reactor severe accident simulations, using prototypic core materials and representative geometry and boundary conditions for the coolant loop and containment. The data thus produced are being used to benchmark the computer tools used for nuclear plant accident analysis to reduce the excessive conservatism typical for estimates of the radiological source term. A set of calculations has been carried out to simulate the results of experiment PHEBUS FPT-1 through each of its main stages, using computer models and methods analogous to those currently employed at PSI for assessments of Swiss nuclear plants. Good agreement for the core degradation and containment behaviour builds confidence in the models, while some open questions remain concerning some aspects of the release of fission products from the fuel, their transport and chemical speciation. Of potentially great importance to the reduction in source term estimates is the formation of the non-volatile species, silver iodide. Current investigations are focused on the uncertainty concerning fission product behaviour and the stability of silver iodide under irradiation. (author)
Argonne Code Center: Benchmark problem book.
Energy Technology Data Exchange (ETDEWEB)
None, None
1977-06-01
This book is an outgrowth of activities of the Computational Benchmark Problems Committee of the Mathematics and Computation Division of the American Nuclear Society. This is the second supplement of the original benchmark book which was first published in February, 1968 and contained computational benchmark problems in four different areas. Supplement No. 1, which was published in December, 1972, contained corrections to the original benchmark book plus additional problems in three new areas. The current supplement. Supplement No. 2, contains problems in eight additional new areas. The objectives of computational benchmark work and the procedures used by the committee in pursuing the objectives are outlined in the original edition of the benchmark book (ANL-7416, February, 1968). The members of the committee who have made contributions to Supplement No. 2 are listed below followed by the contributors to the earlier editions of the benchmark book.
A large-scale benchmark of gene prioritization methods.
Guala, Dimitri; Sonnhammer, Erik L L
2017-04-21
In order to maximize the use of results from high-throughput experimental studies, e.g. GWAS, for identification and diagnostics of new disease-associated genes, it is important to have properly analyzed and benchmarked gene prioritization tools. While prospective benchmarks are underpowered to provide statistically significant results in their attempt to differentiate the performance of gene prioritization tools, a strategy for retrospective benchmarking has been missing, and new tools usually only provide internal validations. The Gene Ontology(GO) contains genes clustered around annotation terms. This intrinsic property of GO can be utilized in construction of robust benchmarks, objective to the problem domain. We demonstrate how this can be achieved for network-based gene prioritization tools, utilizing the FunCoup network. We use cross-validation and a set of appropriate performance measures to compare state-of-the-art gene prioritization algorithms: three based on network diffusion, NetRank and two implementations of Random Walk with Restart, and MaxLink that utilizes network neighborhood. Our benchmark suite provides a systematic and objective way to compare the multitude of available and future gene prioritization tools, enabling researchers to select the best gene prioritization tool for the task at hand, and helping to guide the development of more accurate methods.
Developing a benchmark for emotional analysis of music.
Aljanaki, Anna; Yang, Yi-Hsuan; Soleymani, Mohammad
2017-01-01
Music emotion recognition (MER) field rapidly expanded in the last decade. Many new methods and new audio features are developed to improve the performance of MER algorithms. However, it is very difficult to compare the performance of the new methods because of the data representation diversity and scarcity of publicly available data. In this paper, we address these problems by creating a data set and a benchmark for MER. The data set that we release, a MediaEval Database for Emotional Analysis in Music (DEAM), is the largest available data set of dynamic annotations (valence and arousal annotations for 1,802 songs and song excerpts licensed under Creative Commons with 2Hz time resolution). Using DEAM, we organized the 'Emotion in Music' task at MediaEval Multimedia Evaluation Campaign from 2013 to 2015. The benchmark attracted, in total, 21 active teams to participate in the challenge. We analyze the results of the benchmark: the winning algorithms and feature-sets. We also describe the design of the benchmark, the evaluation procedures and the data cleaning and transformations that we suggest. The results from the benchmark suggest that the recurrent neural network based approaches combined with large feature-sets work best for dynamic MER.
The growth of minicircle networks on regular lattices
International Nuclear Information System (INIS)
Diao, Y; Hinson, K; Arsuaga, J
2012-01-01
The mitochondrial DNA of trypanosomes is organized into a network of topologically linked minicircles. In order to investigate how key topological properties of the network change with minicircle density, the authors introduced, in an earlier study, a mathematical model in which randomly oriented minicircles were placed on the vertices of the simple square lattice. Using this model, the authors rigorously showed that when the density of minicircles increases, percolation clusters form. For higher densities, these percolation clusters are the backbones for networks of minicircles that saturate the entire lattice. An important relevant question is whether these findings are generally true. That is, whether these results are independent of the choice of the lattices on which the model is based. In this paper, we study two additional lattices (namely the honeycomb and the triangular lattices). These regular lattices are selected because they have been proposed for trypanosomes before and after replication. We compare our findings with our earlier results on the square lattice and show that the mathematical statements derived for the square lattice can be extended to these other lattices qualitatively. This finding suggests the universality of these properties. Furthermore, we performed a numerical study which provided data that are consistent with our theoretical analysis, and showed that the effect of the choice of lattices on the key network topological characteristics is rather small. (paper)
International Nuclear Information System (INIS)
Jo, Jong Chull; Roh, Kyung Wan; Jhung, Myung Jo
2006-12-01
During this work period, a preliminary research has been conducted in the three different and related areas as stated in the proposal: literature survey, preliminary feasibility study of LBM and FEM coupling for FSI problems, and benchmark problems. As far as the literature review was concerned, approximately one hundred articles were found for the LBM techniques and critical review has been performed. The reviewed articles were classified into several topics that are useful for a subsequent development of the proposed computer program. Those topics included immiscible multicomponent flows, flow with energy transport, coupled multi-physics applications, application of the boundary conditions, irregular lattices, and turbulence. Furthermore, some fundamental review of the LBM was also included in this report. Secondly, a description of the LBM and FEM coupling program, which has been developed so far, was described here along with some demonstration examples. The preliminary study showed a great potential of the proposed technique for FSI application. A sample computer program list is also attached as Appendix A. As a future benchmark study, a set of test cases were proposed so that experimental data would be obtained in the next phase of the study. These data would be beneficial to understand the fundamental physics of the FSI nature under different basic conditions, and also provide benchmark results against which the developed program at a later stage could be validated. Finally, the future research direction as the extension of the present work is provided with emphasis on its goal, as well as merits and benefits resulting from the proposed research for the regulatory evaluation activities of KINS and the associated technical activities of industries such as design, manufacturing, fabrication, operation and maintenance
Amiri Delouei, A.; Nazari, M.; Kayhani, M. H.; Kang, S. K.; Succi, S.
2016-04-01
In the current study, a direct-forcing immersed boundary-non-Newtonian lattice Boltzmann method (IB-NLBM) is developed to investigate the sedimentation and interaction of particles in shear-thinning and shear-thickening fluids. In the proposed IB-NLBM, the non-linear mechanics of non-Newtonian particulate flows is detected by combination of the most desirable features of immersed boundary and lattice Boltzmann methods. The noticeable roles of non-Newtonian behavior on particle motion, settling velocity and generalized Reynolds number are investigated by simulating benchmark problem of one-particle sedimentation under the same generalized Archimedes number. The effects of extra force due to added accelerated mass are analyzed on the particle motion which have a significant impact on shear-thinning fluids. For the first time, the phenomena of interaction among the particles, such as Drafting, Kissing, and Tumbling in non-Newtonian fluids are investigated by simulation of two-particle sedimentation and twelve-particle sedimentation. The results show that increasing the shear-thickening behavior of fluid leads to a significant increase in the kissing time. Moreover, the transverse position of particles for shear-thinning fluids during the tumbling interval is different from Newtonian and the shear-thickening fluids. The present non-Newtonian particulate study can be applied in several industrial and scientific applications, like the non-Newtonian sedimentation behavior of particles in food industrial and biological fluids.
Scott, Paul
2006-01-01
A lattice is a (rectangular) grid of points, usually pictured as occurring at the intersections of two orthogonal sets of parallel, equally spaced lines. Polygons that have lattice points as vertices are called lattice polygons. It is clear that lattice polygons come in various shapes and sizes. A very small lattice triangle may cover just 3…
VENUS-2 MOX Core Benchmark: Results of ORNL Calculations Using HELIOS-1.4 - Revised Report
Energy Technology Data Exchange (ETDEWEB)
Ellis, RJ
2001-06-01
The Task Force on Reactor-Based Plutonium Disposition (TFRPD) was formed by the Organization for Economic Cooperation and Development/Nuclear Energy Agency (OECD/NEA) to study reactor physics, fuel performance, and fuel cycle issues related to the disposition of weapons-grade (WG) plutonium as mixed-oxide (MOX) reactor fuel. To advance the goals of the TFRPD, 10 countries and 12 institutions participated in a major TFRPD activity: a blind benchmark study to compare code calculations to experimental data for the VENUS-2 MOX core at SCK-CEN in Mol, Belgium. At Oak Ridge National Laboratory, the HELIOS-1.4 code system was used to perform the comprehensive study of pin-cell and MOX core calculations for the VENUS-2 MOX core benchmark study.
Heavy water critical experiments on plutonium lattice
International Nuclear Information System (INIS)
Miyawaki, Yoshio; Shiba, Kiminori
1975-06-01
This report is the summary of physics study on plutonium lattice made in Heavy Water Critical Experiment Section of PNC. By using Deuterium Critical Assembly, physics study on plutonium lattice has been carried out since 1972. Experiments on following items were performed in a core having 22.5 cm square lattice pitch. (1) Material buckling (2) Lattice parameters (3) Local power distribution factor (4) Gross flux distribution in two region core (5) Control rod worth. Experimental results were compared with theoretical ones calculated by METHUSELAH II code. It is concluded from this study that calculation by METHUSELAH II code has acceptable accuracy in the prediction on plutonium lattice. (author)
Benchmarking of HEU mental annuli critical assemblies with internally reflected graphite cylinder
Directory of Open Access Journals (Sweden)
Xiaobo Liu
2017-01-01
Full Text Available Three experimental configurations of critical assemblies, performed in 1963 at the Oak Ridge Critical Experiment Facility, which are assembled using three different diameter HEU annuli (15-9 inches, 15-7 inches and 13-7 inches metal annuli with internally reflected graphite cylinder are evaluated and benchmarked. The experimental uncertainties which are 0.00057, 0.00058 and 0.00057 respectively, and biases to the benchmark models which are − 0.00286, − 0.00242 and − 0.00168 respectively, were determined, and the experimental benchmark keff results were obtained for both detailed and simplified models. The calculation results for both detailed and simplified models using MCNP6-1.0 and ENDF/B-VII.1 agree well to the benchmark experimental results within difference less than 0.2%. The benchmarking results were accepted for the inclusion of ICSBEP Handbook.
Energy Technology Data Exchange (ETDEWEB)
Kemp, R.; Ward, D.J., E-mail: richard.kemp@ccfe.ac.uk [EURATOM/CCFE Association, Culham Centre for Fusion Energy, Abingdon (United Kingdom); Nakamura, M.; Tobita, K. [Japan Atomic Energy Agency, Rokkasho (Japan); Federici, G. [EFDA Garching, Max Plank Institut fur Plasmaphysik, Garching (Germany)
2012-09-15
Full text: Recent systems studies work within the Broader Approach framework has focussed on benchmarking the EU systems code PROCESS against the Japanese code TPC for conceptual DEMO designs. This paper describes benchmarking work for a conservative, pulsed DEMO and an advanced, steady-state, high-bootstrap fraction DEMO. The resulting former machine is an R{sub 0} = 10 m, a = 2.5 m, {beta}{sub N} < 2.0 device with no enhancement in energy confinement over IPB98. The latter machine is smaller (R{sub 0} = 8 m, a = 2.7 m), with {beta}{sub N} = 3.0, enhanced confinement, and high bootstrap fraction f{sub BS} = 0.8. These options were chosen to test the codes across a wide range of parameter space. While generally in good agreement, some of the code outputs differ. In particular, differences have been identified in the impurity radiation models and flux swing calculations. The global effects of these differences are described and approaches to identifying the best models, including future experiments, are discussed. Results of varying some of the assumptions underlying the modelling are also presented, demonstrating the sensitivity of the solutions to technological limitations and providing guidance for where further research could be focussed. (author)
Criticality Analysis Of TCA Critical Lattices With MNCP-4C Monte Carlo Calculation
International Nuclear Information System (INIS)
Zuhair
2002-01-01
The use of uranium-plutonium mixed oxide (MOX) fuel in electric generation light water reactor (PWR, BWR) is being planned in Japan. Therefore, the accuracy evaluations of neutronic analysis code for MOX cores have been employed by many scientists and reactor physicists. Benchmark evaluations for TCA was done using various calculation methods. The Monte Carlo become the most reliable method to predict criticality of various reactor types. In this analysis, the MCNP-4C code was chosen because various superiorities the code has. All in all, the MCNP-4C calculation for TCA core with 38 MOX critical lattice configurations gave the results with high accuracy. The JENDL-3.2 library showed significantly closer results to the ENDF/B-V. The k eff values calculated with the ENDF/B-VI library gave underestimated results. The ENDF/B-V library gave the best estimation. It can be concluded that MCNP-4C calculation, especially with ENDF/B-V and JENDL-3.2 libraries, for MOX fuel utilized NPP design in reactor core is the best choice
Dervaux, Benoît; Baseilhac, Eric; Fagon, Jean-Yves; Biot, Claire; Blachier, Corinne; Braun, Eric; Debroucker, Frédérique; Detournay, Bruno; Ferretti, Carine; Granger, Muriel; Jouan-Flahault, Chrystel; Lussier, Marie-Dominique; Meyer, Arlette; Muller, Sophie; Pigeon, Martine; De Sahb, Rima; Sannié, Thomas; Sapède, Claudine; Vray, Muriel
2014-01-01
Decree No. 2012-1116 of 2 October 2012 on medico-economic assignments of the French National Authority for Health (Haute autorité de santé, HAS) significantly alters the conditions for accessing the health products market in France. This paper presents a theoretical framework for interpreting the results of the economic evaluation of health technologies and summarises the facts available in France for developing benchmarks that will be used to interpret incremental cost-effectiveness ratios. This literature review shows that it is difficult to determine a threshold value but it is also difficult to interpret then incremental cost effectiveness ratio (ICER) results without a threshold value. In this context, round table participants favour a pragmatic approach based on "benchmarks" as opposed to a threshold value, based on an interpretative and normative perspective, i.e. benchmarks that can change over time based on feedback. © 2014 Société Française de Pharmacologie et de Thérapeutique.
Minimal knotted polygons in cubic lattices
International Nuclear Information System (INIS)
Van Rensburg, E J Janse; Rechnitzer, A
2011-01-01
In this paper we examine numerically the properties of minimal length knotted lattice polygons in the simple cubic, face-centered cubic, and body-centered cubic lattices by sieving minimal length polygons from a data stream of a Monte Carlo algorithm, implemented as described in Aragão de Carvalho and Caracciolo (1983 Phys. Rev. B 27 1635), Aragão de Carvalho et al (1983 Nucl. Phys. B 215 209) and Berg and Foester (1981 Phys. Lett. B 106 323). The entropy, mean writhe, and mean curvature of minimal length polygons are computed (in some cases exactly). While the minimal length and mean curvature are found to be lattice dependent, the mean writhe is found to be only weakly dependent on the lattice type. Comparison of our results to numerical results for the writhe obtained elsewhere (see Janse van Rensburg et al 1999 Contributed to Ideal Knots (Series on Knots and Everything vol 19) ed Stasiak, Katritch and Kauffman (Singapore: World Scientific), Portillo et al 2011 J. Phys. A: Math. Theor. 44 275004) shows that the mean writhe is also insensitive to the length of a knotted polygon. Thus, while these results for the mean writhe and mean absolute writhe at minimal length are not universal, our results demonstrate that these values are quite close the those of long polygons regardless of the underlying lattice and length
OWL2 benchmarking for the evaluation of knowledge based systems.
Directory of Open Access Journals (Sweden)
Sher Afgun Khan
Full Text Available OWL2 semantics are becoming increasingly popular for the real domain applications like Gene engineering and health MIS. The present work identifies the research gap that negligible attention has been paid to the performance evaluation of Knowledge Base Systems (KBS using OWL2 semantics. To fulfil this identified research gap, an OWL2 benchmark for the evaluation of KBS is proposed. The proposed benchmark addresses the foundational blocks of an ontology benchmark i.e. data schema, workload and performance metrics. The proposed benchmark is tested on memory based, file based, relational database and graph based KBS for performance and scalability measures. The results show that the proposed benchmark is able to evaluate the behaviour of different state of the art KBS on OWL2 semantics. On the basis of the results, the end users (i.e. domain expert would be able to select a suitable KBS appropriate for his domain.
Impact testing and analysis for structural code benchmarking
International Nuclear Information System (INIS)
Glass, R.E.
1989-01-01
Sandia National Laboratories, in cooperation with industry and other national laboratories, has been benchmarking computer codes (''Structural Code Benchmarking for the Analysis of Impact Response of Nuclear Material Shipping Cask,'' R.E. Glass, Sandia National Laboratories, 1985; ''Sample Problem Manual for Benchmarking of Cask Analysis Codes,'' R.E. Glass, Sandia National Laboratories, 1988; ''Standard Thermal Problem Set for the Evaluation of Heat Transfer Codes Used in the Assessment of Transportation Packages, R.E. Glass, et al., Sandia National Laboratories, 1988) used to predict the structural, thermal, criticality, and shielding behavior of radioactive materials packages. The first step in the benchmarking of the codes was to develop standard problem sets and to compare the results from several codes and users. This step for structural analysis codes has been completed as described in ''Structural Code Benchmarking for the Analysis of Impact Response of Nuclear Material Shipping Casks,'' R.E. Glass, Sandia National Laboratories, 1985. The problem set is shown in Fig. 1. This problem set exercised the ability of the codes to predict the response to end (axisymmetric) and side (plane strain) impacts with both elastic and elastic/plastic materials. The results from these problems showed that there is good agreement in predicting elastic response. Significant differences occurred in predicting strains for the elastic/plastic models. An example of the variation in predicting plastic behavior is given, which shows the hoop strain as a function of time at the impacting end of Model B. These differences in predicting plastic strains demonstrated a need for benchmark data for a cask-like problem. 6 refs., 5 figs
Lattice QCD: Status and Prospect
International Nuclear Information System (INIS)
Ukawa, Akira
2006-01-01
A brief review is given of the current status and near-future prospect of lattice QCD studies of the Standard Model. After summarizing a bit of history, we describe current attempts toward inclusion of dynamical up, down and strange quarks. Recent results on the light hadron mass spectrum as well as those on the heavy quark quantities are described. Recent work on lattice pentaquark search is summarized. We touch upon the PACS-CS Project for building our next machine for lattice QCD, and conclude with a summary of computer situation and the physics possibilities over the next several years
Energy Technology Data Exchange (ETDEWEB)
Fakhari, Abbas, E-mail: afakhari@nd.edu [Department of Civil and Environmental Engineering and Earth Sciences, University of Notre Dame, Notre Dame, IN 46556 (United States); Geier, Martin [TU Braunschweig, Institute for Computational Modeling in Civil Engineering (iRMB), TU-Braunschweig, Pockelsstr. 3, 38106 Braunschweig (Germany); Lee, Taehun [Department of Mechanical Engineering, The City College of the City University of New York, New York, NY 10031 (United States)
2016-06-15
A mass-conserving lattice Boltzmann method (LBM) for multiphase flows is presented in this paper. The proposed LBM improves a previous model (Lee and Liu, 2010 [21]) in terms of mass conservation, speed-up, and efficiency, and also extends its capabilities for implementation on non-uniform grids. The presented model consists of a phase-field lattice Boltzmann equation (LBE) for tracking the interface between different fluids and a pressure-evolution LBM for recovering the hydrodynamic properties. In addition to the mass conservation property and the simplicity of the algorithm, the advantages of the current phase-field LBE are that it is an order of magnitude faster than the previous interface tracking LBE proposed by Lee and Liu (2010) [21] and it requires less memory resources for data storage. Meanwhile, the pressure-evolution LBM is equipped with a multi-relaxation-time (MRT) collision operator to facilitate attainability of small relaxation rates thereby allowing simulation of multiphase flows at higher Reynolds numbers. Additionally, we reformulate the presented MRT-LBM on nonuniform grids within an adaptive mesh refinement (AMR) framework. Various benchmark studies such as a rising bubble and a falling drop under buoyancy, droplet splashing on a wet surface, and droplet coalescence onto a fluid interface are conducted to examine the accuracy and versatility of the proposed AMR-LBM. The proposed model is further validated by comparing the results with other LB models on uniform grids. A factor of about 20 in savings of computational resources is achieved by using the proposed AMR-LBM. As a more demanding application, the Kelvin–Helmholtz instability (KHI) of a shear-layer flow is investigated for both density-matched and density-stratified binary fluids. The KHI results of the density-matched fluids are shown to be in good agreement with the benchmark AMR results based on the sharp-interface approach. When a density contrast between the two fluids exists, a
International Nuclear Information System (INIS)
Fakhari, Abbas; Geier, Martin; Lee, Taehun
2016-01-01
A mass-conserving lattice Boltzmann method (LBM) for multiphase flows is presented in this paper. The proposed LBM improves a previous model (Lee and Liu, 2010 [21]) in terms of mass conservation, speed-up, and efficiency, and also extends its capabilities for implementation on non-uniform grids. The presented model consists of a phase-field lattice Boltzmann equation (LBE) for tracking the interface between different fluids and a pressure-evolution LBM for recovering the hydrodynamic properties. In addition to the mass conservation property and the simplicity of the algorithm, the advantages of the current phase-field LBE are that it is an order of magnitude faster than the previous interface tracking LBE proposed by Lee and Liu (2010) [21] and it requires less memory resources for data storage. Meanwhile, the pressure-evolution LBM is equipped with a multi-relaxation-time (MRT) collision operator to facilitate attainability of small relaxation rates thereby allowing simulation of multiphase flows at higher Reynolds numbers. Additionally, we reformulate the presented MRT-LBM on nonuniform grids within an adaptive mesh refinement (AMR) framework. Various benchmark studies such as a rising bubble and a falling drop under buoyancy, droplet splashing on a wet surface, and droplet coalescence onto a fluid interface are conducted to examine the accuracy and versatility of the proposed AMR-LBM. The proposed model is further validated by comparing the results with other LB models on uniform grids. A factor of about 20 in savings of computational resources is achieved by using the proposed AMR-LBM. As a more demanding application, the Kelvin–Helmholtz instability (KHI) of a shear-layer flow is investigated for both density-matched and density-stratified binary fluids. The KHI results of the density-matched fluids are shown to be in good agreement with the benchmark AMR results based on the sharp-interface approach. When a density contrast between the two fluids exists, a
Tadpole-improved SU(2) lattice gauge theory
Shakespeare, Norman H.; Trottier, Howard D.
1999-01-01
A comprehensive analysis of tadpole-improved SU(2) lattice gauge theory is made. Simulations are done on isotropic and anisotropic lattices, with and without improvement. Two tadpole renormalization schemes are employed, one using average plaquettes, the other using mean links in the Landau gauge. Simulations are done with spatial lattice spacings as in the range of about 0.1-0.4 fm. Results are presented for the static quark potential, the renormalized lattice anisotropy at/as (where at is the ``temporal'' lattice spacing), and for the scalar and tensor glueball masses. Tadpole improvement significantly reduces discretization errors in the static quark potential and in the scalar glueball mass, and results in very little renormalization of the bare anisotropy that is input to the action. We also find that tadpole improvement using mean links in the Landau gauge results in smaller discretization errors in the scalar glueball mass (as well as in the static quark potential), compared to when average plaquettes are used. The possibility is also raised that further improvement in the scalar glueball mass may result when the coefficients of the operators which correct for discretization errors in the action are computed beyond the tree level.
Relationships between lattice energies of inorganic ionic solids
Kaya, Savaş
2018-06-01
Lattice energy, which is a measure of the stabilities of inorganic ionic solids, is the energy required to decompose a solid into its constituent independent gaseous ions. In the present work, the relationships between lattice energies of many diatomic and triatomic inorganic ionic solids are revealed and a simple rule that can be used for the prediction of the lattice energies of inorganic ionic solids is introduced. According to this rule, the lattice energy of an AB molecule can be predicted with the help of the lattice energies of AX, BY and XY molecules in agreement with the experimental data. This rule is valid for not only diatomic molecules but also triatomic molecules. The lattice energy equations proposed in this rule provides compatible results with previously published lattice energy equations by Jenkins, Kaya, Born-Lande, Born-Mayer, Kapustinskii and Reddy. For a large set of tested molecules, calculated percent standard deviation values considering experimental data and the results of the equations proposed in this work are in general between %1-2%.
Lattice polytopes in coding theory
Directory of Open Access Journals (Sweden)
Ivan Soprunov
2015-05-01
Full Text Available In this paper we discuss combinatorial questions about lattice polytopes motivated by recent results on minimum distance estimation for toric codes. We also include a new inductive bound for the minimum distance of generalized toric codes. As an application, we give new formulas for the minimum distance of generalized toric codes for special lattice point configurations.
Results of the reliability benchmark exercise and the future CEC-JRC program
International Nuclear Information System (INIS)
Amendola, A.
1985-01-01
As a contribution towards identifying problem areas and for assessing probabilistic safety assessment (PSA) methods and procedures of analysis, JRC has organized a wide-range Benchmark Exercise on systems reliability. This has been executed by ten different teams involving seventeen organizations from nine European countries. The exercise has been based on a real case (Auxiliary Feedwater System of EDF Paluel PWR 1300 MWe Unit), starting from analysis of technical specifications, logical and topological layout and operational procedures. Terms of references included both qualitative and quantitative analyses. The subdivision of the exercise into different phases and the rules adopted allowed assessment of the different components of the spread of the overall results. It appeared that modelling uncertainties may overwhelm data uncertainties and major efforts must be spent in order to improve consistency and completeness of qualitative analysis. After successful completion of the first exercise, CEC-JRC program has planned separate exercises on analysis of dependent failures and human factors before approaching the evaluation of a complete accident sequence
Benchmarking af kommunernes sagsbehandling
DEFF Research Database (Denmark)
Amilon, Anna
Fra 2007 skal Ankestyrelsen gennemføre benchmarking af kommuernes sagsbehandlingskvalitet. Formålet med benchmarkingen er at udvikle praksisundersøgelsernes design med henblik på en bedre opfølgning og at forbedre kommunernes sagsbehandling. Dette arbejdspapir diskuterer metoder for benchmarking...
New integrable lattice hierarchies
International Nuclear Information System (INIS)
Pickering, Andrew; Zhu Zuonong
2006-01-01
In this Letter we give a new integrable four-field lattice hierarchy, associated to a new discrete spectral problem. We obtain our hierarchy as the compatibility condition of this spectral problem and an associated equation, constructed herein, for the time-evolution of eigenfunctions. We consider reductions of our hierarchy, which also of course admit discrete zero curvature representations, in detail. We find that our hierarchy includes many well-known integrable hierarchies as special cases, including the Toda lattice hierarchy, the modified Toda lattice hierarchy, the relativistic Toda lattice hierarchy, and the Volterra lattice hierarchy. We also obtain here a new integrable two-field lattice hierarchy, to which we give the name of Suris lattice hierarchy, since the first equation of this hierarchy has previously been given by Suris. The Hamiltonian structure of the Suris lattice hierarchy is obtained by means of a trace identity formula
The world according to lattice QCD
International Nuclear Information System (INIS)
Sharpe, S.R.
1988-12-01
A non-technical introduction to lattice calculations is given. The successes and problems of current calculations are emphasized. A summary of lattice results on non-exotic meson and baryon masses indicates that while calculations in the quenched approximation are becoming reliable, the results differ in systematic ways from the physical values. Results for exotic mesons (glueballs and hybrids) are then presented. The future prospects are discussed. 23 refs., 4 figs
Energy Technology Data Exchange (ETDEWEB)
Son, Sung Wan; Ha, Man Yeong; Yoon, Hyun Sik [Pusan National University, Busan (Korea, Republic of); Jeong, Hae Kwon [POSCO, Pohang (Korea, Republic of); Balachandar, S. [University of Florida, Florida (United States)
2013-02-15
We investigate the discrete lattice effect of various forcing methods in the lattice Boltzmann method (LBM) to include the body force obtained from the immersed boundary method (IBM). In the immersed boundary lattice Boltzmann method (IB-LBM), the LBM needs a forcing method to involve the body force on a forcing point near the immersed boundary that is calculated by IBM. The proper forcing method in LBM is derived to include the body force, which appears to resolve problems such as multiphase flow, non-ideal gas behavior, etc. Many researchers have adopted different forcing methods in LBM to involve the body force from IBM, even when they solved similar problems. However, it is necessary to evaluate the discrete lattice effect, which originates from different forcing methods in LBM, to include the effect of the body force from IBM on the results. Consequently, in this study, a rigorous analysis of the discrete lattice effect for different forcing methods in IB-LBM is performed by solving various problems.
International Nuclear Information System (INIS)
Son, Sung Wan; Ha, Man Yeong; Yoon, Hyun Sik; Jeong, Hae Kwon; Balachandar, S.
2013-01-01
We investigate the discrete lattice effect of various forcing methods in the lattice Boltzmann method (LBM) to include the body force obtained from the immersed boundary method (IBM). In the immersed boundary lattice Boltzmann method (IB-LBM), the LBM needs a forcing method to involve the body force on a forcing point near the immersed boundary that is calculated by IBM. The proper forcing method in LBM is derived to include the body force, which appears to resolve problems such as multiphase flow, non-ideal gas behavior, etc. Many researchers have adopted different forcing methods in LBM to involve the body force from IBM, even when they solved similar problems. However, it is necessary to evaluate the discrete lattice effect, which originates from different forcing methods in LBM, to include the effect of the body force from IBM on the results. Consequently, in this study, a rigorous analysis of the discrete lattice effect for different forcing methods in IB-LBM is performed by solving various problems.
Nucleon electromagnetic form factors from lattice QCD
International Nuclear Information System (INIS)
Alexandrou, C.; Koutsou, G.; Negele, J. W.; Tsapalis, A.
2006-01-01
We evaluate the isovector nucleon electromagnetic form factors in quenched and unquenched QCD on the lattice using Wilson fermions. In the quenched theory we use a lattice of spatial size 3 fm at β=6.0 enabling us to reach low momentum transfers and a lowest pion mass of about 400 MeV. In the unquenched theory we use two degenerate flavors of dynamical Wilson fermions on a lattice of spatial size 1.9 fm at β=5.6 and lowest pion mass of about 380 MeV enabling comparison with the results obtained in the quenched theory. that unquenching effects are small for the pion masses considered in this work. We compare our lattice results to the isovector part of the experimentally measured form factors
International Nuclear Information System (INIS)
Choy, J.H.
1979-06-01
A benchmark of the TOTAL data base management system as applied to the Mirror Fusion Test Facility (MFTF) data base was implemented and run in February and March of 1979. The benchmark was run on an Interdata 8/32 and involved the following tasks: (1) data base design, (2) data base generation, (3) data base load, and (4) develop and implement programs to simulate MFTF usage of the data base
P.A. Boncz (Peter); T. Rühl (Tim); F. Kwakkel
1998-01-01
textabstractData Mining places specific requirements on DBMS query performance that cannot be evaluated satisfactorily using existing OLAP benchmarks. The DD Benchmark - defined here - provides a practical case and yardstick to explore how well a DBMS is able to support Data Mining applications. It
Full CKM matrix with lattice QCD
Energy Technology Data Exchange (ETDEWEB)
Okamoto, Masataka; /Fermilab
2004-12-01
The authors show that it is now possible to fully determine the CKM matrix, for the first time, using lattice QCD. |V{sub cd}|, |V{sub cs}|, |V{sub ub}|, |V{sub cb}| and |V{sub us}| are, respectively, directly determined with the lattice results for form factors of semileptonic D {yields} {pi}lv, D {yields} Klv, B {yields} {pi}lv, B {yields} Dlv and K {yields} {pi}lv decays. The error from the quenched approximation is removed by using the MILC unquenced lattice gauge configurations, where the effect of u, d and s quarks is included. The error from the ''chiral'' extrapolation (m{sub l} {yields} m{sub ud}) is greatly reduced by using improved staggered quarks. The accuracy is comparable to that of the Particle Data Group averages. In addition, |V{sub ud}|, |V{sub ts}|, |V{sub ts}| and |V{sub td}| are determined by using unitarity of the CKM matrix and the experimental result for sin (2{beta}). In this way, they obtain all 9 CKM matrix elements, where the only theoretical input is lattice QCD. They also obtain all the Wolfenstein parameters, for the first time, using lattice QCD.
Convection-diffusion lattice Boltzmann scheme for irregular lattices
Sman, van der R.G.M.; Ernst, M.H.
2000-01-01
In this paper, a lattice Boltzmann (LB) scheme for convection diffusion on irregular lattices is presented, which is free of any interpolation or coarse graining step. The scheme is derived using the axioma that the velocity moments of the equilibrium distribution equal those of the
Thermodynamics of lattice QCD with 2 sextet quarks on Nt=8 lattices
International Nuclear Information System (INIS)
Kogut, J. B.; Sinclair, D. K.
2011-01-01
We continue our lattice simulations of QCD with 2 flavors of color-sextet quarks as a model for conformal or walking technicolor. A 2-loop perturbative calculation of the β function which describes the evolution of this theory's running coupling constant predicts that it has a second zero at a finite coupling. This nontrivial zero would be an infrared stable fixed point, in which case the theory with massless quarks would be a conformal field theory. However, if the interaction between quarks and antiquarks becomes strong enough that a chiral condensate forms before this IR fixed point is reached, the theory is QCD-like with spontaneously broken chiral symmetry and confinement. However, the presence of the nearby IR fixed point means that there is a range of couplings for which the running coupling evolves very slowly, i.e. it ''walks.'' We are simulating the lattice version of this theory with staggered quarks at finite temperature, studying the changes in couplings at the deconfinement and chiral-symmetry restoring transitions as the temporal extent (N t ) of the lattice, measured in lattice units, is increased. Our earlier results on lattices with N t =4, 6 show both transitions move to weaker couplings as N t increases consistent with walking behavior. In this paper we extend these calculations to N t =8. Although both transitions again move to weaker couplings, the change in the coupling at the chiral transition from N t =6 to N t =8 is appreciably smaller than that from N t =4 to N t =6. This indicates that at N t =4, 6 we are seeing strong-coupling effects and that we will need results from N t >8 to determine if the chiral-transition coupling approaches zero as N t →∞, as needed for the theory to walk.
Spectral Gaps in Graphene Antidot Lattices
DEFF Research Database (Denmark)
Barbaroux, Jean-Marie; Cornean, Decebal Horia; Stockmeyer, Edgardo
2017-01-01
We consider the gap creation problem in an antidot graphene lattice, i.e. a sheet of graphene with periodically distributed obstacles. We prove several spectral results concerning the size of the gap and its dependence on different natural parameters related to the antidot lattice....
Kinetic models for irreversible processes on a lattice
Energy Technology Data Exchange (ETDEWEB)
Wolf, N.O.
1979-04-01
The development and application of kinetic lattice models are considered. For the most part, the discussions are restricted to lattices in one-dimension. In Chapter 1, a brief overview of kinetic lattice model formalisms and an extensive literature survey are presented. A review of the kinetic models for non-cooperative lattice events is presented in Chapter 2. The development of cooperative lattice models and solution of the resulting kinetic equations for an infinite and a semi-infinite lattice are thoroughly discussed in Chapters 3 and 4. The cooperative models are then applied to the problem of theoretically dtermining the sticking coefficient for molecular chemisorption in Chapter 5. In Chapter 6, other possible applications of these models and several model generalizations are considered. Finally, in Chapter 7, an experimental study directed toward elucidating the mechanistic factors influencing the chemisorption of methane on single crystal tungsten is reported. In this it differs from the rest of the thesis which deals with the statistical distributions resulting from a given mechanism.
Kinetic models for irreversible processes on a lattice
International Nuclear Information System (INIS)
Wolf, N.O.
1979-04-01
The development and application of kinetic lattice models are considered. For the most part, the discussions are restricted to lattices in one-dimension. In Chapter 1, a brief overview of kinetic lattice model formalisms and an extensive literature survey are presented. A review of the kinetic models for non-cooperative lattice events is presented in Chapter 2. The development of cooperative lattice models and solution of the resulting kinetic equations for an infinite and a semi-infinite lattice are thoroughly discussed in Chapters 3 and 4. The cooperative models are then applied to the problem of theoretically dtermining the sticking coefficient for molecular chemisorption in Chapter 5. In Chapter 6, other possible applications of these models and several model generalizations are considered. Finally, in Chapter 7, an experimental study directed toward elucidating the mechanistic factors influencing the chemisorption of methane on single crystal tungsten is reported. In this it differs from the rest of the thesis which deals with the statistical distributions resulting from a given mechanism
Benchmarking and Learning in Public Healthcare
DEFF Research Database (Denmark)
Buckmaster, Natalie; Mouritsen, Jan
2017-01-01
This research investigates the effects of learning-oriented benchmarking in public healthcare settings. Benchmarking is a widely adopted yet little explored accounting practice that is part of the paradigm of New Public Management. Extant studies are directed towards mandated coercive benchmarking...
Strong dynamics and lattice gauge theory
Schaich, David
and other properties of the new particles predicted by these theories. I find S ≳ 0.1 in the specific theories I study. Although this result still disagrees with experiment, it is much closer to the experimental value than is the conventional wisdom S ≳ 0.3. These results encourage further lattice studies to search for experimentally viable strongly-interacting theories of EWSB.
Benchmarking Danish Vocational Education and Training Programmes
DEFF Research Database (Denmark)
Bogetoft, Peter; Wittrup, Jesper
This study paper discusses methods whereby Danish vocational education and training colleges can be benchmarked, and presents results from a number of models. It is conceptually complicated to benchmark vocational colleges, as the various colleges in Denmark offer a wide range of course programmes...... attempt to summarise the various effects that the colleges have in two relevant figures, namely retention rates of students and employment rates among students who have completed training programmes....
Benchmarking & European Sustainable Transport Policies
DEFF Research Database (Denmark)
Gudmundsson, H.
2003-01-01
, Benchmarking is one of the management tools that have recently been introduced in the transport sector. It is rapidly being applied to a wide range of transport operations, services and policies. This paper is a contribution to the discussion of the role of benchmarking in the future efforts to...... contribution to the discussions within the Eusponsored BEST Thematic Network (Benchmarking European Sustainable Transport) which ran from 2000 to 2003....
Benchmarking – A tool for judgment or improvement?
DEFF Research Database (Denmark)
Rasmussen, Grane Mikael Gregaard
2010-01-01
perceptions of benchmarking will be presented; public benchmarking and best practice benchmarking. These two types of benchmarking are used to characterize and discuss the Danish benchmarking system and to enhance which effects, possibilities and challenges that follow in the wake of using this kind......Change in construction is high on the agenda for the Danish government and a comprehensive effort is done in improving quality and efficiency. This has led to an initiated governmental effort in bringing benchmarking into the Danish construction sector. This paper is an appraisal of benchmarking...... as it is presently carried out in the Danish construction sector. Many different perceptions of benchmarking and the nature of the construction sector, lead to an uncertainty in how to perceive and use benchmarking, hence, generating an uncertainty in understanding the effects of benchmarking. This paper addresses...
Storage-Intensive Supercomputing Benchmark Study
Energy Technology Data Exchange (ETDEWEB)
Cohen, J; Dossa, D; Gokhale, M; Hysom, D; May, J; Pearce, R; Yoo, A
2007-10-30
Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe: (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows
BONFIRE: benchmarking computers and computer networks
Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker
2011-01-01
The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...
Lattices with unique complements
Saliĭ, V N
1988-01-01
The class of uniquely complemented lattices properly contains all Boolean lattices. However, no explicit example of a non-Boolean lattice of this class has been found. In addition, the question of whether this class contains any complete non-Boolean lattices remains unanswered. This book focuses on these classical problems of lattice theory and the various attempts to solve them. Requiring no specialized knowledge, the book is directed at researchers and students interested in general algebra and mathematical logic.
A highly simplified 3D BWR benchmark problem
International Nuclear Information System (INIS)
Douglass, Steven; Rahnema, Farzad
2010-01-01
The resurgent interest in reactor development associated with the nuclear renaissance has paralleled significant advancements in computer technology, and allowed for unprecedented computational power to be applied to the numerical solution of neutron transport problems. The current generation of core-level solvers relies on a variety of approximate methods (e.g. nodal diffusion theory, spatial homogenization) to efficiently solve reactor problems with limited computer power; however, in recent years, the increased availability of high-performance computer systems has created an interest in the development of new methods and codes (deterministic and Monte Carlo) to directly solve whole-core reactor problems with full heterogeneity (lattice and core level). This paper presents the development of a highly simplified heterogeneous 3D benchmark problem with physics characteristic of boiling water reactors. The aim of this work is to provide a problem for developers to use to validate new whole-core methods and codes which take advantage of the advanced computational capabilities that are now available. Additionally, eigenvalues and an overview of the pin fission density distribution are provided for the benefit of the reader. (author)
International Nuclear Information System (INIS)
Choi, Sooyoung; Khassenov, Azamat; Lee, Deokjung
2016-01-01
This paper presents a new method of resonance interference effect treatment using resonance interference factor for high fidelity analysis of light water reactors (LWRs). Although there have been significant improvements in the lattice physics calculations over the several decades, there exist still relatively large errors in the resonance interference treatment, in the order of ∼300 pcm in the reactivity prediction of LWRs. In the newly developed method, the impact of resonance interference to the multi-group cross-sections has been quantified and tabulated in a library which can be used in lattice physics calculation as adjustment factors of multi-group cross-sections. The verification of the new method has been performed with Mosteller benchmark, UO_2 and MOX pin-cell depletion problems, and a 17×17 fuel assembly loaded with gadolinia burnable poison, and significant improvements were demonstrated in the accuracy of reactivity and pin power predictions, with reactivity errors down to the order of ∼100 pcm. (author)
SKaMPI: A Comprehensive Benchmark for Public Benchmarking of MPI
Directory of Open Access Journals (Sweden)
Ralf Reussner
2002-01-01
Full Text Available The main objective of the MPI communication library is to enable portable parallel programming with high performance within the message-passing paradigm. Since the MPI standard has no associated performance model, and makes no performance guarantees, comprehensive, detailed and accurate performance figures for different hardware platforms and MPI implementations are important for the application programmer, both for understanding and possibly improving the behavior of a given program on a given platform, as well as for assuring a degree of predictable behavior when switching to another hardware platform and/or MPI implementation. We term this latter goal performance portability, and address the problem of attaining performance portability by benchmarking. We describe the SKaMPI benchmark which covers a large fraction of MPI, and incorporates well-accepted mechanisms for ensuring accuracy and reliability. SKaMPI is distinguished among other MPI benchmarks by an effort to maintain a public performance database with performance data from different hardware platforms and MPI implementations.
Teaching Benchmark Strategy for Fifth-Graders in Taiwan
Yang, Der-Ching; Lai, M. L.
2013-01-01
The key purpose of this study was how we taught the use of benchmark strategy when comparing fraction for fifth-graders in Taiwan. 26 fifth graders from a public elementary in south Taiwan were selected to join this study. Results of this case study showed that students had a much progress on the use of benchmark strategy when comparing fraction…
Entropy-based benchmarking methods
Temurshoev, Umed
2012-01-01
We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth
Fatigue design of a mechanically biocompatible lattice for a proof-of-concept femoral stem.
Arabnejad Khanoki, Sajad; Pasini, Damiano
2013-06-01
A methodology is proposed to design a spatially periodic microarchitectured material for a two-dimensional femoral implant under walking gait conditions. The material is composed of a graded lattice with controlled property distribution that minimizes concurrently bone resorption and interface failure. The periodic microstructure of the material is designed for fatigue fracture caused by cyclic loadings on the hip joint as a result of walking. The bulk material of the lattice is Ti6AL4V and its microstructure is assumed free of defects. The Soderberg diagram is used for the fatigue design under multiaxial loadings. Two cell topologies, square and Kagome, are chosen to obtain optimized property gradients for a two-dimensional implant. Asymptotic homogenization (AH) theory is used to address the multiscale mechanics of the implant as well as to capture the stress and strain distribution at both the macro and the microscale. The microstress distribution found with AH is also compared with that obtained from a detailed finite element analysis. For the maximum value of the von Mises stress, we observe a deviation of 18.6% in unit cells close to the implant boundary, where the AH assumption of spatial periodicity of the fluctuating fields ceases to hold. In the second part of the paper, the metrics of bone resorption and interface shear stress are used to benchmark the graded cellular implant with existing prostheses made of fully dense titanium implant. The results show that the amount of initial postoperative bone loss for square and Kagome lattice implants decreases, respectively, by 53.8% and 58%. In addition, the maximum shear interface failure at the distal end is significantly reduced by about 79%. A set of proof-of-concepts of planar implants have been fabricated via Electron Beam Melting (EBM) to demonstrate the manufacturability of Ti6AL4V into graded lattices with alternative cell size. Optical microscopy has been used to measure the morphological parameters
The Lattice-Valued Turing Machines and the Lattice-Valued Type 0 Grammars
Directory of Open Access Journals (Sweden)
Juan Tang
2014-01-01
Full Text Available Purpose. The purpose of this paper is to study a class of the natural languages called the lattice-valued phrase structure languages, which can be generated by the lattice-valued type 0 grammars and recognized by the lattice-valued Turing machines. Design/Methodology/Approach. From the characteristic of natural language, this paper puts forward a new concept of the l-valued Turing machine. It can be used to characterize recognition, natural language processing, and dynamic characteristics. Findings. The mechanisms of both the generation of grammars for the lattice-valued type 0 grammar and the dynamic transformation of the lattice-valued Turing machines were given. Originality/Value. This paper gives a new approach to study a class of natural languages by using lattice-valued logic theory.
C5 Benchmark Problem with Discrete Ordinate Radiation Transport Code DENOVO
Energy Technology Data Exchange (ETDEWEB)
Yesilyurt, Gokhan [ORNL; Clarno, Kevin T [ORNL; Evans, Thomas M [ORNL; Davidson, Gregory G [ORNL; Fox, Patricia B [ORNL
2011-01-01
The C5 benchmark problem proposed by the Organisation for Economic Co-operation and Development/Nuclear Energy Agency was modeled to examine the capabilities of Denovo, a three-dimensional (3-D) parallel discrete ordinates (S{sub N}) radiation transport code, for problems with no spatial homogenization. Denovo uses state-of-the-art numerical methods to obtain accurate solutions to the Boltzmann transport equation. Problems were run in parallel on Jaguar, a high-performance supercomputer located at Oak Ridge National Laboratory. Both the two-dimensional (2-D) and 3-D configurations were analyzed, and the results were compared with the reference MCNP Monte Carlo calculations. For an additional comparison, SCALE/KENO-V.a Monte Carlo solutions were also included. In addition, a sensitivity analysis was performed for the optimal angular quadrature and mesh resolution for both the 2-D and 3-D infinite lattices of UO{sub 2} fuel pin cells. Denovo was verified with the C5 problem. The effective multiplication factors, pin powers, and assembly powers were found to be in good agreement with the reference MCNP and SCALE/KENO-V.a Monte Carlo calculations.
Natural uranium lattice in heavy water
International Nuclear Information System (INIS)
Girard, Y.; Koechlin, J.C.; Moreau, J.; Naudet, R.
1959-01-01
A group of Laplacian determinations have been made under critical running conditions in a heavy water pile specially constructed to this end using either complete lattices or samples of lattices employing a two-zone method. The experimental equipment is briefly described: it has been devised to allow rapid modifications of the charge. The methods of measurement employed are also summarily described one operates either by flux charts in the case of lattices which are then used as references, or by progressive replacement of the bars by concentric rings and measurements of the reactivity. In this case, one attempts to obtain the difference between the material laplacian of the central unknown lattice and that of the reference lattice. The method has been specially develop ped to give precision. Results of Laplacian measurements for all these lattice types are presented, allowing the construction of a set of curves as a function of the separation. Various other effects have also been measured: the equivalent reactivity of a mm of water - anisotropy - temperature effect, etc. However in this first attack on the problem, the measurement of a large variety of Laplacian has been carried out, rather than careful measurements in particular cases. It is in this spirit that the interpretation of the results has been made. As a large number of very complex phenomena still escape the possibilities of the calculation, it is considered that a certain number of adjustments are necessary; now these can only give the desired efficiency in forecasting results if they refer to a sufficiently great number of experimental data. It is necessary then to connect the measurements closely on with the other whilst, at the same time, subdividing them according to logically deduced formulae. The principal source of trouble has been that of coherence. The rules governing the calculations employed in the interpretation of the data are given. In the first instance simple formula are used: first of
3D Metallic Lattices for Accelerator Applications
Shapiro, Michael A; Sirigiri, Jagadishwar R; Temkin, Richard J
2005-01-01
We present the results of research on 3D metallic lattices operating at microwave frequencies for application in (1) accelerator structures with higher order mode suppression, (2) Smith-Purcell radiation beam diagnostics, and (3) polaritonic materials for laser acceleration. Electromagnetic waves in a 3D simple cubic lattice formed by metal wires are calculated using HFSS. The bulk modes in the lattice are determined using single cell calculations with different phase advances in all three directions. The Brillouin diagram for the bulk modes is presented and indicates the absence of band gaps in simple lattices except the band below the cutoff. Lattices with thin wires as well as with thick wires have been analyzed. The Brillouin diagram also indicates the presence of low frequency 3D plasmon mode as well as the two degenerate photon modes analogous to those in a 2D lattice. Surface modes for a semi-infinite cubic lattice are modeled as a stack of cells with different phase advances in the two directions alon...
Power reactor pressure vessel benchmarks
International Nuclear Information System (INIS)
Rahn, F.J.
1978-01-01
A review is given of the current status of experimental and calculational benchmarks for use in understanding the radiation embrittlement effects in the pressure vessels of operating light water power reactors. The requirements of such benchmarks for application to pressure vessel dosimetry are stated. Recent developments in active and passive neutron detectors sensitive in the ranges of importance to embrittlement studies are summarized and recommendations for improvements in the benchmark are made. (author)
Castle, Toen; Sussman, Daniel M; Tanis, Michael; Kamien, Randall D
2016-09-01
Kirigami uses bending, folding, cutting, and pasting to create complex three-dimensional (3D) structures from a flat sheet. In the case of lattice kirigami, this cutting and rejoining introduces defects into an underlying 2D lattice in the form of points of nonzero Gaussian curvature. A set of simple rules was previously used to generate a wide variety of stepped structures; we now pare back these rules to their minimum. This allows us to describe a set of techniques that unify a wide variety of cut-and-paste actions under the rubric of lattice kirigami, including adding new material and rejoining material across arbitrary cuts in the sheet. We also explore the use of more complex lattices and the different structures that consequently arise. Regardless of the choice of lattice, creating complex structures may require multiple overlapping kirigami cuts, where subsequent cuts are not performed on a locally flat lattice. Our additive kirigami method describes such cuts, providing a simple methodology and a set of techniques to build a huge variety of complex 3D shapes.
Stylized whole-core benchmark of the Integral Inherently Safe Light Water Reactor (I2S-LWR) concept
International Nuclear Information System (INIS)
Hon, Ryan; Kooreman, Gabriel; Rahnema, Farzad; Petrovic, Bojan
2017-01-01
Highlights: • A stylized benchmark specification of the I2S-LWR core. • A library of cross sections were generated in both 8 and 47 groups. • Monte Carlo solutions generated for the 8 group library using MCNP5. • Cross sections and pin fission densities provided in journal’s repository. - Abstract: The Integral, Inherently Safe Light Water Reactor (I 2 S-LWR) is a pressurized water reactor (PWR) concept under development by a multi-institutional team led by Georgia Tech. The core is similar in size to small 2-loop PWRs while having the power level of current large reactors (∼1000 MWe) but using uranium silicide fuel and advanced stainless steel cladding. A stylized benchmark specification of the I 2 S-LWR core has been developed in order to test whole-core neutronics codes and methods. For simplification the core was split into 57 distinct material regions for cross section generation. Cross sections were generated using the lattice physics code HELIOS version 1.10 in both 8 and 47 groups. Monte Carlo solutions, including eigenvalue and pin fission densities, were generated for the 8 group library using MCNP5. Due to space limitations in this paper, the full cross section library and normalized pin fission density results are provided in the journal’s electronic repository.
Shielding benchmark problems, (2)
International Nuclear Information System (INIS)
Tanaka, Shun-ichi; Sasamoto, Nobuo; Oka, Yoshiaki; Shin, Kazuo; Tada, Keiko.
1980-02-01
Shielding benchmark problems prepared by Working Group of Assessment of Shielding Experiments in the Research Committee on Shielding Design in the Atomic Energy Society of Japan were compiled by Shielding Laboratory in Japan Atomic Energy Research Institute. Fourteen shielding benchmark problems are presented newly in addition to twenty-one problems proposed already, for evaluating the calculational algorithm and accuracy of computer codes based on discrete ordinates method and Monte Carlo method and for evaluating the nuclear data used in codes. The present benchmark problems are principally for investigating the backscattering and the streaming of neutrons and gamma rays in two- and three-dimensional configurations. (author)
Energy Technology Data Exchange (ETDEWEB)
NONE
2006-07-01
The aim of this project has been to produce benchmarks for electricity consumption in Danish schools in order to encourage electricity conservation. An internet programme has been developed with the aim of facilitating schools' access to benchmarks and to evaluate energy consumption. The overall purpose is to create increased attention to the electricity consumption of each separate school by publishing benchmarks which take the schools' age and number of pupils as well as after school activities into account. Benchmarks can be used to make green accounts and work as markers in e.g. energy conservation campaigns, energy management and for educational purposes. The internet tool can be found on www.energiguiden.dk. (BA)
Construction of a Benchmark for the User Experience Questionnaire (UEQ
Directory of Open Access Journals (Sweden)
Martin Schrepp
2017-08-01
Full Text Available Questionnaires are a cheap and highly efficient tool for achieving a quantitative measure of a product’s user experience (UX. However, it is not always easy to decide, if a questionnaire result can really show whether a product satisfies this quality aspect. So a benchmark is useful. It allows comparing the results of one product to a large set of other products. In this paper we describe a benchmark for the User Experience Questionnaire (UEQ, a widely used evaluation tool for interactive products. We also describe how the benchmark can be applied to the quality assurance process for concrete projects.
Lattice dynamics and lattice thermal conductivity of thorium dicarbide
Energy Technology Data Exchange (ETDEWEB)
Liao, Zongmeng [Institute of Theoretical Physics and Department of Physics, East China Normal University, Shanghai 200241 (China); Huai, Ping, E-mail: huaiping@sinap.ac.cn [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China); Qiu, Wujie [Institute of Theoretical Physics and Department of Physics, East China Normal University, Shanghai 200241 (China); State Key Laboratory of High Performance Ceramics and Superfine Microstructure, Shanghai Institute of Ceramics, Chinese Academy of Sciences, Shanghai 200050 (China); Ke, Xuezhi, E-mail: xzke@phy.ecnu.edu.cn [Institute of Theoretical Physics and Department of Physics, East China Normal University, Shanghai 200241 (China); Zhang, Wenqing [State Key Laboratory of High Performance Ceramics and Superfine Microstructure, Shanghai Institute of Ceramics, Chinese Academy of Sciences, Shanghai 200050 (China); Zhu, Zhiyuan [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China)
2014-11-15
The elastic and thermodynamic properties of ThC{sub 2} with a monoclinic symmetry have been studied by means of density functional theory and direct force-constant method. The calculated properties including the thermal expansion, the heat capacity and the elastic constants are in a good agreement with experiment. Our results show that the vibrational property of the C{sub 2} dimer in ThC{sub 2} is similar to that of a free standing C{sub 2} dimer. This indicates that the C{sub 2} dimer in ThC{sub 2} is not strongly bonded to Th atoms. The lattice thermal conductivity for ThC{sub 2} was calculated by means of the Debye–Callaway model. As a comparison, the conductivity of ThC was also calculated. Our results show that the ThC and ThC{sub 2} contributions of the lattice thermal conductivity to the total conductivity are 29% and 17%, respectively.
Supersymmetry on a space-time lattice
International Nuclear Information System (INIS)
Kaestner, Tobias
2008-01-01
In this thesis the WZ model in one and two dimensions has been thoroughly investigated. With the help of the Nicolai map it was possible to construct supersymmetrically improved lattice actions that preserve one of several supersymmetries. For the WZ model in one dimension SLAC fermions were utilized for the first time leading to a near-perfect elimination of lattice artifacts. In addition the lattice superpotential does not get modified which in two dimensions becomes important when further (discrete) symmetries of the continuum action are considered. For Wilson fermions two new improvements have been suggested and were shown to yield far better results than standard Wilson fermions concerning lattice artifacts. In the one-dimensional theory Ward Identities were studied.However, supersymmetry violations due to broken supersymmetry could only be detected at coarse lattices and very strong couplings. For the two-dimensional models a detailed analysis of supersymmetric improvement terms was given, both for Wilson and SLAC fermions. (orig.)
Supersymmetry on a space-time lattice
Energy Technology Data Exchange (ETDEWEB)
Kaestner, Tobias
2008-10-28
In this thesis the WZ model in one and two dimensions has been thoroughly investigated. With the help of the Nicolai map it was possible to construct supersymmetrically improved lattice actions that preserve one of several supersymmetries. For the WZ model in one dimension SLAC fermions were utilized for the first time leading to a near-perfect elimination of lattice artifacts. In addition the lattice superpotential does not get modified which in two dimensions becomes important when further (discrete) symmetries of the continuum action are considered. For Wilson fermions two new improvements have been suggested and were shown to yield far better results than standard Wilson fermions concerning lattice artifacts. In the one-dimensional theory Ward Identities were studied.However, supersymmetry violations due to broken supersymmetry could only be detected at coarse lattices and very strong couplings. For the two-dimensional models a detailed analysis of supersymmetric improvement terms was given, both for Wilson and SLAC fermions. (orig.)
Impact testing and analysis for structural code benchmarking
International Nuclear Information System (INIS)
Glass, R.E.
1989-01-01
Sandia National Laboratories, in cooperation with industry and other national laboratories, has been benchmarking computer codes used to predict the structural, thermal, criticality, and shielding behavior of radioactive materials packages. The first step in the benchmarking of the codes was to develop standard problem sets and to compare the results from several codes and users. This step for structural analysis codes has been completed as described in Structural Code Benchmarking for the Analysis of Impact Response of Nuclear Material Shipping Casks, R.E. Glass, Sandia National Laboratories, 1985. The problem set is shown in Fig. 1. This problem set exercised the ability of the codes to predict the response to end (axisymmetric) and side (plane strain) impacts with both elastic and elastic/plastic materials. The results from these problems showed that there is good agreement in predicting elastic response. Significant differences occurred in predicting strains for the elastic/plastic models. An example of the variation in predicting plastic behavior is given, which shows the hoop strain as a function of time at the impacting end of Model B. These differences in predicting plastic strains demonstrated a need for benchmark data for a cask-like problem
Lattice Automata for Control of Self-Reconfigurable Robots
DEFF Research Database (Denmark)
Støy, Kasper
2015-01-01
are extreme versatility and robustness. The organisation of self-reconfigurable robots in a lattice structure and the emphasis on local communication between modules mean that lattice automata are a useful basis for control of self-reconfigurable robots. However, there are significant differences which arise...... mainly from the physical nature of self-reconfigurable robots as opposed to the virtual nature of lattice automata. The problems resulting from these differences are mutual exclusion, handling motion constraints of modules, and unrealistic assumption about global, spatial orientation. Despite...... these problems the self-reconfigurable robot community has successfully applied lattice automata to simple control problems. However, for more complex problems hybrid solutions based on lattice automata and distributed algorithms are used. Hence, lattice automata have shown to have potential for the control...
Thought Experiment to Examine Benchmark Performance for Fusion Nuclear Data
Murata, Isao; Ohta, Masayuki; Kusaka, Sachie; Sato, Fuminobu; Miyamaru, Hiroyuki
2017-09-01
There are many benchmark experiments carried out so far with DT neutrons especially aiming at fusion reactor development. These integral experiments seemed vaguely to validate the nuclear data below 14 MeV. However, no precise studies exist now. The author's group thus started to examine how well benchmark experiments with DT neutrons can play a benchmarking role for energies below 14 MeV. Recently, as a next phase, to generalize the above discussion, the energy range was expanded to the entire region. In this study, thought experiments with finer energy bins have thus been conducted to discuss how to generally estimate performance of benchmark experiments. As a result of thought experiments with a point detector, the sensitivity for a discrepancy appearing in the benchmark analysis is "equally" due not only to contribution directly conveyed to the deterctor, but also due to indirect contribution of neutrons (named (A)) making neutrons conveying the contribution, indirect controbution of neutrons (B) making the neutrons (A) and so on. From this concept, it would become clear from a sensitivity analysis in advance how well and which energy nuclear data could be benchmarked with a benchmark experiment.
Thought Experiment to Examine Benchmark Performance for Fusion Nuclear Data
Directory of Open Access Journals (Sweden)
Murata Isao
2017-01-01
Full Text Available There are many benchmark experiments carried out so far with DT neutrons especially aiming at fusion reactor development. These integral experiments seemed vaguely to validate the nuclear data below 14 MeV. However, no precise studies exist now. The author’s group thus started to examine how well benchmark experiments with DT neutrons can play a benchmarking role for energies below 14 MeV. Recently, as a next phase, to generalize the above discussion, the energy range was expanded to the entire region. In this study, thought experiments with finer energy bins have thus been conducted to discuss how to generally estimate performance of benchmark experiments. As a result of thought experiments with a point detector, the sensitivity for a discrepancy appearing in the benchmark analysis is “equally” due not only to contribution directly conveyed to the deterctor, but also due to indirect contribution of neutrons (named (A making neutrons conveying the contribution, indirect controbution of neutrons (B making the neutrons (A and so on. From this concept, it would become clear from a sensitivity analysis in advance how well and which energy nuclear data could be benchmarked with a benchmark experiment.
Criticality benchmark comparisons leading to cross-section upgrades
International Nuclear Information System (INIS)
Alesso, H.P.; Annese, C.E.; Heinrichs, D.P.; Lloyd, W.R.; Lent, E.M.
1993-01-01
For several years criticality benchmark calculations with COG. COG is a point-wise Monte Carlo code developed at Lawrence Livermore National Laboratory (LLNL). It solves the Boltzmann equation for the transport of neutrons and photons. The principle consideration in developing COG was that the resulting calculation would be as accurate as the point-wise cross-sectional data, since no physics computational approximations were used. The objective of this paper is to report on COG results for criticality benchmark experiments in concert with MCNP comparisons which are resulting in corrections an upgrades to the point-wise ENDL cross-section data libraries. Benchmarking discrepancies reported here indicated difficulties in the Evaluated Nuclear Data Livermore (ENDL) cross-sections for U-238 at thermal neutron energy levels. This led to a re-evaluation and selection of the appropriate cross-section values from several cross-section sets available (ENDL, ENDF/B-V). Further cross-section upgrades anticipated
Performance Targets and External Benchmarking
DEFF Research Database (Denmark)
Friis, Ivar; Hansen, Allan; Vámosi, Tamás S.
Research on relative performance measures, transfer pricing, beyond budgeting initiatives, target costing, piece rates systems and value based management has for decades underlined the importance of external benchmarking in performance management. Research conceptualises external benchmarking...... as a market mechanism that can be brought inside the firm to provide incentives for continuous improvement and the development of competitive advances. However, whereas extant research primarily has focused on the importance and effects of using external benchmarks, less attention has been directed towards...... the conditions upon which the market mechanism is performing within organizations. This paper aims to contribute to research by providing more insight to the conditions for the use of external benchmarking as an element in performance management in organizations. Our study explores a particular type of external...
Lattice results for low moments of light meson distribution amplitudes
Energy Technology Data Exchange (ETDEWEB)
Arthur, R.; Boyle, P.A. [Edinburgh Univ. (United Kingdom). SUPA, School of Physics; Broemmel, D.; Flynn, J.M.; Rae, T.D.; Sachrajda, C.T.C. [Southampton Univ. (United Kingdom). School of Physics and Astronomy; Donnellan, M.A. [NIC/DESY Zeuthen (Germany); Juettner, A. [CERN, Geneva (Switzerland). Physics Dept.
2010-12-15
As part of the UKQCD and RBC collaborations' N{sub f} = 2+1 domain-wall fermion phenomenology programme, we calculate the first two moments of the light-cone distribution amplitudes of the pseudoscalar mesons {pi} and K and the (longitudinally-polarised) vector mesons {rho}, K{sup *} and {phi}. We obtain the desired quantities with good precision and are able to discern the expected quark-mass dependence of SU(3)-flavour breaking effects. An important ingredient of the calculation is the nonperturbative renormalisation of lattice operators using the RI{sup '}/MOM technique. (orig.)
Lattice Results for Low Moments of Light Meson Distribution Amplitudes
Arthur, R; Brommel, D; Donnellan, M A; Flynn, J M; Juttner, A; Rae, T D; Sachrajda, C T.C
2011-01-01
As part of the UKQCD and RBC collaborations' N_f=2+1 domain-wall fermion phenomenology programme, we calculate the first two moments of the light-cone distribution amplitudes of the pseudoscalar mesons pion and kaon and the (longitudinally-polarised) vector mesons rho, K-star and phi. We obtain the desired quantities with good precision and are able to discern the expected quark-mass dependence of SU(3)-flavour breaking effects. An important ingredient of the calculation is the nonperturbative renormalisation of lattice operators using the RI'/MOM technique.
Lattice results for low moments of light meson distribution amplitudes
International Nuclear Information System (INIS)
Arthur, R.; Boyle, P.A.; Juettner, A.
2010-12-01
As part of the UKQCD and RBC collaborations' N f = 2+1 domain-wall fermion phenomenology programme, we calculate the first two moments of the light-cone distribution amplitudes of the pseudoscalar mesons π and K and the (longitudinally-polarised) vector mesons ρ, K * and φ. We obtain the desired quantities with good precision and are able to discern the expected quark-mass dependence of SU(3)-flavour breaking effects. An important ingredient of the calculation is the nonperturbative renormalisation of lattice operators using the RI ' /MOM technique. (orig.)
An improved benchmark model for the Big Ten critical assembly - 021
International Nuclear Information System (INIS)
Mosteller, R.D.
2010-01-01
A new benchmark specification is developed for the BIG TEN uranium critical assembly. The assembly has a fast spectrum, and its core contains approximately 10 wt.% enriched uranium. Detailed specifications for the benchmark are provided, and results from the MCNP5 Monte Carlo code using a variety of nuclear-data libraries are given for this benchmark and two others. (authors)
Benchmarking criticality safety calculations with subcritical experiments
International Nuclear Information System (INIS)
Mihalczo, J.T.
1984-06-01
Calculation of the neutron multiplication factor at delayed criticality may be necessary for benchmarking calculations but it may not be sufficient. The use of subcritical experiments to benchmark criticality safety calculations could result in substantial savings in fuel material costs for experiments. In some cases subcritical configurations could be used to benchmark calculations where sufficient fuel to achieve delayed criticality is not available. By performing a variety of measurements with subcritical configurations, much detailed information can be obtained which can be compared directly with calculations. This paper discusses several measurements that can be performed with subcritical assemblies and presents examples that include comparisons between calculation and experiment where possible. Where not, examples from critical experiments have been used but the measurement methods could also be used for subcritical experiments
Nucleon structure from lattice QCD
Energy Technology Data Exchange (ETDEWEB)
Dinter, Simon
2012-11-13
In this thesis we compute within lattice QCD observables related to the structure of the nucleon. One part of this thesis is concerned with moments of parton distribution functions (PDFs). Those moments are essential elements for the understanding of nucleon structure and can be extracted from a global analysis of deep inelastic scattering experiments. On the theoretical side they can be computed non-perturbatively by means of lattice QCD. However, since the time lattice calculations of moments of PDFs are available, there is a tension between these lattice calculations and the results from a global analysis of experimental data. We examine whether systematic effects are responsible for this tension, and study particularly intensively the effects of excited states by a dedicated high precision computation. Moreover, we carry out a first computation with four dynamical flavors. Another aspect of this thesis is a feasibility study of a lattice QCD computation of the scalar quark content of the nucleon, which is an important element in the cross-section of a heavy particle with the nucleon mediated by a scalar particle (e.g. Higgs particle) and can therefore have an impact on Dark Matter searches. Existing lattice QCD calculations of this quantity usually have a large error and thus a low significance for phenomenological applications. We use a variance-reduction technique for quark-disconnected diagrams to obtain a precise result. Furthermore, we introduce a new stochastic method for the calculation of connected 3-point correlation functions, which are needed to compute nucleon structure observables, as an alternative to the usual sequential propagator method. In an explorative study we check whether this new method is competitive to the standard one. We use Wilson twisted mass fermions at maximal twist in all our calculations, such that all observables considered here have only O(a{sup 2}) discretization effects.
Nucleon structure from lattice QCD
International Nuclear Information System (INIS)
Dinter, Simon
2012-01-01
In this thesis we compute within lattice QCD observables related to the structure of the nucleon. One part of this thesis is concerned with moments of parton distribution functions (PDFs). Those moments are essential elements for the understanding of nucleon structure and can be extracted from a global analysis of deep inelastic scattering experiments. On the theoretical side they can be computed non-perturbatively by means of lattice QCD. However, since the time lattice calculations of moments of PDFs are available, there is a tension between these lattice calculations and the results from a global analysis of experimental data. We examine whether systematic effects are responsible for this tension, and study particularly intensively the effects of excited states by a dedicated high precision computation. Moreover, we carry out a first computation with four dynamical flavors. Another aspect of this thesis is a feasibility study of a lattice QCD computation of the scalar quark content of the nucleon, which is an important element in the cross-section of a heavy particle with the nucleon mediated by a scalar particle (e.g. Higgs particle) and can therefore have an impact on Dark Matter searches. Existing lattice QCD calculations of this quantity usually have a large error and thus a low significance for phenomenological applications. We use a variance-reduction technique for quark-disconnected diagrams to obtain a precise result. Furthermore, we introduce a new stochastic method for the calculation of connected 3-point correlation functions, which are needed to compute nucleon structure observables, as an alternative to the usual sequential propagator method. In an explorative study we check whether this new method is competitive to the standard one. We use Wilson twisted mass fermions at maximal twist in all our calculations, such that all observables considered here have only O(a 2 ) discretization effects.
Benchmark simulation models, quo vadis?
Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D
2013-01-01
As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.
Using the fuzzy linear regression method to benchmark the energy efficiency of commercial buildings
International Nuclear Information System (INIS)
Chung, William
2012-01-01
Highlights: ► Fuzzy linear regression method is used for developing benchmarking systems. ► The systems can be used to benchmark energy efficiency of commercial buildings. ► The resulting benchmarking model can be used by public users. ► The resulting benchmarking model can capture the fuzzy nature of input–output data. -- Abstract: Benchmarking systems from a sample of reference buildings need to be developed to conduct benchmarking processes for the energy efficiency of commercial buildings. However, not all benchmarking systems can be adopted by public users (i.e., other non-reference building owners) because of the different methods in developing such systems. An approach for benchmarking the energy efficiency of commercial buildings using statistical regression analysis to normalize other factors, such as management performance, was developed in a previous work. However, the field data given by experts can be regarded as a distribution of possibility. Thus, the previous work may not be adequate to handle such fuzzy input–output data. Consequently, a number of fuzzy structures cannot be fully captured by statistical regression analysis. This present paper proposes the use of fuzzy linear regression analysis to develop a benchmarking process, the resulting model of which can be used by public users. An illustrative example is given as well.
A Seafloor Benchmark for 3-dimensional Geodesy
Chadwell, C. D.; Webb, S. C.; Nooner, S. L.
2014-12-01
We have developed an inexpensive, permanent seafloor benchmark to increase the longevity of seafloor geodetic measurements. The benchmark provides a physical tie to the sea floor lasting for decades (perhaps longer) on which geodetic sensors can be repeatedly placed and removed with millimeter resolution. Global coordinates estimated with seafloor geodetic techniques will remain attached to the benchmark allowing for the interchange of sensors as they fail or become obsolete, or for the sensors to be removed and used elsewhere, all the while maintaining a coherent series of positions referenced to the benchmark. The benchmark has been designed to free fall from the sea surface with transponders attached. The transponder can be recalled via an acoustic command sent from the surface to release from the benchmark and freely float to the sea surface for recovery. The duration of the sensor attachment to the benchmark will last from a few days to a few years depending on the specific needs of the experiment. The recovered sensors are then available to be reused at other locations, or again at the same site in the future. Three pins on the sensor frame mate precisely and unambiguously with three grooves on the benchmark. To reoccupy a benchmark a Remotely Operated Vehicle (ROV) uses its manipulator arm to place the sensor pins into the benchmark grooves. In June 2014 we deployed four benchmarks offshore central Oregon. We used the ROV Jason to successfully demonstrate the removal and replacement of packages onto the benchmark. We will show the benchmark design and its operational capabilities. Presently models of megathrust slip within the Cascadia Subduction Zone (CSZ) are mostly constrained by the sub-aerial GPS vectors from the Plate Boundary Observatory, a part of Earthscope. More long-lived seafloor geodetic measures are needed to better understand the earthquake and tsunami risk associated with a large rupture of the thrust fault within the Cascadia subduction zone
Utilizing benchmark data from the ANL-ZPR diagnostic cores program
International Nuclear Information System (INIS)
Schaefer, R. W.; McKnight, R. D.
2000-01-01
The support of the criticality safety community is allowing the production of benchmark descriptions of several assemblies from the ZPR Diagnostic Cores Program. The assemblies have high sensitivities to nuclear data for a few isotopes. This can highlight limitations in nuclear data for selected nuclides or in standard methods used to treat these data. The present work extends the use of the simplified model of the U9 benchmark assembly beyond the validation of k eff . Further simplifications have been made to produce a data testing benchmark in the style of the standard CSEWG benchmark specifications. Calculations for this data testing benchmark are compared to results obtained with more detailed models and methods to determine their biases. These biases or corrections factors can then be applied in the use of the less refined methods and models. Data testing results using Versions IV, V, and VI of the ENDF/B nuclear data are presented for k eff , f 28 /f 25 , c 28 /f 25 , and β eff . These limited results demonstrate the importance of studying other integral parameters in addition to k eff in trying to improve nuclear data and methods and the importance of accounting for methods and/or modeling biases when using data testing results to infer the quality of the nuclear data files
The Concepts "Benchmarks and Benchmarking" Used in Education Planning: Teacher Education as Example
Steyn, H. J.
2015-01-01
Planning in education is a structured activity that includes several phases and steps that take into account several kinds of information (Steyn, Steyn, De Waal & Wolhuter, 2002: 146). One of the sets of information that are usually considered is the (so-called) "benchmarks" and "benchmarking" regarding the focus of a…
Status and future of lattice gauge theory
International Nuclear Information System (INIS)
Hoek, J.
1989-07-01
The current status of lattice Quantum Chromo Dynamics (QCD) calculations, the computer requirements to obtain physical results and the direction computing is taking are described. First of all, there is a lot of evidence that QCD is the correct theory of strong interactions. Since it is an asymptotically free theory we can use perturbation theory to solve it in the regime of very hard collisions. However even in the case of very hard parton collisions the end-results of the collisions are bound states of quarks and perturbation theory is not sufficient to calculate these final stages. The way to solve the theory in this regime was opened by Wilson. He contemplated replacing the space-time continuum by a discrete lattice, with a lattice spacing a. Continuum physics is then recovered in the limit where the correlation length of the theory, say ξ. is large with respect to the lattice spacing. This will be true if the lattice spacing becomes very small, which for asymptotically free theories also implies that the coupling g becomes small. The lattice approach to QCD is in many respects analogous to the use of finite element methods to solve classical field theories. These finite element methods are easy to apply in 2-dimensional simulations but are computationally demanding in the 3-dimensional case. Therefore it is not unexpected that the 4-dimensional simulations needed for lattice gauge theories have led to an explosion in demand for computing power by theorists. (author)
Moment-based boundary conditions for lattice Boltzmann simulations of natural convection in cavities
Allen, Rebecca
2016-06-29
We study a multiple relaxation time lattice Boltzmann model for natural convection with moment-based boundary conditions. The unknown primary variables of the algorithm at a boundary are found by imposing conditions directly upon hydrodynamic moments, which are then translated into conditions for the discrete velocity distribution functions. The method is formulated so that it is consistent with the second order implementation of the discrete velocity Boltzmann equations for fluid flow and temperature. Natural convection in square cavities is studied for Rayleigh numbers ranging from 103 to 108. An excellent agreement with benchmark data is observed and the flow fields are shown to converge with second order accuracy. Copyright © 2016 Inderscience Enterprises Ltd.
The congruences of a finite lattice a "proof-by-picture" approach
Grätzer, George
2016-01-01
This is a self-contained exposition by one of the leading experts in lattice theory, George Grätzer, presenting the major results of the last 70 years on congruence lattices of finite lattices, featuring the author's signature Proof-by-Picture method. Key features: * Insightful discussion of techniques to construct "nice" finite lattices with given congruence lattices and "nice" congruence-preserving extensions * Contains complete proofs, an extensive bibliography and index, and over 140 illustrations * This new edition includes two new parts on Planar Semimodular Lattices and The Order of Principle Congruences, covering the research of the last 10 years The book is appropriate for a one-semester graduate course in lattice theory, and it is a practical reference for researchers studying lattices. Reviews of the first edition: "There exist a lot of interesting results in this area of lattice theory, and some of them are presented in this book. [This] monograph…is an exceptional work in lattice theory, like ...
Lattices for laymen: a non-specialist's introduction to lattice gauge theory
International Nuclear Information System (INIS)
Callaway, D.J.E.
1985-01-01
The review on lattice gauge theory is based upon a series of lectures given to the Materials Science and Technology Division at Argonne National Laboratory. Firstly the structure of gauge theories in the continuum is discussed. Then the lattice formulation of these theories is presented, including quantum electrodynamics and non-abelian lattice gauge theories. (U.K.)
Aerodynamic Benchmarking of the Deepwind Design
DEFF Research Database (Denmark)
Bedona, Gabriele; Schmidt Paulsen, Uwe; Aagaard Madsen, Helge
2015-01-01
The aerodynamic benchmarking for the DeepWind rotor is conducted comparing different rotor geometries and solutions and keeping the comparison as fair as possible. The objective for the benchmarking is to find the most suitable configuration in order to maximize the power production and minimize...... the blade solicitation and the cost of energy. Different parameters are considered for the benchmarking study. The DeepWind blade is characterized by a shape similar to the Troposkien geometry but asymmetric between the top and bottom parts: this shape is considered as a fixed parameter in the benchmarking...
Commensurability effects in holographic homogeneous lattices
International Nuclear Information System (INIS)
Andrade, Tomas; Krikun, Alexander
2016-01-01
An interesting application of the gauge/gravity duality to condensed matter physics is the description of a lattice via breaking translational invariance on the gravity side. By making use of global symmetries, it is possible to do so without scarifying homogeneity of the pertinent bulk solutions, which we thus term as “homogeneous holographic lattices.' Due to their technical simplicity, these configurations have received a great deal of attention in the last few years and have been shown to correctly describe momentum relaxation and hence (finite) DC conductivities. However, it is not clear whether they are able to capture other lattice effects which are of interest in condensed matter. In this paper we investigate this question focusing our attention on the phenomenon of commensurability, which arises when the lattice scale is tuned to be equal to (an integer multiple of) another momentum scale in the system. We do so by studying the formation of spatially modulated phases in various models of homogeneous holographic lattices. Our results indicate that the onset of the instability is controlled by the near horizon geometry, which for insulating solutions does carry information about the lattice. However, we observe no sharp connection between the characteristic momentum of the broken phase and the lattice pitch, which calls into question the applicability of these models to the physics of commensurability.
Benchmarking GNU Radio Kernels and Multi-Processor Scheduling
2013-01-14
Linux, GNU Radio, and file system for target machine • Run benchmarking scripts with VOLK enabled • Run benchmarking scripts with VOLK disabled ...numpy.arange(1,11) pipes3d, stages3d = numpy.meshgrid(pipes, stages) # print results.size 86 # print pipes.size # print stages.size surf = s1.plot_surface...pipes3d, stages3d, results, rstride=1, cstride=1,cmap=cm.jet, linewidth=0, antialiased=True) 90 # f1.colorbar( surf , shrink=0.5, aspect=5) # s1.set_zlim3d
Energy Technology Data Exchange (ETDEWEB)
Schaefer, Stefan [DESY (Germany). Neumann Inst. for Computing
2016-11-01
These configurations are currently in use in many on-going projects carried out by researchers throughout Europe. In particular this data will serve as an essential input into the computation of the coupling constant of QCD, where some of the simulations are still on-going. But also projects computing the masses of hadrons and investigating their structure are underway as well as activities in the physics of heavy quarks. As this initial project of gauge field generation has been successful, it is worthwhile to extend the currently available ensembles with further points in parameter space. These will allow to further study and control systematic effects like the ones introduced by the finite volume, the non-physical quark masses and the finite lattice spacing. In particular certain compromises have still been made in the region where pion masses and lattice spacing are both small. This is because physical pion masses require larger lattices to keep the effects of the finite volume under control. At light pion masses, a precise control of the continuum extrapolation is therefore difficult, but certainly a main goal of future simulations. To reach this goal, algorithmic developments as well as faster hardware will be needed.
Hamiltonian lattice studies of chiral meson field theories
International Nuclear Information System (INIS)
Chin, S.A.
1998-01-01
The latticization of the non-linear sigma model reduces a chiral meson field theory to an O(4) spin lattice system with quantum fluctuations. The result is an interesting marriage between quantum many-body theory and classical spin systems. By solving the resulting lattice Hamiltonian by Monte Carlo methods, the dynamics and thermodynamics of pions can be determined non-perturbatively. In a variational 16 3 lattice study, the ground state chiral phase transition is shown to be first order. Moreover, as the chiral phase transition is approached, the mass gap of pionic collective modes with quantum number of the ω vector meson drops toward zero. (Copyright (1998) World Scientific Publishing Co. Pte. Ltd)
International Nuclear Information System (INIS)
Tanaka, Shun-ichi; Sasamoto, Nobuo; Oka, Yoshiaki; Kawai, Masayoshi; Nakazawa, Masaharu.
1978-09-01
Shielding benchmark problems were prepared by the Working Group of Assessment of Shielding Experiments in the Research Comittee on Shielding Design of the Atomic Energy Society of Japan, and compiled by the Shielding Laboratory of Japan Atomic Energy Research Institute. Twenty-one kinds of shielding benchmark problems are presented for evaluating the calculational algorithm and the accuracy of computer codes based on the discrete ordinates method and the Monte Carlo method and for evaluating the nuclear data used in the codes. (author)
International Nuclear Information System (INIS)
DeGrand, T.
1997-01-01
These lectures provide an introduction to lattice methods for nonperturbative studies of Quantum Chromodynamics. Lecture 1: Basic techniques for QCD and results for hadron spectroscopy using the simplest discretizations; lecture 2: Improved actions--what they are and how well they work; lecture 3: SLAC physics from the lattice-structure functions, the mass of the glueball, heavy quarks and α s (M z ), and B-anti B mixing. 67 refs., 36 figs
Directory of Open Access Journals (Sweden)
Jahn, Franziska
2015-08-01
Full Text Available Benchmarking is a method of strategic information management used by many hospitals today. During the last years, several benchmarking clusters have been established within the German-speaking countries. They support hospitals in comparing and positioning their information system’s and information management’s costs, performance and efficiency against other hospitals. In order to differentiate between these benchmarking clusters and to provide decision support in selecting an appropriate benchmarking cluster, a classification scheme is developed. The classification scheme observes both general conditions and examined contents of the benchmarking clusters. It is applied to seven benchmarking clusters which have been active in the German-speaking countries within the last years. Currently, performance benchmarking is the most frequent benchmarking type, whereas the observed benchmarking clusters differ in the number of benchmarking partners and their cooperation forms. The benchmarking clusters also deal with different benchmarking subjects. Assessing costs and quality application systems, physical data processing systems, organizational structures of information management and IT services processes are the most frequent benchmarking subjects. There is still potential for further activities within the benchmarking clusters to measure strategic and tactical information management, IT governance and quality of data and data-processing processes. Based on the classification scheme and the comparison of the benchmarking clusters, we derive general recommendations for benchmarking of hospital information systems.
Few quantum particles on one dimensional lattices
Energy Technology Data Exchange (ETDEWEB)
Valiente Cifuentes, Manuel
2010-06-18
There is currently a great interest in the physics of degenerate quantum gases and low-energy few-body scattering due to the recent experimental advances in manipulation of ultracold atoms by light. In particular, almost perfect periodic potentials, called optical lattices, can be generated. The lattice spacing is fixed by the wavelength of the laser field employed and the angle betwen the pair of laser beams; the lattice depth, defining the magnitude of the different band gaps, is tunable within a large interval of values. This flexibility permits the exploration of different regimes, ranging from the ''free-electron'' picture, modified by the effective mass for shallow optical lattices, to the tight-binding regime of a very deep periodic potential. In the latter case, effective single-band theories, widely used in condensed matter physics, can be implemented with unprecedent accuracy. The tunability of the lattice depth is nowadays complemented by the use of magnetic Feshbach resonances which, at very low temperatures, can vary the relevant atom-atom scattering properties at will. Moreover, optical lattices loaded with gases of effectively reduced dimensionality are experimentally accessible. This is especially important for one spatial dimension, since most of the exactly solvable models in many-body quantum mechanics deal with particles on a line; therefore, experiments with one-dimensional gases serve as a testing ground for many old and new theories which were regarded as purely academic not so long ago. The physics of few quantum particles on a one-dimensional lattice is the topic of this thesis. Most of the results are obtained in the tight-binding approximation, which is amenable to exact numerical or analytical treatment. For the two-body problem, theoretical methods for calculating the stationary scattering and bound states are developed. These are used to obtain, in closed form, the two-particle solutions of both the Hubbard and
Few quantum particles on one dimensional lattices
International Nuclear Information System (INIS)
Valiente Cifuentes, Manuel
2010-01-01
There is currently a great interest in the physics of degenerate quantum gases and low-energy few-body scattering due to the recent experimental advances in manipulation of ultracold atoms by light. In particular, almost perfect periodic potentials, called optical lattices, can be generated. The lattice spacing is fixed by the wavelength of the laser field employed and the angle betwen the pair of laser beams; the lattice depth, defining the magnitude of the different band gaps, is tunable within a large interval of values. This flexibility permits the exploration of different regimes, ranging from the ''free-electron'' picture, modified by the effective mass for shallow optical lattices, to the tight-binding regime of a very deep periodic potential. In the latter case, effective single-band theories, widely used in condensed matter physics, can be implemented with unprecedent accuracy. The tunability of the lattice depth is nowadays complemented by the use of magnetic Feshbach resonances which, at very low temperatures, can vary the relevant atom-atom scattering properties at will. Moreover, optical lattices loaded with gases of effectively reduced dimensionality are experimentally accessible. This is especially important for one spatial dimension, since most of the exactly solvable models in many-body quantum mechanics deal with particles on a line; therefore, experiments with one-dimensional gases serve as a testing ground for many old and new theories which were regarded as purely academic not so long ago. The physics of few quantum particles on a one-dimensional lattice is the topic of this thesis. Most of the results are obtained in the tight-binding approximation, which is amenable to exact numerical or analytical treatment. For the two-body problem, theoretical methods for calculating the stationary scattering and bound states are developed. These are used to obtain, in closed form, the two-particle solutions of both the Hubbard and extended Hubbard models
Theory-motivated benchmark models and superpartners at the Fermilab Tevatron
International Nuclear Information System (INIS)
Kane, G.L.; Nelson, Brent D.; Wang Liantao; Wang, Ting T.; Lykken, J.; Mrenna, Stephen
2003-01-01
Recently published benchmark models have contained rather heavy superpartners. To test the robustness of this result, several benchmark models have been constructed based on theoretically well-motivated approaches, particularly string-based ones. These include variations on anomaly- and gauge-mediated models, as well as gravity mediation. The resulting spectra often have light gauginos that are produced in significant quantities at the Fermilab Tevatron collider, or will be at a 500 GeV linear collider. The signatures also provide interesting challenges for the CERN LHC. In addition, these models are capable of accounting for electroweak symmetry breaking with less severe cancellations among soft supersymmetry breaking parameters than previous benchmark models
Benchmarking Sustainability Practices Use throughout Industrial Construction Project Delivery
Directory of Open Access Journals (Sweden)
Sungmin Yun
2017-06-01
Full Text Available Despite the efforts for sustainability studies in building and infrastructure construction, the sustainability issues in industrial construction remain understudied. Further, few studies evaluate sustainability and benchmark sustainability issues in industrial construction from a management perspective. This study presents a phase-based benchmarking framework for evaluating sustainability practices use focusing on industrial facilities project. Based on the framework, this study quantifies and assesses sustainability practices use, and further sorts the results by project phase and major project characteristics, including project type, project nature, and project delivery method. The results show that sustainability practices were implemented higher in the construction and startup phases relative to other phases, with a very broad range. An assessment by project type and project nature showed significant differences in sustainability practices use, but no significant difference in practices use by project delivery method. This study contributes to providing a benchmarking method for sustainability practices in industrial facilities projects at the project phase level. This study also discusses and provides an application of phase-based benchmarking for sustainability in industrial construction.
Origami lattices with free-form surface ornaments
Janbaz, S.; Noordzij, N.; Widyaratih, Dwisetya Safirna; Hagen, C.W.; Fratila-Apachitei, E.L.; Zadpoor, A.A.
2017-01-01
Lattice structures are used in the design of metamaterials to achieve unusual physical, mechanical, or biological properties. The properties of such metamaterials result from the topology of the lattice structures, which are usually three-dimensionally (3D) printed. To incorporate advanced
On singularities of lattice varieties
Mukherjee, Himadri
2013-01-01
Toric varieties associated with distributive lattices arise as a fibre of a flat degeneration of a Schubert variety in a minuscule. The singular locus of these varieties has been studied by various authors. In this article we prove that the number of diamonds incident on a lattice point $\\a$ in a product of chain lattices is more than or equal to the codimension of the lattice. Using this we also show that the lattice varieties associated with product of chain lattices is smooth.
Memory transfer optimization for a lattice Boltzmann solver on Kepler architecture nVidia GPUs
Mawson, Mark J.; Revell, Alistair J.
2014-10-01
The Lattice Boltzmann method (LBM) for solving fluid flow is naturally well suited to an efficient implementation for massively parallel computing, due to the prevalence of local operations in the algorithm. This paper presents and analyses the performance of a 3D lattice Boltzmann solver, optimized for third generation nVidia GPU hardware, also known as 'Kepler'. We provide a review of previous optimization strategies and analyse data read/write times for different memory types. In LBM, the time propagation step (known as streaming), involves shifting data to adjacent locations and is central to parallel performance; here we examine three approaches which make use of different hardware options. Two of which make use of 'performance enhancing' features of the GPU; shared memory and the new shuffle instruction found in Kepler based GPUs. These are compared to a standard transfer of data which relies instead on optimized storage to increase coalesced access. It is shown that the more simple approach is most efficient; since the need for large numbers of registers per thread in LBM limits the block size and thus the efficiency of these special features is reduced. Detailed results are obtained for a D3Q19 LBM solver, which is benchmarked on nVidia K5000M and K20C GPUs. In the latter case the use of a read-only data cache is explored, and peak performance of over 1036 Million Lattice Updates Per Second (MLUPS) is achieved. The appearance of a periodic bottleneck in the solver performance is also reported, believed to be hardware related; spikes in iteration-time occur with a frequency of around 11 Hz for both GPUs, independent of the size of the problem.
International benchmark tests of the FENDL-1 Nuclear Data Library
International Nuclear Information System (INIS)
Fischer, U.
1997-01-01
An international benchmark validation task has been conducted to validate the fusion evaluated nuclear data library FENDL-1 through data tests against integral 14 MeV neutron experiments. The main objective of this task was to qualify the FENDL-1 working libraries for fusion applications and to elaborate recommendations for further data improvements. Several laboratories and institutions from the European Union, Japan, the Russian Federation and US have contributed to the benchmark task. A large variety of existing integral 14 MeV benchmark experiments was analysed with the FENDL-1 working libraries for continuous energy Monte Carlo and multigroup discrete ordinate calculations. Results of the benchmark analyses have been collected, discussed and evaluated. The major findings, conclusions and recommendations are presented in this paper. With regard to the data quality, it is summarised that fusion nuclear data have reached a high confidence level with the available FENDL-1 data library. With few exceptions this holds for the materials of highest importance for fusion reactor applications. As a result of the performed benchmark analyses, some existing deficiencies and discrepancies have been identified that are recommended for removal in theforthcoming FENDL-2 data file. (orig.)
Issues in Benchmark Metric Selection
Crolotte, Alain
It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.
Benchmarking clinical photography services in the NHS.
Arbon, Giles
2015-01-01
Benchmarking is used in services across the National Health Service (NHS) using various benchmarking programs. Clinical photography services do not have a program in place and services have to rely on ad hoc surveys of other services. A trial benchmarking exercise was undertaken with 13 services in NHS Trusts. This highlights valuable data and comparisons that can be used to benchmark and improve services throughout the profession.
HTR-PROTEUS benchmark calculations. Pt. 1. Unit cell results LEUPRO-1 and LEUPRO-2
International Nuclear Information System (INIS)
Hogenbirk, A.; Stad, R.C.L. van der; Janssen, A.J.; Klippel, H.T.; Kuijper, J.C.
1995-09-01
In the framework of the IAEA Co-ordinated Research Programme (CRP) on 'Validation of Safety Related Physics Calculations for Low-Enriched (LEU) HTGRs' calculational benchmarks are performed on the basis of LEU-HTR pebble-bed critical experiments carried out in the PROTEUS facility at PSI, Switzerland. Of special interest is the treatment of the double heterogeneity of the fuel and the spherical fuel elements of these pebble bed core configurations. Also of interest is the proper calculation of the safety related physics parameters like the effect of water ingress and control rod worth. This document describes the ECN results of the LEUPRO-1 and LEUPRO-2 unitcell calculations performed with the codes WIMS-E, SCALE-4 and MCNP4A. Results of the LEUPRO-1 unit cell with 20% water ingress in the void is also reported for both the single and the double heterogeneous case. Emphasis is put on the intercomparison of the results obtained by the deterministic codes WIMS-E and SCALE-4, and the Monte Carlo code MCNP4A. The LEUPRO whole core calculations will be reported later. (orig.)
Benchmarking Danish Industries
DEFF Research Database (Denmark)
Gammelgaard, Britta; Bentzen, Eric; Aagaard Andreassen, Mette
2003-01-01
compatible survey. The International Manufacturing Strategy Survey (IMSS) doesbring up the question of supply chain management, but unfortunately, we did not have access to thedatabase. Data from the members of the SCOR-model, in the form of benchmarked performance data,may exist, but are nonetheless...... not public. The survey is a cooperative project "Benchmarking DanishIndustries" with CIP/Aalborg University, the Danish Technological University, the DanishTechnological Institute and Copenhagen Business School as consortia partners. The project has beenfunded by the Danish Agency for Trade and Industry...
International Nuclear Information System (INIS)
Haney, Aoife Brophy; Pollitt, Michael G.
2013-01-01
Benchmarking of electricity networks has a key role in sharing the benefits of efficiency improvements with consumers and ensuring regulated companies earn a fair return on their investments. This paper analyses and contrasts the theory and practice of international benchmarking of electricity transmission by regulators. We examine the literature relevant to electricity transmission benchmarking and discuss the results of a survey of 25 national electricity regulators. While new panel data techniques aimed at dealing with unobserved heterogeneity and the validity of the comparator group look intellectually promising, our survey suggests that they are in their infancy for regulatory purposes. In electricity transmission, relative to electricity distribution, choosing variables is particularly difficult, because of the large number of potential variables to choose from. Failure to apply benchmarking appropriately may negatively affect investors’ willingness to invest in the future. While few of our surveyed regulators acknowledge that regulatory risk is currently an issue in transmission benchmarking, many more concede it might be. In the meantime new regulatory approaches – such as those based on tendering, negotiated settlements, a wider range of outputs or longer term grid planning – are emerging and will necessarily involve a reduced role for benchmarking. -- Highlights: •We discuss how to benchmark electricity transmission. •We report survey results from 25 national energy regulators. •Electricity transmission benchmarking is more challenging than benchmarking distribution. •Many regulators concede benchmarking may raise capital costs. •Many regulators are considering new regulatory approaches
Benchmarking of human resources management
Directory of Open Access Journals (Sweden)
David M. Akinnusi
2008-11-01
Full Text Available This paper reviews the role of human resource management (HRM which, today, plays a strategic partnership role in management. The focus of the paper is on HRM in the public sector, where much hope rests on HRM as a means of transforming the public service and achieving much needed service delivery. However, a critical evaluation of HRM practices in the public sector reveals that these services leave much to be desired. The paper suggests the adoption of benchmarking as a process to revamp HRM in the public sector so that it is able to deliver on its promises. It describes the nature and process of benchmarking and highlights the inherent difficulties in applying benchmarking in HRM. It concludes with some suggestions for a plan of action. The process of identifying “best” practices in HRM requires the best collaborative efforts of HRM practitioners and academicians. If used creatively, benchmarking has the potential to bring about radical and positive changes in HRM in the public sector. The adoption of the benchmarking process is, in itself, a litmus test of the extent to which HRM in the public sector has grown professionally.
Criticality benchmark results for the ENDF60 library with MCNP trademark
International Nuclear Information System (INIS)
Keen, N.D.; Frankle, S.C.; MacFarlane, R.E.
1995-01-01
The continuous-energy neutron data library ENDF60, for use with the Monte Carlo N-Particle radiation transport code MCNP4A, was released in the fall of 1994. The ENDF60 library is comprised of 124 nuclide data files based on the ENDF/B-VI (B-VI) evaluations through Release 2. Fifty-two percent of these B-VI evaluations are translations from ENDF/B-V (B-V). The remaining forty-eight percent are new evaluations which have sometimes changed significantly. Among these changes are greatly increased use of isotopic evaluations, more extensive resonance-parameter evaluations, and energy-angle correlated distributions for secondary particles. In particular, the upper energy limit for the resolved resonance region of 235 U, 238 U and 239 Pu has been extended from 0.082, 4.0, and 0.301 keV to 2..25, 10.0, and 2.5 keV respectively. As regulatory oversight has advanced and performing critical experiments has become more difficult, there has been an increased reliance on computational methods. For the criticality safety community, the performance of the combined transport code and data library is of interest. The purpose of this abstract is to provide benchmarking results to aid the user in determining the best data library for their application
Cuttance, Peter
This paper provides a synthesis of the literature on the role of benchmarking, with a focus on its use in the public sector. Benchmarking is discussed in the context of quality systems, of which it is an important component. The paper describes the basic types of benchmarking, pertinent research about its application in the public sector, the…
Energy Technology Data Exchange (ETDEWEB)
DeGrand, T. [Univ. of Colorado, Boulder, CO (United States). Dept. of Physics
1997-06-01
These lectures provide an introduction to lattice methods for nonperturbative studies of Quantum Chromodynamics. Lecture 1: Basic techniques for QCD and results for hadron spectroscopy using the simplest discretizations; lecture 2: Improved actions--what they are and how well they work; lecture 3: SLAC physics from the lattice-structure functions, the mass of the glueball, heavy quarks and {alpha}{sub s} (M{sub z}), and B-{anti B} mixing. 67 refs., 36 figs.