Uddin, M.N.; Sarker, M.M.; Khan, M.J.H.; Islam, S.M.A.
2009-01-01
The aim of this paper is to present the validation of evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 through the analysis of the integral parameters of TRX and BAPL benchmark lattices of thermal reactors for neutronics analysis of TRIGA Mark-II Research Reactor at AERE, Bangladesh. In this process, the 69-group cross-section library for lattice code WIMS was generated using the basic evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 with the help of nuclear data processing code NJOY99.0. Integral measurements on the thermal reactor lattices TRX-1, TRX-2, BAPL-UO 2 -1, BAPL-UO 2 -2 and BAPL-UO 2 -3 served as standard benchmarks for testing nuclear data files and have also been selected for this analysis. The integral parameters of the said lattices were calculated using the lattice transport code WIMSD-5B based on the generated 69-group cross-section library. The calculated integral parameters were compared to the measured values as well as the results of Monte Carlo Code MCNP. It was found that in most cases, the values of integral parameters show a good agreement with the experiment and MCNP results. Besides, the group constants in WIMS format for the isotopes U-235 and U-238 between two data files have been compared using WIMS library utility code WILLIE and it was found that the group constants are identical with very insignificant difference. Therefore, this analysis reflects the validation of evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 through benchmarking the integral parameters of TRX and BAPL lattices and can also be essential to implement further neutronic analysis of TRIGA Mark-II research reactor at AERE, Dhaka, Bangladesh.
Khan, M.J.H.; Sarker, M.M.; Islam, S.M.A.
2013-01-01
Highlights: ► To validate the SRAC2006 code system for TRIGA neutronics calculations. ► TRX and BAPL lattices are treated as standard benchmarks for this purpose. ► To compare the calculated results with experiment as well as MCNP values in this study. ► The study demonstrates a good agreement with the experiment and the MCNP results. ► Thus, this analysis reflects the validation study of the SRAC2006 code system. - Abstract: The goal of this study is to present the validation study of the SRAC2006 code system based on evaluated nuclear data libraries ENDF/B-VII.0 and JENDL-3.3 for neutronics analysis of TRIGA Mark-II Research Reactor at AERE, Bangladesh. This study is achieved through the analysis of integral parameters of TRX and BAPL benchmark lattices of thermal reactors. In integral measurements, the thermal reactor lattices TRX-1, TRX-2, BAPL-UO 2 -1, BAPL-UO 2 -2 and BAPL-UO 2 -3 are treated as standard benchmarks for validating/testing the SRAC2006 code system as well as nuclear data libraries. The integral parameters of the said lattices are calculated using the collision probability transport code PIJ of the SRAC2006 code system at room temperature 20 °C based on the above libraries. The calculated integral parameters are compared to the measured values as well as the MCNP values based on the Chinese evaluated nuclear data library CENDL-3.0. It was found that in most cases, the values of integral parameters demonstrate a good agreement with the experiment and the MCNP results. In addition, the group constants in SRAC format for TRX and BAPL lattices in fast and thermal energy range respectively are compared between the above libraries and it was found that the group constants are identical with very insignificant difference. Therefore, this analysis reflects the validation study of the SRAC2006 code system based on evaluated nuclear data libraries JENDL-3.3 and ENDF/B-VII.0 and can also be essential to implement further neutronics calculations
Khan, M.J.H.; Alam, A.B.M.K.; Ahsan, M.H.; Mamun, K.A.A.; Islam, S.M.A.
2015-01-01
Highlights: • To validate the reactor physics lattice code WIMSD-5B by this analysis. • To model TRX and BAPL critical experiments using WIMSD-5B. • To compare the calculated results with experiment and MCNP results. • To rely on WIMSD-5B code for TRIGA calculations. - Abstract: The aim of this analysis is to validate the reactor physics lattice transport code WIMSD-5B by TRX (thermal reactor-one region lattice) and BAPL (Bettis Atomic Power Laboratory-one region lattice) critical experiments of light water reactors for neutronics analysis of 3 MW TRIGA Mark-II research reactor at AERE, Dhaka, Bangladesh. This analysis is achieved through the analysis of integral parameters of five light water reactor critical experiments TRX-1, TRX-2, BAPL-UO 2 -1, BAPL-UO 2 -2 and BAPL-UO 2 -3 based on evaluated nuclear data libraries JEFF-3.1 and ENDF/B-VII.1. In integral measurements, these experiments are considered as standard benchmark lattices for validating the reactor physics lattice transport code WIMSD-5B as well as evaluated nuclear data libraries. The integral parameters of the said critical experiments are calculated using the reactor physics lattice transport code WIMSD-5B. The calculated integral parameters are compared to the measured values as well as the earlier published MCNP results based on the Chinese evaluated nuclear data library CENDL-3.0 for assessment of deterministic calculation. It was found that the calculated integral parameters give mostly reasonable and globally consistent results with the experiment and the MCNP results. Besides, the group constants in WIMS format for the isotopes U-235 and U-238 between two data files have been compared using WIMS library utility code WILLIE and it was found that the group constants are well consistent with each other. Therefore, this analysis reveals the validation study of the reactor physics lattice transport code WIMSD-5B based on JEFF-3.1 and ENDF/B-VII.1 libraries and can also be essential to
Paratte, J.M.
1985-07-01
The EIR codes system for LWR arrays is based on cross sections taken out of ENDF/B-4 and ENDF/B-5 by the code ETOBOX. The calculation method for the arrays (code BOXER) and the cross sections as well were applied to the CSEWG benchmark experiments TRX-1 to 4 and BAPL-UO/sub 2/-1 to 3. The results are compared to the measured values and to some calculations of other institutions as well. This demonstrates that the deviations of the parameters calculated by BOXER are typical for the cross sections used. A large number of critical experiments were calculated using the measured material bucklings in order to bring to light possible trends in the calculation of the multiplication factor k/sub eff/. First it came out that the error bounds of B/sub m//sup 2/ evalu-ated in the measurements are often optimistic. Two-dimensional calculations improved the results of the cell calculations. With a mean scattering of 4 to 5 mk in the normal arrays, the multiplication factors calculated by BOXER are satisfactory. However one has to take into account a slight trend of k/sub eff/ to grow with the moderator to fuel ratio and the enrichment. (author)
Benchmarking computer platforms for lattice QCD applications
Hasenbusch, M.; Jansen, K.; Pleiter, D.; Wegner, P.; Wettig, T.
2003-09-01
We define a benchmark suite for lattice QCD and report on benchmark results from several computer platforms. The platforms considered are apeNEXT, CRAY T3E, Hitachi SR8000, IBM p690, PC-Clusters, and QCDOC. (orig.)
Benchmarking computer platforms for lattice QCD applications
Hasenbusch, M.; Jansen, K.; Pleiter, D.; Stueben, H.; Wegner, P.; Wettig, T.; Wittig, H.
2004-01-01
We define a benchmark suite for lattice QCD and report on benchmark results from several computer platforms. The platforms considered are apeNEXT, CRAY T3E; Hitachi SR8000, IBM p690, PC-Clusters, and QCDOC
Benchmarking of the 99-group ANSL-V library
Wright, R.Q.; Ford, W.E. III; Greene, N.M.; Petrie, L.M.; Primm, R.T. III; Westfall, R.M.
1987-01-01
The purpose of this paper is to present thermal benchmark data testing results for the BAPL-1, TRX-1, and SEEP-1 lattices, using selected processed cross-sections from the ANSL-V 99-group library. 7 refs., 1 tab
HELIOS calculations for UO2 lattice benchmarks
Mosteller, R.D.
1998-01-01
Calculations for the ANS UO 2 lattice benchmark have been performed with the HELIOS lattice-physics code and six of its cross-section libraries. The results obtained from the different libraries permit conclusions to be drawn regarding the adequacy of the energy group structures and of the ENDF/B-VI evaluation for 238 U. Scandpower A/S, the developer of HELIOS, provided Los Alamos National Laboratory with six different cross section libraries. Three of the libraries were derived directly from Release 3 of ENDF/B-VI (ENDF/B-VI.3) and differ only in the number of groups (34, 89 or 190). The other three libraries are identical to the first three except for a modification to the cross sections for 238 U in the resonance range
Analysis of benchmark lattices with endf/b-vi, jef-2.2 and jendl-3 data
Saglam, M.
1995-01-01
The NJOY Nuclear Data Processing System has been used to process the ENDF/B-VI , JEF-2.2 and JENDL-3 Nuclear Cross Section Data Bases into multigroup form. A brief description of the data bases is given and the assumptions made in processing the data from evaluated nuclear data file format to multigroup format are presented. The differences and similarities of the Evaluated Nuclear Data Files have been investigated by producing four group cross sections by using the GROUPIE code and calculating thermal, fission spectrum averaged and 2200 m/s cross sections and resonance integrals using the INTER cale. It has been shown that the evaluated data for U238 in JEF and ENDF/B-VI are principally the same while in case of U235 the same is true for JENDL and ENDF/B-VI. The evaluations for U233 and Th232 are different for all three ENDF files. Several utility codes have been written to convert the multigroup library into a WIMS-D4 compatible binary library. The performance and suitability of the generated libraries have been tested with the use of metal tueled TRX lattices, uranium oxide fueled BAPL lattices and Th232-U233 fueled BNL lattices. The use ot a new thermal scattering matrix for Hydrogen from ENDF/B-VI increased keff for 0.5 o/ while the use of ENDF/B-VI U238 decreased it for 2.5 %. Although the original WIMS library performed well for Ihe effective multiplication factor of the lattices there is an improvement for the epithermal to thermal capture rate of U238 while using new data in the TRX and BAPL lattices. The effect of the fission spectrum is investigated for the BNL lattices and it is shown that using U233 fission spectrum instead of the original U235 spectrum gives a keff which agrees better with the experimental value. The results obtained by using new multigroup data are generally acceptable and in the experimental error range. They especially improve the prediction of the reaction rate dependent benchmark parameters
Pelloni, S.; Grimm, P.; Mathews, D.; Paratte, J.M.
1989-06-01
In this report the capability of various code systems widely used at PSI (such as WIMS-D, BOXER, and the AARE modules TRAMIX and MICROX-2 in connection with the one-dimensional transport code ONEDANT) and JEF-1 based nuclear data libraries to compute LWR lattices is analysed by comparing results from thermal reactor benchmarks TRX and BAPL with experiment and with previously published values. It is shown that with the JEF-1 evaluation eigenvalues are generally well predicted within 8 mk (1 mk = 0.001) or less by all code systems, and that all methods give reasonable results for the measured reaction rate within or not too far from the experimental uncertainty. This is consistent with previous similar studies. (author) 7 tabs., 36 refs
Validation of WIMS-CANDU using Pin-Cell Lattices
Kim, Won Young; Min, Byung Joo; Park, Joo Hwan
2006-01-01
The WIMS-CANDU is a lattice code which has a depletion capability for the analysis of reactor physics problems related to a design and safety. The WIMS-CANDU code has been developed from the WIMSD5B, a version of the WIMS code released from the OECD/NEA data bank in 1998. The lattice code POWDERPUFS-V (PPV) has been used for the physics design and analysis of a natural uranium fuel for the CANDU reactor. However since the application of PPV is limited to a fresh fuel due to its empirical correlations, the WIMS-AECL code has been developed by AECL to substitute the PPV. Also, the WIMS-CANDU code is being developed to perform the physics analysis of the present operating CANDU reactors as a replacement of PPV. As one of the developing work of WIMS-CANDU, the U 238 absorption cross-section in the nuclear data library of WIMS-CANDU was updated and WIMS-CANDU was validated using the benchmark problems for pin-cell lattices such as TRX-1, TRX-2, Bapl-1, Bapl-2 and Bapl-3. The results by the WIMS-CANDU and the WIMS-AECL were compared with the experimental data
Uranium-fuel thermal reactor benchmark testing of CENDL-3
Liu Ping
2001-01-01
CENDL-3, the new version of China Evaluated Nuclear Data Library are being processed, and distributed for thermal reactor benchmark analysis recently. The processing was carried out using the NJOY nuclear data processing system. The calculations and analyses of uranium-fuel thermal assemblies TRX-1,2, BAPL-1,2,3, ZEEP-1,2,3 were done with lattice code WIMSD5A. The results were compared with the experimental results, the results of the '1986'WIMS library and the results based on ENDF/B-VI. (author)
HELIOS2: Benchmarking against experiments for hexagonal and square lattices
Simeonov, T.
2009-01-01
HELIOS2, is a 2D transport theory program for fuel burnup and gamma-flux calculation. It solves the neutron and gamma transport equations in a general, two-dimensional geometry bounded by a polygon of straight lines. The applied transport solver may be chosen between: The Method of Collision Probabilities (CP) and The Method of Characteristics(MoC). The former is well known for its successful application for preparation of cross section data banks for 3D simulators for all types lattices for WWERs, PWRs, BWRs, AGRs, RBMK and CANDU reactors. The later, MoC, helps in the areas where the requirements of CP for computational power become too large of practical application. The application of HELIOS2 and The Method of Characteristics for some large from calculation point of view benchmarks is presented in this paper. The analysis combines comparisons to measured data from the Hungarian ZR-6 reactor and JAERI facility of Tank type Critical Assembly (TCA) to verify and validate HELIOS2 and MOC for WWER assembly imitators; configurations with different absorber types- ZrB 2 , B 4 C, Eu 2 O 3 and Gd 2 O 3 ; and critical configurations with stainless steel in the reflector. Core eigenvalues and reaction rates are compared. With the account for the uncertainties the results are generally excellent. Special place in this paper is given to the effect of Iron-made radial reflector. Comparisons to measurements from TIC and TCA for stainless steel and Iron reflected cores are presented. The calculated by HELIOS-2 reactivity effect is in very good agreement with the measurements. (author)
HELIOS2: Benchmarking Against Experiments for Hexagonal and Square Lattices
Simeonov, T.
2009-01-01
HELIOS2, is a 2D transport theory program for fuel burnup and gamma-flux calculation. It solves the neutron and gamma transport equations in a general, two-dimensional geometry bounded by a polygon of straight lines. The applied transport solver may be chosen between: The Method of Collision Probabilities and The Method of Characteristics. The former is well known for its successful application for preparation of cross section data banks for 3D simulators for all types lattices for WWER's, PWR's, BWR's, AGR's, RBMK and CANDU reactors. The later, method of characteristics, helps in the areas where the requirements of collision probability for computational power become too large of practical application. The application of HELIOS2 and The method of characteristics for some large from calculation point of view benchmarks is presented in this paper. The analysis combines comparisons to measured data from the Hungarian ZR-6 reactor and JAERI's facility of tanktype critical assembly to verify and validate HELIOS2 and method of characteristics for WWER assembly imitators; configurations with different absorber types-ZrB2, B4C, Eu2O3 and Gd2O3; and critical configurations with stainless steel in the reflector. Core eigenvalues and reaction rates are compared. With the account for the uncertainties the results are generally excellent. Special place in this paper is given to the effect of Iron-made radial reflector. Comparisons to measurements from The Temporary International Collective and tanktype critical assembly for stainless steel and Iron reflected cores are presented. The calculated by HELIOS-2 reactivity effect is in very good agreement with the measurements. (Authors)
Benchmarking lattice physics data and methods for boiling water reactor analysis
Cacciapouti, R.J.; Edenius, M.; Harris, D.R.; Hebert, M.J.; Kapitz, D.M.; Pilat, E.E.; VerPlanck, D.M.
1983-01-01
The objective of the work reported was to verify the adequacy of lattice physics modeling for the analysis of the Vermont Yankee BWR using a multigroup, two-dimensional transport theory code. The BWR lattice physics methods have been benchmarked against reactor physics experiments, higher order calculations, and actual operating data
Benchmark test of CP-PACS for lattice QCD
Yoshie, Tomoteru
1996-01-01
The CP-PACS is a massively parallel computer dedicated for calculations in computational physics and will be in operation in the spring of 1996 at Center for Computational Physics, University of Tsukuba. In this paper, we describe the architecture of the CP-PACS and report the results of the estimate of the performance of the CP-PACS for typical lattice QCD calculations. (author)
Review of international solutions to NEACRP benchmark BWR lattice cell problems
Halsall, M.J.
1977-12-01
This paper summarises international solutions to a set of BWR benchmark problems. The problems, posed as an activity sponsored by the Nuclear Energy Agency Committee on Reactor Physics, were as follows: 9-pin supercell with central burnable poison pin, mini-BWR with 4 pin-cells and water gaps and control rod cruciform, full 7 x 7 pin BWR lattice cell with differential U 235 enrichment, and full 8 x 8 pin BWR lattice cell with water-hole, Pu-loading, burnable poison, and homogenised cruciform control rod. Solutions have been contributed by Denmark, Japan, Sweden, Switzerland and the UK. (author)
Meylianti S., Brigita
1999-01-01
Benchmarking has different meaning to different people. There are five types of benchmarking, namely internal benchmarking, competitive benchmarking, industry / functional benchmarking, process / generic benchmarking and collaborative benchmarking. Each type of benchmarking has its own advantages as well as disadvantages. Therefore it is important to know what kind of benchmarking is suitable to a specific application. This paper will discuss those five types of benchmarking in detail, includ...
The impact of ENDF/B-VI Rev. 3 data on thermal reactor lattices
Trkov, A.
1995-10-01
The ENDF/B-VI Revision 3 files have been released through the International Atomic Energy Agency. The data for hydrogen, aluminium and uranium-235 were processed to prepare an updated WIMS-D library. Thermal benchmark lattices TRX, BAPL and DIMPLE were analyzed. The new data for the thermal scattering laws of hydrogen bound in water had no significant influence on the integral parameters. The effect of the new uranium-235 data was to reduce the lattice multiplication factor by up to 0.3% Δ k/k. The effect of the new aluminium data was also non-negligible. It was traced to the change in the interpolation law for the total and the capture cross sections, which seems incorrect. (author). 8 refs, 1 fig., 2 tabs
Benchmarking of epithermal methods in the lattice-physics code EPRI-CELL
Williams, M.L.; Wright, R.Q.; Barhen, J.; Rothenstein, W.; Toney, B.
1982-01-01
The epithermal cross section shielding methods used in the lattice physics code EPRI-CELL (E-C) have been extensively studied to determine its major approximations and to examine the sensitivity of computed results to these approximations. The study has resulted in several improvements in the original methodology. These include: treatment of the external moderator source with intermediate resonance (IR) theory, development of a new Dancoff factor expression to account for clad interactions, development of a new method for treating resonance interference, and application of a generalized least squares method to compute best-estimate values for the Bell factor and group-dependent IR parameters. The modified E-C code with its new ENDF/B-V cross section library is tested for several numerical benchmark problems. Integral parameters computed by EC are compared with those obtained with point-cross section Monte Carlo calculations, and E-C fine group cross sections are benchmarked against point-cross section descrete ordinates calculations. It is found that the code modifications improve agreement between E-C and the more sophisticated methods. E-C shows excellent agreement on the integral parameters and usually agrees within a few percent on fine-group, shielded cross sections
Benchmark calculation of APOLLO-2 and SLAROM-UF in a fast reactor lattice
Hazama, T.
2009-07-01
A lattice cell benchmark calculation is carried out for APOLLO2 and SLAROM-UF on the infinite lattice of a simple pin cell featuring a fast reactor. The accuracy in k-infinity and reaction rates is investigated in their reference and standard level calculations. In the 1. reference level calculation, APOLLO2 and SLAROM-UF agree with the reference value of k-infinity obtained by a continuous energy Monte Carlo calculation within 50 pcm. However, larger errors are observed in a particular reaction rate and energy range. The major problem common to both codes is in the cross section library of 239 Pu in the unresolved energy range. In the 2. reference level calculation, which is based on the ECCO 1968 group structure, both results of k-infinity agree with the reference value within 100 pcm. The resonance overlap effect is observed by several percents in cross sections of heavy nuclides. In the standard level calculation based on the APOLLO2 library creation methodology, a discrepancy appears by more than 300 pcm. A restriction is revealed in APOLLO2. Its standard cross section library does not have a sufficiently small background cross section to evaluate the self shielding effect on 56 Fe cross sections. The restriction can be removed by introducing the mixture self-shielding treatment recently introduced to APOLLO2. SLAROM-UF original standard level calculation based on the JFS-3 library creation methodology is the best among the standard level calculations. Improvement from the SLAROM-UF standard level calculation is achieved mainly by use of a proper weight function for light or intermediate nuclides. (author)
Freudenreich, W.E.; Aaldijk, J.K.
1994-08-01
The Working Party on Plutonium Recycling of the Nuclear Science Committee of the OECD Nuclear Energy Agency has initiated a benchmark study on the calculation of the void reactivity effect in MOX lattices. The results presented here were obtained with the continuous energy, generalized geometry Monte Carlo transport code MCNP. The cross-section libraries used were processed from the JEF-2.2 evaluation taking into account selfshielding in the unresolved resonance ranges (selfshielding in the resolved resonance ranges is treated by MCNP). For an infinite lattice of unit cells a positive void reactivity effect was found only for the MOX fuel with the largest Pu content. For an infinite lattice of macro cells (voidable inner zone with different fuel mixtures surrounded by an outer zone of UO 2 fuel with moderator) a positive void reactivity effect was obtained for the three MOX fuel types considered. The results are not representative for MOX-loaded power reactor lattices, but serve only to intercompare reactor physics codes and libraries. (orig.)
Beretta Sergio; Dossi Andrea; Grove Hugh
2000-01-01
Due to their particular nature, the benchmarking methodologies tend to exceed the boundaries of management techniques, and to enter the territories of managerial culture. A culture that is also destined to break into the accounting area not only strongly supporting the possibility of fixing targets, and measuring and comparing the performance (an aspect that is already innovative and that is worthy of attention), but also questioning one of the principles (or taboos) of the accounting or...
Benchmark testing of CENDL-2 for U-fuel thermal reactors
Zhang Baocheng; Liu Guisheng; Liu Ping
1995-01-01
Based on CENDL-2, NJOY-WIMS code system was used to generate 69-group constants, and do benchmark testing for TRX-1,2; BAPL-UO-2-1,2,3; ZEEP-1,2,3. All the results proved that CENDL-2 is reliable for thermal reactor calculations. (3 tabs.)
D.C. Blitz (David)
2011-01-01
textabstractBenchmarking benchmarks is a bundle of six studies that are inspired by the prevalence of benchmarking in academic finance research as well as in investment practice. Three studies examine if current benchmark asset pricing models adequately describe the cross-section of stock returns.
Thermal lattice benchmarks for testing basic evaluated data files, developed with MCNP4B
Maucec, M.; Glumac, B.
1996-01-01
The development of unit cell and full reactor core models of DIMPLE S01A and TRX-1 and TRX-2 benchmark experiments, using Monte Carlo computer code MCNP4B is presented. Nuclear data from ENDF/B-V and VI version of cross-section library were used in the calculations. In addition, a comparison to results obtained with the similar models and cross-section data from the EJ2-MCNPlib library (which is based upon the JEF-2.2 evaluation) developed in IRC Petten, Netherlands is presented. The results of the criticality calculation with ENDF/B-VI data library, and a comparison to results obtained using JEF-2.2 evaluation, confirm the MCNP4B full core model of a DIMPLE reactor as a good benchmark for testing basic evaluated data files. On the other hand, the criticality calculations results obtained using the TRX full core models show less agreement with experiment. It is obvious that without additional data about the TRX geometry, our TRX models are not suitable as Monte Carlo benchmarks. (author)
Assessment of neutron transport codes for application to CANDU fuel lattices analysis
Roh, Gyu Hong; Choi, Hang Bok
1999-08-01
In order to assess the applicability of WIMS-AECL and HELIOS code to the CANDU fuel lattice analysis, the physics calculations has been carried out for the standard CANDU fuel and DUPIC fuel lattices, and the results were compared with those of Monte Carlo code MCNP-4B. In this study, in order to consider the full isotopic composition and the temperature effect, new MCNP libraries have been generated from ENDF/B-VI release 3 and validated for typical benchmark problems. The TRX-1,2,BAPL-1,2,3 pin -cell lattices and KENO criticality safety benchmark calculations have been performed for the new MCNP libraries, and the results have shown that the new MCNP library has sufficient accuracy to be used for physics calculation. Then, the lattice codes have been benchmarked by the MCNP code for the major physics parameters such as the burnup reactivity, void reactivity, relative pin power and Doppler coefficient, etc. for the standard CANDU fuel and DUPIC fuel lattices. For the standard CANDU fuel lattice, it was found that the results of WIMS-AECL calculations are consistent with those of MCNP. For the DUPIC fuel lattice, however, the results of WIMS-AECL calculations with ENDF/B-V library have shown that the discrepancy from the results of MCNP calculations increases when the fuel burnup is relatively high. The burnup reactivities of WIMS-ACEL calculations with ENDF/B-VI library have shown excellent agreements with those of MCNP calculation for both the standard CANDU and DUPIC fuel lattices. However, the Doppler coefficient have relatively large discrepancies compared with MCNP calculations, and the difference increases as the fuel burns. On the other hand, the results of HELIOS calculation are consistent with those of MCNP even though the discrepancy is slightly larger compared with the case of the standard CANDU fuel lattice. this study has shown that the WIMS-AECL products reliable results for the natural uranium fuel. However, it is recommended that the WIMS
McCoy, D.R.
1981-01-01
S/sub N/ computational benchmark solutions are generated for a onegroup and multigroup fuel-void slab lattice cell which is a rough model of a gas-cooled fast reactor (GCFR) lattice cell. The reactivity induced by the extrusion of the fuel material into the voided region is determined for a series of partially extruded lattice cell configurations. A special modified Gauss S/sub N/ ordinate array design is developed in order to obtain eigenvalues with errors less than 0.03% in all of the configurations that are considered. The modified Gauss S/sub N/ ordinate array design has a substantially improved eigenvalue angular convergence behavior when compared to existing S/sub N/ ordinate array designs used in neutron streaming applications. The angular refinement computations are performed in some cases by using a perturbation theory method which enables one to obtain high order S/sub N/ eigenvalue estimates for greatly reduced computational costs
Sharpe, J.; Salaun, F.; Hummel, D.; Moghrabi, A.; Nowak, M.; Pencer, J.; Novog, D.; Buijs, A.
2015-01-01
Discrepancies in key lattice physics parameters have been observed between various deterministic (e.g. DRAGON and WIMS-AECL) and stochastic (MCNP, KENO) neutron transport codes in modeling previous versions of the Canadian SCWR lattice cell. Further, inconsistencies in these parameters have also been observed when using different nuclear data libraries. In this work, the predictions of k∞, various reactivity coefficients, and relative ring-averaged pin powers have been re-evaluated using these codes and libraries with the most recent 64-element fuel assembly geometry. A benchmark problem has been defined to quantify the dissimilarities between code results for a number of responses along the fuel channel under prescribed hot full power (HFP), hot zero power (HZP) and cold zero power (CZP) conditions and at several fuel burnups (0, 25 and 50 MW·d·kg"-"1 [HM]). Results from deterministic (TRITON, DRAGON) and stochastic codes (MCNP6, KENO V.a and KENO-VI) are presented. (author)
Akie, Hiroshi; Ishiguro, Yukio; Takano, Hideki
1988-10-01
The results of the NEACRP HCLWR cell burnup benchmark calculations are summarized in this report. Fifteen organizations from eight countries participated in this benchmark and submitted twenty solutions. Large differences are still observed among the calculated values of void reactivities and conversion ratios. These differences are mainly caused from the discrepancies in the reaction rates of U-238, Pu-239 and fission products. The physics problems related to these results are briefly investigated in the report. In the specialists' meeting on this benchmark calculations held in April 1988, it was recommended to perform continuous energy Monte Carlo calculations in order to obtain reference solutions for design codes. The conclusions resulted from the specialists' meeting are also presented. (author)
Salomons, E.M.; Lohman, W.J.A.; Zhou, H.
2016-01-01
Propagation of sound waves in air can be considered as a special case of fluid dynamics. Consequently, the lattice Boltzmann method (LBM) for fluid flow can be used for simulating sound propagation. In this article application of the LBM to sound propagation is illustrated for various cases:
Sharpe, J.; Salaun, F.; Hummel, D.; Moghrabi, A., E-mail: sharpejr@mcmaster.ca [McMaster University, Hamilton, ON (Canada); Nowak, M. [McMaster University, Hamilton, ON (Canada); Institut National Polytechnique de Grenoble, Phelma, Grenoble (France); Pencer, J. [McMaster University, Hamilton, ON (Canada); Canadian Nuclear Laboratories, Chalk River, ON, (Canada); Novog, D.; Buijs, A. [McMaster University, Hamilton, ON (Canada)
2015-07-01
Discrepancies in key lattice physics parameters have been observed between various deterministic (e.g. DRAGON and WIMS-AECL) and stochastic (MCNP, KENO) neutron transport codes in modeling previous versions of the Canadian SCWR lattice cell. Further, inconsistencies in these parameters have also been observed when using different nuclear data libraries. In this work, the predictions of k∞, various reactivity coefficients, and relative ring-averaged pin powers have been re-evaluated using these codes and libraries with the most recent 64-element fuel assembly geometry. A benchmark problem has been defined to quantify the dissimilarities between code results for a number of responses along the fuel channel under prescribed hot full power (HFP), hot zero power (HZP) and cold zero power (CZP) conditions and at several fuel burnups (0, 25 and 50 MW·d·kg{sup -1} [HM]). Results from deterministic (TRITON, DRAGON) and stochastic codes (MCNP6, KENO V.a and KENO-VI) are presented. (author)
Salomons, Erik M; Lohman, Walter J A; Zhou, Han
2016-01-01
Propagation of sound waves in air can be considered as a special case of fluid dynamics. Consequently, the lattice Boltzmann method (LBM) for fluid flow can be used for simulating sound propagation. In this article application of the LBM to sound propagation is illustrated for various cases: free-field propagation, propagation over porous and non-porous ground, propagation over a noise barrier, and propagation in an atmosphere with wind. LBM results are compared with solutions of the equations of acoustics. It is found that the LBM works well for sound waves, but dissipation of sound waves with the LBM is generally much larger than real dissipation of sound waves in air. To circumvent this problem it is proposed here to use the LBM for assessing the excess sound level, i.e. the difference between the sound level and the free-field sound level. The effect of dissipation on the excess sound level is much smaller than the effect on the sound level, so the LBM can be used to estimate the excess sound level for a non-dissipative atmosphere, which is a useful quantity in atmospheric acoustics. To reduce dissipation in an LBM simulation two approaches are considered: i) reduction of the kinematic viscosity and ii) reduction of the lattice spacing.
TRX and UO2 criticality benchmarks with SAM-CE
Beer, M.; Troubetzkoy, E.S.; Lichtenstein, H.; Rose, P.F.
1980-01-01
A set of thermal reactor benchmark calculations with SAM-CE which have been conducted at both MAGI and at BNL are described. Their purpose was both validation of the SAM-CE reactor eigenvalue capability developed by MAGI and a substantial contribution to the data testing of both ENDF/B-IV and ENDF/B-V libraries. This experience also resulted in increased calculational efficiency of the code and an example is given. The benchmark analysis included the TRX-1 infinite cell using both ENDF/B-IV and ENDF/B-V cross section sets and calculations using ENDF/B-IV of the TRX-1 full core and TRX-2 cell. BAPL-UO2-1 calculations were conducted for the cell using both ENDF/B-IV and ENDF/B-V and for the full core with ENDF/B-V
Wiji Suwarno
2017-02-01
Full Text Available The term benchmarking has been encountered in the implementation of total quality (TQM or in Indonesian termed holistic quality management because benchmarking is a tool to look for ideas or learn from the library. Benchmarking is a processof measuring and comparing for continuous business process of systematic and continuous measurement, the process of measuring and comparing for continuous business process of an organization to get information that can help these organization improve their performance efforts.
Lawson, Lartey; Nielsen, Kurt
2005-01-01
We discuss individual learning by interactive benchmarking using stochastic frontier models. The interactions allow the user to tailor the performance evaluation to preferences and explore alternative improvement strategies by selecting and searching the different frontiers using directional...... in the suggested benchmarking tool. The study investigates how different characteristics on dairy farms influences the technical efficiency....
Peña, Alfredo
This report contains the description of a number of benchmarks with the purpose of evaluating flow models for near-shore wind resource estimation. The benchmarks are designed based on the comprehensive database of observations that the RUNE coastal experiment established from onshore lidar...
Hougaard, Jens Leth; Tvede, Mich
2002-01-01
Within a production theoretic framework, this paper considers an axiomatic approach to benchmark selection. It is shown that two simple and weak axioms; efficiency and comprehensive monotonicity characterize a natural family of benchmarks which typically becomes unique. Further axioms are added...... in order to obtain a unique selection...
Analysis of a molten salt reactor benchmark
Ghosh, Biplab; Bajpai, Anil; Degweker, S.B.
2013-01-01
This paper discusses results of our studies of an IAEA molten salt reactor (MSR) benchmark. The benchmark, proposed by Japan, involves burnup calculations of a single lattice cell of a MSR for burning plutonium and other minor actinides. We have analyzed this cell with in-house developed burnup codes BURNTRAN and McBURN. This paper also presents a comparison of the results of our codes and those obtained by the proposers of the benchmark. (author)
Leszczynski, Francisco
2002-01-01
The IAEA-WIMS Library Update Project (WLUP) is on the end stage. The final library will be released on 2002. It is a result of research and development made by more than ten investigators during 10 years. The organization of benchmarks for testing and choosing the best set of data has been coordinated by the author of this paper. It is presented the organization, name conventions, contents and documentation of WLUP benchmarks, and an updated list of the main parameters for all cases. First, the benchmarks objectives and types are given. Then, comparisons of results from different WIMSD libraries are included. Finally it is described the program QVALUE for analysis and plot of results. Some examples are given. The set of benchmarks implemented on this work is a fundamental tool for testing new multigroup libraries. (author)
Agrell, Per J.; Bogetoft, Peter
2017-01-01
Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators. The appli......Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of bench-marking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...
Agrell, Per J.; Bogetoft, Peter
2017-01-01
Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators. The appli......Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...
Hasenfratz, P.
1983-01-01
The author presents a general introduction to lattice gauge theories and discusses non-perturbative methods in the gauge sector. He then shows how the lattice works in obtaining the string tension in SU(2). Lattice QCD at finite physical temperature is discussed. Universality tests in SU(2) lattice QCD are presented. SU(3) pure gauge theory is briefly dealt with. Finally, fermions on the lattice are considered. (Auth.)
On the thermal scattering law data for reactor lattice calculations
Trkov, A.; Mattes, M.
2004-01-01
Thermal scattering law data for hydrogen bound in water, hydrogen bound in zirconium hydride and deuterium bound in heavy water have been re-evaluated. The influence of the thermal scattering law data on critical lattices has been studied with detailed Monte Carlo calculations and a summary of results is presented for a numerical benchmark and for the TRIGA reactor benchmark. Systematics for a large sequence of benchmarks analysed with the WIMS-D lattice code are also presented. (author)
Benchmarking in Foodservice Operations
Johnson, Bonnie
1998-01-01
The objective of this study was to identify usage of foodservice performance measures, important activities in foodservice benchmarking, and benchmarking attitudes, beliefs, and practices by foodservice directors...
Polarization response of RHIC electron lens lattices
Ranjbar, V. H.; Méot, F.; Bai, M.; Abell, D. T.; Meiser, D.
2016-01-01
Depolarization response for a system of two orthogonal snakes at irrational tunes is studied in depth using lattice independent spin integration. Particularly, we consider the effect of overlapping spin resonances in this system, to understand the impact of phase, tune, relative location and threshold strengths of the spin resonances. Furthermore, these results are benchmarked and compared to two dimensional direct tracking results for the RHIC e-lens lattice and the standard lattice. We then consider the effect of longitudinal motion via chromatic scans using direct six dimensional lattice tracking.
Polarization response of RHIC electron lens lattices
V. H. Ranjbar
2016-10-01
Full Text Available Depolarization response for a system of two orthogonal snakes at irrational tunes is studied in depth using lattice independent spin integration. In particular we consider the effect of overlapping spin resonances in this system, to understand the impact of phase, tune, relative location and threshold strengths of the spin resonances. These results are benchmarked and compared to two dimensional direct tracking results for the RHIC e-lens lattice and the standard lattice. Finally we consider the effect of longitudinal motion via chromatic scans using direct six dimensional lattice tracking.
Chadderton, L.T.; Johnson, E.; Wohlenberg, T.
1976-01-01
Void lattices in metals apparently owe their stability to elastically anisotropic interactions. An ordered array of voids on the anion sublattice in fluorite does not fit so neatly into this scheme of things. Crowdions may play a part in the formation of the void lattice, and stability may derive from other sources. (Auth.)
Randjbar-Daemi, S.
1995-12-01
The so-called doubling problem in the lattice description of fermions led to a proof that under certain circumstances chiral gauge theories cannot be defined on the lattice. This is called the no-go theorem. It implies that if Γ/sub/A is defined on a lattice then its infrared limit, which should correspond to the quantum description of the classical action for the slowly varying fields on lattice scale, is inevitably a vector like theory. In particular, if not circumvented, the no-go theorem implies that there is no lattice formulation of the Standard Weinberg-Salam theory or SU(5) GUT, even though the fermions belong to anomaly-free representations of the gauge group. This talk aims to explain one possible attempt at bypassing the no-go theorem. 20 refs
Randjbar-Daemi, S
1995-12-01
The so-called doubling problem in the lattice description of fermions led to a proof that under certain circumstances chiral gauge theories cannot be defined on the lattice. This is called the no-go theorem. It implies that if {Gamma}/sub/A is defined on a lattice then its infrared limit, which should correspond to the quantum description of the classical action for the slowly varying fields on lattice scale, is inevitably a vector like theory. In particular, if not circumvented, the no-go theorem implies that there is no lattice formulation of the Standard Weinberg-Salam theory or SU(5) GUT, even though the fermions belong to anomaly-free representations of the gauge group. This talk aims to explain one possible attempt at bypassing the no-go theorem. 20 refs.
Thorn, C.B.
1988-01-01
The possibility of studying non-perturbative effects in string theory using a world sheet lattice is discussed. The light-cone lattice string model of Giles and Thorn is studied numerically to assess the accuracy of ''coarse lattice'' approximations. For free strings a 5 by 15 lattice seems sufficient to obtain better than 10% accuracy for the bosonic string tachyon mass squared. In addition a crude lattice model simulating string like interactions is studied to find out how easily a coarse lattice calculation can pick out effects such as bound states which would qualitatively alter the spectrum of the free theory. The role of the critical dimension in obtaining a finite continuum limit is discussed. Instead of the ''gaussian'' lattice model one could use one of the vertex models, whose continuum limit is the same as a gaussian model on a torus of any radius. Indeed, any critical 2 dimensional statistical system will have a stringy continuum limit in the absence of string interactions. 8 refs., 1 fig. , 9 tabs
Benchmarking and Performance Measurement.
Town, J. Stephen
This paper defines benchmarking and its relationship to quality management, describes a project which applied the technique in a library context, and explores the relationship between performance measurement and benchmarking. Numerous benchmarking methods contain similar elements: deciding what to benchmark; identifying partners; gathering…
Benchmarking in the Netherlands
1999-01-01
In two articles an overview is given of the activities in the Dutch industry and energy sector with respect to benchmarking. In benchmarking operational processes of different competitive businesses are compared to improve your own performance. Benchmark covenants for energy efficiency between the Dutch government and industrial sectors contribute to a growth of the number of benchmark surveys in the energy intensive industry in the Netherlands. However, some doubt the effectiveness of the benchmark studies
Smith, L.
1975-01-01
An analysis is given of a number of variants of the basic lattice of the planned ISABELLE storage rings. The variants were formed by removing cells from the normal part of the lattice and juggling the lengths of magnets, cells, and insertions in order to maintain a rational relation of circumference to that of the AGS and approximately the same dispersion. Special insertions, correction windings, and the working line with nonlinear resonances are discussed
A 3D stylized half-core CANDU benchmark problem
Pounders, Justin M.; Rahnema, Farzad; Serghiuta, Dumitru; Tholammakkil, John
2011-01-01
A 3D stylized half-core Canadian deuterium uranium (CANDU) reactor benchmark problem is presented. The benchmark problem is comprised of a heterogeneous lattice of 37-element natural uranium fuel bundles, heavy water moderated, heavy water cooled, with adjuster rods included as reactivity control devices. Furthermore, a 2-group macroscopic cross section library has been developed for the problem to increase the utility of this benchmark for full-core deterministic transport methods development. Monte Carlo results are presented for the benchmark problem in cooled, checkerboard void, and full coolant void configurations.
Quantum lattice model solver HΦ
Kawamura, Mitsuaki; Yoshimi, Kazuyoshi; Misawa, Takahiro; Yamaji, Youhei; Todo, Synge; Kawashima, Naoki
2017-08-01
HΦ [aitch-phi ] is a program package based on the Lanczos-type eigenvalue solution applicable to a broad range of quantum lattice models, i.e., arbitrary quantum lattice models with two-body interactions, including the Heisenberg model, the Kitaev model, the Hubbard model and the Kondo-lattice model. While it works well on PCs and PC-clusters, HΦ also runs efficiently on massively parallel computers, which considerably extends the tractable range of the system size. In addition, unlike most existing packages, HΦ supports finite-temperature calculations through the method of thermal pure quantum (TPQ) states. In this paper, we explain theoretical background and user-interface of HΦ. We also show the benchmark results of HΦ on supercomputers such as the K computer at RIKEN Advanced Institute for Computational Science (AICS) and SGI ICE XA (Sekirei) at the Institute for the Solid State Physics (ISSP).
Catterall, Simon
2013-01-01
Discretization of supersymmetric theories is an old problem in lattice field theory. It has resisted solution until quite recently when new ideas drawn from orbifold constructions and topological field theory have been brought to bear on the question. The result has been the creation of a new class of lattice gauge theory in which the lattice action is invariant under one or more supersymmetries. The resultant theories are local and free of doublers and in the case of Yang-Mills theories also possess exact gauge invariance. In principle they form the basis for a truly non-perturbative definition of the continuum supersymmetric field theory. In this talk these ideas are reviewed with particular emphasis being placed on N = 4 super Yang-Mills theory.
U.S. Environmental Protection Agency — The Aquatic Life Benchmarks is an EPA-developed set of criteria for freshwater species. These benchmarks are based on toxicity values reviewed by EPA and used in the...
Benchmarking for Higher Education.
Jackson, Norman, Ed.; Lund, Helen, Ed.
The chapters in this collection explore the concept of benchmarking as it is being used and developed in higher education (HE). Case studies and reviews show how universities in the United Kingdom are using benchmarking to aid in self-regulation and self-improvement. The chapters are: (1) "Introduction to Benchmarking" (Norman Jackson…
Creutz, M.
1984-01-01
After reviewing some recent developments in supercomputer access, the author discusses a few areas where perturbation theory and lattice gauge simulations make contact. The author concludes with a brief discussion of a deterministic dynamics for the Ising model. This may be useful for numerical studies of nonequilibrium phenomena. 13 references
Benchmarking semantic web technology
García-Castro, R
2009-01-01
This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:
Benchmarking in University Toolbox
Katarzyna Kuźmicz
2015-06-01
Full Text Available In the face of global competition and rising challenges that higher education institutions (HEIs meet, it is imperative to increase innovativeness and efficiency of their management. Benchmarking can be the appropriate tool to search for a point of reference necessary to assess institution’s competitive position and learn from the best in order to improve. The primary purpose of the paper is to present in-depth analysis of benchmarking application in HEIs worldwide. The study involves indicating premises of using benchmarking in HEIs. It also contains detailed examination of types, approaches and scope of benchmarking initiatives. The thorough insight of benchmarking applications enabled developing classification of benchmarking undertakings in HEIs. The paper includes review of the most recent benchmarking projects and relating them to the classification according to the elaborated criteria (geographical range, scope, type of data, subject, support and continuity. The presented examples were chosen in order to exemplify different approaches to benchmarking in higher education setting. The study was performed on the basis of the published reports from benchmarking projects, scientific literature and the experience of the author from the active participation in benchmarking projects. The paper concludes with recommendations for university managers undertaking benchmarking, derived on the basis of the conducted analysis.
EPRI depletion benchmark calculations using PARAGON
Kucukboyaci, Vefa N.
2015-01-01
Highlights: • PARAGON depletion calculations are benchmarked against the EPRI reactivity decrement experiments. • Benchmarks cover a wide range of enrichments, burnups, cooling times, and burnable absorbers, and different depletion and storage conditions. • Results from PARAGON-SCALE scheme are more conservative relative to the benchmark data. • ENDF/B-VII based data reduces the excess conservatism and brings the predictions closer to benchmark reactivity decrement values. - Abstract: In order to conservatively apply burnup credit in spent fuel pool criticality analyses, code validation for both fresh and used fuel is required. Fresh fuel validation is typically done by modeling experiments from the “International Handbook.” A depletion validation can determine a bias and bias uncertainty for the worth of the isotopes not found in the fresh fuel critical experiments. Westinghouse’s burnup credit methodology uses PARAGON™ (Westinghouse 2-D lattice physics code) and its 70-group cross-section library, which have been benchmarked, qualified, and licensed both as a standalone transport code and as a nuclear data source for core design simulations. A bias and bias uncertainty for the worth of depletion isotopes, however, are not available for PARAGON. Instead, the 5% decrement approach for depletion uncertainty is used, as set forth in the Kopp memo. Recently, EPRI developed a set of benchmarks based on a large set of power distribution measurements to ascertain reactivity biases. The depletion reactivity has been used to create 11 benchmark cases for 10, 20, 30, 40, 50, and 60 GWd/MTU and 3 cooling times 100 h, 5 years, and 15 years. These benchmark cases are analyzed with PARAGON and the SCALE package and sensitivity studies are performed using different cross-section libraries based on ENDF/B-VI.3 and ENDF/B-VII data to assess that the 5% decrement approach is conservative for determining depletion uncertainty
Schaefer, Stefan [DESY (Germany). Neumann Inst. for Computing
2016-11-01
These configurations are currently in use in many on-going projects carried out by researchers throughout Europe. In particular this data will serve as an essential input into the computation of the coupling constant of QCD, where some of the simulations are still on-going. But also projects computing the masses of hadrons and investigating their structure are underway as well as activities in the physics of heavy quarks. As this initial project of gauge field generation has been successful, it is worthwhile to extend the currently available ensembles with further points in parameter space. These will allow to further study and control systematic effects like the ones introduced by the finite volume, the non-physical quark masses and the finite lattice spacing. In particular certain compromises have still been made in the region where pion masses and lattice spacing are both small. This is because physical pion masses require larger lattices to keep the effects of the finite volume under control. At light pion masses, a precise control of the continuum extrapolation is therefore difficult, but certainly a main goal of future simulations. To reach this goal, algorithmic developments as well as faster hardware will be needed.
A simplified 2D HTTR benchmark problem
Zhang, Z.; Rahnema, F.; Pounders, J. M.; Zhang, D.; Ougouag, A.
2009-01-01
To access the accuracy of diffusion or transport methods for reactor calculations, it is desirable to create heterogeneous benchmark problems that are typical of relevant whole core configurations. In this paper we have created a numerical benchmark problem in 2D configuration typical of a high temperature gas cooled prismatic core. This problem was derived from the HTTR start-up experiment. For code-to-code verification, complex details of geometry and material specification of the physical experiments are not necessary. To this end, the benchmark problem presented here is derived by simplifications that remove the unnecessary details while retaining the heterogeneity and major physics properties from the neutronics viewpoint. Also included here is a six-group material (macroscopic) cross section library for the benchmark problem. This library was generated using the lattice depletion code HELIOS. Using this library, benchmark quality Monte Carlo solutions are provided for three different configurations (all-rods-in, partially-controlled and all-rods-out). The reference solutions include the core eigenvalue, block (assembly) averaged fuel pin fission density distributions, and absorption rate in absorbers (burnable poison and control rods). (authors)
Hendricks, J.S.; Whalen, D.J.; Cardon, D.A.; Uhle, J.L.
1991-01-01
Over 50 neutron benchmark calculations have recently been completed as part of an ongoing program to validate the MCNP Monte Carlo radiation transport code. The new and significant aspects of this work are as follows: These calculations are the first attempt at a validation program for MCNP and the first official benchmarking of version 4 of the code. We believe the chosen set of benchmarks is a comprehensive set that may be useful for benchmarking other radiation transport codes and data libraries. These calculations provide insight into how well neutron transport calculations can be expected to model a wide variety of problems
Efficient LBM visual simulation on face-centered cubic lattices.
Petkov, Kaloian; Qiu, Feng; Fan, Zhe; Kaufman, Arie E; Mueller, Klaus
2009-01-01
The Lattice Boltzmann method (LBM) for visual simulation of fluid flow generally employs cubic Cartesian (CC) lattices such as the D3Q13 and D3Q19 lattices for the particle transport. However, the CC lattices lead to suboptimal representation of the simulation space. We introduce the face-centered cubic (FCC) lattice, fD3Q13, for LBM simulations. Compared to the CC lattices, the fD3Q13 lattice creates a more isotropic sampling of the simulation domain and its single lattice speed (i.e., link length) simplifies the computations and data storage. Furthermore, the fD3Q13 lattice can be decomposed into two independent interleaved lattices, one of which can be discarded, which doubles the simulation speed. The resulting LBM simulation can be efficiently mapped to the GPU, further increasing the computational performance. We show the numerical advantages of the FCC lattice on channeled flow in 2D and the flow-past-a-sphere benchmark in 3D. In both cases, the comparison is against the corresponding CC lattices using the analytical solutions for the systems as well as velocity field visualizations. We also demonstrate the performance advantages of the fD3Q13 lattice for interactive simulation and rendering of hot smoke in an urban environment using thermal LBM.
Marck, Steven C. van der
2006-01-01
benchmarks deviates only 0.017% from the measured benchmark value. Moreover, no clear trends (with e.g. enrichment, lattice pitch, or spectrum) have been observed. Also for fast spectrum benchmarks, both for intermediately or highly enriched uranium and for plutonium, clear improvements are apparent from the calculations. The results for bare assemblies have improved, as well as those with a depleted or natural uranium reflector. On the other hand, the results for plutonium solutions (PU-SOL-THERM) are still high, on average (over 120 benchmarks) roughly 0.6%. Furthermore there still is a bias for a range of benchmarks based on cores in the Zero Power Reactor (ANL) with sizable amounts of tungsten in them. The results for the fusion shielding benchmarks have not changed significantly, compared to ENDF/B-VI.8, for most materials. The delayed neutron testing shows that the values for both thermal and fast spectrum cases are now well predicted, which is an improvement when compared with ENDF/B-VI.8
Benchmarking af kommunernes sagsbehandling
Amilon, Anna
Fra 2007 skal Ankestyrelsen gennemføre benchmarking af kommuernes sagsbehandlingskvalitet. Formålet med benchmarkingen er at udvikle praksisundersøgelsernes design med henblik på en bedre opfølgning og at forbedre kommunernes sagsbehandling. Dette arbejdspapir diskuterer metoder for benchmarking...
Bogetoft, Peter; Nielsen, Kurt
2005-01-01
We discuss the design of interactive, internet based benchmarking using parametric (statistical) as well as nonparametric (DEA) models. The user receives benchmarks and improvement potentials. The user is also given the possibility to search different efficiency frontiers and hereby to explore...
P.A. Boncz (Peter); T. Rühl (Tim); F. Kwakkel
1998-01-01
textabstractData Mining places specific requirements on DBMS query performance that cannot be evaluated satisfactorily using existing OLAP benchmarks. The DD Benchmark - defined here - provides a practical case and yardstick to explore how well a DBMS is able to support Data Mining applications. It
Canadian Health Libraries Association.
Nine Canadian health libraries participated in a pilot test of the Benchmarking Tool Kit between January and April, 1998. Although the Tool Kit was designed specifically for health libraries, the content and approach are useful to other types of libraries as well. Used to its full potential, benchmarking can provide a common measuring stick to…
Seabrooke, Leonard; Wigan, Duncan
2015-01-01
Non-governmental organisations use benchmarks as a form of symbolic violence to place political pressure on firms, states, and international organisations. The development of benchmarks requires three elements: (1) salience, that the community of concern is aware of the issue and views...... are put to the test. The first is a reformist benchmarking cycle where organisations defer to experts to create a benchmark that conforms with the broader system of politico-economic norms. The second is a revolutionary benchmarking cycle driven by expert-activists that seek to contest strong vested...... interests and challenge established politico-economic norms. Differentiating these cycles provides insights into how activists work through organisations and with expert networks, as well as how campaigns on complex economic issues can be mounted and sustained....
Yasu, Y.; Hirayama, H.; Namito, Y.; Yashiro, S.
1995-01-01
This paper proposes EGS4 Benchmark Suite which consists of three programs called UCSAMPL4, UCSAMPL4I and XYZDOS. This paper also evaluates optimization methods of recent RISC/UNIX systems, such as IBM, HP, DEC, Hitachi and Fujitsu, for the benchmark suite. When particular compiler option and math library were included in the evaluation process, system performed significantly better. Observed performance of some of the RISC/UNIX systems were beyond some so-called Mainframes of IBM, Hitachi or Fujitsu. The computer performance of EGS4 Code System on an HP9000/735 (99MHz) was defined to be the unit of EGS4 Unit. The EGS4 Benchmark Suite also run on various PCs such as Pentiums, i486 and DEC alpha and so forth. The performance of recent fast PCs reaches that of recent RISC/UNIX systems. The benchmark programs have been evaluated with correlation of industry benchmark programs, namely, SPECmark. (author)
Benchmark calculation of nuclear design code for HCLWR
Suzuki, Katsuo; Saji, Etsuro; Gakuhari, Kazuhiko; Akie, Hiroshi; Takano, Hideki; Ishiguro, Yukio.
1986-01-01
In the calculation of the lattice cell for High Conversion Light Water Reactors, big differences of nuclear design parameters appear between the results obtained by various methods and nuclear data libraries. The validity of the calculation can be verified by the critical experiment. The benchmark calculation is also efficient for the estimation of the validity in wide range of lattice parameters and burnup. As we do not have many measured data. The benchmark calculations were done by JAERI and MAPI, using SRAC and WIMS-E respectively. The problem covered the wide range of lattice parameters, i.e., from tight lattice to the current PWR lattice. The comparison was made on the effective multiplication factor, conversion ratio, and reaction rate of each nuclide, including burnup and void effects. The difference of the result is largest at the tightest lattice. But even at that lattice, the difference of the effective multiplication factor is only 1.4 %. The main cause of the difference is the neutron absorption rate U-238 in resonance energy region. The difference of other nuclear design parameters and their cause were also grasped. (author)
Scott, Paul
2006-01-01
A lattice is a (rectangular) grid of points, usually pictured as occurring at the intersections of two orthogonal sets of parallel, equally spaced lines. Polygons that have lattice points as vertices are called lattice polygons. It is clear that lattice polygons come in various shapes and sizes. A very small lattice triangle may cover just 3…
Verification and validation benchmarks.
Oberkampf, William Louis; Trucano, Timothy Guy
2007-02-01
Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of
Verification and validation benchmarks
Oberkampf, William Louis; Trucano, Timothy Guy
2007-01-01
Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the
Verification and validation benchmarks
Oberkampf, William L.; Trucano, Timothy G.
2008-01-01
Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the
Benchmarking and the laboratory
Galloway, M; Nadin, L
2001-01-01
This article describes how benchmarking can be used to assess laboratory performance. Two benchmarking schemes are reviewed, the Clinical Benchmarking Company's Pathology Report and the College of American Pathologists' Q-Probes scheme. The Clinical Benchmarking Company's Pathology Report is undertaken by staff based in the clinical management unit, Keele University with appropriate input from the professional organisations within pathology. Five annual reports have now been completed. Each report is a detailed analysis of 10 areas of laboratory performance. In this review, particular attention is focused on the areas of quality, productivity, variation in clinical practice, skill mix, and working hours. The Q-Probes scheme is part of the College of American Pathologists programme in studies of quality assurance. The Q-Probes scheme and its applicability to pathology in the UK is illustrated by reviewing two recent Q-Probe studies: routine outpatient test turnaround time and outpatient test order accuracy. The Q-Probes scheme is somewhat limited by the small number of UK laboratories that have participated. In conclusion, as a result of the government's policy in the UK, benchmarking is here to stay. Benchmarking schemes described in this article are one way in which pathologists can demonstrate that they are providing a cost effective and high quality service. Key Words: benchmarking • pathology PMID:11477112
Studies of thermal-reactor benchmark-data interpretation: experimental corrections
Sher, R.; Fiarman, S.
1976-10-01
Experimental values of integral parameters of the lattices studied in this report, i.e., the MIT(D 2 O) and TRX benchmark lattices have been re-examined and revised. The revisions correct several systematic errors that have been previously ignored or considered insignificant. These systematic errors are discussed in detail. The final corrected values are presented
Shielding benchmark problems, (2)
Tanaka, Shun-ichi; Sasamoto, Nobuo; Oka, Yoshiaki; Shin, Kazuo; Tada, Keiko.
1980-02-01
Shielding benchmark problems prepared by Working Group of Assessment of Shielding Experiments in the Research Committee on Shielding Design in the Atomic Energy Society of Japan were compiled by Shielding Laboratory in Japan Atomic Energy Research Institute. Fourteen shielding benchmark problems are presented newly in addition to twenty-one problems proposed already, for evaluating the calculational algorithm and accuracy of computer codes based on discrete ordinates method and Monte Carlo method and for evaluating the nuclear data used in codes. The present benchmark problems are principally for investigating the backscattering and the streaming of neutrons and gamma rays in two- and three-dimensional configurations. (author)
Toxicological Benchmarks for Wildlife
Sample, B.E. Opresko, D.M. Suter, G.W.
1993-01-01
Ecological risks of environmental contaminants are evaluated by using a two-tiered process. In the first tier, a screening assessment is performed where concentrations of contaminants in the environment are compared to no observed adverse effects level (NOAEL)-based toxicological benchmarks. These benchmarks represent concentrations of chemicals (i.e., concentrations presumed to be nonhazardous to the biota) in environmental media (water, sediment, soil, food, etc.). While exceedance of these benchmarks does not indicate any particular level or type of risk, concentrations below the benchmarks should not result in significant effects. In practice, when contaminant concentrations in food or water resources are less than these toxicological benchmarks, the contaminants may be excluded from further consideration. However, if the concentration of a contaminant exceeds a benchmark, that contaminant should be retained as a contaminant of potential concern (COPC) and investigated further. The second tier in ecological risk assessment, the baseline ecological risk assessment, may use toxicological benchmarks as part of a weight-of-evidence approach (Suter 1993). Under this approach, based toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. Other sources of evidence include media toxicity tests, surveys of biota (abundance and diversity), measures of contaminant body burdens, and biomarkers. This report presents NOAEL- and lowest observed adverse effects level (LOAEL)-based toxicological benchmarks for assessment of effects of 85 chemicals on 9 representative mammalian wildlife species (short-tailed shrew, little brown bat, meadow vole, white-footed mouse, cottontail rabbit, mink, red fox, and whitetail deer) or 11 avian wildlife species (American robin, rough-winged swallow, American woodcock, wild turkey, belted kingfisher, great blue heron, barred owl, barn owl, Cooper's hawk, and red
LATTICE: an interactive lattice computer code
Staples, J.
1976-10-01
LATTICE is a computer code which enables an interactive user to calculate the functions of a synchrotron lattice. This program satisfies the requirements at LBL for a simple interactive lattice program by borrowing ideas from both TRANSPORT and SYNCH. A fitting routine is included
Thermal reactor benchmark tests on JENDL-2
Takano, Hideki; Tsuchihashi, Keichiro; Yamane, Tsuyoshi; Akino, Fujiyoshi; Ishiguro, Yukio; Ido, Masaru.
1983-11-01
A group constant library for the thermal reactor standard nuclear design code system SRAC was produced by using the evaluated nuclear data JENDL-2. Furthermore, the group constants for 235 U were calculated also from ENDF/B-V. Thermal reactor benchmark calculations were performed using the produced group constant library. The selected benchmark cores are two water-moderated lattices (TRX-1 and 2), two heavy water-moderated cores (DCA and ETA-1), two graphite-moderated cores (SHE-8 and 13) and eight critical experiments for critical safety. The effective multiplication factors and lattice cell parameters were calculated and compared with the experimental values. The results are summarized as follows. (1) Effective multiplication factors: The results by JENDL-2 are considerably improved in comparison with ones by ENDF/B-IV. The best agreement is obtained by using JENDL-2 and ENDF/B-V (only 235 U) data. (2) Lattice cell parameters: For the rho 28 (the ratio of epithermal to thermal 238 U captures) and C* (the ratio of 238 U captures to 235 U fissions), the values calculated by JENDL-2 are in good agreement with the experimental values. The rho 28 (the ratio of 238 U to 235 U fissions) are overestimated as found also for the fast reactor benchmarks. The rho 02 (the ratio of epithermal to thermal 232 Th captures) calculated by JENDL-2 or ENDF/B-IV are considerably underestimated. The functions of the SRAC system have been continued to be extended according to the needs of its users. A brief description will be given, in Appendix B, to the extended parts of the SRAC system together with the input specification. (author)
Diagnostic Algorithm Benchmarking
Poll, Scott
2011-01-01
A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.
Benchmarking Swiss electricity grids
Walti, N.O.; Weber, Ch.
2001-01-01
This extensive article describes a pilot benchmarking project initiated by the Swiss Association of Electricity Enterprises that assessed 37 Swiss utilities. The data collected from these utilities on a voluntary basis included data on technical infrastructure, investments and operating costs. These various factors are listed and discussed in detail. The assessment methods and rating mechanisms that provided the benchmarks are discussed and the results of the pilot study are presented that are to form the basis of benchmarking procedures for the grid regulation authorities under the planned Switzerland's electricity market law. Examples of the practical use of the benchmarking methods are given and cost-efficiency questions still open in the area of investment and operating costs are listed. Prefaces by the Swiss Association of Electricity Enterprises and the Swiss Federal Office of Energy complete the article
Agrell, Per J.; Bogetoft, Peter
. The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...
Financial Integrity Benchmarks
City of Jackson, Mississippi — This data compiles standard financial integrity benchmarks that allow the City to measure its financial standing. It measure the City's debt ratio and bond ratings....
Benchmarking in Foodservice Operations
Johnson, Bonnie
1998-01-01
.... The design of this study included two parts: (1) eleven expert panelists involved in a Delphi technique to identify and rate importance of foodservice performance measures and rate the importance of benchmarking activities, and (2...
Choy, J.H.
1979-06-01
A benchmark of the TOTAL data base management system as applied to the Mirror Fusion Test Facility (MFTF) data base was implemented and run in February and March of 1979. The benchmark was run on an Interdata 8/32 and involved the following tasks: (1) data base design, (2) data base generation, (3) data base load, and (4) develop and implement programs to simulate MFTF usage of the data base
Accelerator shielding benchmark problems
Hirayama, H.; Ban, S.; Nakamura, T.
1993-01-01
Accelerator shielding benchmark problems prepared by Working Group of Accelerator Shielding in the Research Committee on Radiation Behavior in the Atomic Energy Society of Japan were compiled by Radiation Safety Control Center of National Laboratory for High Energy Physics. Twenty-five accelerator shielding benchmark problems are presented for evaluating the calculational algorithm, the accuracy of computer codes and the nuclear data used in codes. (author)
Tanaka, Shun-ichi; Sasamoto, Nobuo; Oka, Yoshiaki; Kawai, Masayoshi; Nakazawa, Masaharu.
1978-09-01
Shielding benchmark problems were prepared by the Working Group of Assessment of Shielding Experiments in the Research Comittee on Shielding Design of the Atomic Energy Society of Japan, and compiled by the Shielding Laboratory of Japan Atomic Energy Research Institute. Twenty-one kinds of shielding benchmark problems are presented for evaluating the calculational algorithm and the accuracy of computer codes based on the discrete ordinates method and the Monte Carlo method and for evaluating the nuclear data used in the codes. (author)
Benchmarking electricity distribution
Watts, K. [Department of Justice and Attorney-General, QLD (Australia)
1995-12-31
Benchmarking has been described as a method of continuous improvement that involves an ongoing and systematic evaluation and incorporation of external products, services and processes recognised as representing best practice. It is a management tool similar to total quality management (TQM) and business process re-engineering (BPR), and is best used as part of a total package. This paper discusses benchmarking models and approaches and suggests a few key performance indicators that could be applied to benchmarking electricity distribution utilities. Some recent benchmarking studies are used as examples and briefly discussed. It is concluded that benchmarking is a strong tool to be added to the range of techniques that can be used by electricity distribution utilities and other organizations in search of continuous improvement, and that there is now a high level of interest in Australia. Benchmarking represents an opportunity for organizations to approach learning from others in a disciplined and highly productive way, which will complement the other micro-economic reforms being implemented in Australia. (author). 26 refs.
Simulating colloid hydrodynamics with lattice Boltzmann methods
Cates, M E; Stratford, K; Adhikari, R; Stansell, P; Desplat, J-C; Pagonabarraga, I; Wagner, A J
2004-01-01
We present a progress report on our work on lattice Boltzmann methods for colloidal suspensions. We focus on the treatment of colloidal particles in binary solvents and on the inclusion of thermal noise. For a benchmark problem of colloids sedimenting and becoming trapped by capillary forces at a horizontal interface between two fluids, we discuss the criteria for parameter selection, and address the inevitable compromise between computational resources and simulation accuracy
Thermal and fast reactor benchmark testing of ENDF/B-6.4
Liu Guisheng
1999-01-01
The benchmark testing for B-6.4 was done with the same benchmark experiments and calculating method as for B-6.2. The effective multiplication factors k eff , central reaction rate ratios of fast assemblies and lattice cell reaction rate ratios of thermal lattice cell assemblies were calculated and compared with testing results of B-6.2 and CENDL-2. It is obvious that 238 U data files are most important for the calculations of large fast reactors and lattice thermal reactors. However, 238 U data in the new version of ENDF/B-6 have not been renewed. Only data of 235 U, 27 Al, 14 N and 2 D have been renewed in ENDF/B-6.4. Therefor, it will be shown that the thermal reactor benchmark testing results are remarkably improved and the fast reactor benchmark testing results are not improved
The KMAT: Benchmarking Knowledge Management.
de Jager, Martha
Provides an overview of knowledge management and benchmarking, including the benefits and methods of benchmarking (e.g., competitive, cooperative, collaborative, and internal benchmarking). Arthur Andersen's KMAT (Knowledge Management Assessment Tool) is described. The KMAT is a collaborative benchmarking tool, designed to help organizations make…
Mack, G.
1982-01-01
After a description of a pure Yang-Mills theory on a lattice, the author considers a three-dimensional pure U(1) lattice gauge theory. Thereafter he discusses the exact relation between lattice gauge theories with the gauge groups SU(2) and SO(3). Finally he presents Monte Carlo data on phase transitions in SU(2) and SO(3) lattice gauge models. (HSI)
Benchmarking the Netherlands. Benchmarking for growth
2003-01-01
This is the fourth edition of the Ministry of Economic Affairs' publication 'Benchmarking the Netherlands', which aims to assess the competitiveness of the Dutch economy. The methodology and objective of the benchmarking remain the same. The basic conditions for economic activity (institutions, regulation, etc.) in a number of benchmark countries are compared in order to learn from the solutions found by other countries for common economic problems. This publication is devoted entirely to the potential output of the Dutch economy. In other words, its ability to achieve sustainable growth and create work over a longer period without capacity becoming an obstacle. This is important because economic growth is needed to increase prosperity in the broad sense and meeting social needs. Prosperity in both a material (per capita GDP) and immaterial (living environment, environment, health, etc) sense, in other words. The economy's potential output is determined by two structural factors: the growth of potential employment and the structural increase in labour productivity. Analysis by the Netherlands Bureau for Economic Policy Analysis (CPB) shows that in recent years the increase in the capacity for economic growth has been realised mainly by increasing the supply of labour and reducing the equilibrium unemployment rate. In view of the ageing of the population in the coming years and decades the supply of labour is unlikely to continue growing at the pace we have become accustomed to in recent years. According to a number of recent studies, to achieve a respectable rate of sustainable economic growth the aim will therefore have to be to increase labour productivity. To realise this we have to focus on for six pillars of economic policy: (1) human capital, (2) functioning of markets, (3) entrepreneurship, (4) spatial planning, (5) innovation, and (6) sustainability. These six pillars determine the course for economic policy aiming at higher productivity growth. Throughout
Benchmarking the Netherlands. Benchmarking for growth
NONE
2003-01-01
This is the fourth edition of the Ministry of Economic Affairs' publication 'Benchmarking the Netherlands', which aims to assess the competitiveness of the Dutch economy. The methodology and objective of the benchmarking remain the same. The basic conditions for economic activity (institutions, regulation, etc.) in a number of benchmark countries are compared in order to learn from the solutions found by other countries for common economic problems. This publication is devoted entirely to the potential output of the Dutch economy. In other words, its ability to achieve sustainable growth and create work over a longer period without capacity becoming an obstacle. This is important because economic growth is needed to increase prosperity in the broad sense and meeting social needs. Prosperity in both a material (per capita GDP) and immaterial (living environment, environment, health, etc) sense, in other words. The economy's potential output is determined by two structural factors: the growth of potential employment and the structural increase in labour productivity. Analysis by the Netherlands Bureau for Economic Policy Analysis (CPB) shows that in recent years the increase in the capacity for economic growth has been realised mainly by increasing the supply of labour and reducing the equilibrium unemployment rate. In view of the ageing of the population in the coming years and decades the supply of labour is unlikely to continue growing at the pace we have become accustomed to in recent years. According to a number of recent studies, to achieve a respectable rate of sustainable economic growth the aim will therefore have to be to increase labour productivity. To realise this we have to focus on for six pillars of economic policy: (1) human capital, (2) functioning of markets, (3) entrepreneurship, (4) spatial planning, (5) innovation, and (6) sustainability. These six pillars determine the course for economic policy aiming at higher productivity
Benchmarking in Mobarakeh Steel Company
Sasan Ghasemi
2008-05-01
Full Text Available Benchmarking is considered as one of the most effective ways of improving performance incompanies. Although benchmarking in business organizations is a relatively new concept and practice, ithas rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahans Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aimsto share the process deployed for the benchmarking project in this company and illustrate how the projectsystematic implementation led to succes.
Benchmarking in Mobarakeh Steel Company
Sasan Ghasemi; Mohammad Nazemi; Mehran Nejati
2008-01-01
Benchmarking is considered as one of the most effective ways of improving performance in companies. Although benchmarking in business organizations is a relatively new concept and practice, it has rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan's Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aims to share the process deployed for the benchmarking project in this company and illustrate how th...
Lattices with unique complements
Saliĭ, V N
1988-01-01
The class of uniquely complemented lattices properly contains all Boolean lattices. However, no explicit example of a non-Boolean lattice of this class has been found. In addition, the question of whether this class contains any complete non-Boolean lattices remains unanswered. This book focuses on these classical problems of lattice theory and the various attempts to solve them. Requiring no specialized knowledge, the book is directed at researchers and students interested in general algebra and mathematical logic.
Shielding Benchmark Computational Analysis
Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.
2000-01-01
Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)
Rocha, Vera; Van Praag, Mirjam; Carneiro, Anabela
This paper studies three related questions: To what extent otherwise similar startups employ different quantities and qualities of human capital at the moment of entry? How persistent are initial human capital choices over time? And how does deviating from human capital benchmarks influence firm......, founders human capital, and the ownership structure of startups (solo entrepreneurs versus entrepreneurial teams). We then study the survival implications of exogenous deviations from these benchmarks, based on spline models for survival data. Our results indicate that (especially negative) deviations from...... the benchmark can be substantial, are persistent over time, and hinder the survival of firms. The implications may, however, vary according to the sector and the ownership structure at entry. Given the stickiness of initial choices, wrong human capital decisions at entry turn out to be a close to irreversible...
HPCG Benchmark Technical Specification
Heroux, Michael Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dongarra, Jack [Univ. of Tennessee, Knoxville, TN (United States); Luszczek, Piotr [Univ. of Tennessee, Knoxville, TN (United States)
2013-10-01
The High Performance Conjugate Gradient (HPCG) benchmark [cite SNL, UTK reports] is a tool for ranking computer systems based on a simple additive Schwarz, symmetric Gauss-Seidel preconditioned conjugate gradient solver. HPCG is similar to the High Performance Linpack (HPL), or Top 500, benchmark [1] in its purpose, but HPCG is intended to better represent how today’s applications perform. In this paper we describe the technical details of HPCG: how it is designed and implemented, what code transformations are permitted and how to interpret and report results.
Benchmarking for Best Practice
Zairi, Mohamed
1998-01-01
Benchmarking for Best Practice uses up-to-the-minute case-studies of individual companies and industry-wide quality schemes to show how and why implementation has succeeded. For any practitioner wanting to establish best practice in a wide variety of business areas, this book makes essential reading. .It is also an ideal textbook on the applications of TQM since it describes concepts, covers definitions and illustrates the applications with first-hand examples. Professor Mohamed Zairi is an international expert and leading figure in the field of benchmarking. His pioneering work in this area l
Benchmarking Danish Industries
Gammelgaard, Britta; Bentzen, Eric; Aagaard Andreassen, Mette
2003-01-01
compatible survey. The International Manufacturing Strategy Survey (IMSS) doesbring up the question of supply chain management, but unfortunately, we did not have access to thedatabase. Data from the members of the SCOR-model, in the form of benchmarked performance data,may exist, but are nonetheless...... not public. The survey is a cooperative project "Benchmarking DanishIndustries" with CIP/Aalborg University, the Danish Technological University, the DanishTechnological Institute and Copenhagen Business School as consortia partners. The project has beenfunded by the Danish Agency for Trade and Industry...
Bonnet, F; Solignac, S; Marty, J
2008-03-01
The purpose of benchmarking is to settle improvement processes by comparing the activities to quality standards. The proposed methodology is illustrated by benchmark business cases performed inside medical plants on some items like nosocomial diseases or organization of surgery facilities. Moreover, the authors have built a specific graphic tool, enhanced with balance score numbers and mappings, so that the comparison between different anesthesia-reanimation services, which are willing to start an improvement program, is easy and relevant. This ready-made application is even more accurate as far as detailed tariffs of activities are implemented.
Pesic, M.
1998-01-01
A selected set of the RB reactor benchmark cores is presented in this paper. The first results of validation of the well-known Monte Carlo MCNP TM code and adjoining neutron cross section libraries are given. They confirm the idea for the proposal of the new U-D 2 O criticality benchmark system and support the intention to include this system in the next edition of the recent OECD/NEA Project: International Handbook of Evaluated Criticality Safety Experiment, in near future. (author)
Status on benchmark testing of CENDL-3
Liu Ping
2002-01-01
CENDL-3, the newest version of China Evaluated Nuclear Data Library has been finished, and distributed for some benchmarks analysis recently. The processing was carried out using the NJOY nuclear data processing code system. The calculations and analysis of benchmarks were done with Monte Carlo code MCNP and reactor lattice code WIMSD5A. The calculated results were compared with the experimental results based on ENDF/B6. In most thermal and fast uranium criticality benchmarks, the calculated k sub e sub f sub f values with CENDL-3 were in good agreements with experimental results. In the plutonium fast cores, the k sub e sub f sub f values were improved significantly with CENDL-3. This is duo to reevaluation of the fission spectrum and elastic angular distributions of sup 2 sup 3 sup 9 Pu and sup 2 sup 4 sup 0 Pu. CENDL-3 underestimated the k sub e sub f sub f values compared with other evaluated data libraries for most spherical or cylindrical assemblies of plutonium or uranium with beryllium
Benchmarking and Performance Management
Adrian TANTAU
2010-12-01
Full Text Available The relevance of the chosen topic is explained by the meaning of the firm efficiency concept - the firm efficiency means the revealed performance (how well the firm performs in the actual market environment given the basic characteristics of the firms and their markets that are expected to drive their profitability (firm size, market power etc.. This complex and relative performance could be due to such things as product innovation, management quality, work organization, some other factors can be a cause even if they are not directly observed by the researcher. The critical need for the management individuals/group to continuously improve their firm/company’s efficiency and effectiveness, the need for the managers to know which are the success factors and the competitiveness determinants determine consequently, what performance measures are most critical in determining their firm’s overall success. Benchmarking, when done properly, can accurately identify both successful companies and the underlying reasons for their success. Innovation and benchmarking firm level performance are critical interdependent activities. Firm level variables, used to infer performance, are often interdependent due to operational reasons. Hence, the managers need to take the dependencies among these variables into account when forecasting and benchmarking performance. This paper studies firm level performance using financial ratio and other type of profitability measures. It uses econometric models to describe and then propose a method to forecast and benchmark performance.
Bers, Trudy
2012-01-01
Surveys and benchmarks continue to grow in importance for community colleges in response to several factors. One is the press for accountability, that is, for colleges to report the outcomes of their programs and services to demonstrate their quality and prudent use of resources, primarily to external constituents and governing boards at the state…
New integrable lattice hierarchies
Pickering, Andrew; Zhu Zuonong
2006-01-01
In this Letter we give a new integrable four-field lattice hierarchy, associated to a new discrete spectral problem. We obtain our hierarchy as the compatibility condition of this spectral problem and an associated equation, constructed herein, for the time-evolution of eigenfunctions. We consider reductions of our hierarchy, which also of course admit discrete zero curvature representations, in detail. We find that our hierarchy includes many well-known integrable hierarchies as special cases, including the Toda lattice hierarchy, the modified Toda lattice hierarchy, the relativistic Toda lattice hierarchy, and the Volterra lattice hierarchy. We also obtain here a new integrable two-field lattice hierarchy, to which we give the name of Suris lattice hierarchy, since the first equation of this hierarchy has previously been given by Suris. The Hamiltonian structure of the Suris lattice hierarchy is obtained by means of a trace identity formula
Dark matter, constrained minimal supersymmetric standard model, and lattice QCD.
Giedt, Joel; Thomas, Anthony W; Young, Ross D
2009-11-13
Recent lattice measurements have given accurate estimates of the quark condensates in the proton. We use these results to significantly improve the dark matter predictions in benchmark models within the constrained minimal supersymmetric standard model. The predicted spin-independent cross sections are at least an order of magnitude smaller than previously suggested and our results have significant consequences for dark matter searches.
Precise determination of lattice phase shifts and mixing angles
Lu, Bing-Nan, E-mail: b.lu@fz-juelich.de [Institute for Advanced Simulation, Institut für Kernphysik, and Jülich Center for Hadron Physics, Forschungszentrum Jülich, D-52425 Jülich (Germany); Lähde, Timo A. [Institute for Advanced Simulation, Institut für Kernphysik, and Jülich Center for Hadron Physics, Forschungszentrum Jülich, D-52425 Jülich (Germany); Lee, Dean [Department of Physics, North Carolina State University, Raleigh, NC 27695 (United States); Meißner, Ulf-G. [Helmholtz-Institut für Strahlen- und Kernphysik and Bethe Center for Theoretical Physics, Universität Bonn, D-53115 Bonn (Germany); Institute for Advanced Simulation, Institut für Kernphysik, and Jülich Center for Hadron Physics, Forschungszentrum Jülich, D-52425 Jülich (Germany); JARA – High Performance Computing, Forschungszentrum Jülich, D-52425 Jülich (Germany)
2016-09-10
We introduce a general and accurate method for determining lattice phase shifts and mixing angles, which is applicable to arbitrary, non-cubic lattices. Our method combines angular momentum projection, spherical wall boundaries and an adjustable auxiliary potential. This allows us to construct radial lattice wave functions and to determine phase shifts at arbitrary energies. For coupled partial waves, we use a complex-valued auxiliary potential that breaks time-reversal invariance. We benchmark our method using a system of two spin-1/2 particles interacting through a finite-range potential with a strong tensor component. We are able to extract phase shifts and mixing angles for all angular momenta and energies, with precision greater than that of extant methods. We discuss a wide range of applications from nuclear lattice simulations to optical lattice experiments.
Benchmarking i den offentlige sektor
Bukh, Per Nikolaj; Dietrichson, Lars; Sandalgaard, Niels
2008-01-01
I artiklen vil vi kort diskutere behovet for benchmarking i fraværet af traditionelle markedsmekanismer. Herefter vil vi nærmere redegøre for, hvad benchmarking er med udgangspunkt i fire forskellige anvendelser af benchmarking. Regulering af forsyningsvirksomheder vil blive behandlet, hvorefter...
Cloud benchmarking for performance
Varghese, Blesson; Akgun, Ozgur; Miguel, Ian; Thai, Long; Barker, Adam
2014-01-01
Date of Acceptance: 20/09/2014 How can applications be deployed on the cloud to achieve maximum performance? This question has become significant and challenging with the availability of a wide variety of Virtual Machines (VMs) with different performance capabilities in the cloud. The above question is addressed by proposing a six step benchmarking methodology in which a user provides a set of four weights that indicate how important each of the following groups: memory, processor, computa...
Generalized isothermic lattices
Doliwa, Adam
2007-01-01
We study multi-dimensional quadrilateral lattices satisfying simultaneously two integrable constraints: a quadratic constraint and the projective Moutard constraint. When the lattice is two dimensional and the quadric under consideration is the Moebius sphere one obtains, after the stereographic projection, the discrete isothermic surfaces defined by Bobenko and Pinkall by an algebraic constraint imposed on the (complex) cross-ratio of the circular lattice. We derive the analogous condition for our generalized isothermic lattices using Steiner's projective structure of conics, and we present basic geometric constructions which encode integrability of the lattice. In particular, we introduce the Darboux transformation of the generalized isothermic lattice and we derive the corresponding Bianchi permutability principle. Finally, we study two-dimensional generalized isothermic lattices, in particular geometry of their initial boundary value problem
Benchmarking reference services: an introduction.
Marshall, J G; Buchanan, H S
1995-01-01
Benchmarking is based on the common sense idea that someone else, either inside or outside of libraries, has found a better way of doing certain things and that your own library's performance can be improved by finding out how others do things and adopting the best practices you find. Benchmarking is one of the tools used for achieving continuous improvement in Total Quality Management (TQM) programs. Although benchmarking can be done on an informal basis, TQM puts considerable emphasis on formal data collection and performance measurement. Used to its full potential, benchmarking can provide a common measuring stick to evaluate process performance. This article introduces the general concept of benchmarking, linking it whenever possible to reference services in health sciences libraries. Data collection instruments that have potential application in benchmarking studies are discussed and the need to develop common measurement tools to facilitate benchmarking is emphasized.
Two-dimensional benchmark calculations for PNL-30 through PNL-35
Mosteller, R.D.
1997-01-01
Interest in critical experiments with lattices of mixed-oxide (MOX) fuel pins has been revived by the possibility that light water reactors will be used for disposition of weapons-grade plutonium. A series of six experiments with MOX lattices, designated PNL-30 through PNL-35, was performed at Pacific Northwest Laboratories in 1975 and 1976, and a set of benchmark specifications for these experiments subsequently was adopted by the Cross Section Evaluation Working Group (CSEWG). Although there appear to be some problems with these experiments, they remain the only CSEWG benchmarks for MOX lattices. The number of fuel pins in these experiments is relatively low, corresponding to fewer than 4 typical pressurized-water-reactor fuel assemblies. Accordingly, they are more appropriate as benchmarks for lattice-physics codes than for reactor-core simulator codes. Unfortunately, the CSEWG specifications retain the full three-dimensional (3D) detail of the experiments, while lattice-physics codes almost universally are limited to two dimensions (2D). This paper proposes an extension of the benchmark specifications to include a 2D model, and it justifies that extension by comparing results from the MCNP Monte Carlo code for the 2D and 3D specifications
Heavy nucleus resonant absorption calculation benchmarks
Tellier, H.; Coste, H.; Raepsaet, C.; Van der Gucht, C.
1993-01-01
The calculation of the space and energy dependence of the heavy nucleus resonant absorption in a heterogeneous lattice is one of the hardest tasks in reactor physics. Because of the computer time and memory needed, it is impossible to represent finely the cross-section behavior in the resonance energy range for everyday computations. Consequently, reactor physicists use a simplified formalism, the self-shielding formalism. As no clean and detailed experimental results are available to validate the self-shielding calculations, Monte Carlo computations are used as a reference. These results, which were obtained with the TRIPOLI continuous-energy Monte Carlo code, constitute a set of numerical benchmarks than can be used to evaluate the accuracy of the techniques or formalisms that are included in any reactor physics codes. Examples of such evaluations, for the new assembly code APOLLO2 and the slowing-down code SECOL, are given for cases of 238 U and 232 Th fuel elements
Podlekareva, Daria; Reekie, Joanne; Mocroft, Amanda
2012-01-01
ABSTRACT: BACKGROUND: State-of-the-art care involving the utilisation of multiple health care interventions is the basis for an optimal long-term clinical prognosis for HIV-patients. We evaluated health care for HIV-patients based on four key indicators. METHODS: Four indicators of health care we...... document pronounced regional differences in adherence to guidelines and can help to identify gaps and direct target interventions. It may serve as a tool for assessment and benchmarking the clinical management of HIV-patients in any setting worldwide....
Benchmarking Cloud Storage Systems
Wang, Xing
2014-01-01
With the rise of cloud computing, many cloud storage systems like Dropbox, Google Drive and Mega have been built to provide decentralized and reliable file storage. It is thus of prime importance to know their features, performance, and the best way to make use of them. In this context, we introduce BenchCloud, a tool designed as part of this thesis to conveniently and efficiently benchmark any cloud storage system. First, we provide a study of six commonly-used cloud storage systems to ident...
Jensen, Christian Søndergaard; Tiesyte, Dalia; Tradisauskas, Nerius
2006-01-01
An infrastructure is emerging that enables the positioning of populations of on-line, mobile service users. In step with this, research in the management of moving objects has attracted substantial attention. In particular, quite a few proposals now exist for the indexing of moving objects...... takes into account that the available positions of the moving objects are inaccurate, an aspect largely ignored in previous indexing research. The concepts of data and query enlargement are introduced for addressing inaccuracy. As proof of concepts of the benchmark, the paper covers the application...
Lattice theory for nonspecialists
Hari Dass, N.D.
1984-01-01
These lectures were delivered as part of the academic training programme at the NIKHEF-H. These lectures were intended primarily for experimentalists, and theorists not specializing in lattice methods. The goal was to present the essential spirit behind the lattice approach and consequently the author has concentrated mostly on issues of principle rather than on presenting a large amount of detail. In particular, the author emphasizes the deep theoretical infra-structure that has made lattice studies meaningful. At the same time, he has avoided the use of heavy formalisms as they tend to obscure the basic issues for people trying to approach this subject for the first time. The essential ideas are illustrated with elementary soluble examples not involving complicated mathematics. The following subjects are discussed: three ways of solving the harmonic oscillator problem; latticization; gauge fields on a lattice; QCD observables; how to solve lattice theories. (Auth.)
Creutz, M.
1983-04-01
In the last few years lattice gauge theory has become the primary tool for the study of nonperturbative phenomena in gauge theories. The lattice serves as an ultraviolet cutoff, rendering the theory well defined and amenable to numerical and analytical work. Of course, as with any cutoff, at the end of a calculation one must consider the limit of vanishing lattice spacing in order to draw conclusions on the physical continuum limit theory. The lattice has the advantage over other regulators that it is not tied to the Feynman expansion. This opens the possibility of other approximation schemes than conventional perturbation theory. Thus Wilson used a high temperature expansion to demonstrate confinement in the strong coupling limit. Monte Carlo simulations have dominated the research in lattice gauge theory for the last four years, giving first principle calculations of nonperturbative parameters characterizing the continuum limit. Some of the recent results with lattice calculations are reviewed
Resonance shielding in thermal reactor lattices
Rothenstein, W.; Taviv, E.; Aminpour, M.
1982-01-01
The theoretical foundations of a new methodology for the accurate treatment of resonance absorption in thermal reactor lattice analysis are presented. This methodology is based on the solution of the point-energy transport equation in its integral or integro-differential form for a heterogeneous lattice using detailed resonance cross-section profiles. The methodology is applied to LWR benchmark analysis, with emphasis on temperature dependence of resonance absorption during fuel depletion, spatial and mutual self-shielding, integral parameter analysis and treatment of cluster geometry. The capabilities of the OZMA code, which implements the new methodology are discussed. These capabilities provide a means against which simpler and more rapid resonance absorption algorithms can be checked. (author)
Wilson Dslash Kernel From Lattice QCD Optimization
Joo, Balint [Jefferson Lab, Newport News, VA; Smelyanskiy, Mikhail [Parallel Computing Lab, Intel Corporation, California, USA; Kalamkar, Dhiraj D. [Parallel Computing Lab, Intel Corporation, India; Vaidyanathan, Karthikeyan [Parallel Computing Lab, Intel Corporation, India
2015-07-01
Lattice Quantum Chromodynamics (LQCD) is a numerical technique used for calculations in Theoretical Nuclear and High Energy Physics. LQCD is traditionally one of the first applications ported to many new high performance computing architectures and indeed LQCD practitioners have been known to design and build custom LQCD computers. Lattice QCD kernels are frequently used as benchmarks (e.g. 168.wupwise in the SPEC suite) and are generally well understood, and as such are ideal to illustrate several optimization techniques. In this chapter we will detail our work in optimizing the Wilson-Dslash kernels for Intel Xeon Phi, however, as we will show the technique gives excellent performance on regular Xeon Architecture as well.
On Traveling Waves in Lattices: The Case of Riccati Lattices
Dimitrova, Zlatinka
2012-09-01
The method of simplest equation is applied for analysis of a class of lattices described by differential-difference equations that admit traveling-wave solutions constructed on the basis of the solution of the Riccati equation. We denote such lattices as Riccati lattices. We search for Riccati lattices within two classes of lattices: generalized Lotka-Volterra lattices and generalized Holling lattices. We show that from the class of generalized Lotka-Volterra lattices only the Wadati lattice belongs to the class of Riccati lattices. Opposite to this many lattices from the Holling class are Riccati lattices. We construct exact traveling wave solutions on the basis of the solution of Riccati equation for three members of the class of generalized Holling lattices.
Benchmarking multimedia performance
Zandi, Ahmad; Sudharsanan, Subramania I.
1998-03-01
With the introduction of faster processors and special instruction sets tailored to multimedia, a number of exciting applications are now feasible on the desktops. Among these is the DVD playback consisting, among other things, of MPEG-2 video and Dolby digital audio or MPEG-2 audio. Other multimedia applications such as video conferencing and speech recognition are also becoming popular on computer systems. In view of this tremendous interest in multimedia, a group of major computer companies have formed, Multimedia Benchmarks Committee as part of Standard Performance Evaluation Corp. to address the performance issues of multimedia applications. The approach is multi-tiered with three tiers of fidelity from minimal to full compliant. In each case the fidelity of the bitstream reconstruction as well as quality of the video or audio output are measured and the system is classified accordingly. At the next step the performance of the system is measured. In many multimedia applications such as the DVD playback the application needs to be run at a specific rate. In this case the measurement of the excess processing power, makes all the difference. All these make a system level, application based, multimedia benchmark very challenging. Several ideas and methodologies for each aspect of the problems will be presented and analyzed.
Pavlovichev, A.M.
2001-01-01
Actual regulations while designing of new fuel cycles for nuclear power installations comprise a calculational justification to be performed by certified computer codes. It guarantees that obtained calculational results will be within the limits of declared uncertainties that are indicated in a certificate issued by Gosatomnadzor of Russian Federation (GAN) and concerning a corresponding computer code. A formal justification of declared uncertainties is the comparison of calculational results obtained by a commercial code with the results of experiments or of calculational tests that are calculated with an uncertainty defined by certified precision codes of MCU type or of other one. The actual level of international cooperation provides an enlarging of the bank of experimental and calculational benchmarks acceptable for a certification of commercial codes that are being used for a design of fuel loadings with MOX fuel. In particular, the work is practically finished on the forming of calculational benchmarks list for a certification of code TVS-M as applied to MOX fuel assembly calculations. The results on these activities are presented
H. Groessing
2015-02-01
Full Text Available A benchmark study for permeability measurement is presented. In the past studies of other research groups which focused on the reproducibility of 1D-permeability measurements showed high standard deviations of the gained permeability values (25%, even though a defined test rig with required specifications was used. Within this study, the reproducibility of capacitive in-plane permeability testing system measurements was benchmarked by comparing results of two research sites using this technology. The reproducibility was compared by using a glass fibre woven textile and carbon fibre non crimped fabric (NCF. These two material types were taken into consideration due to the different electrical properties of glass and carbon with respect to dielectric capacitive sensors of the permeability measurement systems. In order to determine the unsaturated permeability characteristics as function of fibre volume content the measurements were executed at three different fibre volume contents including five repetitions. It was found that the stability and reproducibility of the presentedin-plane permeability measurement system is very good in the case of the glass fibre woven textiles. This is true for the comparison of the repetition measurements as well as for the comparison between the two different permeameters. These positive results were confirmed by a comparison to permeability values of the same textile gained with an older generation permeameter applying the same measurement technology. Also it was shown, that a correct determination of the grammage and the material density are crucial for correct correlation of measured permeability values and fibre volume contents.
Benchmarking Using Basic DBMS Operations
Crolotte, Alain; Ghazal, Ahmad
The TPC-H benchmark proved to be successful in the decision support area. Many commercial database vendors and their related hardware vendors used these benchmarks to show the superiority and competitive edge of their products. However, over time, the TPC-H became less representative of industry trends as vendors keep tuning their database to this benchmark-specific workload. In this paper, we present XMarq, a simple benchmark framework that can be used to compare various software/hardware combinations. Our benchmark model is currently composed of 25 queries that measure the performance of basic operations such as scans, aggregations, joins and index access. This benchmark model is based on the TPC-H data model due to its maturity and well-understood data generation capability. We also propose metrics to evaluate single-system performance and compare two systems. Finally we illustrate the effectiveness of this model by showing experimental results comparing two systems under different conditions.
Lattice degeneracies of fermions
Raszillier, H.
1983-10-01
We present a detailed description of the minimal degeneracies of geometric (Kaehler) fermions on all the lattices of maximal symmetries in n = 1, ..., 4 dimensions. We also determine the isolated orbits of the maximal symmetry groups, which are related to the minimal numbers of ''naive'' fermions on the reciprocals of these lattices. It turns out that on the self-reciprocal lattices the minimal numbers of naive fermions are equal to the minimal numbers of degrees of freedom of geometric fermions. The description we give relies on the close connection of the maximal lattice symmetry groups with (affine) Weyl groups of root systems of (semi-) simple Lie algebras. (orig.)
Shindler, A.
2007-07-01
I review the theoretical foundations, properties as well as the simulation results obtained so far of a variant of the Wilson lattice QCD formulation: Wilson twisted mass lattice QCD. Emphasis is put on the discretization errors and on the effects of these discretization errors on the phase structure for Wilson-like fermions in the chiral limit. The possibility to use in lattice simulations different lattice actions for sea and valence quarks to ease the renormalization patterns of phenomenologically relevant local operators, is also discussed. (orig.)
Shindler, A. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC
2007-07-15
I review the theoretical foundations, properties as well as the simulation results obtained so far of a variant of the Wilson lattice QCD formulation: Wilson twisted mass lattice QCD. Emphasis is put on the discretization errors and on the effects of these discretization errors on the phase structure for Wilson-like fermions in the chiral limit. The possibility to use in lattice simulations different lattice actions for sea and valence quarks to ease the renormalization patterns of phenomenologically relevant local operators, is also discussed. (orig.)
Low-energy scattering on the lattice
Bour Bour, Shahin
2014-01-01
In this thesis we present precision benchmark calculations for two-component fermions in the unitarity limit using an ab initio method, namely Hamiltonian lattice formalism. We calculate the ground state energy for unpolarized four particles (Fermi gas) in a periodic cube as a fraction of the ground state energy of the non-interacting system for two independent representations of the lattice Hamiltonians. We obtain the values 0.211(2) and 0.210(2). These results are in full agreement with the Euclidean lattice and fixed-node diffusion Monte Carlo calculations. We also give an expression for the energy corrections to the binding energy of a bound state in a moving frame. These corrections contain information about the mass and number of the constituents and are topological in origin and will have a broad applications to the lattice calculations of nucleons, nuclei, hadronic molecules and cold atoms. As one of its applications we use this expression and determine the low-energy parameters for the fermion dimer elastic scattering in shallow binding limit. For our lattice calculations we use Luescher's finite volume method. From the lattice calculations we find κa fd =1.174(9) and κr fd =-0.029(13), where κ represents the binding momentum of dimer and a fd (r fd ) denotes the scattering length (effective-range). These results are confirmed by the continuum calculations using the Skorniakov-Ter-Martirosian integral equation which gives 1.17907(1) and -0.0383(3) for the scattering length and effective range, respectively.
Benchmarking & European Sustainable Transport Policies
Gudmundsson, H.
2003-01-01
, Benchmarking is one of the management tools that have recently been introduced in the transport sector. It is rapidly being applied to a wide range of transport operations, services and policies. This paper is a contribution to the discussion of the role of benchmarking in the future efforts to...... contribution to the discussions within the Eusponsored BEST Thematic Network (Benchmarking European Sustainable Transport) which ran from 2000 to 2003....
Benchmarking in Czech Higher Education
Plaček Michal; Ochrana František; Půček Milan
2015-01-01
The first part of this article surveys the current experience with the use of benchmarking at Czech universities specializing in economics and management. The results indicate that collaborative benchmarking is not used on this level today, but most actors show some interest in its introduction. The expression of the need for it and the importance of benchmarking as a very suitable performance-management tool in less developed countries are the impetus for the second part of our article. Base...
Power reactor pressure vessel benchmarks
Rahn, F.J.
1978-01-01
A review is given of the current status of experimental and calculational benchmarks for use in understanding the radiation embrittlement effects in the pressure vessels of operating light water power reactors. The requirements of such benchmarks for application to pressure vessel dosimetry are stated. Recent developments in active and passive neutron detectors sensitive in the ranges of importance to embrittlement studies are summarized and recommendations for improvements in the benchmark are made. (author)
Simplified two and three dimensional HTTR benchmark problems
Zhang Zhan; Rahnema, Farzad; Zhang Dingkang; Pounders, Justin M.; Ougouag, Abderrafi M.
2011-01-01
To assess the accuracy of diffusion or transport methods for reactor calculations, it is desirable to create heterogeneous benchmark problems that are typical of whole core configurations. In this paper we have created two and three dimensional numerical benchmark problems typical of high temperature gas cooled prismatic cores. Additionally, a single cell and single block benchmark problems are also included. These problems were derived from the HTTR start-up experiment. Since the primary utility of the benchmark problems is in code-to-code verification, minor details regarding geometry and material specification of the original experiment have been simplified while retaining the heterogeneity and the major physics properties of the core from a neutronics viewpoint. A six-group material (macroscopic) cross section library has been generated for the benchmark problems using the lattice depletion code HELIOS. Using this library, Monte Carlo solutions are presented for three configurations (all-rods-in, partially-controlled and all-rods-out) for both the 2D and 3D problems. These solutions include the core eigenvalues, the block (assembly) averaged fission densities, local peaking factors, the absorption densities in the burnable poison and control rods, and pin fission density distribution for selected blocks. Also included are the solutions for the single cell and single block problems.
MOx Depletion Calculation Benchmark
San Felice, Laurence; Eschbach, Romain; Dewi Syarifah, Ratna; Maryam, Seif-Eddine; Hesketh, Kevin
2016-01-01
Under the auspices of the NEA Nuclear Science Committee (NSC), the Working Party on Scientific Issues of Reactor Systems (WPRS) has been established to study the reactor physics, fuel performance, radiation transport and shielding, and the uncertainties associated with modelling of these phenomena in present and future nuclear power systems. The WPRS has different expert groups to cover a wide range of scientific issues in these fields. The Expert Group on Reactor Physics and Advanced Nuclear Systems (EGRPANS) was created in 2011 to perform specific tasks associated with reactor physics aspects of present and future nuclear power systems. EGRPANS provides expert advice to the WPRS and the nuclear community on the development needs (data and methods, validation experiments, scenario studies) for different reactor systems and also provides specific technical information regarding: core reactivity characteristics, including fuel depletion effects; core power/flux distributions; Core dynamics and reactivity control. In 2013 EGRPANS published a report that investigated fuel depletion effects in a Pressurised Water Reactor (PWR). This was entitled 'International Comparison of a Depletion Calculation Benchmark on Fuel Cycle Issues' NEA/NSC/DOC(2013) that documented a benchmark exercise for UO 2 fuel rods. This report documents a complementary benchmark exercise that focused on PuO 2 /UO 2 Mixed Oxide (MOX) fuel rods. The results are especially relevant to the back-end of the fuel cycle, including irradiated fuel transport, reprocessing, interim storage and waste repository. Saint-Laurent B1 (SLB1) was the first French reactor to use MOx assemblies. SLB1 is a 900 MWe PWR, with 30% MOx fuel loading. The standard MOx assemblies, used in Saint-Laurent B1 reactor, include three zones with different plutonium enrichments, high Pu content (5.64%) in the center zone, medium Pu content (4.42%) in the intermediate zone and low Pu content (2.91%) in the peripheral zone
Benchmarking Academic Anatomic Pathologists
Barbara S. Ducatman MD
2016-10-01
Full Text Available The most common benchmarks for faculty productivity are derived from Medical Group Management Association (MGMA or Vizient-AAMC Faculty Practice Solutions Center ® (FPSC databases. The Association of Pathology Chairs has also collected similar survey data for several years. We examined the Association of Pathology Chairs annual faculty productivity data and compared it with MGMA and FPSC data to understand the value, inherent flaws, and limitations of benchmarking data. We hypothesized that the variability in calculated faculty productivity is due to the type of practice model and clinical effort allocation. Data from the Association of Pathology Chairs survey on 629 surgical pathologists and/or anatomic pathologists from 51 programs were analyzed. From review of service assignments, we were able to assign each pathologist to a specific practice model: general anatomic pathologists/surgical pathologists, 1 or more subspecialties, or a hybrid of the 2 models. There were statistically significant differences among academic ranks and practice types. When we analyzed our data using each organization’s methods, the median results for the anatomic pathologists/surgical pathologists general practice model compared to MGMA and FPSC results for anatomic and/or surgical pathology were quite close. Both MGMA and FPSC data exclude a significant proportion of academic pathologists with clinical duties. We used the more inclusive FPSC definition of clinical “full-time faculty” (0.60 clinical full-time equivalent and above. The correlation between clinical full-time equivalent effort allocation, annual days on service, and annual work relative value unit productivity was poor. This study demonstrates that effort allocations are variable across academic departments of pathology and do not correlate well with either work relative value unit effort or reported days on service. Although the Association of Pathology Chairs–reported median work relative
Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions
Mathew, Paul; Sartor, Dale; Tschudi, William
2009-07-13
This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.
Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions
Mathew, Paul; Greenberg, Steve; Sartor, Dale
2009-07-13
This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.
Kawai, Masayoshi
1984-01-01
Iron data in JENDL-2 have been tested by analyzing shielding benchmark experiments for neutron transmission through iron block performed at KFK using CF-252 neutron source and at ORNL using collimated neutron beam from reactor. The analyses are made by a shielding analysis code system RADHEAT-V4 developed at JAERI. The calculated results are compared with the measured data. As for the KFK experiments, the C/E values are about 1.1. For the ORNL experiments, the calculated values agree with the measured data within an accuracy of 33% for the off-center geometry. The d-t neutron transmission measurements through carbon sphere made at LLNL are also analyzed preliminarily by using the revised JENDL data for fusion neutronics calculation. (author)
Epelbaum E.
2010-04-01
Full Text Available We review recent progress on nuclear lattice simulations using chiral eﬀective ﬁeld theory. We discuss lattice results for dilute neutron matter at next-to-leading order, three-body forces at next-to-next-toleading order, isospin-breaking and Coulomb eﬀects, and the binding energy of light nuclei.
Jersak, J.
1986-01-01
This year has brought a sudden interest in lattice Higgs models. After five years of only modest activity we now have many new results obtained both by analytic and Monte Carlo methods. This talk is a review of the present state of lattice Higgs models with particular emphasis on the recent development
Benchmarking monthly homogenization algorithms
Venema, V. K. C.; Mestre, O.; Aguilar, E.; Auer, I.; Guijarro, J. A.; Domonkos, P.; Vertacnik, G.; Szentimrey, T.; Stepanek, P.; Zahradnicek, P.; Viarre, J.; Müller-Westermeier, G.; Lakatos, M.; Williams, C. N.; Menne, M.; Lindau, R.; Rasol, D.; Rustemeier, E.; Kolokythas, K.; Marinova, T.; Andresen, L.; Acquaotta, F.; Fratianni, S.; Cheval, S.; Klancar, M.; Brunetti, M.; Gruber, C.; Prohom Duran, M.; Likso, T.; Esteban, P.; Brandsma, T.
2011-08-01
The COST (European Cooperation in Science and Technology) Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME) has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative). The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide) trend was added. Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i) the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii) the error in linear trend estimates and (iii) traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve precipitation data
Benchmarking foreign electronics technologies
Bostian, C.W.; Hodges, D.A.; Leachman, R.C.; Sheridan, T.B.; Tsang, W.T.; White, R.M.
1994-12-01
This report has been drafted in response to a request from the Japanese Technology Evaluation Center`s (JTEC) Panel on Benchmarking Select Technologies. Since April 1991, the Competitive Semiconductor Manufacturing (CSM) Program at the University of California at Berkeley has been engaged in a detailed study of quality, productivity, and competitiveness in semiconductor manufacturing worldwide. The program is a joint activity of the College of Engineering, the Haas School of Business, and the Berkeley Roundtable on the International Economy, under sponsorship of the Alfred P. Sloan Foundation, and with the cooperation of semiconductor producers from Asia, Europe and the United States. Professors David A. Hodges and Robert C. Leachman are the project`s Co-Directors. The present report for JTEC is primarily based on data and analysis drawn from that continuing program. The CSM program is being conducted by faculty, graduate students and research staff from UC Berkeley`s Schools of Engineering and Business, and Department of Economics. Many of the participating firms are represented on the program`s Industry Advisory Board. The Board played an important role in defining the research agenda. A pilot study was conducted in 1991 with the cooperation of three semiconductor plants. The research plan and survey documents were thereby refined. The main phase of the CSM benchmarking study began in mid-1992 and will continue at least through 1997. reports are presented on the manufacture of integrated circuits; data storage; wireless technology; human-machine interfaces; and optoelectronics. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.
Philippacopoulos, A.J.; Miller, C.A.; Costantino, C.J.; Graves, H.
1987-01-01
This paper presents the latest results of the ongoing program entitled, Standard Problems for Structural Computer Codes, currently being worked on at BNL for the USNRC, Office of Nuclear Regulatory Research. During FY 1986, efforts were focussed on three tasks, namely, (1) an investigation of ground water effects on the response of Category I structures, (2) the Soil-Structure Interaction Workshop and (3) studies on structural benchmarks associated with Category I structures. The objective of the studies on ground water effects is to verify the applicability and the limitations of the SSI methods currently used by the industry in performing seismic evaluations of nuclear plants which are located at sites with high water tables. In a previous study by BNL (NUREG/CR-4588), it has been concluded that the pore water can influence significantly the soil-structure interaction process. This result, however, is based on the assumption of fully saturated soil profiles. Consequently, the work was further extended to include cases associated with variable water table depths. In this paper, results related to cut-off depths beyond which the pore water effects can be ignored in seismic calculations, are addressed. Comprehensive numerical data are given for soil configurations typical to those encountered in nuclear plant sites. These data were generated by using a modified version of the SLAM code which is capable of handling problems related to the dynamic response of saturated soils. Further, the paper presents some key aspects of the Soil-Structure Interaction Workshop (NUREG/CP-0054) which was held in Bethesda, MD on June 1, 1986. Finally, recent efforts related to the task on the structural benchmarks are described
Review for session K - benchmarks
McCracken, A.K.
1980-01-01
Eight of the papers to be considered in Session K are directly concerned, at least in part, with the Pool Critical Assembly (P.C.A.) benchmark at Oak Ridge. The remaining seven papers in this session, the subject of this review, are concerned with a variety of topics related to the general theme of Benchmarks and will be considered individually
Internal Benchmarking for Institutional Effectiveness
Ronco, Sharron L.
2012-01-01
Internal benchmarking is an established practice in business and industry for identifying best in-house practices and disseminating the knowledge about those practices to other groups in the organization. Internal benchmarking can be done with structures, processes, outcomes, or even individuals. In colleges or universities with multicampuses or a…
Entropy-based benchmarking methods
Temurshoev, Umed
2012-01-01
We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth
Benchmark simulation models, quo vadis?
Jeppsson, U.; Alex, J; Batstone, D. J.
2013-01-01
As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to p...
EPA's Benchmark Dose Modeling Software
The EPA developed the Benchmark Dose Software (BMDS) as a tool to help Agency risk assessors facilitate applying benchmark dose (BMD) method’s to EPA’s human health risk assessment (HHRA) documents. The application of BMD methods overcomes many well know limitations ...
Benchmark for Strategic Performance Improvement.
Gohlke, Annette
1997-01-01
Explains benchmarking, a total quality management tool used to measure and compare the work processes in a library with those in other libraries to increase library performance. Topics include the main groups of upper management, clients, and staff; critical success factors for each group; and benefits of benchmarking. (Author/LRW)
Benchmarking: A Process for Improvement.
Peischl, Thomas M.
One problem with the outcome-based measures used in higher education is that they measure quantity but not quality. Benchmarking, or the use of some external standard of quality to measure tasks, processes, and outputs, is partially solving that difficulty. Benchmarking allows for the establishment of a systematic process to indicate if outputs…
Staff Association
2017-01-01
On 12 December 2016, in Echo No. 259, we already discussed at length the MERIT and benchmark jobs. Still, we find that a couple of issues warrant further discussion. Benchmark job – administrative decision on 1 July 2017 On 12 January 2017, the HR Department informed all staff members of a change to the effective date of the administrative decision regarding benchmark jobs. The benchmark job title of each staff member will be confirmed on 1 July 2017, instead of 1 May 2017 as originally announced in HR’s letter on 18 August 2016. Postponing the administrative decision by two months will leave a little more time to address the issues related to incorrect placement in a benchmark job. Benchmark job – discuss with your supervisor, at the latest during the MERIT interview In order to rectify an incorrect placement in a benchmark job, it is essential that the supervisor and the supervisee go over the assigned benchmark job together. In most cases, this placement has been done autom...
On singularities of lattice varieties
Mukherjee, Himadri
2013-01-01
Toric varieties associated with distributive lattices arise as a fibre of a flat degeneration of a Schubert variety in a minuscule. The singular locus of these varieties has been studied by various authors. In this article we prove that the number of diamonds incident on a lattice point $\\a$ in a product of chain lattices is more than or equal to the codimension of the lattice. Using this we also show that the lattice varieties associated with product of chain lattices is smooth.
Benchmarking: applications to transfusion medicine.
Apelseth, Torunn Oveland; Molnar, Laura; Arnold, Emmy; Heddle, Nancy M
2012-10-01
Benchmarking is as a structured continuous collaborative process in which comparisons for selected indicators are used to identify factors that, when implemented, will improve transfusion practices. This study aimed to identify transfusion medicine studies reporting on benchmarking, summarize the benchmarking approaches used, and identify important considerations to move the concept of benchmarking forward in the field of transfusion medicine. A systematic review of published literature was performed to identify transfusion medicine-related studies that compared at least 2 separate institutions or regions with the intention of benchmarking focusing on 4 areas: blood utilization, safety, operational aspects, and blood donation. Forty-five studies were included: blood utilization (n = 35), safety (n = 5), operational aspects of transfusion medicine (n = 5), and blood donation (n = 0). Based on predefined criteria, 7 publications were classified as benchmarking, 2 as trending, and 36 as single-event studies. Three models of benchmarking are described: (1) a regional benchmarking program that collects and links relevant data from existing electronic sources, (2) a sentinel site model where data from a limited number of sites are collected, and (3) an institutional-initiated model where a site identifies indicators of interest and approaches other institutions. Benchmarking approaches are needed in the field of transfusion medicine. Major challenges include defining best practices and developing cost-effective methods of data collection. For those interested in initiating a benchmarking program, the sentinel site model may be most effective and sustainable as a starting point, although the regional model would be the ideal goal. Copyright © 2012 Elsevier Inc. All rights reserved.
Towards a physical interpretation of the entropic lattice Boltzmann method
Malaspinas, Orestis; Deville, Michel; Chopard, Bastien
2008-12-01
The entropic lattice Boltzmann method (ELBM) is one among several different versions of the lattice Boltzmann method for the simulation of hydrodynamics. The collision term of the ELBM is characterized by a nonincreasing H function, guaranteed by a variable relaxation time. We propose here an analysis of the ELBM using the Chapman-Enskog expansion. We show that it can be interpreted as some kind of subgrid model, where viscosity correction scales like the strain rate tensor. We confirm our analytical results by the numerical computations of the relaxation time modifications on the two-dimensional dipole-wall interaction benchmark.
Montecarlo calculation for a benchmark on interactive effects of Gadolinium poisoned pins in BWRs
Borgia, M.G.; Casali, F.; Cepraga, D.
1985-01-01
K infinite and burn-up calculations have been done in the frame of a benchmark organized by Physic Reactor Committee of NEA. The calculations, performed by the Montecarlo code KIM, concerned BWR lattices having UO*L2 fuel rodlets with and without gadolinium oxide
Benchmarking school nursing practice: the North West Regional Benchmarking Group
Littler, Nadine; Mullen, Margaret; Beckett, Helen; Freshney, Alice; Pinder, Lynn
2016-01-01
It is essential that the quality of care is reviewed regularly through robust processes such as benchmarking to ensure all outcomes and resources are evidence-based so that children and young people’s needs are met effectively. This article provides an example of the use of benchmarking in school nursing practice. Benchmarking has been defined as a process for finding, adapting and applying best practices (Camp, 1994). This concept was first adopted in the 1970s ‘from industry where it was us...
D'Hondt, P.; Gehin, J.; Na, B.C.; Sartori, E.; Wiesenack, W.
2001-01-01
One of the options envisaged for disposing of weapons grade plutonium, declared surplus for national defence in the Russian Federation and Usa, is to burn it in nuclear power reactors. The scientific/technical know-how accumulated in the use of MOX as a fuel for electricity generation is of great relevance for the plutonium disposition programmes. An Expert Group of the OECD/Nea is carrying out a series of benchmarks with the aim of facilitating the use of this know-how for meeting this objective. This paper describes the background that led to establishing the Expert Group, and the present status of results from these benchmarks. The benchmark studies cover a theoretical reactor physics benchmark on a VVER-1000 core loaded with MOX, two experimental benchmarks on MOX lattices and a benchmark concerned with MOX fuel behaviour for both solid and hollow pellets. First conclusions are outlined as well as future work. (author)
Benchmarking Nuclear Power Plants
Jakic, I.
2016-01-01
One of the main tasks an owner have is to keep its business competitive on the market while delivering its product. Being owner of nuclear power plant bear the same (or even more complex and stern) responsibility due to safety risks and costs. In the past, nuclear power plant managements could (partly) ignore profit or it was simply expected and to some degree assured through the various regulatory processes governing electricity rate design. It is obvious now that, with the deregulation, utility privatization and competitive electricity market, key measure of success used at nuclear power plants must include traditional metrics of successful business (return on investment, earnings and revenue generation) as well as those of plant performance, safety and reliability. In order to analyze business performance of (specific) nuclear power plant, benchmarking, as one of the well-established concept and usual method was used. Domain was conservatively designed, with well-adjusted framework, but results have still limited application due to many differences, gaps and uncertainties. (author).
Virtual machine performance benchmarking.
Langer, Steve G; French, Todd
2011-10-01
The attractions of virtual computing are many: reduced costs, reduced resources and simplified maintenance. Any one of these would be compelling for a medical imaging professional attempting to support a complex practice on limited resources in an era of ever tightened reimbursement. In particular, the ability to run multiple operating systems optimized for different tasks (computational image processing on Linux versus office tasks on Microsoft operating systems) on a single physical machine is compelling. However, there are also potential drawbacks. High performance requirements need to be carefully considered if they are to be executed in an environment where the running software has to execute through multiple layers of device drivers before reaching the real disk or network interface. Our lab has attempted to gain insight into the impact of virtualization on performance by benchmarking the following metrics on both physical and virtual platforms: local memory and disk bandwidth, network bandwidth, and integer and floating point performance. The virtual performance metrics are compared to baseline performance on "bare metal." The results are complex, and indeed somewhat surprising.
AER benchmark specification sheet
Aszodi, A.; Toth, S.
2009-01-01
In the VVER-440/213 type reactors, the core outlet temperature field is monitored with in-core thermocouples, which are installed above 210 fuel assemblies. These measured temperatures are used in determination of the fuel assembly powers and they have important role in the reactor power limitation. For these reasons, correct interpretation of the thermocouple signals is an important question. In order to interpret the signals in correct way, knowledge of the coolant mixing in the assembly heads is necessary. Computational Fluid Dynamics (CFD) codes and experiments can help to understand better these mixing processes and they can provide information which can support the more adequate interpretation of the thermocouple signals. This benchmark deals with the 3D CFD modeling of the coolant mixing in the heads of the profiled fuel assemblies with 12.2 mm rod pitch. Two assemblies of the 23rd cycle of the Paks NPP's Unit 3 are investigated. One of them has symmetrical pin power profile and another possesses inclined profile. (authors)
AER Benchmark Specification Sheet
Aszodi, A.; Toth, S.
2009-01-01
In the WWER-440/213 type reactors, the core outlet temperature field is monitored with in-core thermocouples, which are installed above 210 fuel assemblies. These measured temperatures are used in determination of the fuel assembly powers and they have important role in the reactor power limitation. For these reasons, correct interpretation of the thermocouple signals is an important question. In order to interpret the signals in correct way, knowledge of the coolant mixing in the assembly heads is necessary. Computational fluid dynamics codes and experiments can help to understand better these mixing processes and they can provide information which can support the more adequate interpretation of the thermocouple signals. This benchmark deals with the 3D computational fluid dynamics modeling of the coolant mixing in the heads of the profiled fuel assemblies with 12.2 mm rod pitch. Two assemblies of the twenty third cycle of the Paks NPPs Unit 3 are investigated. One of them has symmetrical pin power profile and another possesses inclined profile. (Authors)
Benchmarking biofuels; Biobrandstoffen benchmarken
Croezen, H.; Kampman, B.; Bergsma, G.
2012-03-15
A sustainability benchmark for transport biofuels has been developed and used to evaluate the various biofuels currently on the market. For comparison, electric vehicles, hydrogen vehicles and petrol/diesel vehicles were also included. A range of studies as well as growing insight are making it ever clearer that biomass-based transport fuels may have just as big a carbon footprint as fossil fuels like petrol or diesel, or even bigger. At the request of Greenpeace Netherlands, CE Delft has brought together current understanding on the sustainability of fossil fuels, biofuels and electric vehicles, with particular focus on the performance of the respective energy carriers on three sustainability criteria, with the first weighing the heaviest: (1) Greenhouse gas emissions; (2) Land use; and (3) Nutrient consumption [Dutch] Greenpeace Nederland heeft CE Delft gevraagd een duurzaamheidsmeetlat voor biobrandstoffen voor transport te ontwerpen en hierop de verschillende biobrandstoffen te scoren. Voor een vergelijk zijn ook elektrisch rijden, rijden op waterstof en rijden op benzine of diesel opgenomen. Door onderzoek en voortschrijdend inzicht blijkt steeds vaker dat transportbrandstoffen op basis van biomassa soms net zoveel of zelfs meer broeikasgassen veroorzaken dan fossiele brandstoffen als benzine en diesel. CE Delft heeft voor Greenpeace Nederland op een rijtje gezet wat de huidige inzichten zijn over de duurzaamheid van fossiele brandstoffen, biobrandstoffen en elektrisch rijden. Daarbij is gekeken naar de effecten van de brandstoffen op drie duurzaamheidscriteria, waarbij broeikasgasemissies het zwaarst wegen: (1) Broeikasgasemissies; (2) Landgebruik; en (3) Nutriëntengebruik.
Benchmarking in academic pharmacy departments.
Bosso, John A; Chisholm-Burns, Marie; Nappi, Jean; Gubbins, Paul O; Ross, Leigh Ann
2010-10-11
Benchmarking in academic pharmacy, and recommendations for the potential uses of benchmarking in academic pharmacy departments are discussed in this paper. Benchmarking is the process by which practices, procedures, and performance metrics are compared to an established standard or best practice. Many businesses and industries use benchmarking to compare processes and outcomes, and ultimately plan for improvement. Institutions of higher learning have embraced benchmarking practices to facilitate measuring the quality of their educational and research programs. Benchmarking is used internally as well to justify the allocation of institutional resources or to mediate among competing demands for additional program staff or space. Surveying all chairs of academic pharmacy departments to explore benchmarking issues such as department size and composition, as well as faculty teaching, scholarly, and service productivity, could provide valuable information. To date, attempts to gather this data have had limited success. We believe this information is potentially important, urge that efforts to gather it should be continued, and offer suggestions to achieve full participation.
Issues in Benchmark Metric Selection
Crolotte, Alain
It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.
California commercial building energy benchmarking
Kinney, Satkartar; Piette, Mary Ann
2003-07-01
Building energy benchmarking is the comparison of whole-building energy use relative to a set of similar buildings. It provides a useful starting point for individual energy audits and for targeting buildings for energy-saving measures in multiple-site audits. Benchmarking is of interest and practical use to a number of groups. Energy service companies and performance contractors communicate energy savings potential with ''typical'' and ''best-practice'' benchmarks while control companies and utilities can provide direct tracking of energy use and combine data from multiple buildings. Benchmarking is also useful in the design stage of a new building or retrofit to determine if a design is relatively efficient. Energy managers and building owners have an ongoing interest in comparing energy performance to others. Large corporations, schools, and government agencies with numerous facilities also use benchmarking methods to compare their buildings to each other. The primary goal of Task 2.1.1 Web-based Benchmarking was the development of a web-based benchmarking tool, dubbed Cal-Arch, for benchmarking energy use in California commercial buildings. While there were several other benchmarking tools available to California consumers prior to the development of Cal-Arch, there were none that were based solely on California data. Most available benchmarking information, including the Energy Star performance rating, were developed using DOE's Commercial Building Energy Consumption Survey (CBECS), which does not provide state-level data. Each database and tool has advantages as well as limitations, such as the number of buildings and the coverage by type, climate regions and end uses. There is considerable commercial interest in benchmarking because it provides an inexpensive method of screening buildings for tune-ups and retrofits. However, private companies who collect and manage consumption data are concerned that the
A Heterogeneous Medium Analytical Benchmark
Ganapol, B.D.
1999-01-01
A benchmark, called benchmark BLUE, has been developed for one-group neutral particle (neutron or photon) transport in a one-dimensional sub-critical heterogeneous plane parallel medium with surface illumination. General anisotropic scattering is accommodated through the Green's Function Method (GFM). Numerical Fourier transform inversion is used to generate the required Green's functions which are kernels to coupled integral equations that give the exiting angular fluxes. The interior scalar flux is then obtained through quadrature. A compound iterative procedure for quadrature order and slab surface source convergence provides highly accurate benchmark qualities (4- to 5- places of accuracy) results
Monte Carlo benchmark calculations for 400MWTH PBMR core
Kim, H. C.; Kim, J. K.; Kim, S. Y.; Noh, J. M.
2007-01-01
A large interest in high-temperature gas-cooled reactors (HTGR) has been initiated in connection with hydrogen production in recent years. In this study, as a part of work for establishing Monte Carlo computation system for HTGR core analysis, some benchmark calculations for pebble-type HTGR were carried out using MCNP5 code. The core of the 400MW t h Pebble-bed Modular Reactor (PBMR) was selected as a benchmark model. Recently, the IAEA CRP5 neutronics and thermal-hydraulics benchmark problem was proposed for the testing of existing methods for HTGRs to analyze the neutronics and thermal-hydraulic behavior for the design and safety evaluations of the PBMR. This study deals with the neutronic benchmark problems, for fresh fuel and cold conditions (Case F-1), and first core loading with given number densities (Case F-2), proposed for PBMR. After the detailed MCNP modeling of the whole facility, benchmark calculations were performed. Spherical fuel region of a fuel pebble is divided into cubic lattice element in order to model a fuel pebble which contains, on average, 15000 CFPs (Coated Fuel Particles). Each element contains one CFP at its center. In this study, the side length of each cubic lattice element to have the same amount of fuel was calculated to be 0.1635 cm. The remaining volume of each lattice element was filled with graphite. All of different 5 concentric shells of CFP were modeled. The PBMR annular core consists of approximately 452000 pebbles in the benchmark problems. In Case F-1 where the core was filled with only fresh fuel pebble, a BCC(body-centered-cubic) lattice model was employed in order to achieve the random packing core with the packing fraction of 0.61. The BCC lattice was also employed with the size of the moderator pebble increased in a manner that reproduces the specified F/M ratio of 1:2 while preserving the packing fraction of 0.61 in Case F-2. The calculations were pursued with ENDF/B-VI cross-section library and used sab2002 S(α,
Mackenzie, Paul
1989-01-01
The forty-year dream of understanding the properties of the strongly interacting particles from first principles is now approaching reality. Quantum chromodynamics (QCD - the field theory of the quark and gluon constituents of strongly interacting particles) was initially handicapped by the severe limitations of the conventional (perturbation) approach in this picture, but Ken Wilson's inventions of lattice gauge theory and renormalization group methods opened new doors, making calculations of masses and other particle properties possible. Lattice gauge theory became a major industry around 1980, when Monte Carlo methods were introduced, and the first prototype calculations yielded qualitatively reasonable results. The promising developments over the past year were highlighted at the 1988 Symposium on Lattice Field Theory - Lattice 88 - held at Fermilab
Risager, Morten S.; Södergren, Carl Anders
2017-01-01
It is well known that the angles in a lattice acting on hyperbolic n -space become equidistributed. In this paper we determine a formula for the pair correlation density for angles in such hyperbolic lattices. Using this formula we determine, among other things, the asymptotic behavior of the den......It is well known that the angles in a lattice acting on hyperbolic n -space become equidistributed. In this paper we determine a formula for the pair correlation density for angles in such hyperbolic lattices. Using this formula we determine, among other things, the asymptotic behavior...... of the density function in both the small and large variable limits. This extends earlier results by Boca, Pasol, Popa and Zaharescu and Kelmer and Kontorovich in dimension 2 to general dimension n . Our proofs use the decay of matrix coefficients together with a number of careful estimates, and lead...
Kulikowska, T.
1999-01-01
The present lecture has a main goal to show how the transport lattice calculations are realised in a standard computer code. This is illustrated on the example of the WIMSD code, belonging to the most popular tools for reactor calculations. Most of the approaches discussed here can be easily modified to any other lattice code. The description of the code assumes the basic knowledge of reactor lattice, on the level given in the lecture on 'Reactor lattice transport calculations'. For more advanced explanation of the WIMSD code the reader is directed to the detailed descriptions of the code cited in References. The discussion of the methods and models included in the code is followed by the generally used homogenisation procedure and several numerical examples of discrepancies in calculated multiplication factors based on different sources of library data. (author)
Mackenzie, Paul
1989-03-15
The forty-year dream of understanding the properties of the strongly interacting particles from first principles is now approaching reality. Quantum chromodynamics (QCD - the field theory of the quark and gluon constituents of strongly interacting particles) was initially handicapped by the severe limitations of the conventional (perturbation) approach in this picture, but Ken Wilson's inventions of lattice gauge theory and renormalization group methods opened new doors, making calculations of masses and other particle properties possible. Lattice gauge theory became a major industry around 1980, when Monte Carlo methods were introduced, and the first prototype calculations yielded qualitatively reasonable results. The promising developments over the past year were highlighted at the 1988 Symposium on Lattice Field Theory - Lattice 88 - held at Fermilab.
Christ, Norman H
2000-01-01
The architecture and capabilities of the computers currently in use for large-scale lattice QCD calculations are described and compared. Based on this present experience, possible future directions are discussed
Kulikowska, T.
2001-01-01
The description of reactor lattice codes is carried out on the example of the WIMSD-5B code. The WIMS code in its various version is the most recognised lattice code. It is used in all parts of the world for calculations of research and power reactors. The version WIMSD-5B is distributed free of charge by NEA Data Bank. The description of its main features given in the present lecture follows the aspects defined previously for lattice calculations in the lecture on Reactor Lattice Transport Calculations. The spatial models are described, and the approach to the energy treatment is given. Finally the specific algorithm applied in fuel depletion calculations is outlined. (author)
A Global Vision over Benchmarking Process: Benchmarking Based Enterprises
Sitnikov, Catalina; Giurca Vasilescu, Laura
2008-01-01
Benchmarking uses the knowledge and the experience of others to improve the enterprise. Starting from the analysis of the performance and underlying the strengths and weaknesses of the enterprise it should be assessed what must be done in order to improve its activity. Using benchmarking techniques, an enterprise looks at how processes in the value chain are performed. The approach based on the vision “from the whole towards the parts” (a fragmented image of the enterprise’s value chain) redu...
Benchmarking and Learning in Public Healthcare
Buckmaster, Natalie; Mouritsen, Jan
2017-01-01
This research investigates the effects of learning-oriented benchmarking in public healthcare settings. Benchmarking is a widely adopted yet little explored accounting practice that is part of the paradigm of New Public Management. Extant studies are directed towards mandated coercive benchmarking...... applications. The present study analyses voluntary benchmarking in a public setting that is oriented towards learning. The study contributes by showing how benchmarking can be mobilised for learning and offers evidence of the effects of such benchmarking for performance outcomes. It concludes that benchmarking...... can enable learning in public settings but that this requires actors to invest in ensuring that benchmark data are directed towards improvement....
Petronzio, R.
1992-01-01
Lattice gauge theories are about fifteen years old and I will report on the present status of the field without making the elementary introduction that can be found in the proceedings of the last two conferences. The talk covers briefly the following subjects: the determination of α s , the status of spectroscopy, heavy quark physics and in particular the calculation of their hadronic weak matrix elements, high temperature QCD, non perturbative Higgs bounds, chiral theories on the lattice and induced theories
Kiefel, Martin; Jampani, Varun; Gehler, Peter V.
2014-01-01
This paper presents a convolutional layer that is able to process sparse input features. As an example, for image recognition problems this allows an efficient filtering of signals that do not lie on a dense grid (like pixel position), but of more general features (such as color values). The presented algorithm makes use of the permutohedral lattice data structure. The permutohedral lattice was introduced to efficiently implement a bilateral filter, a commonly used image processing operation....
Castle, Toen; Sussman, Daniel M; Tanis, Michael; Kamien, Randall D
2016-09-01
Kirigami uses bending, folding, cutting, and pasting to create complex three-dimensional (3D) structures from a flat sheet. In the case of lattice kirigami, this cutting and rejoining introduces defects into an underlying 2D lattice in the form of points of nonzero Gaussian curvature. A set of simple rules was previously used to generate a wide variety of stepped structures; we now pare back these rules to their minimum. This allows us to describe a set of techniques that unify a wide variety of cut-and-paste actions under the rubric of lattice kirigami, including adding new material and rejoining material across arbitrary cuts in the sheet. We also explore the use of more complex lattices and the different structures that consequently arise. Regardless of the choice of lattice, creating complex structures may require multiple overlapping kirigami cuts, where subsequent cuts are not performed on a locally flat lattice. Our additive kirigami method describes such cuts, providing a simple methodology and a set of techniques to build a huge variety of complex 3D shapes.
Classical Logic and Quantum Logic with Multiple and Common Lattice Models
Mladen Pavičić
2016-01-01
Full Text Available We consider a proper propositional quantum logic and show that it has multiple disjoint lattice models, only one of which is an orthomodular lattice (algebra underlying Hilbert (quantum space. We give an equivalent proof for the classical logic which turns out to have disjoint distributive and nondistributive ortholattices. In particular, we prove that both classical logic and quantum logic are sound and complete with respect to each of these lattices. We also show that there is one common nonorthomodular lattice that is a model of both quantum and classical logic. In technical terms, that enables us to run the same classical logic on both a digital (standard, two-subset, 0-1-bit computer and a nondigital (say, a six-subset computer (with appropriate chips and circuits. With quantum logic, the same six-element common lattice can serve us as a benchmark for an efficient evaluation of equations of bigger lattice models or theorems of the logic.
Lattice regularized chiral perturbation theory
Borasoy, Bugra; Lewis, Randy; Ouimet, Pierre-Philippe A.
2004-01-01
Chiral perturbation theory can be defined and regularized on a spacetime lattice. A few motivations are discussed here, and an explicit lattice Lagrangian is reviewed. A particular aspect of the connection between lattice chiral perturbation theory and lattice QCD is explored through a study of the Wess-Zumino-Witten term
Vortex lattices in layered superconductors
Prokic, V.; Davidovic, D.; Dobrosavljevic-Grujic, L.
1995-01-01
We study vortex lattices in a superconductor--normal-metal superlattice in a parallel magnetic field. Distorted lattices, resulting from the shear deformations along the layers, are found to be unstable. Under field variation, nonequilibrium configurations undergo an infinite sequence of continuous transitions, typical for soft lattices. The equilibrium vortex arrangement is always a lattice of isocell triangles, without shear
Performance Targets and External Benchmarking
Friis, Ivar; Hansen, Allan; Vámosi, Tamás S.
Research on relative performance measures, transfer pricing, beyond budgeting initiatives, target costing, piece rates systems and value based management has for decades underlined the importance of external benchmarking in performance management. Research conceptualises external benchmarking...... as a market mechanism that can be brought inside the firm to provide incentives for continuous improvement and the development of competitive advances. However, whereas extant research primarily has focused on the importance and effects of using external benchmarks, less attention has been directed towards...... the conditions upon which the market mechanism is performing within organizations. This paper aims to contribute to research by providing more insight to the conditions for the use of external benchmarking as an element in performance management in organizations. Our study explores a particular type of external...
Benchmarking and Sustainable Transport Policy
Gudmundsson, Henrik; Wyatt, Andrew; Gordon, Lucy
2004-01-01
Order to learn from the best. In 2000 the European Commission initiated research to explore benchmarking as a tool to promote policies for sustainable transport. This paper reports findings and recommendations on how to address this challenge. The findings suggest that benchmarking is a valuable...... tool that may indeed help to move forward the transport policy agenda. However, there are major conditions and limitations. First of all it is not always so straightforward to delimit, measure and compare transport services in order to establish a clear benchmark. Secondly sustainable transport...... evokes a broad range of concerns that are hard to address fully at the level of specific practices. Thirdly policies are not directly comparable across space and context. For these reasons attempting to benchmark sustainable transport policies against one another would be a highly complex task, which...
Processing of nuclear data for reactor applications
Trkov, A.; Ravnik, M.
1996-01-01
A brief description is given of the processing and validation of nuclear data in connection with the TRX-1, TRX-2, BAPL-1 and BAPL-2 benchmarks of a/o thermal reactors and in connection with the JEF-1, JENDL-3 and WIMS libraries. Also, the validation of the WLUP results are briefly discussed. 8 refs, 5 tabs
Thermal reactor benchmark testing of 69 group library
Liu Guisheng; Wang Yaoqing; Liu Ping; Zhang Baocheng
1994-01-01
Using a code system NSLINK, AMPX master library in WIMS 69 groups structure are made from nuclides relating to 4 newest evaluated nuclear data libraries. Some integrals of 10 thermal reactor benchmark assemblies recommended by the U.S. CSEWG are calculated using rectified PASC-1 code system and compared with foreign results, the authors results are in good agreement with others. 69 group libraries of evaluated data bases in TPFAP interface file are generated with NJOY code system. The k ∞ values of 6 cell lattice assemblies are calculated by the code CBM. The calculated results are analysed and compared
Benchmarking: contexts and details matter.
Zheng, Siyuan
2017-07-05
Benchmarking is an essential step in the development of computational tools. We take this opportunity to pitch in our opinions on tool benchmarking, in light of two correspondence articles published in Genome Biology.Please see related Li et al. and Newman et al. correspondence articles: www.dx.doi.org/10.1186/s13059-017-1256-5 and www.dx.doi.org/10.1186/s13059-017-1257-4.
Handbook of critical experiments benchmarks
Durst, B.M.; Bierman, S.R.; Clayton, E.D.
1978-03-01
Data from critical experiments have been collected together for use as benchmarks in evaluating calculational techniques and nuclear data. These benchmarks have been selected from the numerous experiments performed on homogeneous plutonium systems. No attempt has been made to reproduce all of the data that exists. The primary objective in the collection of these data is to present representative experimental data defined in a concise, standardized format that can easily be translated into computer code input
Analysis of Benchmark 2 results
Bacha, F.; Lefievre, B.; Maillard, J.; Silva, J.
1994-01-01
The code GEANT315 has been compared to different codes in two benchmarks. We analyze its performances through our results, especially in the thick target case. In spite of gaps in nucleus-nucleus interaction theories at intermediate energies, benchmarks allow possible improvements of physical models used in our codes. Thereafter, a scheme of radioactive waste burning system is studied. (authors). 4 refs., 7 figs., 1 tab
Benchmarks for GADRAS performance validation
Mattingly, John K.; Mitchell, Dean James; Rhykerd, Charles L. Jr.
2009-01-01
The performance of the Gamma Detector Response and Analysis Software (GADRAS) was validated by comparing GADRAS model results to experimental measurements for a series of benchmark sources. Sources for the benchmark include a plutonium metal sphere, bare and shielded in polyethylene, plutonium oxide in cans, a highly enriched uranium sphere, bare and shielded in polyethylene, a depleted uranium shell and spheres, and a natural uranium sphere. The benchmark experimental data were previously acquired and consist of careful collection of background and calibration source spectra along with the source spectra. The calibration data were fit with GADRAS to determine response functions for the detector in each experiment. A one-dimensional model (pie chart) was constructed for each source based on the dimensions of the benchmark source. The GADRAS code made a forward calculation from each model to predict the radiation spectrum for the detector used in the benchmark experiment. The comparisons between the GADRAS calculation and the experimental measurements are excellent, validating that GADRAS can correctly predict the radiation spectra for these well-defined benchmark sources.
Benchmarking in Czech Higher Education
Plaček Michal
2015-12-01
Full Text Available The first part of this article surveys the current experience with the use of benchmarking at Czech universities specializing in economics and management. The results indicate that collaborative benchmarking is not used on this level today, but most actors show some interest in its introduction. The expression of the need for it and the importance of benchmarking as a very suitable performance-management tool in less developed countries are the impetus for the second part of our article. Based on an analysis of the current situation and existing needs in the Czech Republic, as well as on a comparison with international experience, recommendations for public policy are made, which lie in the design of a model of a collaborative benchmarking for Czech economics and management in higher-education programs. Because the fully complex model cannot be implemented immediately – which is also confirmed by structured interviews with academics who have practical experience with benchmarking –, the final model is designed as a multi-stage model. This approach helps eliminate major barriers to the implementation of benchmarking.
Dynamic benchmarking of simulation codes
Henry, R.E.; Paik, C.Y.; Hauser, G.M.
1996-01-01
Computer simulation of nuclear power plant response can be a full-scope control room simulator, an engineering simulator to represent the general behavior of the plant under normal and abnormal conditions, or the modeling of the plant response to conditions that would eventually lead to core damage. In any of these, the underlying foundation for their use in analysing situations, training of vendor/utility personnel, etc. is how well they represent what has been known from industrial experience, large integral experiments and separate effects tests. Typically, simulation codes are benchmarked with some of these; the level of agreement necessary being dependent upon the ultimate use of the simulation tool. However, these analytical models are computer codes, and as a result, the capabilities are continually enhanced, errors are corrected, new situations are imposed on the code that are outside of the original design basis, etc. Consequently, there is a continual need to assure that the benchmarks with important transients are preserved as the computer code evolves. Retention of this benchmarking capability is essential to develop trust in the computer code. Given the evolving world of computer codes, how is this retention of benchmarking capabilities accomplished? For the MAAP4 codes this capability is accomplished through a 'dynamic benchmarking' feature embedded in the source code. In particular, a set of dynamic benchmarks are included in the source code and these are exercised every time the archive codes are upgraded and distributed to the MAAP users. Three different types of dynamic benchmarks are used: plant transients; large integral experiments; and separate effects tests. Each of these is performed in a different manner. The first is accomplished by developing a parameter file for the plant modeled and an input deck to describe the sequence; i.e. the entire MAAP4 code is exercised. The pertinent plant data is included in the source code and the computer
DRAGON solutions to the 3D transport benchmark over a range in parameter space
Martin, Nicolas; Hebert, Alain; Marleau, Guy
2010-01-01
DRAGON solutions to the 'NEA suite of benchmarks for 3D transport methods and codes over a range in parameter space' are discussed in this paper. A description of the benchmark is first provided, followed by a detailed review of the different computational models used in the lattice code DRAGON. Two numerical methods were selected for generating the required quantities for the 729 configurations of this benchmark. First, S N calculations were performed using fully symmetric angular quadratures and high-order diamond differencing for spatial discretization. To compare S N results with those of another deterministic method, the method of characteristics (MoC) was also considered for this benchmark. Comparisons between reference solutions, S N and MoC results illustrate the advantages and drawbacks of each methods for this 3-D transport problem.
Chodos, A.
1978-01-01
A version of lattice gauge theory is presented in which the shape of the lattice is not assumed at the outset but is a consequence of the dynamics. Other related features which are not specified a priori include the internal and space-time symmetry groups and the dimensionality of space-time. The theory possesses a much larger invariance group than the usual gauge group on a lattice, and has associated with it an integer k 0 analogous to the topological quantum numer of quantum chromodynamics. Families of semiclassical solutions are found which are labeled by k 0 and a second integer x, but the analysis is not carried far enough to determine which space-time and internal symmetry groups characterize the lowest-lying states of the theory
Graphene antidot lattice waveguides
Pedersen, Jesper Goor; Gunst, Tue; Markussen, Troels
2012-01-01
We introduce graphene antidot lattice waveguides: nanostructured graphene where a region of pristine graphene is sandwiched between regions of graphene antidot lattices. The band gaps in the surrounding antidot lattices enable localized states to emerge in the central waveguide region. We model...... the waveguides via a position-dependent mass term in the Dirac approximation of graphene and arrive at analytical results for the dispersion relation and spinor eigenstates of the localized waveguide modes. To include atomistic details we also use a tight-binding model, which is in excellent agreement...... with the analytical results. The waveguides resemble graphene nanoribbons, but without the particular properties of ribbons that emerge due to the details of the edge. We show that electrons can be guided through kinks without additional resistance and that transport through the waveguides is robust against...
Regional Competitive Intelligence: Benchmarking and Policymaking
Huggins , Robert
2010-01-01
Benchmarking exercises have become increasingly popular within the sphere of regional policymaking in recent years. The aim of this paper is to analyse the concept of regional benchmarking and its links with regional policymaking processes. It develops a typology of regional benchmarking exercises and regional benchmarkers, and critically reviews the literature, both academic and policy oriented. It is argued that critics who suggest regional benchmarking is a flawed concept and technique fai...
Catterall, Simon; Kaplan, David B.; Unsal, Mithat
2009-03-31
We provide an introduction to recent lattice formulations of supersymmetric theories which are invariant under one or more real supersymmetries at nonzero lattice spacing. These include the especially interesting case of N = 4 SYM in four dimensions. We discuss approaches based both on twisted supersymmetry and orbifold-deconstruction techniques and show their equivalence in the case of gauge theories. The presence of an exact supersymmetry reduces and in some cases eliminates the need for fine tuning to achieve a continuum limit invariant under the full supersymmetry of the target theory. We discuss open problems.
Benchmarking of human resources management
David M. Akinnusi
2008-11-01
Full Text Available This paper reviews the role of human resource management (HRM which, today, plays a strategic partnership role in management. The focus of the paper is on HRM in the public sector, where much hope rests on HRM as a means of transforming the public service and achieving much needed service delivery. However, a critical evaluation of HRM practices in the public sector reveals that these services leave much to be desired. The paper suggests the adoption of benchmarking as a process to revamp HRM in the public sector so that it is able to deliver on its promises. It describes the nature and process of benchmarking and highlights the inherent difficulties in applying benchmarking in HRM. It concludes with some suggestions for a plan of action. The process of identifying “best” practices in HRM requires the best collaborative efforts of HRM practitioners and academicians. If used creatively, benchmarking has the potential to bring about radical and positive changes in HRM in the public sector. The adoption of the benchmarking process is, in itself, a litmus test of the extent to which HRM in the public sector has grown professionally.
Benchmark simulation models, quo vadis?
Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D
2013-01-01
As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.
Radiation Detection Computational Benchmark Scenarios
Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.; McDonald, Ben S.
2013-09-24
Modeling forms an important component of radiation detection development, allowing for testing of new detector designs, evaluation of existing equipment against a wide variety of potential threat sources, and assessing operation performance of radiation detection systems. This can, however, result in large and complex scenarios which are time consuming to model. A variety of approaches to radiation transport modeling exist with complementary strengths and weaknesses for different problems. This variety of approaches, and the development of promising new tools (such as ORNL’s ADVANTG) which combine benefits of multiple approaches, illustrates the need for a means of evaluating or comparing different techniques for radiation detection problems. This report presents a set of 9 benchmark problems for comparing different types of radiation transport calculations, identifying appropriate tools for classes of problems, and testing and guiding the development of new methods. The benchmarks were drawn primarily from existing or previous calculations with a preference for scenarios which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22. From a technical perspective, the benchmarks were chosen to span a range of difficulty and to include gamma transport, neutron transport, or both and represent different important physical processes and a range of sensitivity to angular or energy fidelity. Following benchmark identification, existing information about geometry, measurements, and previous calculations were assembled. Monte Carlo results (MCNP decks) were reviewed or created and re-run in order to attain accurate computational times and to verify agreement with experimental data, when present. Benchmark information was then conveyed to ORNL in order to guide testing and development of hybrid calculations. The results of those ADVANTG calculations were then sent to PNNL for
Krojts, M.
1987-01-01
The book by the known american physicist-theoretist M.Kreuts represents the first monography in world literature, where a new perspective direction in elementary particle physics and quantum field theory - lattice formulation of gauge theories is stated systematically. Practically all main ideas of this direction are given. Material is stated in systematic and understandable form
Phenomenology Using Lattice QCD
Gupta, R.
2005-08-01
This talk provides a brief summary of the status of lattice QCD calculations of the light quark masses and the kaon bag parameter BK. Precise estimates of these four fundamental parameters of the standard model, i.e., mu, md, ms and the CP violating parameter η, help constrain grand unified models and could provide a window to new physics.
Bali, G.S.
2005-01-01
I comment on progress of lattice QCD techniques and calculations. Recent results on pentaquark masses as well as of the spectrum of excited baryons are summarized and interpreted. The present state of calculations of quantities related to the nucleon structure and of electromagnetic transition form factors is surveyed
Finite lattice extrapolation algorithms
Henkel, M.; Schuetz, G.
1987-08-01
Two algorithms for sequence extrapolation, due to von den Broeck and Schwartz and Bulirsch and Stoer are reviewed and critically compared. Applications to three states and six states quantum chains and to the (2+1)D Ising model show that the algorithm of Bulirsch and Stoer is superior, in particular if only very few finite lattice data are available. (orig.)
Williamson, S. Gill
2010-01-01
Will the cosmological multiverse, when described mathematically, have easily stated properties that are impossible to prove or disprove using mathematical physics? We explore this question by constructing lattice multiverses which exhibit such behavior even though they are much simpler mathematically than any likely cosmological multiverse.
de Raedt, Hans; von der Linden, W.; Binder, K
1995-01-01
In this chapter we review methods currently used to perform Monte Carlo calculations for quantum lattice models. A detailed exposition is given of the formalism underlying the construction of the simulation algorithms. We discuss the fundamental and technical difficulties that are encountered and
Scott, Paul
2006-01-01
A "convex" polygon is one with no re-entrant angles. Alternatively one can use the standard convexity definition, asserting that for any two points of the convex polygon, the line segment joining them is contained completely within the polygon. In this article, the author provides a solution to a problem involving convex lattice polygons.
Autin, B.
1984-01-01
After a description of the constraints imposed by the cooling of Antiprotons on the lattice of the rings, the reasons which motivate the shape and the structure of these machines are surveyed. Linear and non-linear beam optics properties are treated with a special amplification to the Antiproton Accumulator. (orig.)
3-D neutron transport benchmarks
Takeda, T.; Ikeda, H.
1991-03-01
A set of 3-D neutron transport benchmark problems proposed by the Osaka University to NEACRP in 1988 has been calculated by many participants and the corresponding results are summarized in this report. The results of K eff , control rod worth and region-averaged fluxes for the four proposed core models, calculated by using various 3-D transport codes are compared and discussed. The calculational methods used were: Monte Carlo, Discrete Ordinates (Sn), Spherical Harmonics (Pn), Nodal Transport and others. The solutions of the four core models are quite useful as benchmarks for checking the validity of 3-D neutron transport codes
Strategic behaviour under regulatory benchmarking
Jamasb, T. [Cambridge Univ. (United Kingdom). Dept. of Applied Economics; Nillesen, P. [NUON NV (Netherlands); Pollitt, M. [Cambridge Univ. (United Kingdom). Judge Inst. of Management
2004-09-01
In order to improve the efficiency of electricity distribution networks, some regulators have adopted incentive regulation schemes that rely on performance benchmarking. Although regulation benchmarking can influence the ''regulation game,'' the subject has received limited attention. This paper discusses how strategic behaviour can result in inefficient behaviour by firms. We then use the Data Envelopment Analysis (DEA) method with US utility data to examine implications of illustrative cases of strategic behaviour reported by regulators. The results show that gaming can have significant effects on the measured performance and profitability of firms. (author)
Atomic Energy Research benchmark activity
Makai, M.
1998-01-01
The test problems utilized in the validation and verification process of computer programs in Atomic Energie Research are collected into one bunch. This is the first step towards issuing a volume in which tests for VVER are collected, along with reference solutions and a number of solutions. The benchmarks do not include the ZR-6 experiments because they have been published along with a number of comparisons in the Final reports of TIC. The present collection focuses on operational and mathematical benchmarks which cover almost the entire range of reaktor calculation. (Author)
Unquenched lattice upsilon spectroscopy
Marcantonio, L.M.
2001-03-01
A non-relativistic effective theory of QCD (NRQCD) is used in calculations of the upsilon spectrum. Simultaneous multi-correlation fitting routines are used to yield lattice channel energies and amplitudes. The lattice configurations used were both dynamical, with two flavours of sea quarks included in the action; and quenched, with no sea quarks. These configurations were generated by the UKQCD collaboration. The dynamical configurations used were ''matched'', having the same lattice spacing, but differing in the sea quark mass. Thus, it was possible to analyse trends of observables with sea quark mass, in the certainty that the trend isn't partially due to varying lattice spacing. The lattice spacing used for spectroscopy was derived from the lattice 1 1 P 1 - 1 3 S 1 splitting. On each set of configurations two lattice bare b quark masses were used, giving kinetic masses bracketing the physical Υ mass. The only quantity showing a strong dependence on these masses was the hyperfine splitting, so it was interpolated to the real Υ mass. The radial and orbital splittings gave good agreement with experiment. The hyperfine splitting results showed a clear signal for unquenching and the dynamical hyperfine splitting results were extrapolated to a physical sea quark mass. This result, combined with the quenched result yielded a value for the hyperfine splitting at n f = 3, predicting an η b mass of 9.517(4) GeV. The NRQCD technique for obtaining a value of the strong coupling constant in the M-barS-bar scheme was followed. Using quenched and dynamical results a value was extrapolated to n f = 3. Employing a three loop beta function to run the coupling, with suitable matching conditions at heavy quark thresholds, the final result was obtained for n f = 5 at a scale equal to the Z boson mass. This result was α(5)/MS(Mz)=0.110(4). Two methods for finding the mass of the b quark in the MS scheme were employed. The results of both methods agree within error but the
WIMS library up-date project: first stage results
Prati, A.; Claro, L.H.
1990-01-01
The following benchmarks: TRX1, TRX2, BAPL-UO sub(2)-1, BAPL-UO sub (2)-2, BAPL-UO sub(2)-3 have been calculated with the WIMSD/4 code, as a contribution of CTA/IEAv, to the first stage of the WIMS Library Update Project, coordinated by the International Atomic Energy Agency. The card image input for each benchmark has been attached and the major input options/parameters are commented. The version of the WIMSD/4 code and its multigroup cross section library used to run the benchmarks are specified. Results from the major integral parameters are presented and discussed. (author)
Monte Carlo burnup simulation of the TAKAHAMA-3 benchmark experiment
Dalle, Hugo M.
2009-01-01
High burnup PWR fuel is currently being studied at CDTN/CNEN-MG. Monte Carlo burnup code system MONTEBURNS is used to characterize the neutronic behavior of the fuel. In order to validate the code system and calculation methodology to be used in this study the Japanese Takahama-3 Benchmark was chosen, as it is the single burnup benchmark experimental data set freely available that partially reproduces the conditions of the fuel under evaluation. The burnup of the three PWR fuel rods of the Takahama-3 burnup benchmark was calculated by MONTEBURNS using the simplest infinite fuel pin cell model and also a more complex representation of an infinite heterogeneous fuel pin cells lattice. Calculations results for the mass of most isotopes of Uranium, Neptunium, Plutonium, Americium, Curium and some fission products, commonly used as burnup monitors, were compared with the Post Irradiation Examinations (PIE) values for all the three fuel rods. Results have shown some sensitivity to the MCNP neutron cross-section data libraries, particularly affected by the temperature in which the evaluated nuclear data files were processed. (author)
Superspace approach to lattice supersymmetry
Kostelecky, V.A.; Rabin, J.M.
1984-01-01
We construct a cubic lattice of discrete points in superspace, as well as a discrete subgroup of the supersymmetry group which maps this ''superlattice'' into itself. We discuss the connection between this structure and previous versions of lattice supersymmetry. Our approach clarifies the mathematical problems of formulating supersymmetric lattice field theories and suggests new methods for attacking them
Basis reduction for layered lattices
Torreão Dassen, Erwin
2011-01-01
We develop the theory of layered Euclidean spaces and layered lattices. We present algorithms to compute both Gram-Schmidt and reduced bases in this generalized setting. A layered lattice can be seen as lattices where certain directions have infinite weight. It can also be
Woloshyn, R.M.
1988-03-01
The basic concepts of the Lagrangian formulation of lattice field theory are discussed. The Wilson and staggered schemes for dealing with fermions on the lattice are described. Some recent results for hadron masses and vector and axial vector current matrix elements in lattice QCD are reviewed. (Author) (118 refs., 16 figs.)
Basis reduction for layered lattices
E.L. Torreão Dassen (Erwin)
2011-01-01
htmlabstractWe develop the theory of layered Euclidean spaces and layered lattices. With this new theory certain problems that usually are solved by using classical lattices with a "weighting" gain a new, more natural form. Using the layered lattice basis reduction algorithms introduced here these
Benchmarked Library Websites Comparative Study
Ramli, Rindra M.; Tyhurst, Janis
2015-01-01
This presentation provides an analysis of services provided by the benchmarked library websites. The exploratory study includes comparison of these websites against a list of criterion and presents a list of services that are most commonly deployed by the selected websites. In addition to that, the investigators proposed a list of services that could be provided via the KAUST library website.
Prismatic Core Coupled Transient Benchmark
Ortensi, J.; Pope, M.A.; Strydom, G.; Sen, R.S.; DeHart, M.D.; Gougar, H.D.; Ellis, C.; Baxter, A.; Seker, V.; Downar, T.J.; Vierow, K.; Ivanov, K.
2011-01-01
The Prismatic Modular Reactor (PMR) is one of the High Temperature Reactor (HTR) design concepts that have existed for some time. Several prismatic units have operated in the world (DRAGON, Fort St. Vrain, Peach Bottom) and one unit is still in operation (HTTR). The deterministic neutronics and thermal-fluids transient analysis tools and methods currently available for the design and analysis of PMRs have lagged behind the state of the art compared to LWR reactor technologies. This has motivated the development of more accurate and efficient tools for the design and safety evaluations of the PMR. In addition to the work invested in new methods, it is essential to develop appropriate benchmarks to verify and validate the new methods in computer codes. The purpose of this benchmark is to establish a well-defined problem, based on a common given set of data, to compare methods and tools in core simulation and thermal hydraulics analysis with a specific focus on transient events. The benchmark-working group is currently seeking OECD/NEA sponsorship. This benchmark is being pursued and is heavily based on the success of the PBMR-400 exercise.
Lattice QCD calculations on commodity clusters at DESY
Gellrich, A.; Pop, D.; Wegner, P.; Wittig, H.; Hasenbusch, M.; Jansen, K.
2003-06-01
Lattice Gauge Theory is an integral part of particle physics that requires high performance computing in the multi-Tflops regime. These requirements are motivated by the rich research program and the physics milestones to be reached by the lattice community. Over the last years the enormous gains in processor performance, memory bandwidth, and external I/O bandwidth for parallel applications have made commodity clusters exploiting PCs or workstations also suitable for large Lattice Gauge Theory applications. For more than one year two clusters have been operated at the two DESY sites in Hamburg and Zeuthen, consisting of 32 resp. 16 dual-CPU PCs, equipped with Intel Pentium 4 Xeon processors. Interconnection of the nodes is done by way of Myrinet. Linux was chosen as the operating system. In the course of the projects benchmark programs for architectural studies were developed. The performance of the Wilson-Dirac Operator (also in an even-odd preconditioned version) as the inner loop of the Lattice QCD (LQCD) algorithms plays the most important role in classifying the hardware basis to be used. Using the SIMD streaming extensions (SSE/SSE2) on Intel's Pentium 4 Xeon CPUs give promising results for both the single CPU and the parallel version. The parallel performance, in addition to the CPU power and the memory throughput, is nevertheless strongly influenced by the behavior of hardware components like the PC chip-set and the communication interfaces. The paper starts by giving a short explanation about the physics background and the motivation for using PC clusters for Lattice QCD. Subsequently, the concept, implementation, and operating experiences of the two clusters are discussed. Finally, the paper presents benchmark results and discusses comparisons to systems with different hardware components including Myrinet-, GigaBit-Ethernet-, and Infiniband-based interconnects. (orig.)
Buechner, O. [Zentralinstitut fuer Angewandte Mathematik ZAM, 52425 Juelich (Germany); Ernst, M. [Deutsches Elektronen-Synchrotron DESY, 22603 Hamburg (Germany); Jansen, K. [John von Neumann-Institut fuer Computing NIC/DESY, 15738 Zeuthen (Germany); Lippert, Th. [Zentralinstitut fuer Angewandte Mathematik ZAM, 52425 Juelich (Germany); Melkumyan, D. [Deutsches Elektronen-Synchrotron DESY, 15738 Zeuthen (Germany); Orth, B. [Zentralinstitut fuer Angewandte Mathematik ZAM, 52425 Juelich (Germany); Pleiter, D. [John von Neumann-Institut fuer Computing NIC/DESY, 15738 Zeuthen (Germany)]. E-mail: dirk.pleiter@desy.de; Stueben, H. [Konrad-Zuse-Institut fuer Informationstechnik ZIB, 14195 Berlin (Germany); Wegner, P. [Deutsches Elektronen-Synchrotron DESY, 15738 Zeuthen (Germany); Wollny, S. [Konrad-Zuse-Institut fuer Informationstechnik ZIB, 14195 Berlin (Germany)
2006-04-01
As the need for computing resources to carry out numerical simulations of Quantum Chromodynamics (QCD) formulated on a lattice has increased significantly, efficient use of the generated data has become a major concern. To improve on this, groups plan to share their configurations on a worldwide level within the International Lattice DataGrid (ILDG). Doing so requires standardized description of the configurations, standards on binary file formats and common middleware interfaces. We describe the requirements and problems, and discuss solutions. Furthermore, an overview is given on the implementation of the LatFor DataGrid [http://www-zeuthen.desy.de/latfor/ldg], a France/German/Italian grid that will be one of the regional grids within the ILDG grid-of-grids concept.
Borsanyi, Sz.; Kampert, K.H.; Fodor, Z.; Forschungszentrum Juelich; Eoetvoes Univ., Budapest
2016-06-01
We present a full result for the equation of state (EoS) in 2+1+1 (up/down, strange and charm quarks are present) flavour lattice QCD. We extend this analysis and give the equation of state in 2+1+1+1 flavour QCD. In order to describe the evolution of the universe from temperatures several hundreds of GeV to the MeV scale we also include the known effects of the electroweak theory and give the effective degree of freedoms. As another application of lattice QCD we calculate the topological susceptibility (χ) up to the few GeV temperature region. These two results, EoS and χ, can be used to predict the dark matter axion's mass in the post-inflation scenario and/or give the relationship between the axion's mass and the universal axionic angle, which acts as a initial condition of our universe.
Lutz, H.D.; Willich, P.
1977-01-01
The FIR absorption spectra of pyrite type compounds RuS 2 , RuSsub(2-x)Sesub(x), RuSe 2 , RuTe 2 , OsS 2 , OsSe 2 , and PtP 2 as well as loellingite type phosphides FeP 2 , RuP 2 , and OsP 2 are reported. For RuS 2 , RuSe 2 , RuTe 2 , OsS 2 , and PtP 2 all of the five infrared allowed modes (k = 0) are observed. As a first result of a numerical normal coordinate treatment vibration forms of pyrite structure are communicated. The spectra show that lattice forces of corresponding sulfides, tellurides, and phosphides are about the same strength, but increase strongly by substitution of iron by ruthenium and especially of ruthenium by osmium. The lattice constants of the RuSsub(2-x)Sesub(x) solid solution obey Vegard's rule. (author)
Solórzano, S.; Mendoza, M.; Succi, S.; Herrmann, H. J.
2018-01-01
We present a numerical scheme to solve the Wigner equation, based on a lattice discretization of momentum space. The moments of the Wigner function are recovered exactly, up to the desired order given by the number of discrete momenta retained in the discretization, which also determines the accuracy of the method. The Wigner equation is equipped with an additional collision operator, designed in such a way as to ensure numerical stability without affecting the evolution of the relevant moments of the Wigner function. The lattice Wigner scheme is validated for the case of quantum harmonic and anharmonic potentials, showing good agreement with theoretical results. It is further applied to the study of the transport properties of one- and two-dimensional open quantum systems with potential barriers. Finally, the computational viability of the scheme for the case of three-dimensional open systems is also illustrated.
Lattice Quantum Chromodynamics
Sachrajda, C T
2016-01-01
I review the the application of the lattice formulation of QCD and large-scale numerical simulations to the evaluation of non-perturbative hadronic effects in Standard Model Phenomenology. I present an introduction to the elements of the calculations and discuss the limitations both in the range of quantities which can be studied and in the precision of the results. I focus particularly on the extraction of the QCD parameters, i.e. the quark masses and the strong coupling constant, and on important quantities in flavour physics. Lattice QCD is playing a central role in quantifying the hadronic effects necessary for the development of precision flavour physics and its use in exploring the limits of the Standard Model and in searches for inconsistencies which would signal the presence of new physics.
Lattices of dielectric resonators
Trubin, Alexander
2016-01-01
This book provides the analytical theory of complex systems composed of a large number of high-Q dielectric resonators. Spherical and cylindrical dielectric resonators with inferior and also whispering gallery oscillations allocated in various lattices are considered. A new approach to S-matrix parameter calculations based on perturbation theory of Maxwell equations, developed for a number of high-Q dielectric bodies, is introduced. All physical relationships are obtained in analytical form and are suitable for further computations. Essential attention is given to a new unified formalism of the description of scattering processes. The general scattering task for coupled eigen oscillations of the whole system of dielectric resonators is described. The equations for the expansion coefficients are explained in an applicable way. The temporal Green functions for the dielectric resonator are presented. The scattering process of short pulses in dielectric filter structures, dielectric antennas and lattices of d...
Hasenfratz, A.; Hasenfratz, P.
1985-01-01
This paper deals almost exclusively with applications in QCD. Presumably QCD will remain in the center of lattice calculations in the near future. The existing techniques and the available computer resources should be able to produce trustworthy results in pure SU(3) gauge theory and in quenched hadron spectroscopy. Going beyond the quenched approximation might require some technical breakthrough or exceptional computer resources, or both. Computational physics has entered high-energy physics. From this point of view, lattice QCD is only one (although the most important, at present) of the research fields. Increasing attention is devoted to the study of other QFTs. It is certain that the investigation of nonasymptotically free theories, the Higgs phenomenon, or field theories that are not perturbatively renormalizable will be important research areas in the future
Lattice degeneracies of geometric fermions
Raszillier, H.
1983-05-01
We give the minimal numbers of degrees of freedom carried by geometric fermions on all lattices of maximal symmetries in d = 2, 3, and 4 dimensions. These numbers are lattice dependent, but in the (free) continuum limit, part of the degrees of freedom have to escape to infinity by a Wilson mechanism built in, and 2sup(d) survive for any lattice. On self-reciprocal lattices we compare the minimal numbers of degrees of freedom of geometric fermions with the minimal numbers of naive fermions on these lattices and argue that these numbers are equal. (orig.)
1962-01-01
The panel was attended by prominent physicists from most of the well-known laboratories in the field of light-water lattices, who exchanged the latest information on the status of work in their countries and discussed both the theoretical and the experimental aspects of the subjects. The supporting papers covered most problems, including criticality, resonance absorption, thermal utilization, spectrum calculations and the physics of plutonium bearing systems. Refs, figs and tabs
Diffusion in heterogeneous lattices
Tarasenko, Alexander; Jastrabík, Lubomír
2010-01-01
Roč. 256, č. 17 (2010), s. 5137-5144 ISSN 0169-4332 R&D Projects: GA AV ČR KAN301370701; GA MŠk(CZ) 1M06002 Institutional research plan: CEZ:AV0Z10100522 Keywords : lattice- gas systems * diffusion * Monte Carlo simulations Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.795, year: 2010
Automated lattice data generation
Ayyar Venkitesh
2018-01-01
Full Text Available The process of generating ensembles of gauge configurations (and measuring various observables over them can be tedious and error-prone when done “by hand”. In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.
Automated lattice data generation
Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.
2018-03-01
The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.
Kumar, J [Agra Coll. (India). Dept. of Physics
1977-03-01
In the present work, a local model pseudopotential has been proposed to study the lattice dynamics of thorium. The model potential depends on the core and ionic radii, and accounts for the s-d-f hybridization effects in a phenomenological way. When this form of potential is applied to derive the photon dispersion curves of Th, sufficiently good agreement is found between the computed and experimental results.
Bowler, Ken
1990-01-01
One of the major recent developments in particle theory has been the use of very high performance computers to obtain approximate numerical solutions of quantum field theories by formulating them on a finite space-time lattice. The great virtue of this new technique is that it avoids the straitjacket of perturbation theory and can thus attack new, but very fundamental problems, such as the calculation of hadron masses in quark-gluon field theory (quantum chromodynamics - QCD)
Adamatzky, Andrew
2015-01-01
The book gives a comprehensive overview of the state-of-the-art research and engineering in theory and application of Lattice Automata in design and control of autonomous Robots. Automata and robots share the same notional meaning. Automata (originated from the latinization of the Greek word “αυτόματον”) as self-operating autonomous machines invented from ancient years can be easily considered the first steps of robotic-like efforts. Automata are mathematical models of Robots and also they are integral parts of robotic control systems. A Lattice Automaton is a regular array or a collective of finite state machines, or automata. The Automata update their states by the same rules depending on states of their immediate neighbours. In the context of this book, Lattice Automata are used in developing modular reconfigurable robotic systems, path planning and map exploration for robots, as robot controllers, synchronisation of robot collectives, robot vision, parallel robotic actuators. All chapters are...
Digital lattice gauge theories
Zohar, Erez; Farace, Alessandro; Reznik, Benni; Cirac, J. Ignacio
2017-02-01
We propose a general scheme for a digital construction of lattice gauge theories with dynamical fermions. In this method, the four-body interactions arising in models with 2 +1 dimensions and higher are obtained stroboscopically, through a sequence of two-body interactions with ancillary degrees of freedom. This yields stronger interactions than the ones obtained through perturbative methods, as typically done in previous proposals, and removes an important bottleneck in the road towards experimental realizations. The scheme applies to generic gauge theories with Lie or finite symmetry groups, both Abelian and non-Abelian. As a concrete example, we present the construction of a digital quantum simulator for a Z3 lattice gauge theory with dynamical fermionic matter in 2 +1 dimensions, using ultracold atoms in optical lattices, involving three atomic species, representing the matter, gauge, and auxiliary degrees of freedom, that are separated in three different layers. By moving the ancilla atoms with a proper sequence of steps, we show how we can obtain the desired evolution in a clean, controlled way.
Dielectric lattice gauge theory
Mack, G.
1983-06-01
Dielectric lattice gauge theory models are introduced. They involve variables PHI(b)epsilong that are attached to the links b = (x+esub(μ),x) of the lattice and take their values in the linear space g which consists of real linear combinations of matrices in the gauge group G. The polar decomposition PHI(b)=U(b)osub(μ)(x) specifies an ordinary lattice gauge field U(b) and a kind of dielectric field epsilonsub(ij)proportionalosub(i)osub(j)sup(*)deltasub(ij). A gauge invariant positive semidefinite kinetic term for the PHI-field is found, and it is shown how to incorporate Wilson fermions in a way which preserves Osterwalder Schrader positivity. Theories with G = SU(2) and without matter fields are studied in some detail. It is proved that confinement holds, in the sense that Wilson loop expectation values show an area law decay, if the Euclidean action has certain qualitative features which imply that PHI = 0 (i.e. dielectric field identical 0) is the unique maximum of the action. (orig.)
Dielectric lattice gauge theory
Mack, G.
1984-01-01
Dielectric lattice gauge theory models are introduced. They involve variables PHI(b)element ofG that are attached to the links b = (x+esub(μ), x) of the lattice and take their values in the linear space G which consists of real linear combinations of matrices in the gauge group G. The polar decomposition PHI(b)=U(b)sigmasub(μ)(x) specifies an ordinary lattice gauge field U(b) and a kind of dielectric field epsilonsub(ij)proportional sigmasub(i)sigmasub(j)sup(*)deltasub(ij). A gauge invariant positive semidefinite kinetic term for the PHI-field is found, and it is shown how to incorporate Wilson fermions in a way which preserves Osterwalder-Schrader positivity. Theories with G = SU(2) and without matter fields are studied in some detail. It is proved that confinement holds, in the sense that Wilson-loop expectation values show an area law decay, if the euclidean action has certain qualitative features which imply that PHI=0 (i.e. dielectric field identical 0) is the unique maximum of the action. (orig.)
Toward lattice fractional vector calculus
Tarasov, Vasily E.
2014-09-01
An analog of fractional vector calculus for physical lattice models is suggested. We use an approach based on the models of three-dimensional lattices with long-range inter-particle interactions. The lattice analogs of fractional partial derivatives are represented by kernels of lattice long-range interactions, where the Fourier series transformations of these kernels have a power-law form with respect to wave vector components. In the continuum limit, these lattice partial derivatives give derivatives of non-integer order with respect to coordinates. In the three-dimensional description of the non-local continuum, the fractional differential operators have the form of fractional partial derivatives of the Riesz type. As examples of the applications of the suggested lattice fractional vector calculus, we give lattice models with long-range interactions for the fractional Maxwell equations of non-local continuous media and for the fractional generalization of the Mindlin and Aifantis continuum models of gradient elasticity.
Tourism Destination Benchmarking: Evaluation and Selection of the Benchmarking Partners
Luštický Martin
2012-03-01
Full Text Available Tourism development has an irreplaceable role in regional policy of almost all countries. This is due to its undeniable benefits for the local population with regards to the economic, social and environmental sphere. Tourist destinations compete for visitors at tourism market and subsequently get into a relatively sharp competitive struggle. The main goal of regional governments and destination management institutions is to succeed in this struggle by increasing the competitiveness of their destination. The quality of strategic planning and final strategies is a key factor of competitiveness. Even though the tourism sector is not the typical field where the benchmarking methods are widely used, such approaches could be successfully applied. The paper focuses on key phases of the benchmarking process which lies in the search for suitable referencing partners. The partners are consequently selected to meet general requirements to ensure the quality if strategies. Following from this, some specific characteristics are developed according to the SMART approach. The paper tests this procedure with an expert evaluation of eight selected regional tourism strategies of regions in the Czech Republic, Slovakia and Great Britain. In this way it validates the selected criteria in the frame of the international environment. Hence, it makes it possible to find strengths and weaknesses of selected strategies and at the same time facilitates the discovery of suitable benchmarking partners.
BONFIRE: benchmarking computers and computer networks
Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker
2011-01-01
The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...
Benchmarking clinical photography services in the NHS.
Arbon, Giles
2015-01-01
Benchmarking is used in services across the National Health Service (NHS) using various benchmarking programs. Clinical photography services do not have a program in place and services have to rely on ad hoc surveys of other services. A trial benchmarking exercise was undertaken with 13 services in NHS Trusts. This highlights valuable data and comparisons that can be used to benchmark and improve services throughout the profession.
How Benchmarking and Higher Education Came Together
Levy, Gary D.; Ronco, Sharron L.
2012-01-01
This chapter introduces the concept of benchmarking and how higher education institutions began to use benchmarking for a variety of purposes. Here, benchmarking is defined as a strategic and structured approach whereby an organization compares aspects of its processes and/or outcomes to those of another organization or set of organizations to…
WWER-1000 Burnup Credit Benchmark (CB5)
Manolova, M.A.
2002-01-01
In the paper the specification of WWER-1000 Burnup Credit Benchmark first phase (depletion calculations), given. The second phase - criticality calculations for the WWER-1000 fuel pin cell, will be given after the evaluation of the results, obtained at the first phase. The proposed benchmark is a continuation of the WWER benchmark activities in this field (Author)
Benchmarking and Learning in Public Healthcare
Buckmaster, Natalie; Mouritsen, Jan
2017-01-01
This research investigates the effects of learning-oriented benchmarking in public healthcare settings. Benchmarking is a widely adopted yet little explored accounting practice that is part of the paradigm of New Public Management. Extant studies are directed towards mandated coercive benchmarking...
Integral benchmark test of JENDL-4.0 for U-233 systems with ICSBEP handbook
Kuwagaki, Kazuki; Nagaya, Yasunobu
2017-03-01
The integral benchmark test of JENDL-4.0 for U-233 systems using the continuous-energy Monte Carlo code MVP was conducted. The previous benchmark test was performed only for U-233 thermal solution and fast metallic systems in the ICSBEP handbook. In this study, MVP input files were prepared for uninvestigated benchmark problems in the handbook including compound thermal systems (mainly lattice systems) and integral benchmark test was performed. The prediction accuracy of JENDL-4.0 was evaluated for effective multiplication factors (k eff 's) of the U-233 systems. As a result, a trend of underestimation was observed for all the categories of U-233 systems. In the benchmark test of ENDF/B-VII.1 for U-233 systems with the ICSBEP handbook, it is reported that a decreasing trend of calculated k eff values in association with a parameter ATFF (Above-Thermal Fission Fraction) is observed. The ATFF values were also calculated in this benchmark test of JENDL-4.0 and the same trend as ENDF/B-VII.1 was observed. A CD-ROM is attached as an appendix. (J.P.N.)
featsel: A framework for benchmarking of feature selection algorithms and cost functions
Marcelo S. Reis; Gustavo Estrela; Carlos Eduardo Ferreira; Junior Barrera
2017-01-01
In this paper, we introduce featsel, a framework for benchmarking of feature selection algorithms and cost functions. This framework allows the user to deal with the search space as a Boolean lattice and has its core coded in C++ for computational efficiency purposes. Moreover, featsel includes Perl scripts to add new algorithms and/or cost functions, generate random instances, plot graphs and organize results into tables. Besides, this framework already comes with dozens of algorithms and co...
Geothermal Heat Pump Benchmarking Report
None
1997-01-17
A benchmarking study was conducted on behalf of the Department of Energy to determine the critical factors in successful utility geothermal heat pump programs. A Successful program is one that has achieved significant market penetration. Successfully marketing geothermal heat pumps has presented some major challenges to the utility industry. However, select utilities have developed programs that generate significant GHP sales. This benchmarking study concludes that there are three factors critical to the success of utility GHP marking programs: (1) Top management marketing commitment; (2) An understanding of the fundamentals of marketing and business development; and (3) An aggressive competitive posture. To generate significant GHP sales, competitive market forces must by used. However, because utilities have functioned only in a regulated arena, these companies and their leaders are unschooled in competitive business practices. Therefore, a lack of experience coupled with an intrinsically non-competitive culture yields an industry environment that impedes the generation of significant GHP sales in many, but not all, utilities.
The development of code benchmarks
Glass, R.E.
1986-01-01
Sandia National Laboratories has undertaken a code benchmarking effort to define a series of cask-like problems having both numerical solutions and experimental data. The development of the benchmarks includes: (1) model problem definition, (2) code intercomparison, and (3) experimental verification. The first two steps are complete and a series of experiments are planned. The experiments will examine the elastic/plastic behavior of cylinders for both the end and side impacts resulting from a nine meter drop. The cylinders will be made from stainless steel and aluminum to give a range of plastic deformations. This paper presents the results of analyses simulating the model's behavior using materials properties for stainless steel and aluminum
Benchmarking Variable Selection in QSAR.
Eklund, Martin; Norinder, Ulf; Boyer, Scott; Carlsson, Lars
2012-02-01
Variable selection is important in QSAR modeling since it can improve model performance and transparency, as well as reduce the computational cost of model fitting and predictions. Which variable selection methods that perform well in QSAR settings is largely unknown. To address this question we, in a total of 1728 benchmarking experiments, rigorously investigated how eight variable selection methods affect the predictive performance and transparency of random forest models fitted to seven QSAR datasets covering different endpoints, descriptors sets, types of response variables, and number of chemical compounds. The results show that univariate variable selection methods are suboptimal and that the number of variables in the benchmarked datasets can be reduced with about 60 % without significant loss in model performance when using multivariate adaptive regression splines MARS and forward selection. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Closed-loop neuromorphic benchmarks
Stewart, TC
2015-11-01
Full Text Available Benchmarks Terrence C. Stewart 1* , Travis DeWolf 1 , Ashley Kleinhans 2 , Chris Eliasmith 1 1 University of Waterloo, Canada, 2 Council for Scientific and Industrial Research, South Africa Submitted to Journal: Frontiers in Neuroscience Specialty... Eliasmith 1 1Centre for Theoretical Neuroscience, University of Waterloo, Waterloo, ON, Canada 2Mobile Intelligent Autonomous Systems group, Council for Scientific and Industrial Research, Pretoria, South Africa Correspondence*: Terrence C. Stewart Centre...
Investible benchmarks & hedge fund liquidity
Freed, Marc S; McMillan, Ben
2011-01-01
A lack of commonly accepted benchmarks for hedge fund performance has permitted hedge fund managers to attribute to skill returns that may actually accrue from market risk factors and illiquidity. Recent innovations in hedge fund replication permits us to estimate the extent of this misattribution. Using an option-based model, we find evidence that the value of liquidity options that investors implicitly grant managers when they invest may account for part or even all hedge fund returns. C...
Lattice-induced nonadiabatic frequency shifts in optical lattice clocks
Beloy, K.
2010-01-01
We consider the frequency shift in optical lattice clocks which arises from the coupling of the electronic motion to the atomic motion within the lattice. For the simplest of three-dimensional lattice geometries this coupling is shown to affect only clocks based on blue-detuned lattices. We have estimated the size of this shift for the prospective strontium lattice clock operating at the 390-nm blue-detuned magic wavelength. The resulting fractional frequency shift is found to be on the order of 10 -18 and is largely overshadowed by the electric quadrupole shift. For lattice clocks based on more complex geometries or other atomic systems, this shift could potentially be a limiting factor in clock accuracy.
RISKIND verification and benchmark comparisons
Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.
1997-08-01
This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models
RISKIND verification and benchmark comparisons
Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.
1997-08-01
This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.
Lattice topology dictates photon statistics.
Kondakci, H Esat; Abouraddy, Ayman F; Saleh, Bahaa E A
2017-08-21
Propagation of coherent light through a disordered network is accompanied by randomization and possible conversion into thermal light. Here, we show that network topology plays a decisive role in determining the statistics of the emerging field if the underlying lattice is endowed with chiral symmetry. In such lattices, eigenmode pairs come in skew-symmetric pairs with oppositely signed eigenvalues. By examining one-dimensional arrays of randomly coupled waveguides arranged on linear and ring topologies, we are led to a remarkable prediction: the field circularity and the photon statistics in ring lattices are dictated by its parity while the same quantities are insensitive to the parity of a linear lattice. For a ring lattice, adding or subtracting a single lattice site can switch the photon statistics from super-thermal to sub-thermal, or vice versa. This behavior is understood by examining the real and imaginary fields on a lattice exhibiting chiral symmetry, which form two strands that interleave along the lattice sites. These strands can be fully braided around an even-sited ring lattice thereby producing super-thermal photon statistics, while an odd-sited lattice is incommensurate with such an arrangement and the statistics become sub-thermal.
HS06 Benchmark for an ARM Server
Kluth, Stefan
2014-06-01
We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.
HS06 benchmark for an ARM server
Kluth, Stefan
2014-01-01
We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.
Kilcup, G.
1986-01-01
A progress report on a lattice project at Los Alamos is presented. The projects are basically of two sorts: approaching the continuum (determination of MCRG flows under the blocking transformation, and beta-function along Wilson and improved action lines); and arriving at the continuum (hadron spectrum, coupling constants, and matrix elements). Since the ultimate goal is to determine matrix elements for which chiral symmetry is very relevant, the authors choose the formalism whose chiral properties are easier to understand, i.e., staggered fermions
Lattice of quantum predictions
Drieschner, Michael
1993-10-01
What is the structure of reality? Physics is supposed to answer this question, but a purely empiristic view is not sufficient to explain its ability to do so. Quantum mechanics has forced us to think more deeply about what a physical theory is. There are preconditions every physical theory must fulfill. It has to contain, e.g., rules for empirically testable predictions. Those preconditions give physics a structure that is “a priori” in the Kantian sense. An example is given how the lattice structure of quantum mechanics can be understood along these lines.
Lattice Vibrations in Chlorobenzenes:
Reynolds, P. A.; Kjems, Jørgen; White, J. W.
1974-01-01
Lattice vibrational dispersion curves for the ``intermolecular'' modes in the triclinic, one molecule per unit cell β phase of p‐C6D4Cl2 and p‐C6H4Cl2 have been obtained by inelastic neutron scattering. The deuterated sample was investigated at 295 and at 90°K and a linear extrapolation to 0°K...... was applied in order to correct for anharmonic effects. Calculations based on the atom‐atom model for van der Waals' interaction and on general potential parameters for the aromatic compounds agree reasonably well with the experimental observations. There is no substantial improvement in fit obtained either...
Diamond lattice Heisenberg antiferromagnet
Oitmaa, J.
2018-04-01
We investigate ground-state and high-temperature properties of the nearest-neighbour Heisenberg antiferromagnet on the three-dimensional diamond lattice, using series expansion methods. The ground-state energy and magnetization, as well as the magnon spectrum, are calculated and found to be in good agreement with first-order spin-wave theory, with a quantum renormalization factor of about 1.13. High-temperature series are derived for the free energy, and physical and staggered susceptibilities for spin S = 1/2, 1 and 3/2, and analysed to obtain the corresponding Curie and Néel temperatures.
Lattice cell burnup calculation
Pop-Jordanov, J.
1977-01-01
Accurate burnup prediction is a key item for design and operation of a power reactor. It should supply information on isotopic changes at each point in the reactor core and the consequences of these changes on the reactivity, power distribution, kinetic characters, control rod patterns, fuel cycles and operating strategy. A basic stage in the burnup prediction is the lattice cell burnup calculation. This series of lectures attempts to give a review of the general principles and calculational methods developed and applied in this area of burnup physics
Crisafulli, M.; Martinelli, G.; Sachrajda, Christopher T.; Crisafulli, M; Gimenez, V; Martinelli, G; Sachrajda, C T
1994-01-01
We present the first lattice calculation of the B-meson binding energy \\labar and of the kinetic energy \\lambda_1/2 m_Q of the heavy-quark inside the pseudoscalar B-meson. In order to cancel the ambiguities due to the ultraviolet renormalons present in the operator matrix elements, this calculation has required the non-perturbative subtraction of the power divergences present in the Lagrangian operator \\energy and in the kinetic energy operator \\kkinetic. The non-perturbative renormalization of the relevant operators has been implemented by imposing suitable renormalization conditions on quark matrix elements in the Landau gauge.
Vidovsky, I.; Kereszturi, A.
1991-11-01
The results of experiments and calculations on Gd lattices are presented, and a comparison of experimental and calculational data is given. This latter can be divided into four groups. The first belongs to the comparison of criticality parameters, the second group is related with the comparison of 2D distributions, the third one relates the comparison of intra-macrocell distributions, whereas the fourth group is devoted for the comparison of spectral parameters. For comparison, the computer code RFIT based on strict statistical criteria has been used. The calculated and measured results agree, in most cases, sufficiently. (R.P.) 11 refs.; 13 figs.; 9 tabs
Benchmark of the CASMO-3G/MICROBURN-B codes for Commonwealth Edison boiling water reactors
Wheeler, J.K.; Pallotta, A.S.
1992-01-01
The Commonwealth Edison Company has performed an extensive benchmark against measured data from three boiling water reactors using the Studsvik lattice physics code CASMO-3G and the Siemens Nuclear Power three-dimensional simulator code MICROBURN-B. The measured data of interest for this benchmark are the hot and cold reactivity, and the core power distributions as measured by the traversing incore probe system and gamma scan data for fuel pins and assemblies. A total of nineteen unit-cycles were evaluated. The database included fuel product lines manufactured by General Electric and Siemens Nuclear Power, wit assemblies containing 7 x 7 to 9 x 9 pin configurations, several water rod designs, various enrichments and gadolina loadings, and axially varying lattice designs throughout the enriched portion of the bundle. The results of the benchmark present evidence that the CASMO-3G/MICROBURN-B code package can adequately model the range of fuel and core types in the benchmark, and the codes are acceptable for performing neutronic analyses of Commonwealth Edison's boiling water reactors
Argonne Code Center: Benchmark problem book.
None, None
1977-06-01
This book is an outgrowth of activities of the Computational Benchmark Problems Committee of the Mathematics and Computation Division of the American Nuclear Society. This is the second supplement of the original benchmark book which was first published in February, 1968 and contained computational benchmark problems in four different areas. Supplement No. 1, which was published in December, 1972, contained corrections to the original benchmark book plus additional problems in three new areas. The current supplement. Supplement No. 2, contains problems in eight additional new areas. The objectives of computational benchmark work and the procedures used by the committee in pursuing the objectives are outlined in the original edition of the benchmark book (ANL-7416, February, 1968). The members of the committee who have made contributions to Supplement No. 2 are listed below followed by the contributors to the earlier editions of the benchmark book.
Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...
Lattice Boltzmann model for three-phase viscoelastic fluid flow
Xie, Chiyu; Lei, Wenhai; Wang, Moran
2018-02-01
A lattice Boltzmann (LB) framework is developed for simulation of three-phase viscoelastic fluid flows in complex geometries. This model is based on a Rothman-Keller type model for immiscible multiphase flows which ensures mass conservation of each component in porous media even for a high density ratio. To account for the viscoelastic effects, the Maxwell constitutive relation is correctly introduced into the momentum equation, which leads to a modified lattice Boltzmann evolution equation for Maxwell fluids by removing the normal but excess viscous term. Our simulation tests indicate that this excess viscous term may induce significant errors. After three benchmark cases, the displacement processes of oil by dispersed polymer are studied as a typical example of three-phase viscoelastic fluid flow. The results show that increasing either the polymer intrinsic viscosity or the elastic modulus will enhance the oil recovery.
Lattice Transparency of Graphene.
Chae, Sieun; Jang, Seunghun; Choi, Won Jin; Kim, Youn Sang; Chang, Hyunju; Lee, Tae Il; Lee, Jeong-O
2017-03-08
Here, we demonstrated the transparency of graphene to the atomic arrangement of a substrate surface, i.e., the "lattice transparency" of graphene, by using hydrothermally grown ZnO nanorods as a model system. The growth behaviors of ZnO nanocrystals on graphene-coated and uncoated substrates with various crystal structures were investigated. The atomic arrangements of the nucleating ZnO nanocrystals exhibited a close match with those of the respective substrates despite the substrates being bound to the other side of the graphene. By using first-principles calculations based on density functional theory, we confirmed the energetic favorability of the nucleating phase following the atomic arrangement of the substrate even with the graphene layer present in between. In addition to transmitting information about the atomic lattice of the substrate, graphene also protected its surface. This dual role enabled the hydrothermal growth of ZnO nanorods on a Cu substrate, which otherwise dissolved in the reaction conditions when graphene was absent.
Introduction to lattice gauge theories
La Cock, P.
1988-03-01
A general introduction to Lattice Gauge Theory (LGT) is given. The theory is discussed from first principles to facilitate an understanding of the techniques used in LGT. These include lattice formalism, gauge invariance, fermions on the lattice, group theory and integration, strong coupling methods and mean field techniques. A review of quantum chromodynamics on the lattice at finite temperature and density is also given. Monte Carlo results and analytical methods are discussed. An attempt has been made to include most relevant data up to the end of 1987, and to update some earlier reviews existing on the subject. 224 refs., 33 figs., 14 tabs
Hadron structure from lattice QCD
Schaefer, Andreas
2008-01-01
Some elements and current developments of lattice QCD are reviewed, with special emphasis on hadron structure observables. In principle, high precision experimental and lattice data provide nowadays a very detailled picture of the internal structure of hadrons. However, to relate both, a very good controle of perturbative QCD is needed in many cases. Finally chiral perturbation theory is extremely helpful to boost the precision of lattice calculations. The mutual need and benefit of all four elements: experiment, lattice QCD, perturbative QCD and chiral perturbation theory is the main topic of this review
Lattice formulations of reggeon interactions
Brower, R.C.; Ellis, J.; Savit, R.; Zinn-Justin, J.
1976-01-01
A class of lattice analogues to reggeon field theory is examined. First the transition from a continuum to a lattice field theory is discussed, emphasizing the necessity of a Wick rotation and the consideration of symmetry properties. Next the theory is transformed to a discrete system with two spins at each lattice site, and the problems of the triple-reggeon interaction and the reggeon energy gap are discussed. It is pointed out that transferring the theory from the continuum to a lattice necesarily introduces new relevant operators not normally present in reggeon field theory. (Auth.)
Straight velocity boundaries in the lattice Boltzmann method
Latt, Jonas; Chopard, Bastien; Malaspinas, Orestis; Deville, Michel; Michler, Andreas
2008-05-01
Various ways of implementing boundary conditions for the numerical solution of the Navier-Stokes equations by a lattice Boltzmann method are discussed. Five commonly adopted approaches are reviewed, analyzed, and compared, including local and nonlocal methods. The discussion is restricted to velocity Dirichlet boundary conditions, and to straight on-lattice boundaries which are aligned with the horizontal and vertical lattice directions. The boundary conditions are first inspected analytically by applying systematically the results of a multiscale analysis to boundary nodes. This procedure makes it possible to compare boundary conditions on an equal footing, although they were originally derived from very different principles. It is concluded that all five boundary conditions exhibit second-order accuracy, consistent with the accuracy of the lattice Boltzmann method. The five methods are then compared numerically for accuracy and stability through benchmarks of two-dimensional and three-dimensional flows. None of the methods is found to be throughout superior to the others. Instead, the choice of a best boundary condition depends on the flow geometry, and on the desired trade-off between accuracy and stability. From the findings of the benchmarks, the boundary conditions can be classified into two major groups. The first group comprehends boundary conditions that preserve the information streaming from the bulk into boundary nodes and complete the missing information through closure relations. Boundary conditions in this group are found to be exceptionally accurate at low Reynolds number. Boundary conditions of the second group replace all variables on boundary nodes by new values. They exhibit generally much better numerical stability and are therefore dedicated for use in high Reynolds number flows.
Lattice Boltzmann simulation of flow around a confined circular cyclinder
Ashrafizaadeh, M.; Zadehgol, A.
2002-01-01
A two dimensional lattice Boltzmann model (LBM) based on a single time relaxation BGK model has been developed. Several benchmark problems including the Poiseuille flow, the lid driven cavity flow and the flow around a circular cylinder have been performed employing a d2q9 lattice. The laminar flow around a circular cylinder within a channel has been extensively investigated using the present lattice Boltzmann model. Both symmetric and asymmetric placement configurations of the circular cylinder within the channel have been considered. A new treatment for the outlet velocity and pressure (density) boundary conditions has been proposed and validated. The present LBM results are in excellent agreement with those of the other existing CFD results. Careful examination of the LBM results and an appropriate calculation of the lift coefficient based on the rectangular lattice representation of the circular cylinder reveals that the periodic oscillation of the lift coefficient has a second harmonic when the cylinder is placed asymmetrically within the channel. The second harmonic could be associated with an asymmetrical shedding pattern of the vortices behind the cylinder from the upper and lower sides of the cylinder. (author)
Entropic multirelaxation lattice Boltzmann models for turbulent flows
Bösch, Fabian; Chikatamarla, Shyam S.; Karlin, Ilya V.
2015-10-01
We present three-dimensional realizations of a class of lattice Boltzmann models introduced recently by the authors [I. V. Karlin, F. Bösch, and S. S. Chikatamarla, Phys. Rev. E 90, 031302(R) (2014), 10.1103/PhysRevE.90.031302] and review the role of the entropic stabilizer. Both coarse- and fine-grid simulations are addressed for the Kida vortex flow benchmark. We show that the outstanding numerical stability and performance is independent of a particular choice of the moment representation for high-Reynolds-number flows. We report accurate results for low-order moments for homogeneous isotropic decaying turbulence and second-order grid convergence for most assessed statistical quantities. It is demonstrated that all the three-dimensional lattice Boltzmann realizations considered herein converge to the familiar lattice Bhatnagar-Gross-Krook model when the resolution is increased. Moreover, thanks to the dynamic nature of the entropic stabilizer, the present model features less compressibility effects and maintains correct energy and enstrophy dissipation. The explicit and efficient nature of the present lattice Boltzmann method renders it a promising candidate for both engineering and scientific purposes for highly turbulent flows.
Convection-diffusion lattice Boltzmann scheme for irregular lattices
Sman, van der R.G.M.; Ernst, M.H.
2000-01-01
In this paper, a lattice Boltzmann (LB) scheme for convection diffusion on irregular lattices is presented, which is free of any interpolation or coarse graining step. The scheme is derived using the axioma that the velocity moments of the equilibrium distribution equal those of the
Elimination of spurious lattice fermion solutions and noncompact lattice QCD
Lee, T.D.
1997-09-22
It is well known that the Dirac equation on a discrete hyper-cubic lattice in D dimension has 2{sup D} degenerate solutions. The usual method of removing these spurious solutions encounters difficulties with chiral symmetry when the lattice spacing l {ne} 0, as exemplified by the persistent problem of the pion mass. On the other hand, we recall that in any crystal in nature, all the electrons do move in a lattice and satisfy the Dirac equation; yet there is not a single physical result that has ever been entangled with a spurious fermion solution. Therefore it should not be difficult to eliminate these unphysical elements. On a discrete lattice, particle hop from point to point, whereas in a real crystal the lattice structure in embedded in a continuum and electrons move continuously from lattice cell to lattice cell. In a discrete system, the lattice functions are defined only on individual points (or links as in the case of gauge fields). However, in a crystal the electron state vector is represented by the Bloch wave functions which are continuous functions in {rvec {gamma}}, and herein lies one of the essential differences.
NASA Software Engineering Benchmarking Effort
Godfrey, Sally; Rarick, Heather
2012-01-01
Benchmarking was very interesting and provided a wealth of information (1) We did see potential solutions to some of our "top 10" issues (2) We have an assessment of where NASA stands with relation to other aerospace/defense groups We formed new contacts and potential collaborations (1) Several organizations sent us examples of their templates, processes (2) Many of the organizations were interested in future collaboration: sharing of training, metrics, Capability Maturity Model Integration (CMMI) appraisers, instructors, etc. We received feedback from some of our contractors/ partners (1) Desires to participate in our training; provide feedback on procedures (2) Welcomed opportunity to provide feedback on working with NASA
NEACRP thermal fission product benchmark
Halsall, M.J.; Taubman, C.J.
1989-09-01
The objective of the thermal fission product benchmark was to compare the range of fission product data in use at the present time. A simple homogeneous problem was set with 200 atoms H/1 atom U235, to be burnt up to 1000 days and then decay for 1000 days. The problem was repeated with 200 atoms H/1 atom Pu239, 20 atoms H/1 atom U235 and 20 atoms H/1 atom Pu239. There were ten participants and the submissions received are detailed in this report. (author)
Benchmark neutron porosity log calculations
Little, R.C.; Michael, M.; Verghese, K.; Gardner, R.P.
1989-01-01
Calculations have been made for a benchmark neutron porosity log problem with the general purpose Monte Carlo code MCNP and the specific purpose Monte Carlo code McDNL. For accuracy and timing comparison purposes the CRAY XMP and MicroVax II computers have been used with these codes. The CRAY has been used for an analog version of the MCNP code while the MicroVax II has been used for the optimized variance reduction versions of both codes. Results indicate that the two codes give the same results within calculated standard deviations. Comparisons are given and discussed for accuracy (precision) and computation times for the two codes
On the characterization and software implementation of general protein lattice models.
Alessio Bechini
Full Text Available models of proteins have been widely used as a practical means to computationally investigate general properties of the system. In lattice models any sterically feasible conformation is represented as a self-avoiding walk on a lattice, and residue types are limited in number. So far, only two- or three-dimensional lattices have been used. The inspection of the neighborhood of alpha carbons in the core of real proteins reveals that also lattices with higher coordination numbers, possibly in higher dimensional spaces, can be adopted. In this paper, a new general parametric lattice model for simplified protein conformations is proposed and investigated. It is shown how the supporting software can be consistently designed to let algorithms that operate on protein structures be implemented in a lattice-agnostic way. The necessary theoretical foundations are developed and organically presented, pinpointing the role of the concept of main directions in lattice-agnostic model handling. Subsequently, the model features across dimensions and lattice types are explored in tests performed on benchmark protein sequences, using a Python implementation. Simulations give insights on the use of square and triangular lattices in a range of dimensions. The trend of potential minimum for sequences of different lengths, varying the lattice dimension, is uncovered. Moreover, an extensive quantitative characterization of the usage of the so-called "move types" is reported for the first time. The proposed general framework for the development of lattice models is simple yet complete, and an object-oriented architecture can be proficiently employed for the supporting software, by designing ad-hoc classes. The proposed framework represents a new general viewpoint that potentially subsumes a number of solutions previously studied. The adoption of the described model pushes to look at protein structure issues from a more general and essential perspective, making
Lattice quantum chromodynamics
Hassenfratz, P.
1983-01-01
It is generally accepted that relativistic field theory is relevant in high energy physics. It is also recognized that even in QCD, which is asymptotically free, the scope of perturbation theory is very limited. Despite the tremendous theoretical and experimental effort to study scaling, scaling violations, e + e - , lepton pair creation, jets, etc., the answer to the question whether and to what extent is QCD the theory of strong interactions is vague. At present-day energies it is difficult to disentangle perturbative and non-perturbative effects. The author states that QCD must be understood and that quantitative non-perturbative methods are needed. He states that the lattice formulation of field theories is a promising approach to meeting this need and discusses the formulation in detail in this paper
Geometry of lattice field theory
Honan, T.J.
1986-01-01
Using some tools of algebraic topology, a general formalism for lattice field theory is presented. The lattice is taken to be a simplicial complex that is also a manifold and is referred to as a simplicial manifold. The fields on this lattice are cochains, that are called lattice forms to emphasize the connections with differential forms in the continuum. This connection provides a new bridge between lattice and continuum field theory. A metric can be put onto this simplicial manifold by assigning lengths to every link or I-simplex of the lattice. Regge calculus is a way of defining general relativity on this lattice. A geometric discussion of Regge calculus is presented. The Regge action, which is a discrete form of the Hilbert action, is derived from the Hilbert action using distribution valued forms. This is a new derivation that emphasizes the underlying geometry. Kramers-Wannier duality in statistical mechanics is discussed in this general setting. Nonlinear field theories, which include gauge theories and nonlinear sigma models are discussed in the continuum and then are put onto a lattice. The main new result here is the generalization to curved spacetime, which consists of making the theory compatible with Regge calculus
Homogenization theory in reactor lattices
Benoist, P.
1986-02-01
The purpose of the theory of homogenization of reactor lattices is to determine, by the mean of transport theory, the constants of a homogeneous medium equivalent to a given lattice, which allows to treat the reactor as a whole by diffusion theory. In this note, the problem is presented by laying emphasis on simplicity, as far as possible [fr
Remarks on lattice gauge models
Grosse, H.
1981-01-01
The author reports a study of the phase structure of lattice gauge models where one takes as a gauge group a non-abelian discrete subgroup of SU(3). In addition he comments on a lattice action proposed recently by Manton and observes that it violates a positivity property. (Auth.)
Remarks on lattice gauge models
Grosse, H.
1981-01-01
The author reports on a study of the phase structure of lattice gauge models where one takes as a gauge group a non-abelian discrete subgroup of SU(3). In addition he comments on a lattice action proposed recently by Manton (1980) and observes that it violates a positivity property. (Auth.)
Lattices, supersymmetry and Kaehler fermions
Scott, D.M.
1984-01-01
It is shown that a graded extension of the space group of a (generalised) simple cubic lattice exists in any space dimension, D. The fermionic variables which arise admit a Kaehlerian interpretation. Each graded space group is a subgroup of a graded extension of the appropriate Euclidean group, E(D). The relevance of this to the construction of lattice theories is discussed. (author)
Lattice polytopes in coding theory
Ivan Soprunov
2015-05-01
Full Text Available In this paper we discuss combinatorial questions about lattice polytopes motivated by recent results on minimum distance estimation for toric codes. We also include a new inductive bound for the minimum distance of generalized toric codes. As an application, we give new formulas for the minimum distance of generalized toric codes for special lattice point configurations.
Reevaluation of the Jezebel Benchmark
Favorite, Jeffrey A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2014-03-10
Every nuclear engineering student is familiar with Jezebel, the homogeneous bare sphere of plutonium first assembled at Los Alamos in 1954-1955. The actual Jezebel assembly was neither homogeneous, nor bare, nor spherical; nor was it singular – there were hundreds of Jezebel configurations assembled. The Jezebel benchmark has been reevaluated for the International Criticality Safety Benchmark Evaluation Project (ICSBEP) Handbook. Logbooks, original drawings, mass accountability statements, internal reports, and published reports have been used to model four actual three-dimensional Jezebel assemblies with high fidelity. Because the documentation available today is often inconsistent, three major assumptions were made regarding plutonium part masses and dimensions. The first was that the assembly masses given in Los Alamos report LA-4208 (1969) were correct, and the second was that the original drawing dimension for the polar height of a certain major part was correct. The third assumption was that a change notice indicated on the original drawing was not actually implemented. This talk will describe these assumptions, the alternatives, and the implications. Since the publication of the 2013 ICSBEP Handbook, the actual masses of the major components have turned up. Our assumption regarding the assembly masses was proven correct, but we had the mass distribution incorrect. Work to incorporate the new information is ongoing, and this talk will describe the latest assessment.
SCWEB, Scientific Workstation Evaluation Benchmark
Raffenetti, R C [Computing Services-Support Services Division, Argonne National Laboratory, 9700 South Cass Avenue, Argonne, Illinois 60439 (United States)
1988-06-16
1 - Description of program or function: The SCWEB (Scientific Workstation Evaluation Benchmark) software includes 16 programs which are executed in a well-defined scenario to measure the following performance capabilities of a scientific workstation: implementation of FORTRAN77, processor speed, memory management, disk I/O, monitor (or display) output, scheduling of processing (multiprocessing), and scheduling of print tasks (spooling). 2 - Method of solution: The benchmark programs are: DK1, DK2, and DK3, which do Fourier series fitting based on spline techniques; JC1, which checks the FORTRAN function routines which produce numerical results; JD1 and JD2, which solve dense systems of linear equations in double- and single-precision, respectively; JD3 and JD4, which perform matrix multiplication in single- and double-precision, respectively; RB1, RB2, and RB3, which perform substantial amounts of I/O processing on files other than the input and output files; RR1, which does intense single-precision floating-point multiplication in a tight loop, RR2, which initializes a 512x512 integer matrix in a manner which skips around in the address space rather than initializing each consecutive memory cell in turn; RR3, which writes alternating text buffers to the output file; RR4, which evaluates the timer routines and demonstrates that they conform to the specification; and RR5, which determines whether the workstation is capable of executing a 4-megabyte program
Pynamic: the Python Dynamic Benchmark
Lee, G L; Ahn, D H; de Supinksi, B R; Gyllenhaal, J C; Miller, P J
2007-07-10
Python is widely used in scientific computing to facilitate application development and to support features such as computational steering. Making full use of some of Python's popular features, which improve programmer productivity, leads to applications that access extremely high numbers of dynamically linked libraries (DLLs). As a result, some important Python-based applications severely stress a system's dynamic linking and loading capabilities and also cause significant difficulties for most development environment tools, such as debuggers. Furthermore, using the Python paradigm for large scale MPI-based applications can create significant file IO and further stress tools and operating systems. In this paper, we present Pynamic, the first benchmark program to support configurable emulation of a wide-range of the DLL usage of Python-based applications for large scale systems. Pynamic has already accurately reproduced system software and tool issues encountered by important large Python-based scientific applications on our supercomputers. Pynamic provided insight for our system software and tool vendors, and our application developers, into the impact of several design decisions. As we describe the Pynamic benchmark, we will highlight some of the issues discovered in our large scale system software and tools using Pynamic.
The Isprs Benchmark on Indoor Modelling
Khoshelham, K.; Díaz Vilariño, L.; Peter, M.; Kang, Z.; Acharya, D.
2017-09-01
Automated generation of 3D indoor models from point cloud data has been a topic of intensive research in recent years. While results on various datasets have been reported in literature, a comparison of the performance of different methods has not been possible due to the lack of benchmark datasets and a common evaluation framework. The ISPRS benchmark on indoor modelling aims to address this issue by providing a public benchmark dataset and an evaluation framework for performance comparison of indoor modelling methods. In this paper, we present the benchmark dataset comprising several point clouds of indoor environments captured by different sensors. We also discuss the evaluation and comparison of indoor modelling methods based on manually created reference models and appropriate quality evaluation criteria. The benchmark dataset is available for download at: html"target="_blank">http://www2.isprs.org/commissions/comm4/wg5/benchmark-on-indoor-modelling.html.
Computing the writhe on lattices
Laing, C; Sumners, D W
2006-01-01
Given a polygonal closed curve on a lattice or space group, we describe a method for computing the writhe of the curve as the average of weighted projected writhing numbers of the polygon in a few directions. These directions are determined by the lattice geometry, the weights are determined by areas of regions on the unit 2-sphere, and the regions are formed by the tangent indicatrix to the polygonal curve. We give a new formula for the writhe of polygons on the face centred cubic lattice and prove that the writhe of polygons on the body centred cubic lattice, the hexagonal simple lattice, and the diamond space group is always a rational number, and discuss applications to ring polymers
Benchmarking i eksternt regnskab og revision
Thinggaard, Frank; Kiertzner, Lars
2001-01-01
løbende i en benchmarking-proces. Dette kapitel vil bredt undersøge, hvor man med nogen ret kan få benchmarking-begrebet knyttet til eksternt regnskab og revision. Afsnit 7.1 beskæftiger sig med det eksterne årsregnskab, mens afsnit 7.2 tager fat i revisionsområdet. Det sidste afsnit i kapitlet opsummerer...... betragtningerne om benchmarking i forbindelse med begge områder....
Computational Chemistry Comparison and Benchmark Database
SRD 101 NIST Computational Chemistry Comparison and Benchmark Database (Web, free access) The NIST Computational Chemistry Comparison and Benchmark Database is a collection of experimental and ab initio thermochemical properties for a selected set of molecules. The goals are to provide a benchmark set of molecules for the evaluation of ab initio computational methods and allow the comparison between different ab initio computational methods for the prediction of thermochemical properties.
Monte Carlo code criticality benchmark comparisons for waste packaging
Alesso, H.P.; Annese, C.E.; Buck, R.M.; Pearson, J.S.; Lloyd, W.R.
1992-07-01
COG is a new point-wise Monte Carlo code being developed and tested at Lawrence Livermore National Laboratory (LLNL). It solves the Boltzmann equation for the transport of neutrons and photons. The objective of this paper is to report on COG results for criticality benchmark experiments both on a Cray mainframe and on a HP 9000 workstation. COG has been recently ported to workstations to improve its accessibility to a wider community of users. COG has some similarities to a number of other computer codes used in the shielding and criticality community. The recently introduced high performance reduced instruction set (RISC) UNIX workstations provide computational power that approach mainframes at a fraction of the cost. A version of COG is currently being developed for the Hewlett Packard 9000/730 computer with a UNIX operating system. Subsequent porting operations will move COG to SUN, DEC, and IBM workstations. In addition, a CAD system for preparation of the geometry input for COG is being developed. In July 1977, Babcock ampersand Wilcox Co. (B ampersand W) was awarded a contract to conduct a series of critical experiments that simulated close-packed storage of LWR-type fuel. These experiments provided data for benchmarking and validating calculational methods used in predicting K-effective of nuclear fuel storage in close-packed, neutron poisoned arrays. Low enriched UO2 fuel pins in water-moderated lattices in fuel storage represent a challenging criticality calculation for Monte Carlo codes particularly when the fuel pins extend out of the water. COG and KENO calculational results of these criticality benchmark experiments are presented
Aerodynamic Benchmarking of the Deepwind Design
Bedona, Gabriele; Schmidt Paulsen, Uwe; Aagaard Madsen, Helge
2015-01-01
The aerodynamic benchmarking for the DeepWind rotor is conducted comparing different rotor geometries and solutions and keeping the comparison as fair as possible. The objective for the benchmarking is to find the most suitable configuration in order to maximize the power production and minimize...... the blade solicitation and the cost of energy. Different parameters are considered for the benchmarking study. The DeepWind blade is characterized by a shape similar to the Troposkien geometry but asymmetric between the top and bottom parts: this shape is considered as a fixed parameter in the benchmarking...
HPC Benchmark Suite NMx, Phase I
National Aeronautics and Space Administration — Intelligent Automation Inc., (IAI) and University of Central Florida (UCF) propose to develop a comprehensive numerical test suite for benchmarking current and...
High Energy Physics (HEP) benchmark program
Yasu, Yoshiji; Ichii, Shingo; Yashiro, Shigeo; Hirayama, Hideo; Kokufuda, Akihiro; Suzuki, Eishin.
1993-01-01
High Energy Physics (HEP) benchmark programs are indispensable tools to select suitable computer for HEP application system. Industry standard benchmark programs can not be used for this kind of particular selection. The CERN and the SSC benchmark suite are famous HEP benchmark programs for this purpose. The CERN suite includes event reconstruction and event generator programs, while the SSC one includes event generators. In this paper, we found that the results from these two suites are not consistent. And, the result from the industry benchmark does not agree with either of these two. Besides, we describe comparison of benchmark results using EGS4 Monte Carlo simulation program with ones from two HEP benchmark suites. Then, we found that the result from EGS4 in not consistent with the two ones. The industry standard of SPECmark values on various computer systems are not consistent with the EGS4 results either. Because of these inconsistencies, we point out the necessity of a standardization of HEP benchmark suites. Also, EGS4 benchmark suite should be developed for users of applications such as medical science, nuclear power plant, nuclear physics and high energy physics. (author)
Establishing benchmarks and metrics for utilization management.
Melanson, Stacy E F
2014-01-01
The changing environment of healthcare reimbursement is rapidly leading to a renewed appreciation of the importance of utilization management in the clinical laboratory. The process of benchmarking of laboratory operations is well established for comparing organizational performance to other hospitals (peers) and for trending data over time through internal benchmarks. However, there are relatively few resources available to assist organizations in benchmarking for laboratory utilization management. This article will review the topic of laboratory benchmarking with a focus on the available literature and services to assist in managing physician requests for laboratory testing. © 2013.
Professional Performance and Bureaucratic Benchmarking Information
Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz
Prior research documents positive effects of benchmarking information provision on performance and attributes this to social comparisons. However, the effects on professional recipients are unclear. Studies of professional control indicate that professional recipients often resist bureaucratic...... controls because of organizational-professional conflicts. We therefore analyze the association between bureaucratic benchmarking information provision and professional performance and suggest that the association is more positive if prior professional performance was low. We test our hypotheses based...... on archival, publicly disclosed, professional performance data for 191 German orthopedics departments, matched with survey data on bureaucratic benchmarking information given to chief orthopedists by the administration. We find a positive association between bureaucratic benchmarking information provision...
MCNP analysis of the nine-cell LWR gadolinium benchmark
Arkuszewski, J.J.
1988-01-01
The Monte Carlo results for a 9-cell fragment of the light water reactor square lattice with a central gadolinium-loaded pin are presented. The calculations are performed with the code MCNP-3A and the ENDF-B/5 library and compared with the results obtained from the BOXER code system and the JEF-1 library. The objective of this exercise is to study the feasibility of BOXER for the analysis of a Gd-loaded LWR lattice in the broader framework of GAP International Benchmark Analysis. A comparison of results indicates that, apart from unavoidable discrepancies originating from different data evaluations, the BOXER code overestimates the multiplication factor by 1.4 % and underestimates the power release in a Gd cell by 4.66 %. It is hoped that further similar studies with use of the JEF-1 library for both BOXER and MCNP will help to isolate and explain these discrepancies in a cleaner way. (author) 4 refs., 9 figs., 10 tabs
Lattice gas cellular automata and lattice Boltzmann models an introduction
Wolf-Gladrow, Dieter A
2000-01-01
Lattice-gas cellular automata (LGCA) and lattice Boltzmann models (LBM) are relatively new and promising methods for the numerical solution of nonlinear partial differential equations. The book provides an introduction for graduate students and researchers. Working knowledge of calculus is required and experience in PDEs and fluid dynamics is recommended. Some peculiarities of cellular automata are outlined in Chapter 2. The properties of various LGCA and special coding techniques are discussed in Chapter 3. Concepts from statistical mechanics (Chapter 4) provide the necessary theoretical background for LGCA and LBM. The properties of lattice Boltzmann models and a method for their construction are presented in Chapter 5.
Benchmarking of nuclear economics tools
Moore, Megan; Korinny, Andriy; Shropshire, David; Sadhankar, Ramesh
2017-01-01
Highlights: • INPRO and GIF economic tools exhibited good alignment in total capital cost estimation. • Subtle discrepancies in the cost result from differences in financing and the fuel cycle assumptions. • A common set of assumptions was found to reduce the discrepancies to 1% or less. • Opportunities for harmonisation of economic tools exists. - Abstract: Benchmarking of the economics methodologies developed by the Generation IV International Forum (GIF) and the International Atomic Energy Agency’s International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO), was performed for three Generation IV nuclear energy systems. The Economic Modeling Working Group of GIF developed an Excel based spreadsheet package, G4ECONS (Generation 4 Excel-based Calculation Of Nuclear Systems), to calculate the total capital investment cost (TCIC) and the levelised unit energy cost (LUEC). G4ECONS is sufficiently generic in the sense that it can accept the types of projected input, performance and cost data that are expected to become available for Generation IV systems through various development phases and that it can model both open and closed fuel cycles. The Nuclear Energy System Assessment (NESA) Economic Support Tool (NEST) was developed to enable an economic analysis using the INPRO methodology to easily calculate outputs including the TCIC, LUEC and other financial figures of merit including internal rate of return, return of investment and net present value. NEST is also Excel based and can be used to evaluate nuclear reactor systems using the open fuel cycle, MOX (mixed oxide) fuel recycling and closed cycles. A Super Critical Water-cooled Reactor system with an open fuel cycle and two Fast Reactor systems, one with a break-even fuel cycle and another with a burner fuel cycle, were selected for the benchmarking exercise. Published data on capital and operating costs were used for economics analyses using G4ECONS and NEST tools. Both G4ECONS and
A lattice Boltzmann coupled to finite volumes method for solving phase change problems
El Ganaoui Mohammed
2009-01-01
Full Text Available A numerical scheme coupling lattice Boltzmann and finite volumes approaches has been developed and qualified for test cases of phase change problems. In this work, the coupled partial differential equations of momentum conservation equations are solved with a non uniform lattice Boltzmann method. The energy equation is discretized by using a finite volume method. Simulations show the ability of this developed hybrid method to model the effects of convection, and to predict transfers. Benchmarking is operated both for conductive and convective situation dominating solid/liquid transition. Comparisons are achieved with respect to available analytical solutions and experimental results.
FENDL neutronics benchmark: Specifications for the calculational neutronics and shielding benchmark
Sawan, M.E.
1994-12-01
During the IAEA Advisory Group Meeting on ''Improved Evaluations and Integral Data Testing for FENDL'' held in Garching near Munich, Germany in the period 12-16 September 1994, the Working Group II on ''Experimental and Calculational Benchmarks on Fusion Neutronics for ITER'' recommended that a calculational benchmark representative of the ITER design should be developed. This report describes the neutronics and shielding calculational benchmark available for scientists interested in performing analysis for this benchmark. (author)
Initial Mechanical Testing of Superalloy Lattice Block Structures Conducted
Krause, David L.; Whittenberger, J. Daniel
2002-01-01
The first mechanical tests of superalloy lattice block structures produced promising results for this exciting new lightweight material system. The testing was performed in-house at NASA Glenn Research Center's Structural Benchmark Test Facility, where small subelement-sized compression and beam specimens were loaded to observe elastic and plastic behavior, component strength levels, and fatigue resistance for hundreds of thousands of load cycles. Current lattice block construction produces a flat panel composed of thin ligaments arranged in a three-dimensional triangulated trusslike structure. Investment casting of lattice block panels has been developed and greatly expands opportunities for using this unique architecture in today's high-performance structures. In addition, advances made in NASA's Ultra-Efficient Engine Technology Program have extended the lattice block concept to superalloy materials. After a series of casting iterations, the nickel-based superalloy Inconel 718 (IN 718, Inco Alloys International, Inc., Huntington, WV) was successfully cast into lattice block panels; this combination offers light weight combined with high strength, high stiffness, and elevated-temperature durability. For tests to evaluate casting quality and configuration merit, small structural compression and bend test specimens were machined from the 5- by 12- by 0.5-in. panels. Linear elastic finite element analyses were completed for several specimen layouts to predict material stresses and deflections under proposed test conditions. The structural specimens were then subjected to room-temperature static and cyclic loads in Glenn's Life Prediction Branch's material test machine. Surprisingly, the test results exceeded analytical predictions: plastic strains greater than 5 percent were obtained, and fatigue lives did not depreciate relative to the base material. These assets were due to the formation of plastic hinges and the redundancies inherent in lattice block construction
Chen, Yuntian; Zhang, Yan; Femius Koenderink, A
2017-09-04
We study semi-analytically the light emission and absorption properties of arbitrary stratified photonic structures with embedded two-dimensional magnetoelectric point scattering lattices, as used in recent plasmon-enhanced LEDs and solar cells. By employing dyadic Green's function for the layered structure in combination with the Ewald lattice summation to deal with the particle lattice, we develop an efficient method to study the coupling between planar 2D scattering lattices of plasmonic, or metamaterial point particles, coupled to layered structures. Using the 'array scanning method' we deal with localized sources. Firstly, we apply our method to light emission enhancement of dipole emitters in slab waveguides, mediated by plasmonic lattices. We benchmark the array scanning method against a reciprocity-based approach to find that the calculated radiative rate enhancement in k-space below the light cone shows excellent agreement. Secondly, we apply our method to study absorption-enhancement in thin-film solar cells mediated by periodic Ag nanoparticle arrays. Lastly, we study the emission distribution in k-space of a coupled waveguide-lattice system. In particular, we explore the dark mode excitation on the plasmonic lattice using the so-called array scanning method. Our method could be useful for simulating a broad range of complex nanophotonic structures, i.e., metasurfaces, plasmon-enhanced light emitting systems and photovoltaics.
Human factors reliability Benchmark exercise
Poucet, A.
1989-06-01
The Joint Research Centre of the European Commission has organized a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim of assessing the state of the art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participated in the HF-RBE. The HF-RBE was organized around two study cases: (1) analysis of routine functional Test and Maintenance (T and M) procedures: with the aim of assessing the probability of test induced failures, the probability of failures to remain unrevealed and the potential to initiate transients because of errors performed in the test; (2) analysis of human actions during an operational transient: with the aim of assessing the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. This report contains the final summary reports produced by the participants in the exercise
Experimental and computational benchmark tests
Gilliam, D.M.; Briesmeister, J.F.
1994-01-01
A program involving principally NIST, LANL, and ORNL has been in progress for about four years now to establish a series of benchmark measurements and calculations related to the moderation and leakage of 252 Cf neutrons from a source surrounded by spherical aqueous moderators of various thicknesses and compositions. The motivation for these studies comes from problems in criticality calculations concerning arrays of multiplying components, where the leakage from one component acts as a source for the other components. This talk compares experimental and calculated values for the fission rates of four nuclides - 235 U, 239 Pu, 238 U, and 237 Np - in the leakage spectrum from moderator spheres of diameters 76.2 mm, 101.6 mm, and 127.0 mm, with either pure water or enriched B-10 solutions as the moderator. Very detailed Monte Carlo calculations were done with the MCNP code, using a open-quotes light waterclose quotes S(α,β) scattering kernel
ENVIRONMENTAL BENCHMARKING FOR LOCAL AUTHORITIES
Marinela GHEREŞ
2010-01-01
Full Text Available This paper is an attempt to clarify and present the many definitions ofbenchmarking. It also attempts to explain the basic steps of benchmarking, toshow how this tool can be applied by local authorities as well as to discuss itspotential benefits and limitations. It is our strong belief that if cities useindicators and progressively introduce targets to improve management andrelated urban life quality, and to measure progress towards more sustainabledevelopment, we will also create a new type of competition among cities andfoster innovation. This is seen to be important because local authorities’actions play a vital role in responding to the challenges of enhancing thestate of the environment not only in policy-making, but also in the provision ofservices and in the planning process. Local communities therefore need tobe aware of their own sustainability performance levels and should be able toengage in exchange of best practices to respond effectively to the ecoeconomicalchallenges of the century.
Benchmark results in radiative transfer
Garcia, R.D.M.; Siewert, C.E.
1986-02-01
Several aspects of the F N method are reported, and the method is used to solve accurately some benchmark problems in radiative transfer in the field of atmospheric physics. The method was modified to solve cases of pure scattering and an improved process was developed for computing the radiation intensity. An algorithms for computing several quantities used in the F N method was done. An improved scheme to evaluate certain integrals relevant to the method is done, and a two-term recursion relation that has proved useful for the numerical evaluation of matrix elements, basic for the method, is given. The methods used to solve the encountered linear algebric equations are discussed, and the numerical results are evaluated. (M.C.K.) [pt
Irreversible stochastic processes on lattices
Nord, R.S.
1986-01-01
Models for irreversible random or cooperative filling of lattices are required to describe many processes in chemistry and physics. Since the filling is assumed to be irreversible, even the stationary, saturation state is not in equilibrium. The kinetics and statistics of these processes are described by recasting the master equations in infinite hierarchical form. Solutions can be obtained by implementing various techniques: refinements in these solution techniques are presented. Programs considered include random dimer, trimer, and tetramer filling of 2D lattices, random dimer filling of a cubic lattice, competitive filling of two or more species, and the effect of a random distribution of inactive sites on the filling. Also considered is monomer filling of a linear lattice with nearest neighbor cooperative effects and solve for the exact cluster-size distribution for cluster sizes up to the asymptotic regime. Additionally, a technique is developed to directly determine the asymptotic properties of the cluster size distribution. Finally cluster growth is considered via irreversible aggregation involving random walkers. In particular, explicit results are provided for the large-lattice-size asymptotic behavior of trapping probabilities and average walk lengths for a single walker on a lattice with multiple traps. Procedures for exact calculation of these quantities on finite lattices are also developed
Toward lattice fractional vector calculus
Tarasov, Vasily E
2014-01-01
An analog of fractional vector calculus for physical lattice models is suggested. We use an approach based on the models of three-dimensional lattices with long-range inter-particle interactions. The lattice analogs of fractional partial derivatives are represented by kernels of lattice long-range interactions, where the Fourier series transformations of these kernels have a power-law form with respect to wave vector components. In the continuum limit, these lattice partial derivatives give derivatives of non-integer order with respect to coordinates. In the three-dimensional description of the non-local continuum, the fractional differential operators have the form of fractional partial derivatives of the Riesz type. As examples of the applications of the suggested lattice fractional vector calculus, we give lattice models with long-range interactions for the fractional Maxwell equations of non-local continuous media and for the fractional generalization of the Mindlin and Aifantis continuum models of gradient elasticity. (papers)
Introduction to lattice gauge theory
Gupta, R.
1987-01-01
The lattice formulation of Quantum Field Theory (QFT) can be exploited in many ways. We can derive the lattice Feynman rules and carry out weak coupling perturbation expansions. The lattice then serves as a manifestly gauge invariant regularization scheme, albeit one that is more complicated than standard continuum schemes. Strong coupling expansions: these give us useful qualitative information, but unfortunately no hard numbers. The lattice theory is amenable to numerical simulations by which one calculates the long distance properties of a strongly interacting theory from first principles. The observables are measured as a function of the bare coupling g and a gauge invariant cut-off ≅ 1/α, where α is the lattice spacing. The continuum (physical) behavior is recovered in the limit α → 0, at which point the lattice artifacts go to zero. This is the more powerful use of lattice formulation, so in these lectures the author focuses on setting up the theory for the purpose of numerical simulations to get hard numbers. The numerical techniques used in Lattice Gauge Theories have their roots in statistical mechanics, so it is important to develop an intuition for the interconnection between quantum mechanics and statistical mechanics. This will be the emphasis of the first lecture. In the second lecture, the author reviews the essential ingredients of formulating QCD on the lattice and discusses scaling and the continuum limit. In the last lecture the author summarizes the status of some of the main results. He also mentions the bottlenecks and possible directions for research. 88 refs
NASA Software Engineering Benchmarking Study
Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.
2013-01-01
To identify best practices for the improvement of software engineering on projects, NASA's Offices of Chief Engineer (OCE) and Safety and Mission Assurance (OSMA) formed a team led by Heather Rarick and Sally Godfrey to conduct this benchmarking study. The primary goals of the study are to identify best practices that: Improve the management and technical development of software intensive systems; Have a track record of successful deployment by aerospace industries, universities [including research and development (R&D) laboratories], and defense services, as well as NASA's own component Centers; and Identify candidate solutions for NASA's software issues. Beginning in the late fall of 2010, focus topics were chosen and interview questions were developed, based on the NASA top software challenges. Between February 2011 and November 2011, the Benchmark Team interviewed a total of 18 organizations, consisting of five NASA Centers, five industry organizations, four defense services organizations, and four university or university R and D laboratory organizations. A software assurance representative also participated in each of the interviews to focus on assurance and software safety best practices. Interviewees provided a wealth of information on each topic area that included: software policy, software acquisition, software assurance, testing, training, maintaining rigor in small projects, metrics, and use of the Capability Maturity Model Integration (CMMI) framework, as well as a number of special topics that came up in the discussions. NASA's software engineering practices compared favorably with the external organizations in most benchmark areas, but in every topic, there were ways in which NASA could improve its practices. Compared to defense services organizations and some of the industry organizations, one of NASA's notable weaknesses involved communication with contractors regarding its policies and requirements for acquired software. One of NASA's strengths
Lattice Methods for Quantum Chromodynamics
DeGrand, Thomas
2006-01-01
Numerical simulation of lattice-regulated QCD has become an important source of information about strong interactions. In the last few years there has been an explosion of techniques for performing ever more accurate studies on the properties of strongly interacting particles. Lattice predictions directly impact many areas of particle and nuclear physics theory and phenomenology. This book provides a thorough introduction to the specialized techniques needed to carry out numerical simulations of QCD: a description of lattice discretizations of fermions and gauge fields, methods for actually do
Localized structures in Kagome lattices
Saxena, Avadh B [Los Alamos National Laboratory; Bishop, Alan R [Los Alamos National Laboratory; Law, K J H [UNIV OF MASSACHUSETTS; Kevrekidis, P G [UNIV OF MASSACHUSETTS
2009-01-01
We investigate the existence and stability of gap vortices and multi-pole gap solitons in a Kagome lattice with a defocusing nonlinearity both in a discrete case and in a continuum one with periodic external modulation. In particular, predictions are made based on expansion around a simple and analytically tractable anti-continuum (zero coupling) limit. These predictions are then confirmed for a continuum model of an optically-induced Kagome lattice in a photorefractive crystal obtained by a continuous transformation of a honeycomb lattice.
Lattice QCD: Status and Prospect
Ukawa, Akira
2006-01-01
A brief review is given of the current status and near-future prospect of lattice QCD studies of the Standard Model. After summarizing a bit of history, we describe current attempts toward inclusion of dynamical up, down and strange quarks. Recent results on the light hadron mass spectrum as well as those on the heavy quark quantities are described. Recent work on lattice pentaquark search is summarized. We touch upon the PACS-CS Project for building our next machine for lattice QCD, and conclude with a summary of computer situation and the physics possibilities over the next several years
Borwein, J M; McPhedran, R C
2013-01-01
The study of lattice sums began when early investigators wanted to go from mechanical properties of crystals to the properties of the atoms and ions from which they were built (the literature of Madelung's constant). A parallel literature was built around the optical properties of regular lattices of atoms (initiated by Lord Rayleigh, Lorentz and Lorenz). For over a century many famous scientists and mathematicians have delved into the properties of lattices, sometimes unwittingly duplicating the work of their predecessors. Here, at last, is a comprehensive overview of the substantial body of
The role of benchmarking for yardstick competition
Burns, Phil; Jenkins, Cloda; Riechmann, Christoph
2005-01-01
With the increasing interest in yardstick regulation, there is a need to understand the most appropriate method for realigning tariffs at the outset. Benchmarking is the tool used for such realignment and is therefore a necessary first-step in the implementation of yardstick competition. A number of concerns have been raised about the application of benchmarking, making some practitioners reluctant to move towards yardstick based regimes. We assess five of the key concerns often discussed and find that, in general, these are not as great as perceived. The assessment is based on economic principles and experiences with applying benchmarking to regulated sectors, e.g. in the electricity and water industries in the UK, The Netherlands, Austria and Germany in recent years. The aim is to demonstrate that clarity on the role of benchmarking reduces the concern about its application in different regulatory regimes. We find that benchmarking can be used in regulatory settlements, although the range of possible benchmarking approaches that are appropriate will be small for any individual regulatory question. Benchmarking is feasible as total cost measures and environmental factors are better defined in practice than is commonly appreciated and collusion is unlikely to occur in environments with more than 2 or 3 firms (where shareholders have a role in monitoring and rewarding performance). Furthermore, any concern about companies under-recovering costs is a matter to be determined through the regulatory settlement and does not affect the case for using benchmarking as part of that settlement. (author)
Benchmarking set for domestic smart grid management
Bosman, M.G.C.; Bakker, Vincent; Molderink, Albert; Hurink, Johann L.; Smit, Gerardus Johannes Maria
2010-01-01
In this paper we propose a benchmark for domestic smart grid management. It consists of an in-depth description of a domestic smart grid, in which local energy consumers, producers and buffers can be controlled. First, from this description a general benchmark framework is derived, which can be used
Medical school benchmarking - from tools to programmes.
Wilkinson, Tim J; Hudson, Judith N; Mccoll, Geoffrey J; Hu, Wendy C Y; Jolly, Brian C; Schuwirth, Lambert W T
2015-02-01
Benchmarking among medical schools is essential, but may result in unwanted effects. To apply a conceptual framework to selected benchmarking activities of medical schools. We present an analogy between the effects of assessment on student learning and the effects of benchmarking on medical school educational activities. A framework by which benchmarking can be evaluated was developed and applied to key current benchmarking activities in Australia and New Zealand. The analogy generated a conceptual framework that tested five questions to be considered in relation to benchmarking: what is the purpose? what are the attributes of value? what are the best tools to assess the attributes of value? what happens to the results? and, what is the likely "institutional impact" of the results? If the activities were compared against a blueprint of desirable medical graduate outcomes, notable omissions would emerge. Medical schools should benchmark their performance on a range of educational activities to ensure quality improvement and to assure stakeholders that standards are being met. Although benchmarking potentially has positive benefits, it could also result in perverse incentives with unforeseen and detrimental effects on learning if it is undertaken using only a few selected assessment tools.
Benchmarking in digital circuit design automation
Jozwiak, L.; Gawlowski, D.M.; Slusarczyk, A.S.
2008-01-01
This paper focuses on benchmarking, which is the main experimental approach to the design method and EDA-tool analysis, characterization and evaluation. We discuss the importance and difficulties of benchmarking, as well as the recent research effort related to it. To resolve several serious
Benchmark Two-Good Utility Functions
de Jaegher, K.
Benchmark two-good utility functions involving a good with zero income elasticity and unit income elasticity are well known. This paper derives utility functions for the additional benchmark cases where one good has zero cross-price elasticity, unit own-price elasticity, and zero own price
Repeated Results Analysis for Middleware Regression Benchmarking
Bulej, Lubomír; Kalibera, T.; Tůma, P.
2005-01-01
Roč. 60, - (2005), s. 345-358 ISSN 0166-5316 R&D Projects: GA ČR GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : middleware benchmarking * regression benchmarking * regression testing Subject RIV: JD - Computer Applications, Robotics Impact factor: 0.756, year: 2005
Benchmarking the energy efficiency of commercial buildings
Chung, William; Hui, Y.V.; Lam, Y. Miu
2006-01-01
Benchmarking energy-efficiency is an important tool to promote the efficient use of energy in commercial buildings. Benchmarking models are mostly constructed in a simple benchmark table (percentile table) of energy use, which is normalized with floor area and temperature. This paper describes a benchmarking process for energy efficiency by means of multiple regression analysis, where the relationship between energy-use intensities (EUIs) and the explanatory factors (e.g., operating hours) is developed. Using the resulting regression model, these EUIs are then normalized by removing the effect of deviance in the significant explanatory factors. The empirical cumulative distribution of the normalized EUI gives a benchmark table (or percentile table of EUI) for benchmarking an observed EUI. The advantage of this approach is that the benchmark table represents a normalized distribution of EUI, taking into account all the significant explanatory factors that affect energy consumption. An application to supermarkets is presented to illustrate the development and the use of the benchmarking method
Benchmarking, Total Quality Management, and Libraries.
Shaughnessy, Thomas W.
1993-01-01
Discussion of the use of Total Quality Management (TQM) in higher education and academic libraries focuses on the identification, collection, and use of reliable data. Methods for measuring quality, including benchmarking, are described; performance measures are considered; and benchmarking techniques are examined. (11 references) (MES)
A Seafloor Benchmark for 3-dimensional Geodesy
Chadwell, C. D.; Webb, S. C.; Nooner, S. L.
2014-12-01
We have developed an inexpensive, permanent seafloor benchmark to increase the longevity of seafloor geodetic measurements. The benchmark provides a physical tie to the sea floor lasting for decades (perhaps longer) on which geodetic sensors can be repeatedly placed and removed with millimeter resolution. Global coordinates estimated with seafloor geodetic techniques will remain attached to the benchmark allowing for the interchange of sensors as they fail or become obsolete, or for the sensors to be removed and used elsewhere, all the while maintaining a coherent series of positions referenced to the benchmark. The benchmark has been designed to free fall from the sea surface with transponders attached. The transponder can be recalled via an acoustic command sent from the surface to release from the benchmark and freely float to the sea surface for recovery. The duration of the sensor attachment to the benchmark will last from a few days to a few years depending on the specific needs of the experiment. The recovered sensors are then available to be reused at other locations, or again at the same site in the future. Three pins on the sensor frame mate precisely and unambiguously with three grooves on the benchmark. To reoccupy a benchmark a Remotely Operated Vehicle (ROV) uses its manipulator arm to place the sensor pins into the benchmark grooves. In June 2014 we deployed four benchmarks offshore central Oregon. We used the ROV Jason to successfully demonstrate the removal and replacement of packages onto the benchmark. We will show the benchmark design and its operational capabilities. Presently models of megathrust slip within the Cascadia Subduction Zone (CSZ) are mostly constrained by the sub-aerial GPS vectors from the Plate Boundary Observatory, a part of Earthscope. More long-lived seafloor geodetic measures are needed to better understand the earthquake and tsunami risk associated with a large rupture of the thrust fault within the Cascadia subduction zone
SP2Bench: A SPARQL Performance Benchmark
Schmidt, Michael; Hornung, Thomas; Meier, Michael; Pinkel, Christoph; Lausen, Georg
A meaningful analysis and comparison of both existing storage schemes for RDF data and evaluation approaches for SPARQL queries necessitates a comprehensive and universal benchmark platform. We present SP2Bench, a publicly available, language-specific performance benchmark for the SPARQL query language. SP2Bench is settled in the DBLP scenario and comprises a data generator for creating arbitrarily large DBLP-like documents and a set of carefully designed benchmark queries. The generated documents mirror vital key characteristics and social-world distributions encountered in the original DBLP data set, while the queries implement meaningful requests on top of this data, covering a variety of SPARQL operator constellations and RDF access patterns. In this chapter, we discuss requirements and desiderata for SPARQL benchmarks and present the SP2Bench framework, including its data generator, benchmark queries and performance metrics.
Benchmarking of refinery emissions performance : Executive summary
2003-07-01
This study was undertaken to collect emissions performance data for Canadian and comparable American refineries. The objective was to examine parameters that affect refinery air emissions performance and develop methods or correlations to normalize emissions performance. Another objective was to correlate and compare the performance of Canadian refineries to comparable American refineries. For the purpose of this study, benchmarking involved the determination of levels of emission performance that are being achieved for generic groups of facilities. A total of 20 facilities were included in the benchmarking analysis, and 74 American refinery emission correlations were developed. The recommended benchmarks, and the application of those correlations for comparison between Canadian and American refinery performance, were discussed. The benchmarks were: sulfur oxides, nitrogen oxides, carbon monoxide, particulate, volatile organic compounds, ammonia and benzene. For each refinery in Canada, benchmark emissions were developed. Several factors can explain differences in Canadian and American refinery emission performance. 4 tabs., 7 figs
INL Results for Phases I and III of the OECD/NEA MHTGR-350 Benchmark
Gerhard Strydom; Javier Ortensi; Sonat Sen; Hans Hammer
2013-09-01
The Idaho National Laboratory (INL) Very High Temperature Reactor (VHTR) Technology Development Office (TDO) Methods Core Simulation group led the construction of the Organization for Economic Cooperation and Development (OECD) Modular High Temperature Reactor (MHTGR) 350 MW benchmark for comparing and evaluating prismatic VHTR analysis codes. The benchmark is sponsored by the OECD's Nuclear Energy Agency (NEA), and the project will yield a set of reference steady-state, transient, and lattice depletion problems that can be used by the Department of Energy (DOE), the Nuclear Regulatory Commission (NRC), and vendors to assess their code suits. The Methods group is responsible for defining the benchmark specifications, leading the data collection and comparison activities, and chairing the annual technical workshops. This report summarizes the latest INL results for Phase I (steady state) and Phase III (lattice depletion) of the benchmark. The INSTANT, Pronghorn and RattleSnake codes were used for the standalone core neutronics modeling of Exercise 1, and the results obtained from these codes are compared in Section 4. Exercise 2 of Phase I requires the standalone steady-state thermal fluids modeling of the MHTGR-350 design, and the results for the systems code RELAP5-3D are discussed in Section 5. The coupled neutronics and thermal fluids steady-state solution for Exercise 3 are reported in Section 6, utilizing the newly developed Parallel and Highly Innovative Simulation for INL Code System (PHISICS)/RELAP5-3D code suit. Finally, the lattice depletion models and results obtained for Phase III are compared in Section 7. The MHTGR-350 benchmark proved to be a challenging simulation set of problems to model accurately, and even with the simplifications introduced in the benchmark specification this activity is an important step in the code-to-code verification of modern prismatic VHTR codes. A final OECD/NEA comparison report will compare the Phase I and III
Vver-1000 Mox core computational benchmark
2006-01-01
The NEA Nuclear Science Committee has established an Expert Group that deals with the status and trends of reactor physics, fuel performance and fuel cycle issues related to disposing of weapons-grade plutonium in mixed-oxide fuel. The objectives of the group are to provide NEA member countries with up-to-date information on, and to develop consensus regarding, core and fuel cycle issues associated with burning weapons-grade plutonium in thermal water reactors (PWR, BWR, VVER-1000, CANDU) and fast reactors (BN-600). These issues concern core physics, fuel performance and reliability, and the capability and flexibility of thermal water reactors and fast reactors to dispose of weapons-grade plutonium in standard fuel cycles. The activities of the NEA Expert Group on Reactor-based Plutonium Disposition are carried out in close co-operation (jointly, in most cases) with the NEA Working Party on Scientific Issues in Reactor Systems (WPRS). A prominent part of these activities include benchmark studies. At the time of preparation of this report, the following benchmarks were completed or in progress: VENUS-2 MOX Core Benchmarks: carried out jointly with the WPRS (formerly the WPPR) (completed); VVER-1000 LEU and MOX Benchmark (completed); KRITZ-2 Benchmarks: carried out jointly with the WPRS (formerly the WPPR) (completed); Hollow and Solid MOX Fuel Behaviour Benchmark (completed); PRIMO MOX Fuel Performance Benchmark (ongoing); VENUS-2 MOX-fuelled Reactor Dosimetry Calculation (ongoing); VVER-1000 In-core Self-powered Neutron Detector Calculational Benchmark (started); MOX Fuel Rod Behaviour in Fast Power Pulse Conditions (started); Benchmark on the VENUS Plutonium Recycling Experiments Configuration 7 (started). This report describes the detailed results of the benchmark investigating the physics of a whole VVER-1000 reactor core using two-thirds low-enriched uranium (LEU) and one-third MOX fuel. It contributes to the computer code certification process and to the
Tallarita, Gianni; Peterson, Adam
2018-04-01
We perform a numerical study of the phase diagram of the model proposed in [M. Shifman, Phys. Rev. D 87, 025025 (2013)., 10.1103/PhysRevD.87.025025], which is a simple model containing non-Abelian vortices. As per the case of Abrikosov vortices, we map out a region of parameter space in which the system prefers the formation of vortices in ordered lattice structures. These are generalizations of Abrikosov vortex lattices with extra orientational moduli in the vortex cores. At sufficiently large lattice spacing the low energy theory is described by a sum of C P (1 ) theories, each located on a vortex site. As the lattice spacing becomes smaller, when the self-interaction of the orientational field becomes relevant, only an overall rotation in internal space survives.
A partial entropic lattice Boltzmann MHD simulation of the Orszag-Tang vortex
Flint, Christopher; Vahala, George
2018-02-01
Karlin has introduced an analytically determined entropic lattice Boltzmann (LB) algorithm for Navier-Stokes turbulence. Here, this is partially extended to an LB model of magnetohydrodynamics, on using the vector distribution function approach of Dellar for the magnetic field (which is permitted to have field reversal). The partial entropic algorithm is benchmarked successfully against standard simulations of the Orszag-Tang vortex [Orszag, S.A.; Tang, C.M. J. Fluid Mech. 1979, 90 (1), 129-143].
Lattice Studies of Hyperon Spectroscopy
Richards, David G. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)
2016-04-01
I describe recent progress at studying the spectrum of hadrons containing the strange quark through lattice QCD calculations. I emphasise in particular the richness of the spectrum revealed by lattice studies, with a spectrum of states at least as rich as that of the quark model. I conclude by prospects for future calculations, including in particular the determination of the decay amplitudes for the excited states.
Harmonic oscillator on a lattice
Ader, J.P.; Bonnier, B.; Hontebeyrie, M.; Meyers, C.
1983-01-01
The continuum limit of the ground state energy for the harmonic oscillator with discrete time is derived for all possible choices of the lattice derivative. The occurrence of unphysical values is shown to arise whenever the lattice laplacian is not strictly positive on its Brillouin zone. These undesirable limits can either be finite and arbitrary (multiple spectrum) or infinite (overlapping sublattices with multiple spectrum). (orig.)
DeGrand, T.
1997-01-01
These lectures provide an introduction to lattice methods for nonperturbative studies of Quantum Chromodynamics. Lecture 1: Basic techniques for QCD and results for hadron spectroscopy using the simplest discretizations; lecture 2: Improved actions--what they are and how well they work; lecture 3: SLAC physics from the lattice-structure functions, the mass of the glueball, heavy quarks and α s (M z ), and B-anti B mixing. 67 refs., 36 figs
Takami, A.; Hashimoto, T.; Horibe, M.; Hayashi, A.
2000-01-01
The Wigner functions on the one dimensional lattice are studied. Contrary to the previous claim in literature, Wigner functions exist on the lattice with any number of sites, whether it is even or odd. There are infinitely many solutions satisfying the conditions which reasonable Wigner functions should respect. After presenting a heuristic method to obtain Wigner functions, we give the general form of the solutions. Quantum mechanical expectation values in terms of Wigner functions are also ...
DeGrand, T. [Univ. of Colorado, Boulder, CO (United States). Dept. of Physics
1997-06-01
These lectures provide an introduction to lattice methods for nonperturbative studies of Quantum Chromodynamics. Lecture 1: Basic techniques for QCD and results for hadron spectroscopy using the simplest discretizations; lecture 2: Improved actions--what they are and how well they work; lecture 3: SLAC physics from the lattice-structure functions, the mass of the glueball, heavy quarks and {alpha}{sub s} (M{sub z}), and B-{anti B} mixing. 67 refs., 36 figs.
Thermal Performance Benchmarking: Annual Report
Feng, Xuhui [National Renewable Energy Laboratory (NREL), Golden, CO (United States). Transportation and Hydrogen Systems Center
2017-10-19
In FY16, the thermal performance of the 2014 Honda Accord Hybrid power electronics thermal management systems were benchmarked. Both experiments and numerical simulation were utilized to thoroughly study the thermal resistances and temperature distribution in the power module. Experimental results obtained from the water-ethylene glycol tests provided the junction-to-liquid thermal resistance. The finite element analysis (FEA) and computational fluid dynamics (CFD) models were found to yield a good match with experimental results. Both experimental and modeling results demonstrate that the passive stack is the dominant thermal resistance for both the motor and power electronics systems. The 2014 Accord power electronics systems yield steady-state thermal resistance values around 42- 50 mm to the 2nd power K/W, depending on the flow rates. At a typical flow rate of 10 liters per minute, the thermal resistance of the Accord system was found to be about 44 percent lower than that of the 2012 Nissan LEAF system that was benchmarked in FY15. The main reason for the difference is that the Accord power module used a metalized-ceramic substrate and eliminated the thermal interface material layers. FEA models were developed to study the transient performance of 2012 Nissan LEAF, 2014 Accord, and two other systems that feature conventional power module designs. The simulation results indicate that the 2012 LEAF power module has lowest thermal impedance at a time scale less than one second. This is probably due to moving low thermally conductive materials further away from the heat source and enhancing the heat spreading effect from the copper-molybdenum plate close to the insulated gate bipolar transistors. When approaching steady state, the Honda system shows lower thermal impedance. Measurement results of the thermal resistance of the 2015 BMW i3 power electronic system indicate that the i3 insulated gate bipolar transistor module has significantly lower junction
Parton distributions and lattice QCD calculations: A community white paper
Lin, Huey-Wen; Nocera, Emanuele R.; Olness, Fred; Orginos, Kostas; Rojo, Juan; Accardi, Alberto; Alexandrou, Constantia; Bacchetta, Alessandro; Bozzi, Giuseppe; Chen, Jiunn-Wei; Collins, Sara; Cooper-Sarkar, Amanda; Constantinou, Martha; Del Debbio, Luigi; Engelhardt, Michael; Green, Jeremy; Gupta, Rajan; Harland-Lang, Lucian A.; Ishikawa, Tomomi; Kusina, Aleksander; Liu, Keh-Fei; Liuti, Simonetta; Monahan, Christopher; Nadolsky, Pavel; Qiu, Jian-Wei; Schienbein, Ingo; Schierholz, Gerrit; Thorne, Robert S.; Vogelsang, Werner; Wittig, Hartmut; Yuan, C.-P.; Zanotti, James
2018-05-01
In the framework of quantum chromodynamics (QCD), parton distribution functions (PDFs) quantify how the momentum and spin of a hadron are divided among its quark and gluon constituents. Two main approaches exist to determine PDFs. The first approach, based on QCD factorization theorems, realizes a QCD analysis of a suitable set of hard-scattering measurements, often using a variety of hadronic observables. The second approach, based on first-principle operator definitions of PDFs, uses lattice QCD to compute directly some PDF-related quantities, such as their moments. Motivated by recent progress in both approaches, in this document we present an overview of lattice-QCD and global-analysis techniques used to determine unpolarized and polarized proton PDFs and their moments. We provide benchmark numbers to validate present and future lattice-QCD calculations and we illustrate how they could be used to reduce the PDF uncertainties in current unpolarized and polarized global analyses. This document represents a first step towards establishing a common language between the two communities, to foster dialogue and to further improve our knowledge of PDFs.
Benchmarking of EPRI-cell epithermal methods with the point-energy discrete-ordinates code (OZMA)
Williams, M.L.; Wright, R.Q.; Barhen, J.; Rothenstein, W.
1982-01-01
The purpose of the present study is to benchmark E-C resonance-shielding and cell-averaging methods against a rigorous deterministic solution on a fine-group level (approx. 30 groups between 1 eV and 5.5 keV). The benchmark code used is OZMA, which solves the space-dependent slowing-down equations using continuous-energy discrete ordinates or integral transport theory to produce fine-group cross sections. Results are given for three water-moderated lattices - a mixed oxide, a uranium method, and a tight-pitch high-conversion uranium oxide configuration. The latter two lattices were chosen because of the strong self shielding of the 238 U resonances
Racetrack lattices for the TRIUMF KAON factory
Servranckx, R.V.; Wienands, U.; Craddock, M.K.; Rees, G.H.
1989-03-01
Separated-function racetrack lattices have been developed for the KAON Factory accelerators that have more flexibility than the old circular lattices. Straight sections with zero dispersion are provided for rf cavities and fast injection and extraction, and with controlled dispersion for H - injection and slow extraction. In addition the new lattices have fewer depolarizing resonances than the old circular lattices
What Randomized Benchmarking Actually Measures
Proctor, Timothy; Rudinger, Kenneth; Young, Kevin; Sarovar, Mohan; Blume-Kohout, Robin
2017-01-01
Randomized benchmarking (RB) is widely used to measure an error rate of a set of quantum gates, by performing random circuits that would do nothing if the gates were perfect. In the limit of no finite-sampling error, the exponential decay rate of the observable survival probabilities, versus circuit length, yields a single error metric r. For Clifford gates with arbitrary small errors described by process matrices, r was believed to reliably correspond to the mean, over all Clifford gates, of the average gate infidelity between the imperfect gates and their ideal counterparts. We show that this quantity is not a well-defined property of a physical gate set. It depends on the representations used for the imperfect and ideal gates, and the variant typically computed in the literature can differ from r by orders of magnitude. We present new theories of the RB decay that are accurate for all small errors describable by process matrices, and show that the RB decay curve is a simple exponential for all such errors. Here, these theories allow explicit computation of the error rate that RB measures (r), but as far as we can tell it does not correspond to the infidelity of a physically allowed (completely positive) representation of the imperfect gates.
Benchmarking Commercial Conformer Ensemble Generators.
Friedrich, Nils-Ole; de Bruyn Kops, Christina; Flachsenberg, Florian; Sommer, Kai; Rarey, Matthias; Kirchmair, Johannes
2017-11-27
We assess and compare the performance of eight commercial conformer ensemble generators (ConfGen, ConfGenX, cxcalc, iCon, MOE LowModeMD, MOE Stochastic, MOE Conformation Import, and OMEGA) and one leading free algorithm, the distance geometry algorithm implemented in RDKit. The comparative study is based on a new version of the Platinum Diverse Dataset, a high-quality benchmarking dataset of 2859 protein-bound ligand conformations extracted from the PDB. Differences in the performance of commercial algorithms are much smaller than those observed for free algorithms in our previous study (J. Chem. Inf. 2017, 57, 529-539). For commercial algorithms, the median minimum root-mean-square deviations measured between protein-bound ligand conformations and ensembles of a maximum of 250 conformers are between 0.46 and 0.61 Å. Commercial conformer ensemble generators are characterized by their high robustness, with at least 99% of all input molecules successfully processed and few or even no substantial geometrical errors detectable in their output conformations. The RDKit distance geometry algorithm (with minimization enabled) appears to be a good free alternative since its performance is comparable to that of the midranked commercial algorithms. Based on a statistical analysis, we elaborate on which algorithms to use and how to parametrize them for best performance in different application scenarios.
Kikuchi, Yasuyuki; Hasegawa, Akira; Takano, Hideki; Kamei, Takanobu; Hojuyama, Takeshi; Sasaki, Makoto; Seki, Yuji; Zukeran, Atsushi; Otake, Iwao.
1982-02-01
Various benchmark tests were made on JENDL-1. At the first stage, various core center characteristics were tested for many critical assemblies with one-dimensional model. At the second stage, applicability of JENDL-1 was further tested to more sophisticated problems for MOZART and ZPPR-3 assemblies with two-dimensional model. It was proved that JENDL-1 predicted various quantities of fast reactors satisfactorily as a whole. However, the following problems were pointed out: 1) There exists discrepancy of 0.9% in the k sub(eff)-values between the Pu- and U-cores. 2) The fission rate ratio of 239 Pu to 235 U is underestimated by 3%. 3) The Doppler reactivity coefficients are overestimated by about 10%. 4) The control rod worths are underestimated by 4%. 5) The fission rates of 235 U and 239 Pu are underestimated considerably in the outer core and radial blanket regions. 6) The negative sodium void reactivities are overestimated, when the sodium is removed from the outer core. As a whole, most of problems of JENDL-1 seem to be related with the neutron leakage and the neutron spectrum. It was found through the further study that most of these problems came from too small diffusion coefficients and too large elastic removal cross sections above 100 keV, which might be probably caused by overestimation of the total and elastic scattering cross sections for structural materials in the unresolved resonance region up to several MeV. (author)
Human factors reliability benchmark exercise
Poucet, A.
1989-08-01
The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim of assessing the state of the art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participated in the HF-RBE. The HF-RBE was organised around two study cases: (1) analysis of routine functional Test and Maintenance (TPM) procedures: with the aim of assessing the probability of test induced failures, the probability of failures to remain unrevealed and the potential to initiate transients because of errors performed in the test; (2) analysis of human actions during an operational transient: with the aim of assessing the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. This report summarises the contributions received from the participants and analyses these contributions on a comparative basis. The aim of this analysis was to compare the procedures, modelling techniques and quantification methods used, to obtain insight in the causes and magnitude of the variability observed in the results, to try to identify preferred human reliability assessment approaches and to get an understanding of the current state of the art in the field identifying the limitations that are still inherent to the different approaches
Lattice gauge theory using parallel processors
Lee, T.D.; Chou, K.C.; Zichichi, A.
1987-01-01
The book's contents include: Lattice Gauge Theory Lectures: Introduction and Current Fermion Simulations; Monte Carlo Algorithms for Lattice Gauge Theory; Specialized Computers for Lattice Gauge Theory; Lattice Gauge Theory at Finite Temperature: A Monte Carlo Study; Computational Method - An Elementary Introduction to the Langevin Equation, Present Status of Numerical Quantum Chromodynamics; Random Lattice Field Theory; The GF11 Processor and Compiler; and The APE Computer and First Physics Results; Columbia Supercomputer Project: Parallel Supercomputer for Lattice QCD; Statistical and Systematic Errors in Numerical Simulations; Monte Carlo Simulation for LGT and Programming Techniques on the Columbia Supercomputer; Food for Thought: Five Lectures on Lattice Gauge Theory
Revaluering benchmarking - A topical theme for the construction industry
Rasmussen, Grane Mikael Gregaard
2011-01-01
and questioning the concept objectively. This paper addresses the underlying nature of benchmarking, and accounts for the importance of focusing attention on the sociological impacts benchmarking has in organizations. To understand these sociological impacts, benchmarking research needs to transcend...... the perception of benchmarking systems as secondary and derivative and instead studying benchmarking as constitutive of social relations and as irredeemably social phenomena. I have attempted to do so in this paper by treating benchmarking using a calculative practice perspective, and describing how...
Developing integrated benchmarks for DOE performance measurement
Barancik, J.I.; Kramer, C.F.; Thode, Jr. H.C.
1992-09-30
The objectives of this task were to describe and evaluate selected existing sources of information on occupational safety and health with emphasis on hazard and exposure assessment, abatement, training, reporting, and control identifying for exposure and outcome in preparation for developing DOE performance benchmarks. Existing resources and methodologies were assessed for their potential use as practical performance benchmarks. Strengths and limitations of current data resources were identified. Guidelines were outlined for developing new or improved performance factors, which then could become the basis for selecting performance benchmarks. Data bases for non-DOE comparison populations were identified so that DOE performance could be assessed relative to non-DOE occupational and industrial groups. Systems approaches were described which can be used to link hazards and exposure, event occurrence, and adverse outcome factors, as needed to generate valid, reliable, and predictive performance benchmarks. Data bases were identified which contain information relevant to one or more performance assessment categories . A list of 72 potential performance benchmarks was prepared to illustrate the kinds of information that can be produced through a benchmark development program. Current information resources which may be used to develop potential performance benchmarks are limited. There is need to develop an occupational safety and health information and data system in DOE, which is capable of incorporating demonstrated and documented performance benchmarks prior to, or concurrent with the development of hardware and software. A key to the success of this systems approach is rigorous development and demonstration of performance benchmark equivalents to users of such data before system hardware and software commitments are institutionalized.
IAEA sodium void reactivity benchmark calculations
Hill, R.N.; Finck, P.J.
1992-01-01
In this paper, the IAEA-1 992 ''Benchmark Calculation of Sodium Void Reactivity Effect in Fast Reactor Core'' problem is evaluated. The proposed design is a large axially heterogeneous oxide-fueled fast reactor as described in Section 2; the core utilizes a sodium plenum above the core to enhance leakage effects. The calculation methods used in this benchmark evaluation are described in Section 3. In Section 4, the calculated core performance results for the benchmark reactor model are presented; and in Section 5, the influence of steel and interstitial sodium heterogeneity effects is estimated
Benchmarking gate-based quantum computers
Michielsen, Kristel; Nocon, Madita; Willsch, Dennis; Jin, Fengping; Lippert, Thomas; De Raedt, Hans
2017-11-01
With the advent of public access to small gate-based quantum processors, it becomes necessary to develop a benchmarking methodology such that independent researchers can validate the operation of these processors. We explore the usefulness of a number of simple quantum circuits as benchmarks for gate-based quantum computing devices and show that circuits performing identity operations are very simple, scalable and sensitive to gate errors and are therefore very well suited for this task. We illustrate the procedure by presenting benchmark results for the IBM Quantum Experience, a cloud-based platform for gate-based quantum computing.
Benchmark Imagery FY11 Technical Report
Roberts, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pope, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2011-06-14
This report details the work performed in FY11 under project LL11-GS-PD06, “Benchmark Imagery for Assessing Geospatial Semantic Extraction Algorithms.” The original LCP for the Benchmark Imagery project called for creating a set of benchmark imagery for verifying and validating algorithms that extract semantic content from imagery. More specifically, the first year was slated to deliver real imagery that had been annotated, the second year to deliver real imagery that had composited features, and the final year was to deliver synthetic imagery modeled after the real imagery.
Embedded Lattice and Properties of Gram Matrix
Futa Yuichi
2017-03-01
Full Text Available In this article, we formalize in Mizar [14] the definition of embedding of lattice and its properties. We formally define an inner product on an embedded module. We also formalize properties of Gram matrix. We formally prove that an inverse of Gram matrix for a rational lattice exists. Lattice of Z-module is necessary for lattice problems, LLL (Lenstra, Lenstra and Lov´asz base reduction algorithm [16] and cryptographic systems with lattice [17].
Depletion benchmarks calculation of random media using explicit modeling approach of RMC
Liu, Shichang; She, Ding; Liang, Jin-gang; Wang, Kan
2016-01-01
Highlights: • Explicit modeling of RMC is applied to depletion benchmark for HTGR fuel element. • Explicit modeling can provide detailed burnup distribution and burnup heterogeneity. • The results would serve as a supplement for the HTGR fuel depletion benchmark. • The method of adjacent burnup regions combination is proposed for full-core problems. • The combination method can reduce memory footprint, keeping the computing accuracy. - Abstract: Monte Carlo method plays an important role in accurate simulation of random media, owing to its advantages of the flexible geometry modeling and the use of continuous-energy nuclear cross sections. Three stochastic geometry modeling methods including Random Lattice Method, Chord Length Sampling and explicit modeling approach with mesh acceleration technique, have been implemented in RMC to simulate the particle transport in the dispersed fuels, in which the explicit modeling method is regarded as the best choice. In this paper, the explicit modeling method is applied to the depletion benchmark for HTGR fuel element, and the method of combination of adjacent burnup regions has been proposed and investigated. The results show that the explicit modeling can provide detailed burnup distribution of individual TRISO particles, and this work would serve as a supplement for the HTGR fuel depletion benchmark calculations. The combination of adjacent burnup regions can effectively reduce the memory footprint while keeping the computational accuracy.
Development of a set of benchmark problems to verify numerical methods for solving burnup equations
Lago, Daniel; Rahnema, Farzad
2017-01-01
Highlights: • Description transmutation chain benchmark problems. • Problems for validating numerical methods for solving burnup equations. • Analytical solutions for the burnup equations. • Numerical solutions for the burnup equations. - Abstract: A comprehensive set of transmutation chain benchmark problems for numerically validating methods for solving burnup equations was created. These benchmark problems were designed to challenge both traditional and modern numerical methods used to solve the complex set of ordinary differential equations used for tracking the change in nuclide concentrations over time due to nuclear phenomena. Given the development of most burnup solvers is done for the purpose of coupling with an established transport solution method, these problems provide a useful resource in testing and validating the burnup equation solver before coupling for use in a lattice or core depletion code. All the relevant parameters for each benchmark problem are described. Results are also provided in the form of reference solutions generated by the Mathematica tool, as well as additional numerical results from MATLAB.
Criticality benchmark guide for light-water-reactor fuel in transportation and storage packages
Lichtenwalter, J.J.; Bowman, S.M.; DeHart, M.D.; Hopper, C.M.
1997-03-01
This report is designed as a guide for performing criticality benchmark calculations for light-water-reactor (LWR) fuel applications. The guide provides documentation of 180 criticality experiments with geometries, materials, and neutron interaction characteristics representative of transportation packages containing LWR fuel or uranium oxide pellets or powder. These experiments should benefit the U.S. Nuclear Regulatory Commission (NRC) staff and licensees in validation of computational methods used in LWR fuel storage and transportation concerns. The experiments are classified by key parameters such as enrichment, water/fuel volume, hydrogen-to-fissile ratio (H/X), and lattice pitch. Groups of experiments with common features such as separator plates, shielding walls, and soluble boron are also identified. In addition, a sample validation using these experiments and a statistical analysis of the results are provided. Recommendations for selecting suitable experiments and determination of calculational bias and uncertainty are presented as part of this benchmark guide
How benchmarking can improve patient nutrition.
Ellis, Jane
Benchmarking is a tool that originated in business to enable organisations to compare their services with industry-wide best practice. Early last year the Department of Health published The Essence of Care, a benchmarking toolkit adapted for use in health care. It focuses on eight elements of care that are crucial to patients' experiences. Nurses and other health care professionals at a London NHS trust have begun a trust-wide benchmarking project. The aim is to improve patients' experiences of health care by sharing and comparing information, and by identifying examples of good practice and areas for improvement. The project began with two of the eight elements of The Essence of Care, with the intention of covering the rest later. This article describes the benchmarking process for nutrition and some of the consequent improvements in care.
Benchmarking and validation activities within JEFF project
Cabellos O.
2017-01-01
Full Text Available The challenge for any nuclear data evaluation project is to periodically release a revised, fully consistent and complete library, with all needed data and covariances, and ensure that it is robust and reliable for a variety of applications. Within an evaluation effort, benchmarking activities play an important role in validating proposed libraries. The Joint Evaluated Fission and Fusion (JEFF Project aims to provide such a nuclear data library, and thus, requires a coherent and efficient benchmarking process. The aim of this paper is to present the activities carried out by the new JEFF Benchmarking and Validation Working Group, and to describe the role of the NEA Data Bank in this context. The paper will also review the status of preliminary benchmarking for the next JEFF-3.3 candidate cross-section files.
Measuring Distribution Performance? Benchmarking Warrants Your Attention
Ericson, Sean J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Alvarez, Paul [The Wired Group
2018-04-13
Identifying, designing, and measuring performance metrics is critical to securing customer value, but can be a difficult task. This article examines the use of benchmarks based on publicly available performance data to set challenging, yet fair, metrics and targets.
Numerical methods: Analytical benchmarking in transport theory
Ganapol, B.D.
1988-01-01
Numerical methods applied to reactor technology have reached a high degree of maturity. Certainly one- and two-dimensional neutron transport calculations have become routine, with several programs available on personal computer and the most widely used programs adapted to workstation and minicomputer computational environments. With the introduction of massive parallelism and as experience with multitasking increases, even more improvement in the development of transport algorithms can be expected. Benchmarking an algorithm is usually not a very pleasant experience for the code developer. Proper algorithmic verification by benchmarking involves the following considerations: (1) conservation of particles, (2) confirmation of intuitive physical behavior, and (3) reproduction of analytical benchmark results. By using today's computational advantages, new basic numerical methods have been developed that allow a wider class of benchmark problems to be considered
Benchmarking Linked Open Data Management Systems
R. Angles Rojas (Renzo); M.-D. Pham (Minh-Duc); P.A. Boncz (Peter)
2014-01-01
htmlabstractWith inherent support for storing and analysing highly interconnected data, graph and RDF databases appear as natural solutions for developing Linked Open Data applications. However, current benchmarks for these database technologies do not fully attain the desirable characteristics
Benchmarks for dynamic multi-objective optimisation
Helbig, M
2013-06-01
Full Text Available When algorithms solve dynamic multi-objective optimisation problems (DMOOPs), benchmark functions should be used to determine whether the algorithm can overcome specific difficulties that can occur in real-world problems. However, for dynamic multi...
Professional Performance and Bureaucratic Benchmarking Information
Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz
controls because of organizational-professional conflicts. We therefore analyze the association between bureaucratic benchmarking information provision and professional performance and suggest that the association is more positive if prior professional performance was low. We test our hypotheses based...
Second benchmark problem for WIPP structural computations
Krieg, R.D.; Morgan, H.S.; Hunter, T.O.
1980-12-01
This report describes the second benchmark problem for comparison of the structural codes used in the WIPP project. The first benchmark problem consisted of heated and unheated drifts at a depth of 790 m, whereas this problem considers a shallower level (650 m) more typical of the repository horizon. But more important, the first problem considered a homogeneous salt configuration, whereas this problem considers a configuration with 27 distinct geologic layers, including 10 clay layers - 4 of which are to be modeled as possible slip planes. The inclusion of layering introduces complications in structural and thermal calculations that were not present in the first benchmark problem. These additional complications will be handled differently by the various codes used to compute drift closure rates. This second benchmark problem will assess these codes by evaluating the treatment of these complications
Reactor fuel depletion benchmark of TINDER
Martin, W.J.; Oliveira, C.R.E. de; Hecht, A.A.
2014-01-01
Highlights: • A reactor burnup benchmark of TINDER, coupling MCNP6 to CINDER2008, was performed. • TINDER is a poor candidate for fuel depletion calculations using its current libraries. • Data library modification is necessary if fuel depletion is desired from TINDER. - Abstract: Accurate burnup calculations are key to proper nuclear reactor design, fuel cycle modeling, and disposal estimations. The TINDER code, originally designed for activation analyses, has been modified to handle full burnup calculations, including the widely used predictor–corrector feature. In order to properly characterize the performance of TINDER for this application, a benchmark calculation was performed. Although the results followed the trends of past benchmarked codes for a UO 2 PWR fuel sample from the Takahama-3 reactor, there were obvious deficiencies in the final result, likely in the nuclear data library that was used. Isotopic comparisons versus experiment and past code benchmarks are given, as well as hypothesized areas of deficiency and future work
XWeB: The XML Warehouse Benchmark
Mahboubi, Hadj; Darmont, Jérôme
With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems.
Benchmarking and validation activities within JEFF project
Cabellos, O.; Alvarez-Velarde, F.; Angelone, M.; Diez, C. J.; Dyrda, J.; Fiorito, L.; Fischer, U.; Fleming, M.; Haeck, W.; Hill, I.; Ichou, R.; Kim, D. H.; Klix, A.; Kodeli, I.; Leconte, P.; Michel-Sendis, F.; Nunnenmann, E.; Pecchia, M.; Peneliau, Y.; Plompen, A.; Rochman, D.; Romojaro, P.; Stankovskiy, A.; Sublet, J. Ch.; Tamagno, P.; Marck, S. van der
2017-09-01
The challenge for any nuclear data evaluation project is to periodically release a revised, fully consistent and complete library, with all needed data and covariances, and ensure that it is robust and reliable for a variety of applications. Within an evaluation effort, benchmarking activities play an important role in validating proposed libraries. The Joint Evaluated Fission and Fusion (JEFF) Project aims to provide such a nuclear data library, and thus, requires a coherent and efficient benchmarking process. The aim of this paper is to present the activities carried out by the new JEFF Benchmarking and Validation Working Group, and to describe the role of the NEA Data Bank in this context. The paper will also review the status of preliminary benchmarking for the next JEFF-3.3 candidate cross-section files.
Hadronic corrections to electroweak observables from twisted mass lattice QCD
Pientka, Grit
2015-01-01
For several benchmark quantities investigated to detect signs for new physics beyond the standard model of elementary particle physics, lattice QCD currently constitutes the only ab initio approach available at small momentum transfers for the computation of non-perturbative hadronic contributions. Among those observables are the lepton anomalous magnetic moments and the running of the electroweak coupling constants. We compute the leading QCD contribution to the muon anomalous magnetic moment by performing lattice QCD calculations on ensembles incorporating N f =2+1+1 dynamical twisted mass fermions. Considering active up, down, strange, and charm quarks, admits for the first time a direct comparison of the lattice data for the muon anomaly with phenomenological results because both the latter as well as the experimentally obtained values are sensitive to the complete first two generations of quarks at the current level of precision. Recently, it has been noted that improved measurements of the electron and tau anomalous magnetic moments might also provide ways of detecting new physics contributions. Therefore, we also compute their leading QCD contributions, which simultaneously serve as cross-checks of the value obtained for the muon. Additionally, we utilise the obtained data to compute the leading hadronic contribution to the running of the fine structure constant, which enters all perturbative QED calculations. Furthermore, we show that even for the weak mixing angle the leading QCD contribution can be computed from this data. In this way, we identify a new prime observable in the search for new physics whose hadronic contributions can be obtained from lattice QCD. With the results obtained in this thesis, we are able to exclude unsuitable phenomenologically necessary flavour separations and thus directly assist the presently more precise phenomenological determinations of this eminent quantity.
Benchmark tests of JENDL-3.2 for thermal and fast reactors
Takano, Hideki
1995-01-01
Benchmark calculations for a variety of thermal and fast reactors have been performed by using the newly evaluated JENDL-3 Version-2 (JENDL-3.2) file. In the thermal reactor calculations for the uranium and plutonium fueled cores of TRX and TCA, the k eff and lattice parameters were well predicted. The fast reactor calculations for ZPPR-9 and FCA assemblies showed that the k eff , reactivity worth of Doppler, sodium void and control rod, and reaction rate distribution were in a very good agreement with the experiments. (author)
Benchmarking Danish Vocational Education and Training Programmes
Bogetoft, Peter; Wittrup, Jesper
This study paper discusses methods whereby Danish vocational education and training colleges can be benchmarked, and presents results from a number of models. It is conceptually complicated to benchmark vocational colleges, as the various colleges in Denmark offer a wide range of course programmes...... attempt to summarise the various effects that the colleges have in two relevant figures, namely retention rates of students and employment rates among students who have completed training programmes....
A framework for benchmarking land models
Y. Q. Luo
2012-10-01
Full Text Available Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1 targeted aspects of model performance to be evaluated, (2 a set of benchmarks as defined references to test model performance, (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4 model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data–model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties
Finite-lattice-spacing corrections to masses and g factors on a lattice
Roskies, R.; Wu, J.C.
1986-01-01
We suggest an alternative method for extracting masses and g factors from lattice calculations. Our method takes account of more of the infrared and ultraviolet lattice effects. It leads to more reasonable results in simulations of QED on a lattice
Numisheet2005 Benchmark Analysis on Forming of an Automotive Underbody Cross Member: Benchmark 2
Buranathiti, Thaweepat; Cao Jian
2005-01-01
This report presents an international cooperation benchmark effort focusing on simulations of a sheet metal stamping process. A forming process of an automotive underbody cross member using steel and aluminum blanks is used as a benchmark. Simulation predictions from each submission are analyzed via comparison with the experimental results. A brief summary of various models submitted for this benchmark study is discussed. Prediction accuracy of each parameter of interest is discussed through the evaluation of cumulative errors from each submission
Frustrated lattices of Ising chains
Kudasov, Yurii B; Korshunov, Aleksei S; Pavlov, V N; Maslov, Dmitrii A
2012-01-01
The magnetic structure and magnetization dynamics of systems of plane frustrated Ising chain lattices are reviewed for three groups of compounds: Ca 3 Co 2 O 6 , CsCoCl 3 , and Sr 5 Rh 4 O 12 . The available experimental data are analyzed and compared in detail. It is shown that a high-temperature magnetic phase on a triangle lattice is normally and universally a partially disordered antiferromagnetic (PDA) structure. The diversity of low-temperature phases results from weak interactions that lift the degeneracy of a 2D antiferromagnetic Ising model on the triangle lattice. Mean-field models, Monte Carlo simulation results on the static magnetization curve, and results on slow magnetization dynamics obtained with Glauber's theory are discussed in detail. (reviews of topical problems)
Lattice QCD for nuclear physics
Meyer, Harvey
2015-01-01
With ever increasing computational resources and improvements in algorithms, new opportunities are emerging for lattice gauge theory to address key questions in strongly interacting systems, such as nuclear matter. Calculations today use dynamical gauge-field ensembles with degenerate light up/down quarks and the strange quark and it is possible now to consider including charm-quark degrees of freedom in the QCD vacuum. Pion masses and other sources of systematic error, such as finite-volume and discretization effects, are beginning to be quantified systematically. Altogether, an era of precision calculation has begun, and many new observables will be calculated at the new computational facilities. The aim of this set of lectures is to provide graduate students with a grounding in the application of lattice gauge theory methods to strongly interacting systems, and in particular to nuclear physics. A wide variety of topics are covered, including continuum field theory, lattice discretizations, hadron spect...
Benchmarking infrastructure for mutation text mining.
Klein, Artjom; Riazanov, Alexandre; Hindle, Matthew M; Baker, Christopher Jo
2014-02-25
Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption.
Ad hoc committee on reactor physics benchmarks
Diamond, D.J.; Mosteller, R.D.; Gehin, J.C.
1996-01-01
In the spring of 1994, an ad hoc committee on reactor physics benchmarks was formed under the leadership of two American Nuclear Society (ANS) organizations. The ANS-19 Standards Subcommittee of the Reactor Physics Division and the Computational Benchmark Problem Committee of the Mathematics and Computation Division had both seen a need for additional benchmarks to help validate computer codes used for light water reactor (LWR) neutronics calculations. Although individual organizations had employed various means to validate the reactor physics methods that they used for fuel management, operations, and safety, additional work in code development and refinement is under way, and to increase accuracy, there is a need for a corresponding increase in validation. Both organizations thought that there was a need to promulgate benchmarks based on measured data to supplement the LWR computational benchmarks that have been published in the past. By having an organized benchmark activity, the participants also gain by being able to discuss their problems and achievements with others traveling the same route
Benchmarking for controllere: metoder, teknikker og muligheder
Bukh, Per Nikolaj; Sandalgaard, Niels Erik; Dietrichson, Lars Grubbe
2008-01-01
Benchmarking indgår på mange måder i både private og offentlige virksomheders ledelsespraksis. I økonomistyring anvendes benchmark-baserede indikatorer (eller nøgletal), eksempelvis ved fastlæggelse af mål i resultatkontrakter eller for at angive det ønskede niveau for visse nøgletal i et Balanced...... Scorecard eller tilsvarende målstyringsmodeller. Artiklen redegør for begrebet benchmarking ved at præsentere og diskutere forskellige facetter af det, samt redegør for fire forskellige anvendelser af benchmarking for at vise begrebets bredde og væsentligheden af at klarlægge formålet med et...... benchmarkingprojekt. Dernæst bliver forskellen på resultatbenchmarking og procesbenchmarking behandlet, hvorefter brugen af intern hhv. ekstern benchmarking, samt brugen af benchmarking i budgetlægning og budgetopfølgning, behandles....
Pool critical assembly pressure vessel facility benchmark
Remec, I.; Kam, F.B.K.
1997-07-01
This pool critical assembly (PCA) pressure vessel wall facility benchmark (PCA benchmark) is described and analyzed in this report. Analysis of the PCA benchmark can be used for partial fulfillment of the requirements for the qualification of the methodology for pressure vessel neutron fluence calculations, as required by the US Nuclear Regulatory Commission regulatory guide DG-1053. Section 1 of this report describes the PCA benchmark and provides all data necessary for the benchmark analysis. The measured quantities, to be compared with the calculated values, are the equivalent fission fluxes. In Section 2 the analysis of the PCA benchmark is described. Calculations with the computer code DORT, based on the discrete-ordinates method, were performed for three ENDF/B-VI-based multigroup libraries: BUGLE-93, SAILOR-95, and BUGLE-96. An excellent agreement of the calculated (C) and measures (M) equivalent fission fluxes was obtained. The arithmetic average C/M for all the dosimeters (total of 31) was 0.93 ± 0.03 and 0.92 ± 0.03 for the SAILOR-95 and BUGLE-96 libraries, respectively. The average C/M ratio, obtained with the BUGLE-93 library, for the 28 measurements was 0.93 ± 0.03 (the neptunium measurements in the water and air regions were overpredicted and excluded from the average). No systematic decrease in the C/M ratios with increasing distance from the core was observed for any of the libraries used
Benchmarking infrastructure for mutation text mining
2014-01-01
Background Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. Results We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. Conclusion We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption. PMID:24568600
Benchmarking for Cost Improvement. Final report
1993-09-01
The US Department of Energy`s (DOE) Office of Environmental Restoration and Waste Management (EM) conducted the Benchmarking for Cost Improvement initiative with three objectives: Pilot test benchmarking as an EM cost improvement tool; identify areas for cost improvement and recommend actions to address these areas; provide a framework for future cost improvement. The benchmarking initiative featured the use of four principal methods (program classification, nationwide cost improvement survey, paired cost comparison and component benchmarking). Interested parties contributed during both the design and execution phases. The benchmarking initiative was conducted on an accelerated basis. Of necessity, it considered only a limited set of data that may not be fully representative of the diverse and complex conditions found at the many DOE installations. The initiative generated preliminary data about cost differences and it found a high degree of convergence on several issues. Based on this convergence, the report recommends cost improvement strategies and actions. This report describes the steps taken as part of the benchmarking initiative and discusses the findings and recommended actions for achieving cost improvement. The results and summary recommendations, reported below, are organized by the study objectives.
Nucleon structure from lattice QCD
Dinter, Simon
2012-11-13
In this thesis we compute within lattice QCD observables related to the structure of the nucleon. One part of this thesis is concerned with moments of parton distribution functions (PDFs). Those moments are essential elements for the understanding of nucleon structure and can be extracted from a global analysis of deep inelastic scattering experiments. On the theoretical side they can be computed non-perturbatively by means of lattice QCD. However, since the time lattice calculations of moments of PDFs are available, there is a tension between these lattice calculations and the results from a global analysis of experimental data. We examine whether systematic effects are responsible for this tension, and study particularly intensively the effects of excited states by a dedicated high precision computation. Moreover, we carry out a first computation with four dynamical flavors. Another aspect of this thesis is a feasibility study of a lattice QCD computation of the scalar quark content of the nucleon, which is an important element in the cross-section of a heavy particle with the nucleon mediated by a scalar particle (e.g. Higgs particle) and can therefore have an impact on Dark Matter searches. Existing lattice QCD calculations of this quantity usually have a large error and thus a low significance for phenomenological applications. We use a variance-reduction technique for quark-disconnected diagrams to obtain a precise result. Furthermore, we introduce a new stochastic method for the calculation of connected 3-point correlation functions, which are needed to compute nucleon structure observables, as an alternative to the usual sequential propagator method. In an explorative study we check whether this new method is competitive to the standard one. We use Wilson twisted mass fermions at maximal twist in all our calculations, such that all observables considered here have only O(a{sup 2}) discretization effects.
Nucleon structure from lattice QCD
Dinter, Simon
2012-01-01
In this thesis we compute within lattice QCD observables related to the structure of the nucleon. One part of this thesis is concerned with moments of parton distribution functions (PDFs). Those moments are essential elements for the understanding of nucleon structure and can be extracted from a global analysis of deep inelastic scattering experiments. On the theoretical side they can be computed non-perturbatively by means of lattice QCD. However, since the time lattice calculations of moments of PDFs are available, there is a tension between these lattice calculations and the results from a global analysis of experimental data. We examine whether systematic effects are responsible for this tension, and study particularly intensively the effects of excited states by a dedicated high precision computation. Moreover, we carry out a first computation with four dynamical flavors. Another aspect of this thesis is a feasibility study of a lattice QCD computation of the scalar quark content of the nucleon, which is an important element in the cross-section of a heavy particle with the nucleon mediated by a scalar particle (e.g. Higgs particle) and can therefore have an impact on Dark Matter searches. Existing lattice QCD calculations of this quantity usually have a large error and thus a low significance for phenomenological applications. We use a variance-reduction technique for quark-disconnected diagrams to obtain a precise result. Furthermore, we introduce a new stochastic method for the calculation of connected 3-point correlation functions, which are needed to compute nucleon structure observables, as an alternative to the usual sequential propagator method. In an explorative study we check whether this new method is competitive to the standard one. We use Wilson twisted mass fermions at maximal twist in all our calculations, such that all observables considered here have only O(a 2 ) discretization effects.
Aiman El-Saed
2013-10-01
Full Text Available Summary: Growing numbers of healthcare facilities are routinely collecting standardized data on healthcare-associated infection (HAI, which can be used not only to track internal performance but also to compare local data to national and international benchmarks. Benchmarking overall (crude HAI surveillance metrics without accounting or adjusting for potential confounders can result in misleading conclusions. Methods commonly used to provide risk-adjusted metrics include multivariate logistic regression analysis, stratification, indirect standardization, and restrictions. The characteristics of recognized benchmarks worldwide, including the advantages and limitations are described. The choice of the right benchmark for the data from the Gulf Cooperation Council (GCC states is challenging. The chosen benchmark should have similar data collection and presentation methods. Additionally, differences in surveillance environments including regulations should be taken into consideration when considering such a benchmark. The GCC center for infection control took some steps to unify HAI surveillance systems in the region. GCC hospitals still need to overcome legislative and logistic difficulties in sharing data to create their own benchmark. The availability of a regional GCC benchmark may better enable health care workers and researchers to obtain more accurate and realistic comparisons. Keywords: Benchmarking, Comparison, Surveillance, Healthcare-associated infections
Kondo length in bosonic lattices
Giuliano, Domenico; Sodano, Pasquale; Trombettoni, Andrea
2017-09-01
Motivated by the fact that the low-energy properties of the Kondo model can be effectively simulated in spin chains, we study the realization of the effect with bond impurities in ultracold bosonic lattices at half filling. After presenting a discussion of the effective theory and of the mapping of the bosonic chain onto a lattice spin Hamiltonian, we provide estimates for the Kondo length as a function of the parameters of the bosonic model. We point out that the Kondo length can be extracted from the integrated real-space correlation functions, which are experimentally accessible quantities in experiments with cold atoms.
Supersymmetry on the noncommutative lattice
Nishimura, Jun; Rey, Soo-Jong; Sugino, Fumihiko
2003-01-01
Built upon the proposal of Kaplan et al. (heplat{0206109}), we construct noncommutative lattice gauge theory with manifest supersymmetry. We show that such theory is naturally implementable via orbifold conditions generalizing those used by Kaplan et al. We present the prescription in detail and illustrate it for noncommutative gauge theories latticized partially in two dimensions. We point out a deformation freedom in the defining theory by a complex-parameter, reminiscent of discrete torsion in string theory. We show that, in the continuum limit, the supersymmetry is enhanced only at a particular value of the deformation parameter, determined solely by the size of the noncommutativity. (author)
Machines for lattice gauge theory
Mackenzie, P.B.
1989-05-01
The most promising approach to the solution of the theory of strong interactions is large scale numerical simulation using the techniques of lattice gauge theory. At the present time, computing requirements for convincing calculations of the properties of hadrons exceed the capabilities of even the most powerful commercial supercomputers. This has led to the development of massively parallel computers dedicated to lattice gauge theory. This talk will discuss the computing requirements behind these machines, and general features of the components and architectures of the half dozen major projects now in existence. 20 refs., 1 fig
Graphene on graphene antidot lattices
Gregersen, Søren Schou; Pedersen, Jesper Goor; Power, Stephen
2015-01-01
Graphene bilayer systems are known to exhibit a band gap when the layer symmetry is broken by applying a perpendicular electric field. The resulting band structure resembles that of a conventional semiconductor with a parabolic dispersion. Here, we introduce a bilayer graphene heterostructure......, where single-layer graphene is placed on top of another layer of graphene with a regular lattice of antidots. We dub this class of graphene systems GOAL: graphene on graphene antidot lattice. By varying the structure geometry, band-structure engineering can be performed to obtain linearly dispersing...
Unconventional superconductivity in honeycomb lattice
P Sahebsara
2013-03-01
Full Text Available The possibility of symmetrical s-wave superconductivity in the honeycomb lattice is studied within a strongly correlated regime, using the Hubbard model. The superconducting order parameter is defined by introducing the Green function, which is obtained by calculating the density of the electrons . In this study showed that the superconducting order parameter appears in doping interval between 0 and 0.5, and x=0.25 is the optimum doping for the s-wave superconductivity in honeycomb lattice.
[Lattice degeneration of the retina].
Boĭko, E V; Suetov, A A; Mal'tsev, D S
2014-01-01
Lattice degeneration of the retina is a clinically important type of peripheral retinal dystrophies due to its participation in the pathogenesis of rhegmatogenous retinal detachment. In spite of extensive epidemiological, morphological, and clinical data, the question on causes of this particular type of retinal dystrophies currently remains debatable. Existing hypotheses on pathogenesis of retinal structural changes in lattice degeneration explain it to a certain extent. In clinical ophthalmology it is necessary to pay close attention to this kind of degenerations and distinguish between cases requiring preventive treatment and those requiring monitoring.
Lattice calculations in gauge theory
Rebbi, C.
1985-01-01
The lattice formulation of quantum gauge theories is discussed as a viable technique for quantitative studies of nonperturbative effects in QCD. Evidence is presented to ascertain that whole classes of lattice actions produce a universal continuum limit. Discrepancies between numerical results from Monto Carlo simulations for the pure gauge system and for the system with gauge and quark fields are discussed. Numerical calculations for QCD require very substantial computational resources. The use of powerful vector processors of special purpose machines, in extending the scope and magnitude or the calculations is considered, and one may reasonably expect that in the near future good quantitative predictions will be obtained for QCD
Chiral symmetry on the lattice
Creutz, M.
1994-11-01
The author reviews some of the difficulties associated with chiral symmetry in the context of a lattice regulator. The author discusses the structure of Wilson Fermions when the hopping parameter is in the vicinity of its critical value. Here one flavor contrasts sharply with the case of more, where a residual chiral symmetry survives anomalies. The author briefly discusses the surface mode approach, the use of mirror Fermions to cancel anomalies, and finally speculates on the problems with lattice versions of the standard model
Nuclear Physics from Lattice QCD
William Detmold, Silas Beane, Konstantinos Orginos, Martin Savage
2011-01-01
We review recent progress toward establishing lattice Quantum Chromodynamics as a predictive calculational framework for nuclear physics. A survey of the current techniques that are used to extract low-energy hadronic scattering amplitudes and interactions is followed by a review of recent two-body and few-body calculations by the NPLQCD collaboration and others. An outline of the nuclear physics that is expected to be accomplished with Lattice QCD in the next decade, along with estimates of the required computational resources, is presented.
Benchmarking - a validation of UTDefect
Niklasson, Jonas; Bostroem, Anders; Wirdelius, Haakan
2006-06-01
New and stronger demands on reliability of used NDE/NDT procedures and methods have stimulated the development of simulation tools of NDT. Modelling of ultrasonic non-destructive testing is useful for a number of reasons, e.g. physical understanding, parametric studies and in the qualification of procedures and personnel. The traditional way of qualifying a procedure is to generate a technical justification by employing experimental verification of the chosen technique. The manufacturing of test pieces is often very expensive and time consuming. It also tends to introduce a number of possible misalignments between the actual NDT situation and the proposed experimental simulation. The UTDefect computer code (SUNDT/simSUNDT) has been developed, together with the Dept. of Mechanics at Chalmers Univ. of Technology, during a decade and simulates the entire ultrasonic testing situation. A thorough validated model has the ability to be an alternative and a complement to the experimental work in order to reduce the extensive cost. The validation can be accomplished by comparisons with other models, but ultimately by comparisons with experiments. This project addresses the last alternative but provides an opportunity to, in a later stage, compare with other software when all data are made public and available. The comparison has been with experimental data from an international benchmark study initiated by the World Federation of NDE Centers. The experiments have been conducted with planar and spherically focused immersion transducers. The defects considered are side-drilled holes, flat-bottomed holes, and a spherical cavity. The data from the experiments are a reference signal used for calibration (the signal from the front surface of the test block at normal incidence) and the raw output from the scattering experiment. In all, more than forty cases have been compared. The agreement between UTDefect and the experiments was in general good (deviation less than 2dB) when the
Representation theory of lattice current algebras
Alekseev, A.Yu.; Eidgenoessische Technische Hochschule, Zurich; Faddeev, L.D.; Froehlich, L.D.; Schomerus, V.; Kyoto Univ.
1996-04-01
Lattice current algebras were introduced as a regularization of the left-and right moving degrees of freedom in the WZNW model. They provide examples of lattice theories with a local quantum symmetry U q (G). Their representation theory is studied in detail. In particular, we construct all irreducible representations along with a lattice analogue of the fusion product for representations of the lattice current algebra. It is shown that for an arbitrary number of lattice sites, the representation categories of the lattice current algebras agree with their continuum counterparts. (orig.)
Jahn, Franziska
2015-08-01
Full Text Available Benchmarking is a method of strategic information management used by many hospitals today. During the last years, several benchmarking clusters have been established within the German-speaking countries. They support hospitals in comparing and positioning their information system’s and information management’s costs, performance and efficiency against other hospitals. In order to differentiate between these benchmarking clusters and to provide decision support in selecting an appropriate benchmarking cluster, a classification scheme is developed. The classification scheme observes both general conditions and examined contents of the benchmarking clusters. It is applied to seven benchmarking clusters which have been active in the German-speaking countries within the last years. Currently, performance benchmarking is the most frequent benchmarking type, whereas the observed benchmarking clusters differ in the number of benchmarking partners and their cooperation forms. The benchmarking clusters also deal with different benchmarking subjects. Assessing costs and quality application systems, physical data processing systems, organizational structures of information management and IT services processes are the most frequent benchmarking subjects. There is still potential for further activities within the benchmarking clusters to measure strategic and tactical information management, IT governance and quality of data and data-processing processes. Based on the classification scheme and the comparison of the benchmarking clusters, we derive general recommendations for benchmarking of hospital information systems.
Raising Quality and Achievement. A College Guide to Benchmarking.
Owen, Jane
This booklet introduces the principles and practices of benchmarking as a way of raising quality and achievement at further education colleges in Britain. Section 1 defines the concept of benchmarking. Section 2 explains what benchmarking is not and the steps that should be taken before benchmarking is initiated. The following aspects and…
Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies
Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)
2017-05-23
This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.
Benchmarks: The Development of a New Approach to Student Evaluation.
Larter, Sylvia
The Toronto Board of Education Benchmarks are libraries of reference materials that demonstrate student achievement at various levels. Each library contains video benchmarks, print benchmarks, a staff handbook, and summary and introductory documents. This book is about the development and the history of the benchmark program. It has taken over 3…
Storage-Intensive Supercomputing Benchmark Study
Cohen, J; Dossa, D; Gokhale, M; Hysom, D; May, J; Pearce, R; Yoo, A
2007-10-30
Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe: (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows
Calculation of Single Cell and Fuel Assembly IRIS Benchmarks Using WIMSD5B and GNOMER Codes
Pevec, D.; Grgic, D.; Jecmenica, R.
2002-01-01
IRIS reactor (an acronym for International Reactor Innovative and Secure) is a modular, integral, light water cooled, small to medium power (100-335 MWe/module) reactor, which addresses the requirements defined by the United States Department of Energy for Generation IV nuclear energy systems, i.e., proliferation resistance, enhanced safety, improved economics, and waste reduction. An international consortium led by Westinghouse/BNFL was created for development of IRIS reactor; it includes universities, institutes, commercial companies, and utilities. Faculty of Electrical Engineering and Computing, University of Zagreb joined the consortium in year 2001, with the aim to take part in IRIS neutronics design and safety analyses of IRIS transients. A set of neutronic benchmarks for IRIS reactor was defined with the objective to compare results of all participants with exactly the same assumptions. In this paper a calculation of Benchmark 44 for IRIS reactor is described. Benchmark 44 is defined as a core depletion benchmark problem for specified IRIS reactor operating conditions (e.g., temperatures, moderator density) without feedback. Enriched boron, inhomogeneously distributed in axial direction, is used as an integral fuel burnable absorber (IFBA). The aim of this benchmark was to enable a more direct comparison of results of different code systems. Calculations of Benchmark 44 were performed using the modified CORD-2 code package. The CORD-2 code package consists of WIMSD and GNOMER codes. WIMSD is a well-known lattice spectrum calculation code. GNOMER solves the neutron diffusion equation in three-dimensional Cartesian geometry by the Green's function nodal method. The following parameters were obtained in Benchmark 44 analysis: effective multiplication factor as a function of burnup, nuclear peaking factor as a function of burnup, axial offset as a function of burnup, core-average axial power profile, core radial power profile, axial power profile for selected
Full sphere hydrodynamic and dynamo benchmarks
Marti, P.
2014-01-26
Convection in planetary cores can generate fluid flow and magnetic fields, and a number of sophisticated codes exist to simulate the dynamic behaviour of such systems. We report on the first community activity to compare numerical results of computer codes designed to calculate fluid flow within a whole sphere. The flows are incompressible and rapidly rotating and the forcing of the flow is either due to thermal convection or due to moving boundaries. All problems defined have solutions that alloweasy comparison, since they are either steady, slowly drifting or perfectly periodic. The first two benchmarks are defined based on uniform internal heating within the sphere under the Boussinesq approximation with boundary conditions that are uniform in temperature and stress-free for the flow. Benchmark 1 is purely hydrodynamic, and has a drifting solution. Benchmark 2 is a magnetohydrodynamic benchmark that can generate oscillatory, purely periodic, flows and magnetic fields. In contrast, Benchmark 3 is a hydrodynamic rotating bubble benchmark using no slip boundary conditions that has a stationary solution. Results from a variety of types of code are reported, including codes that are fully spectral (based on spherical harmonic expansions in angular coordinates and polynomial expansions in radius), mixed spectral and finite difference, finite volume, finite element and also a mixed Fourier-finite element code. There is good agreement between codes. It is found that in Benchmarks 1 and 2, the approximation of a whole sphere problem by a domain that is a spherical shell (a sphere possessing an inner core) does not represent an adequate approximation to the system, since the results differ from whole sphere results. © The Authors 2014. Published by Oxford University Press on behalf of The Royal Astronomical Society.
Interlaboratory computational comparisons of critical fast test reactor pin lattices
Mincey, J.F.; Kerr, H.T.; Durst, B.M.
1979-01-01
An objective of the Consolidated Fuel Reprocessing Program's (CFRP) nuclear engineering group at Oak Ridge National Laboratory (ORNL) is to ensure that chemical equipment components designed for the reprocessing of spent LMFBR fuel (among other fuel types) are safe from a criticality standpoint. As existing data are inadequate for the general validation of computational models describing mixed plutonium--uranium oxide systems with isotopic compositions typical of LMFBR fuel, a program of critical experiments has been initiated at the Battelle Pacific Northwest Laboratories (PNL). The first series of benchmark experiments consisted of five square-pitched lattices of unirradiated Fast Test Reactor (FTR) fuel moderated and reflected by light water. Calculations of these five experiments have been conducted by both ORNL/CFRP and PNL personnel with the purpose of exploring how accurately various computational models will predict k/sub eff/ values for such neutronic systems and if differences between k/sub eff/ values obtained with these different models are significant
Nuclear lattice simulations using symmetry-sign extrapolation
Laehde, Timo A.; Luu, Thomas [Forschungszentrum Juelich, Institute for Advanced Simulation, Institut fuer Kernphysik, and Juelich Center for Hadron Physics, Juelich (Germany); Lee, Dean [North Carolina State University, Department of Physics, Raleigh, NC (United States); Meissner, Ulf G. [Universitaet Bonn, Helmholtz-Institut fuer Strahlen- und Kernphysik and Bethe Center for Theoretical Physics, Bonn (Germany); Forschungszentrum Juelich, Institute for Advanced Simulation, Institut fuer Kernphysik, and Juelich Center for Hadron Physics, Juelich (Germany); Forschungszentrum Juelich, JARA - High Performance Computing, Juelich (Germany); Epelbaum, Evgeny; Krebs, Hermann [Ruhr-Universitaet Bochum, Institut fuer Theoretische Physik II, Bochum (Germany); Rupak, Gautam [Mississippi State University, Department of Physics and Astronomy, Mississippi State, MS (United States)
2015-07-15
Projection Monte Carlo calculations of lattice Chiral Effective Field Theory suffer from sign oscillations to a varying degree dependent on the number of protons and neutrons. Hence, such studies have hitherto been concentrated on nuclei with equal numbers of protons and neutrons, and especially on the alpha nuclei where the sign oscillations are smallest. Here, we introduce the ''symmetry-sign extrapolation'' method, which allows us to use the approximate Wigner SU(4) symmetry of the nuclear interaction to systematically extend the Projection Monte Carlo calculations to nuclear systems where the sign problem is severe. We benchmark this method by calculating the ground-state energies of the {sup 12}C, {sup 6}He and {sup 6}Be nuclei, and discuss its potential for studies of neutron-rich halo nuclei and asymmetric nuclear matter. (orig.)
Computers for lattice field theories
Iwasaki, Y.
1994-01-01
Parallel computers dedicated to lattice field theories are reviewed with emphasis on the three recent projects, the Teraflops project in the US, the CP-PACS project in Japan and the 0.5-Teraflops project in the US. Some new commercial parallel computers are also discussed. Recent development of semiconductor technologies is briefly surveyed in relation to possible approaches toward Teraflops computers. (orig.)
Synthesis of spatially variant lattices.
Rumpf, Raymond C; Pazos, Javier
2012-07-02
It is often desired to functionally grade and/or spatially vary a periodic structure like a photonic crystal or metamaterial, yet no general method for doing this has been offered in the literature. A straightforward procedure is described here that allows many properties of the lattice to be spatially varied at the same time while producing a final lattice that is still smooth and continuous. Properties include unit cell orientation, lattice spacing, fill fraction, and more. This adds many degrees of freedom to a design such as spatially varying the orientation to exploit directional phenomena. The method is not a coordinate transformation technique so it can more easily produce complicated and arbitrary spatial variance. To demonstrate, the algorithm is used to synthesize a spatially variant self-collimating photonic crystal to flow a Gaussian beam around a 90° bend. The performance of the structure was confirmed through simulation and it showed virtually no scattering around the bend that would have arisen if the lattice had defects or discontinuities.
From lattice gases to polymers
Frenkel, D.
1990-01-01
The modification of a technique that was developed to study time correlations in lattice-gas cellular automata to facilitate the numerical simulation of chain molecules is described. As an example, the calculation of the excess chemical potential of an ideal polymer in a dense colloidal
Flavor extrapolation in lattice QCD
Duffy, W.C.
1984-01-01
Explicit calculation of the effect of virtual quark-antiquark pairs in lattice QCD has eluded researchers. To include their effect explicitly one must calculate the determinant of the fermion-fermion coupling matrix. Owing to the large number of sites in a continuum limit size lattice, direct evaluation of this term requires an unrealistic amount of computer time. The effect of the virtual pairs can be approximated by ignoring this term and adjusting lattice couplings to reproduce experimental results. This procedure is called the valence approximation since it ignores all but the minimal number of quarks needed to describe hadrons. In this work the effect of the quark-antiquark pairs has been incorporated in a theory with an effective negative number of quark flavors contributing to the closed loops. Various particle masses and decay constants have been calculated for this theory and for one with no virtual pairs. The author attempts to extrapolate results towards positive numbers of quark flavors. The results show approximate agreement with experimental measurements and demonstrate the smoothness of lattice expectations in the number of quark flavors
Nuclear physics on the lattice?
Koonin, S.E.
1985-01-01
The goal of the paper is to try to adapt lattice gauge theory to build in some biases in order for being applicable to nuclear physics. In so doing the calculations are made more precise, and the author can address questions like the size of the nucleon, the nucleon-nucleon potential, the modifications of the nucleon in the nuclear medium, etc. (Auth.)
Differential geometry of group lattices
Dimakis, Aristophanes; Mueller-Hoissen, Folkert
2003-01-01
In a series of publications we developed ''differential geometry'' on discrete sets based on concepts of noncommutative geometry. In particular, it turned out that first-order differential calculi (over the algebra of functions) on a discrete set are in bijective correspondence with digraph structures where the vertices are given by the elements of the set. A particular class of digraphs are Cayley graphs, also known as group lattices. They are determined by a discrete group G and a finite subset S. There is a distinguished subclass of ''bicovariant'' Cayley graphs with the property ad(S)S subset of S. We explore the properties of differential calculi which arise from Cayley graphs via the above correspondence. The first-order calculi extend to higher orders and then allow us to introduce further differential geometric structures. Furthermore, we explore the properties of ''discrete'' vector fields which describe deterministic flows on group lattices. A Lie derivative with respect to a discrete vector field and an inner product with forms is defined. The Lie-Cartan identity then holds on all forms for a certain subclass of discrete vector fields. We develop elements of gauge theory and construct an analog of the lattice gauge theory (Yang-Mills) action on an arbitrary group lattice. Also linear connections are considered and a simple geometric interpretation of the torsion is established. By taking a quotient with respect to some subgroup of the discrete group, generalized differential calculi associated with so-called Schreier diagrams are obtained
Lattice dynamics of lithium oxide
Abstract. Li2O finds several important technological applications, as it is used in solid- state batteries, can be used as a blanket breeding material in nuclear fusion reactors, etc. Li2O exhibits a fast ion phase, characterized by a thermally induced dynamic disorder in the anionic sub-lattice of Li+, at elevated temperatures ...
Lattice fields and strong interactions
Creutz, M.
1989-06-01
I review the lattice formulation of gauge theories and the use of numerical methods to investigate nonperturbative phenomena. These methods are directly applicable to studying hadronic matter at high temperatures. Considerable recent progress has been made in numerical algorithms for including dynamical fermions in such calculations. Dealing with a nonvanishing baryon density adds new unsolved challenges. 33 refs
Borgs, C.; Chayes, J.T.; Hofstad, van der R.W.; Slade, G.
1999-01-01
We introduce a mean-field model of lattice trees based on embeddings into d of abstract trees having a critical Poisson offspring distribution. This model provides a combinatorial interpretation for the self-consistent mean-field model introduced previously by Derbez and Slade [9], and provides an
Lattice quantum chromodynamics: Some topics
I will begin with a lightning quick overview of the basic lattice gauge theory and then go on to .... The Monte Carlo technique to evaluate C(t), or the expectation value of any other observable ... x }occurs with a probability proportional to. 890.
Lattice continuum and diffusional creep.
Mesarovic, Sinisa Dj
2016-04-01
Diffusional creep is characterized by growth/disappearance of lattice planes at the crystal boundaries that serve as sources/sinks of vacancies, and by diffusion of vacancies. The lattice continuum theory developed here represents a natural and intuitive framework for the analysis of diffusion in crystals and lattice growth/loss at the boundaries. The formulation includes the definition of the Lagrangian reference configuration for the newly created lattice, the transport theorem and the definition of the creep rate tensor for a polycrystal as a piecewise uniform, discontinuous field. The values associated with each crystalline grain are related to the normal diffusional flux at grain boundaries. The governing equations for Nabarro-Herring creep are derived with coupled diffusion and elasticity with compositional eigenstrain. Both, bulk diffusional dissipation and boundary dissipation accompanying vacancy nucleation and absorption, are considered, but the latter is found to be negligible. For periodic arrangements of grains, diffusion formally decouples from elasticity but at the cost of a complicated boundary condition. The equilibrium of deviatorically stressed polycrystals is impossible without inclusion of interface energies. The secondary creep rate estimates correspond to the standard Nabarro-Herring model, and the volumetric creep is small. The initial (primary) creep rate is estimated to be much larger than the secondary creep rate.
Itzykson, C.
1983-10-01
We review the formulation of field theory and statistical mechanics on a Poissonian random lattice. Topics discussed include random geometry, the construction of field equations for arbitrary spin, the free field spectrum and the question of localization illustrated in the one dimensional case
IRPHE/B and W-SS-LATTICE, Spectral Shift Reactor Lattice Experiments
2003-01-01
plutonium utilisation in commercial reactors. A fourth report concerns Critical Experiments Supporting Close Proximity Water Storage of Power Reactor Fuel. The fifth report concerns critical experiments supporting underwater storage of tightly packed configurations of spent fuel pins. The sixth report entitled 'Physics verification program', covers principally a series of experiments to measure the effect of lattice heterogeneities. The seventh report concerns Development and Demonstration of an Advanced Extended-Burnup Fuel- Assembly Design Incorporating Urania-Gadolinia. The eighth report concerns 'Urania-Gadolinia: Nuclear Model Development and Critical Experiment Benchmark'. The purpose was development and verification within an extended-burnup program for pressurised water reactors of advanced fuel assembly design. The ninth report concerns the Characterization and Irradiation Program: Extended-Burnup Gadolinia Lead Test Assembly (Mark GdB). The goal of the program was to extend the burnup of pressurized water reactor fuel assemblies to 50,000 MWd/mtU batch average burnup. The tenth report concerns the Hot Cell Examination of Gadolinia Lead Test Assembly Rods After One Cycle of Irradiation as described in the eighth and ninth report. The eleventh report covering April 1986 through March 1987, combines the progress report for the program entitled Development of an Advanced Extended Burnup Fuel Assembly Design Incorporating Urania-Gadolinia, and the final progress report for the program entitled Qualification of the B and W Mark B Fuel Assembly for High Burnup. The twelfth report describes five lead test assemblies designed, manufactured, characterized, and inserted for irradiation in Oconee Unit 1 cycle 8
Benchmarking – A tool for judgment or improvement?
Rasmussen, Grane Mikael Gregaard
2010-01-01
perceptions of benchmarking will be presented; public benchmarking and best practice benchmarking. These two types of benchmarking are used to characterize and discuss the Danish benchmarking system and to enhance which effects, possibilities and challenges that follow in the wake of using this kind......Change in construction is high on the agenda for the Danish government and a comprehensive effort is done in improving quality and efficiency. This has led to an initiated governmental effort in bringing benchmarking into the Danish construction sector. This paper is an appraisal of benchmarking...... as it is presently carried out in the Danish construction sector. Many different perceptions of benchmarking and the nature of the construction sector, lead to an uncertainty in how to perceive and use benchmarking, hence, generating an uncertainty in understanding the effects of benchmarking. This paper addresses...
Disconnected Diagrams in Lattice QCD
Gambhir, Arjun Singh
In this work, we present state-of-the-art numerical methods and their applications for computing a particular class of observables using lattice quantum chromodynamics (Lattice QCD), a discretized version of the fundamental theory of quarks and gluons. These observables require calculating so called "disconnected diagrams" and are important for understanding many aspects of hadron structure, such as the strange content of the proton. We begin by introducing the reader to the key concepts of Lattice QCD and rigorously define the meaning of disconnected diagrams through an example of the Wick contractions of the nucleon. Subsequently, the calculation of observables requiring disconnected diagrams is posed as the computationally challenging problem of finding the trace of the inverse of an incredibly large, sparse matrix. This is followed by a brief primer of numerical sparse matrix techniques that overviews broadly used methods in Lattice QCD and builds the background for the novel algorithm presented in this work. We then introduce singular value deflation as a method to improve convergence of trace estimation and analyze its effects on matrices from a variety of fields, including chemical transport modeling, magnetohydrodynamics, and QCD. Finally, we apply this method to compute observables such as the strange axial charge of the proton and strange sigma terms in light nuclei. The work in this thesis is innovative for four reasons. First, we analyze the effects of deflation with a model that makes qualitative predictions about its effectiveness, taking only the singular value spectrum as input, and compare deflated variance with different types of trace estimator noise. Second, the synergy between probing methods and deflation is investigated both experimentally and theoretically. Third, we use the synergistic combination of deflation and a graph coloring algorithm known as hierarchical probing to conduct a lattice calculation of light disconnected matrix elements
Disconnected Diagrams in Lattice QCD
Gambhir, Arjun [College of William and Mary, Williamsburg, VA (United States)
2017-08-01
In this work, we present state-of-the-art numerical methods and their applications for computing a particular class of observables using lattice quantum chromodynamics (Lattice QCD), a discretized version of the fundamental theory of quarks and gluons. These observables require calculating so called \\disconnected diagrams" and are important for understanding many aspects of hadron structure, such as the strange content of the proton. We begin by introducing the reader to the key concepts of Lattice QCD and rigorously define the meaning of disconnected diagrams through an example of the Wick contractions of the nucleon. Subsequently, the calculation of observables requiring disconnected diagrams is posed as the computationally challenging problem of finding the trace of the inverse of an incredibly large, sparse matrix. This is followed by a brief primer of numerical sparse matrix techniques that overviews broadly used methods in Lattice QCD and builds the background for the novel algorithm presented in this work. We then introduce singular value deflation as a method to improve convergence of trace estimation and analyze its effects on matrices from a variety of fields, including chemical transport modeling, magnetohydrodynamics, and QCD. Finally, we apply this method to compute observables such as the strange axial charge of the proton and strange sigma terms in light nuclei. The work in this thesis is innovative for four reasons. First, we analyze the effects of deflation with a model that makes qualitative predictions about its effectiveness, taking only the singular value spectrum as input, and compare deflated variance with different types of trace estimator noise. Second, the synergy between probing methods and deflation is investigated both experimentally and theoretically. Third, we use the synergistic combination of deflation and a graph coloring algorithm known as hierarchical probing to conduct a lattice calculation of light disconnected matrix elements
The Concepts "Benchmarks and Benchmarking" Used in Education Planning: Teacher Education as Example
Steyn, H. J.
2015-01-01
Planning in education is a structured activity that includes several phases and steps that take into account several kinds of information (Steyn, Steyn, De Waal & Wolhuter, 2002: 146). One of the sets of information that are usually considered is the (so-called) "benchmarks" and "benchmarking" regarding the focus of a…
Criteria of benchmark selection for efficient flexible multibody system formalisms
Valášek M.
2007-10-01
Full Text Available The paper deals with the selection process of benchmarks for testing and comparing efficient flexible multibody formalisms. The existing benchmarks are briefly summarized. The purposes for benchmark selection are investigated. The result of this analysis is the formulation of the criteria of benchmark selection for flexible multibody formalisms. Based on them the initial set of suitable benchmarks is described. Besides that the evaluation measures are revised and extended.
Cluster computing for lattice QCD simulations
Coddington, P.D.; Williams, A.G.
2000-01-01
main application is lattice QCD calculations. We have a number of programs for generating and analysing lattice QCD configurations. These programs are written in a data parallel style using Fortran 90 array syntax. Initially they were run on the CM-5 by using CM Fortran compiler directives for specifying data distribution among the processors of the parallel machine. It was a simple task to convert these codes to use the equivalent directives for High Performance Fortran (HPF), which is a standard, portable data parallel language that can be used on clusters. We have used the Portland Group HPF compiler (PGHPF), which offers good support for cluster computing. We benchmarked our codes on a number of different types of machine, before eventually deciding to purchase a large cluster from Sun Microsystems, which was installed at Adelaide University in June 2000. With a peak performance of 144 GFlops, it is currently the fastest computer in Australia. The machine is a new product from Sun, known as a Sun Technical Compute Farm (TCF). It consists of a cluster of Sun E420R workstations, each of which has four 450MHz UltraSPARC II processors, with a peak speed of 3.6 GFlops per workstation. The NCFLGT cluster consists of 40 E420R workstations, giving a total of 160 processors, 160 GBytes of memory, 640 MBytes of cache memory, and 720 GBytes of disk. The standard Sun TCF product comes with either Fast or Gigabit Ethernet, with an option for using a very high-bandwidth, low-latency SCI network targeted at parallel computing applications. For most parallel lattice QCD codes, Ethernet does not offer a low enough communications latency, while SCI is very expensive and is overkill for our applications. We therefore decided upon a third-party solution for the network, and will soon be installing a high-speed Myrinet 2000 network. Currently we only have very preliminary performance results for our lattice QCD codes, which look quite promising. We will present detailed performance
Toxicological benchmarks for wildlife: 1994 Revision
Opresko, D.M.; Sample, B.E.; Suter, G.W. II.
1994-09-01
The process by which ecological risks of environmental contaminants are evaluated is two-tiered. The first tier is a screening assessment where concentrations of contaminants in the environment are compared to toxicological benchmarks which represent concentrations of chemicals in environmental media (water, sediment, soil, food, etc.) that are presumed to be nonhazardous to the surrounding biota. The second tier is a baseline ecological risk assessment where toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. The report presents toxicological benchmarks for assessment of effects of 76 chemicals on 8 representative mammalian wildlife species and 31 chemicals on 9 avian wildlife species. The chemicals are some of those that occur at United States Department of Energy waste sites; the wildlife species were chosen because they are widely distributed and provide a representative range of body sizes and diets. Further descriptions of the chosen wildlife species and chemicals are provided in the report. The benchmarks presented in this report represent values believed to be nonhazardous for the listed wildlife species. These benchmarks only consider contaminant exposure through oral ingestion of contaminated media; exposure through inhalation or direct dermal exposure are not considered in this report
Toxicological benchmarks for wildlife: 1994 Revision
Opresko, D.M.; Sample, B.E.; Suter, G.W. II
1994-09-01
The process by which ecological risks of environmental contaminants are evaluated is two-tiered. The first tier is a screening assessment where concentrations of contaminants in the environment are compared to toxicological benchmarks which represent concentrations of chemicals in environmental media (water, sediment, soil, food, etc.) that are presumed to be nonhazardous to the surrounding biota. The second tier is a baseline ecological risk assessment where toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. The report presents toxicological benchmarks for assessment of effects of 76 chemicals on 8 representative mammalian wildlife species and 31 chemicals on 9 avian wildlife species. The chemicals are some of those that occur at United States Department of Energy waste sites; the wildlife species were chosen because they are widely distributed and provide a representative range of body sizes and diets. Further descriptions of the chosen wildlife species and chemicals are provided in the report. The benchmarks presented in this report represent values believed to be nonhazardous for the listed wildlife species. These benchmarks only consider contaminant exposure through oral ingestion of contaminated media; exposure through inhalation or direct dermal exposure are not considered in this report.
Validation of VHTRC calculation benchmark of critical experiment using the MCB code
Stanisz Przemysław
2016-01-01
Full Text Available The calculation benchmark problem Very High Temperature Reactor Critical (VHTR a pin-in-block type core critical assembly has been investigated with the Monte Carlo Burnup (MCB code in order to validate the latest version of Nuclear Data Library based on ENDF format. Executed benchmark has been made on the basis of VHTR benchmark available from the International Handbook of Evaluated Reactor Physics Benchmark Experiments. This benchmark is useful for verifying the discrepancies in keff values between various libraries and experimental values. This allows to improve accuracy of the neutron transport calculations that may help in designing the high performance commercial VHTRs. Almost all safety parameters depend on the accuracy of neutron transport calculation results that, in turn depend on the accuracy of nuclear data libraries. Thus, evaluation of the libraries applicability to VHTR modelling is one of the important subjects. We compared the numerical experiment results with experimental measurements using two versions of available nuclear data (ENDF-B-VII.1 and JEFF-3.2 prepared for required temperatures. Calculations have been performed with the MCB code which allows to obtain very precise representation of complex VHTR geometry, including the double heterogeneity of a fuel element. In this paper, together with impact of nuclear data, we discuss also the impact of different lattice modelling inside the fuel pins. The discrepancies of keff have been successfully observed and show good agreement with each other and with the experimental data within the 1 σ range of the experimental uncertainty. Because some propagated discrepancies observed, we proposed appropriate corrections in experimental constants which can improve the reactivity coefficient dependency. Obtained results confirm the accuracy of the new Nuclear Data Libraries.
Statistical hydrodynamics of lattice-gas automata
Grosfils, Patrick; Boon, Jean-Pierre; Brito López, Ricardo; Ernst, M. H.
1993-01-01
We investigate the space and time behavior of spontaneous thermohydrodynamic fluctuations in a simple fluid modeled by a lattice-gas automaton and develop the statistical-mechanical theory of thermal lattice gases to compute the dynamical structure factor, i.e., the power spectrum of the density correlation function. A comparative analysis of the theoretical predictions with our lattice gas simulations is presented. The main results are (i) the spectral function of the lattice-gas fluctuation...
A highly simplified 3D BWR benchmark problem
Douglass, Steven; Rahnema, Farzad
2010-01-01
The resurgent interest in reactor development associated with the nuclear renaissance has paralleled significant advancements in computer technology, and allowed for unprecedented computational power to be applied to the numerical solution of neutron transport problems. The current generation of core-level solvers relies on a variety of approximate methods (e.g. nodal diffusion theory, spatial homogenization) to efficiently solve reactor problems with limited computer power; however, in recent years, the increased availability of high-performance computer systems has created an interest in the development of new methods and codes (deterministic and Monte Carlo) to directly solve whole-core reactor problems with full heterogeneity (lattice and core level). This paper presents the development of a highly simplified heterogeneous 3D benchmark problem with physics characteristic of boiling water reactors. The aim of this work is to provide a problem for developers to use to validate new whole-core methods and codes which take advantage of the advanced computational capabilities that are now available. Additionally, eigenvalues and an overview of the pin fission density distribution are provided for the benefit of the reader. (author)
Lattice QCD. A critical status report
Jansen, Karl
2008-10-15
The substantial progress that has been achieved in lattice QCD in the last years is pointed out. I compare the simulation cost and systematic effects of several lattice QCD formulations and discuss a number of topics such as lattice spacing scaling, applications of chiral perturbation theory, non-perturbative renormalization and finite volume effects. Additionally, the importance of demonstrating universality is emphasized. (orig.)
Lattice QCD. A critical status report
Jansen, Karl
2008-10-01
The substantial progress that has been achieved in lattice QCD in the last years is pointed out. I compare the simulation cost and systematic effects of several lattice QCD formulations and discuss a number of topics such as lattice spacing scaling, applications of chiral perturbation theory, non-perturbative renormalization and finite volume effects. Additionally, the importance of demonstrating universality is emphasized. (orig.)
Gauge theories on a small lattice
Robson, D.; Webber, D.M.
1980-01-01
We present exact solutions to U(1), SU(2), and SU(3) lattice gauge theories on a Kogut-Susskind lattice consisting of a single plaquette. We demonstrate precise equivalence between the U(1) theory and the harmonic oscillator on an infinite one-dimensional lattice, and between the SU(N) theory and an N-fermion Schroedinger equation. (orig.)
Spatiotemporal complexity in coupled map lattices
Kaneko, Kunihiko
1986-01-01
Some spatiotemporal patterns of couple map lattices are presented. The chaotic kink-like motions are shown for the phase motion of the coupled circle lattices. An extension of the couple map lattice approach to Hamiltonian dynamics is briefly reported. An attempt to characterize the high-dimensional attractor by the extension of the correlation dimension is discussed. (author)
Clar sextets in square graphene antidot lattices
Petersen, Rene; Pedersen, Thomas Garm; Jauho, Antti-Pekka
2011-01-01
A periodic array of holes transforms graphene from a semimetal into a semiconductor with a band gap tuneable by varying the parameters of the lattice. In earlier work only hexagonal lattices have been treated. Using atomistic models we here investigate the size of the band gap of a square lattice...
Spatial classification with fuzzy lattice reasoning
Mavridis, Constantinos; Athanasiadis, I.N.
2017-01-01
This work extends the Fuzzy Lattice Reasoning (FLR) Classifier to manage spatial attributes, and spatial relationships. Specifically, we concentrate on spatial entities, as countries, cities, or states. Lattice Theory requires the elements of a Lattice to be partially ordered. To match such
Benchmark calculations of power distribution within assemblies
Cavarec, C.; Perron, J.F.; Verwaerde, D.; West, J.P.
1994-09-01
The main objective of this Benchmark is to compare different techniques for fine flux prediction based upon coarse mesh diffusion or transport calculations. We proposed 5 ''core'' configurations including different assembly types (17 x 17 pins, ''uranium'', ''absorber'' or ''MOX'' assemblies), with different boundary conditions. The specification required results in terms of reactivity, pin by pin fluxes and production rate distributions. The proposal for these Benchmark calculations was made by J.C. LEFEBVRE, J. MONDOT, J.P. WEST and the specification (with nuclear data, assembly types, core configurations for 2D geometry and results presentation) was distributed to correspondents of the OECD Nuclear Energy Agency. 11 countries and 19 companies answered the exercise proposed by this Benchmark. Heterogeneous calculations and homogeneous calculations were made. Various methods were used to produce the results: diffusion (finite differences, nodal...), transport (P ij , S n , Monte Carlo). This report presents an analysis and intercomparisons of all the results received
ZZ WPPR, Pu Recycling Benchmark Results
Lutz, D.; Mattes, M.; Delpech, Marc; Juanola, Marc
2002-01-01
Description of program or function: The NEA NSC Working Party on Physics of Plutonium Recycling has commissioned a series of benchmarks covering: - Plutonium recycling in pressurized-water reactors; - Void reactivity effect in pressurized-water reactors; - Fast Plutonium-burner reactors: beginning of life; - Plutonium recycling in fast reactors; - Multiple recycling in advanced pressurized-water reactors. The results have been published (see references). ZZ-WPPR-1-A/B contains graphs and tables relative to the PWR Mox pin cell benchmark, representing typical fuel for plutonium recycling, one corresponding to a first cycle, the second for a fifth cycle. These computer readable files contain the complete set of results, while the printed report contains only a subset. ZZ-WPPR-2-CYC1 are the results from cycle 1 of the multiple recycling benchmarks
Interior beam searchlight semi-analytical benchmark
Ganapol, Barry D.; Kornreich, Drew E.
2008-01-01
Multidimensional semi-analytical benchmarks to provide highly accurate standards to assess routine numerical particle transport algorithms are few and far between. Because of the well-established 1D theory for the analytical solution of the transport equation, it is sometimes possible to 'bootstrap' a 1D solution to generate a more comprehensive solution representation. Here, we consider the searchlight problem (SLP) as a multidimensional benchmark. A variation of the usual SLP is the interior beam SLP (IBSLP) where a beam source lies beneath the surface of a half space and emits directly towards the free surface. We consider the establishment of a new semi-analytical benchmark based on a new FN formulation. This problem is important in radiative transfer experimental analysis to determine cloud absorption and scattering properties. (authors)
The national hydrologic bench-mark network
Cobb, Ernest D.; Biesecker, J.E.
1971-01-01
The United States is undergoing a dramatic growth of population and demands on its natural resources. The effects are widespread and often produce significant alterations of the environment. The hydrologic bench-mark network was established to provide data on stream basins which are little affected by these changes. The network is made up of selected stream basins which are not expected to be significantly altered by man. Data obtained from these basins can be used to document natural changes in hydrologic characteristics with time, to provide a better understanding of the hydrologic structure of natural basins, and to provide a comparative base for studying the effects of man on the hydrologic environment. There are 57 bench-mark basins in 37 States. These basins are in areas having a wide variety of climate and topography. The bench-mark basins and the types of data collected in the basins are described.
Confidential benchmarking based on multiparty computation
Damgård, Ivan Bjerre; Damgård, Kasper Lyneborg; Nielsen, Kurt
We report on the design and implementation of a system that uses multiparty computation to enable banks to benchmark their customers' confidential performance data against a large representative set of confidential performance data from a consultancy house. The system ensures that both the banks......' and the consultancy house's data stays confidential, the banks as clients learn nothing but the computed benchmarking score. In the concrete business application, the developed prototype help Danish banks to find the most efficient customers among a large and challenging group of agricultural customers with too much...... debt. We propose a model based on linear programming for doing the benchmarking and implement it using the SPDZ protocol by Damgård et al., which we modify using a new idea that allows clients to supply data and get output without having to participate in the preprocessing phase and without keeping...
Benchmark referencing of neutron dosimetry measurements
Eisenhauer, C.M.; Grundl, J.A.; Gilliam, D.M.; McGarry, E.D.; Spiegel, V.
1980-01-01
The concept of benchmark referencing involves interpretation of dosimetry measurements in applied neutron fields in terms of similar measurements in benchmark fields whose neutron spectra and intensity are well known. The main advantage of benchmark referencing is that it minimizes or eliminates many types of experimental uncertainties such as those associated with absolute detection efficiencies and cross sections. In this paper we consider the cavity external to the pressure vessel of a power reactor as an example of an applied field. The pressure vessel cavity is an accessible location for exploratory dosimetry measurements aimed at understanding embrittlement of pressure vessel steel. Comparisons with calculated predictions of neutron fluence and spectra in the cavity provide a valuable check of the computational methods used to estimate pressure vessel safety margins for pressure vessel lifetimes
MIPS bacterial genomes functional annotation benchmark dataset.
Tetko, Igor V; Brauner, Barbara; Dunger-Kaltenbach, Irmtraud; Frishman, Goar; Montrone, Corinna; Fobo, Gisela; Ruepp, Andreas; Antonov, Alexey V; Surmeli, Dimitrij; Mewes, Hans-Wernen
2005-05-15
Any development of new methods for automatic functional annotation of proteins according to their sequences requires high-quality data (as benchmark) as well as tedious preparatory work to generate sequence parameters required as input data for the machine learning methods. Different program settings and incompatible protocols make a comparison of the analyzed methods difficult. The MIPS Bacterial Functional Annotation Benchmark dataset (MIPS-BFAB) is a new, high-quality resource comprising four bacterial genomes manually annotated according to the MIPS functional catalogue (FunCat). These resources include precalculated sequence parameters, such as sequence similarity scores, InterPro domain composition and other parameters that could be used to develop and benchmark methods for functional annotation of bacterial protein sequences. These data are provided in XML format and can be used by scientists who are not necessarily experts in genome annotation. BFAB is available at http://mips.gsf.de/proj/bfab
Energy benchmarking of South Australian WWTPs.
Krampe, J
2013-01-01
Optimising the energy consumption and energy generation of wastewater treatment plants (WWTPs) is a topic with increasing importance for water utilities in times of rising energy costs and pressures to reduce greenhouse gas (GHG) emissions. Assessing the energy efficiency and energy optimisation of a WWTP are difficult tasks as most plants vary greatly in size, process layout and other influencing factors. To overcome these limits it is necessary to compare energy efficiency with a statistically relevant base to identify shortfalls and optimisation potential. Such energy benchmarks have been successfully developed and used in central Europe over the last two decades. This paper demonstrates how the latest available energy benchmarks from Germany have been applied to 24 WWTPs in South Australia. It shows how energy benchmarking can be used to identify shortfalls in current performance, prioritise detailed energy assessments and help inform decisions on capital investment.
Benchmarking criticality safety calculations with subcritical experiments
Mihalczo, J.T.
1984-06-01
Calculation of the neutron multiplication factor at delayed criticality may be necessary for benchmarking calculations but it may not be sufficient. The use of subcritical experiments to benchmark criticality safety calculations could result in substantial savings in fuel material costs for experiments. In some cases subcritical configurations could be used to benchmark calculations where sufficient fuel to achieve delayed criticality is not available. By performing a variety of measurements with subcritical configurations, much detailed information can be obtained which can be compared directly with calculations. This paper discusses several measurements that can be performed with subcritical assemblies and presents examples that include comparisons between calculation and experiment where possible. Where not, examples from critical experiments have been used but the measurement methods could also be used for subcritical experiments
A lattice Boltzmann model for solute transport in open channel flow
Wang, Hongda; Cater, John; Liu, Haifei; Ding, Xiangyi; Huang, Wei
2018-01-01
A lattice Boltzmann model of advection-dispersion problems in one-dimensional (1D) open channel flows is developed for simulation of solute transport and pollutant concentration. The hydrodynamics are calculated based on a previous lattice Boltzmann approach to solving the 1D Saint-Venant equations (LABSVE). The advection-dispersion model is coupled with the LABSVE using the lattice Boltzmann method. Our research recovers the advection-dispersion equations through the Chapman-Enskog expansion of the lattice Boltzmann equation. The model differs from the existing schemes in two points: (1) the lattice Boltzmann numerical method is adopted to solve the advection-dispersion problem by meso-scopic particle distribution; (2) and the model describes the relation between discharge, cross section area and solute concentration, which increases the applicability of the water quality model in practical engineering. The model is verified using three benchmark tests: (1) instantaneous solute transport within a short distance; (2) 1D point source pollution with constant velocity; (3) 1D point source pollution in a dam break flow. The model is then applied to a 50-year flood point source pollution accident on the Yongding River, which showed good agreement with a MIKE 11 solution and gauging data.
Jeong, Chang-Joon; Okumura, Keisuke; Ishiguro, Yukio; Tanaka, Ken-ichi
1990-01-01
Validation tests were made for the accuracy of cell calculation methods used in analyses of tight lattices of a mixed-oxide (MOX) fuel core in a high conversion light water reactor (HCLWR). A series of cell calculations was carried out for the lattices referred from an international HCLWR benchmark comparison, with emphasis placed on the resonance calculation methods; the NR, IR approximations, the collision probability method with ultra-fine energy group. Verification was also performed for the geometrical modelling; a hexagonal/cylindrical cell, and the boundary condition; mirror/white reflection. In the calculations, important reactor physics parameters, such as the neutron multiplication factor, the conversion ratio and the void coefficient, were evaluated using the above methods for various HCLWR lattices with different moderator to fuel volume ratios, fuel materials and fissile plutonium enrichments. The calculated results were compared with each other, and the accuracy and applicability of each method were clarified by comparison with continuous energy Monte Carlo calculations. It was verified that the accuracy of the IR approximation became worse when the neutron spectrum became harder. It was also concluded that the cylindrical cell model with the white boundary condition was not so suitable for MOX fuelled lattices, as for UO 2 fuelled lattices. (author)
Inexpensive chirality on the lattice
Kamleh, W.; Williams, A.G.; Adams, D.
2000-01-01
Full text: Implementing lattice fermions that resemble as closely as possible continuum fermions is one of the main goals of the theoretical physics community. Aside from a lack of infinitely powerful computers, one of the main impediments to this is the Nielsen-Ninomiya No-Go theorem for chirality on the lattice. One of the consequences of this theorem is that exact chiral symmetry and a lack of fermion doublers cannot be simultaneously satisfied for fermions on the lattice. In the commonly used Wilson fermion formulation, chiral symmetry is explicitly sacrificed on the lattice to avoid fermion doubling. Recently, an alternative has come forward, namely, the Ginsparg-Wilson relation and one of its solutions, the Overlap fermion. The Ginsparg-Wilson relation is a statement of lattice-deformed chirality. The Overlap-Dirac operator is a member of the family of solutions of the Ginsparg-Wilson relation. In recent times, Overlap fermions have been of great interest to the community due to their excellent chiral properties. However, they are significantly more expensive to implement than Wilson fermions. This expense is primarily due to the fact that the Overlap implementation requires an evaluation of the sign function for the Wilson-Dirac operator. The sign function is approximated by a high order rational polynomial function, but this approximation is poor close to the origin. The less near-zero modes that the Wilson- Dirac operator possesses, the cheaper the Overlap operator will be to implement. A means of improving the eigenvalue properties of the Wilson-Dirac operator by the addition of a so-called 'Clover' term is put forward. Numerical results are given that demonstrate this improvement. The Nielsen-Ninomiya no-go theorem and chirality on the lattice are reviewed. The general form of solutions of the Ginsparg-Wilson relation are given, and the Overlap solution is discussed. Properties of the Overlap-Dirac operator are given, including locality and analytic
A Benchmarking System for Domestic Water Use
Dexter V. L. Hunt
2014-05-01
Full Text Available The national demand for water in the UK is predicted to increase, exacerbated by a growing UK population, and home-grown demands for energy and food. When set against the context of overstretched existing supply sources vulnerable to droughts, particularly in increasingly dense city centres, the delicate balance of matching minimal demands with resource secure supplies becomes critical. When making changes to "internal" demands the role of technological efficiency and user behaviour cannot be ignored, yet existing benchmarking systems traditionally do not consider the latter. This paper investigates the practicalities of adopting a domestic benchmarking system (using a band rating that allows individual users to assess their current water use performance against what is possible. The benchmarking system allows users to achieve higher benchmarks through any approach that reduces water consumption. The sensitivity of water use benchmarks are investigated by making changes to user behaviour and technology. The impact of adopting localised supplies (i.e., Rainwater harvesting—RWH and Grey water—GW and including "external" gardening demands are investigated. This includes the impacts (in isolation and combination of the following: occupancy rates (1 to 4; roof size (12.5 m2 to 100 m2; garden size (25 m2 to 100 m2 and geographical location (North West, Midlands and South East, UK with yearly temporal effects (i.e., rainfall and temperature. Lessons learnt from analysis of the proposed benchmarking system are made throughout this paper, in particular its compatibility with the existing Code for Sustainable Homes (CSH accreditation system. Conclusions are subsequently drawn for the robustness of the proposed system.
Toxicological benchmarks for wildlife: 1996 Revision
Sample, B.E.; Opresko, D.M.; Suter, G.W., II.
1996-06-01
The purpose of this report is to present toxicological benchmarks for assessment of effects of certain chemicals on mammalian and avian wildlife species. Publication of this document meets a milestone for the Environmental Restoration (ER) Risk Assessment Program. This document provides the ER Program with toxicological benchmarks that may be used as comparative tools in screening assessments as well as lines of evidence to support or refute the presence of ecological effects in ecological risk assessments. The chemicals considered in this report are some that occur at US DOE waste sites, and the wildlife species evaluated herein were chosen because they represent a range of body sizes and diets
Benchmarking af kommunernes førtidspensionspraksis
Gregersen, Ole
Hvert år udgiver Den Sociale Ankestyrelse statistikken over afgørelser i sager om førtidspension. I forbindelse med årsstatistikken udgives resultater fra en benchmarking model, hvor antal tilkendelser i den enkelte kommune sammenlignes med et forventet antal tilkendelser, hvis kommunen havde haft...... samme afgørelsespraksis, som den "gennemsnitlige kommune", når vi korrigerer for den sociale struktur i kommunen. Den hidtil anvendte benchmarking model er dokumenteret i Ole Gregersen (1994): Kommunernes Pensionspraksis, Servicerapport, Socialforskningsinstituttet. I dette notat dokumenteres en...
Benchmark calculations for fusion blanket development
Sawan, M.E.; Cheng, E.T.
1985-01-01
Benchmark problems representing the leading fusion blanket concepts are presented. Benchmark calculations for self-cooled Li/sub 17/Pb/sub 83/ and helium-cooled blankets were performed. Multigroup data libraries generated from ENDF/B-IV and V files using the NJOY and AMPX processing codes with different weighting functions were used. The sensitivity of the TBR to group structure and weighting spectrum increases and Li enrichment decrease with up to 20% discrepancies for thin natural Li/sub 17/Pb/sub 83/ blankets
Benchmark calculations for fusion blanket development
Sawan, M.L.; Cheng, E.T.
1986-01-01
Benchmark problems representing the leading fusion blanket concepts are presented. Benchmark calculations for self-cooled Li 17 Pb 83 and helium-cooled blankets were performed. Multigroup data libraries generated from ENDF/B-IV and V files using the NJOY and AMPX processing codes with different weighting functions were used. The sensitivity of the tritium breeding ratio to group structure and weighting spectrum increases as the thickness and Li enrichment decrease with up to 20% discrepancies for thin natural Li 17 Pb 83 blankets. (author)
Toxicological benchmarks for wildlife: 1996 Revision
Sample, B.E.; Opresko, D.M.; Suter, G.W., II
1996-06-01
The purpose of this report is to present toxicological benchmarks for assessment of effects of certain chemicals on mammalian and avian wildlife species. Publication of this document meets a milestone for the Environmental Restoration (ER) Risk Assessment Program. This document provides the ER Program with toxicological benchmarks that may be used as comparative tools in screening assessments as well as lines of evidence to support or refute the presence of ecological effects in ecological risk assessments. The chemicals considered in this report are some that occur at US DOE waste sites, and the wildlife species evaluated herein were chosen because they represent a range of body sizes and diets.
Reactor group constants and benchmark test
Takano, Hideki [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment
2001-08-01
The evaluated nuclear data files such as JENDL, ENDF/B-VI and JEF-2 are validated by analyzing critical mock-up experiments for various type reactors and assessing applicability for nuclear characteristics such as criticality, reaction rates, reactivities, etc. This is called Benchmark Testing. In the nuclear calculations, the diffusion and transport codes use the group constant library which is generated by processing the nuclear data files. In this paper, the calculation methods of the reactor group constants and benchmark test are described. Finally, a new group constants scheme is proposed. (author)
Lattice Boltzmann methods for global linear instability analysis
Pérez, José Miguel; Aguilar, Alfonso; Theofilis, Vassilis
2017-12-01
Modal global linear instability analysis is performed using, for the first time ever, the lattice Boltzmann method (LBM) to analyze incompressible flows with two and three inhomogeneous spatial directions. Four linearization models have been implemented in order to recover the linearized Navier-Stokes equations in the incompressible limit. Two of those models employ the single relaxation time and have been proposed previously in the literature as linearization of the collision operator of the lattice Boltzmann equation. Two additional models are derived herein for the first time by linearizing the local equilibrium probability distribution function. Instability analysis results are obtained in three benchmark problems, two in closed geometries and one in open flow, namely the square and cubic lid-driven cavity flow and flow in the wake of the circular cylinder. Comparisons with results delivered by classic spectral element methods verify the accuracy of the proposed new methodologies and point potential limitations particular to the LBM approach. The known issue of appearance of numerical instabilities when the SRT model is used in direct numerical simulations employing the LBM is shown to be reflected in a spurious global eigenmode when the SRT model is used in the instability analysis. Although this mode is absent in the multiple relaxation times model, other spurious instabilities can also arise and are documented herein. Areas of potential improvements in order to make the proposed methodology competitive with established approaches for global instability analysis are discussed.
High order spectral difference lattice Boltzmann method for incompressible hydrodynamics
Li, Weidong
2017-09-01
This work presents a lattice Boltzmann equation (LBE) based high order spectral difference method for incompressible flows. In the present method, the spectral difference (SD) method is adopted to discretize the convection and collision term of the LBE to obtain high order (≥3) accuracy. Because the SD scheme represents the solution as cell local polynomials and the solution polynomials have good tensor-product property, the present spectral difference lattice Boltzmann method (SD-LBM) can be implemented on arbitrary unstructured quadrilateral meshes for effective and efficient treatment of complex geometries. Thanks to only first oder PDEs involved in the LBE, no special techniques, such as hybridizable discontinuous Galerkin method (HDG), local discontinuous Galerkin method (LDG) and so on, are needed to discrete diffusion term, and thus, it simplifies the algorithm and implementation of the high order spectral difference method for simulating viscous flows. The proposed SD-LBM is validated with four incompressible flow benchmarks in two-dimensions: (a) the Poiseuille flow driven by a constant body force; (b) the lid-driven cavity flow without singularity at the two top corners-Burggraf flow; and (c) the unsteady Taylor-Green vortex flow; (d) the Blasius boundary-layer flow past a flat plate. Computational results are compared with analytical solutions of these cases and convergence studies of these cases are also given. The designed accuracy of the proposed SD-LBM is clearly verified.
Development of a reference scheme for MOX lattice physics calculations
Finck, P.J.; Stenberg, C.G.; Roy, R.
1998-01-01
The US program to dispose of weapons-grade Pu could involve the irradiation of mixed-oxide (MOX) fuel assemblies in commercial light water reactors. This will require licensing acceptance because of the modifications to the core safety characteristics. In particular, core neutronics will be significantly modified, thus making it necessary to validate the standard suites of neutronics codes for that particular application. Validation criteria are still unclear, but it seems reasonable to expect that the same level of accuracy will be expected for MOX as that which has been achieved for UO 2 . Commercial lattice physics codes are invariably claimed to be accurate for MOX analysis but often lack independent confirmation of their performance on a representative experimental database. Argonne National Laboratory (ANL) has started implementing a public domain suite of codes to provide for a capability to perform independent assessments of MOX core analyses. The DRAGON lattice code was chosen, and fine group ENDF/B-VI.04 and JEF-2.2 libraries have been developed. The objective of this work is to validate the DRAGON algorithms with respect to continuous-energy Monte Carlo for a suite of realistic UO 2 -MOX benchmark cases, with the aim of establishing a reference DRAGON scheme with a demonstrated high level of accuracy and no computing resource constraints. Using this scheme as a reference, future work will be devoted to obtaining simpler and less costly schemes that preserve accuracy as much as possible
Chiral fermions on the lattice
Randjbar Daemi, S.; Strathdee, J.
1995-01-01
The overlap approach to chiral gauge theories on arbitrary D-dimensional lattices is studied. The doubling problem and its relation to chiral anomalies for D = 2 and 4 is examined. In each case it is shown that the doublers can be eliminated and the well known perturbative results for chiral anomalies can be recovered. We also consider the multi-flavour case and give the general criteria for the construction of anomaly free chiral gauge theories on arbitrary lattices. We calculate the second order terms in a continuum approximation to the overlap formula in D dimensions and show that they coincide with the bilinear part of the effective action of D-dimensional Weyl fermions coupled to a background gauge field. Finally, using the same formalism we reproduce the correct Lorentz, diffeomorphism and gauge anomalies in the coupling of a Weyl fermion to 2-dimensional gravitation and Maxwell fields. (author). 15 refs
Entropy favours open colloidal lattices
Mao, Xiaoming; Chen, Qian; Granick, Steve
2013-03-01
Burgeoning experimental and simulation activity seeks to understand the existence of self-assembled colloidal structures that are not close-packed. Here we describe an analytical theory based on lattice dynamics and supported by experiments that reveals the fundamental role entropy can play in stabilizing open lattices. The entropy we consider is associated with the rotational and vibrational modes unique to colloids interacting through extended attractive patches. The theory makes predictions of the implied temperature, pressure and patch-size dependence of the phase diagram of open and close-packed structures. More generally, it provides guidance for the conditions at which targeted patchy colloidal assemblies in two and three dimensions are stable, thus overcoming the difficulty in exploring by experiment or simulation the full range of conceivable parameters.
Electroweak interactions on the lattice
Kieu, T.D.
1994-07-01
It is shown that the lattice fermion doubling phenomenon is connected to the chiral anomaly which is unique to the electroweak interactions. The chiral anomaly is the breaking of chiral gauge symmetry at the quantum level due to the quantum fluctuations. Such breaking, however, is undesirable and to be avoided. The preservation of gauge symmetry imposes stringent constraints on acceptable chiral gauge theory. It is argued that the constraints are unnecessary because the conventional quantization of chiral gauge theory has missed out some crucial contributions of the chiral interactions. The corrected quantization yields consistent theory in which there is no gauge anomaly and in which various mass terms can be introduced with neither the loss of gauge invariance nor the need for the Higgs mechanism. The new quantization also provide a solution to the difficulty of how to model the electroweak interactions on the lattice. 9 refs. 1 fig
Entanglement scaling in lattice systems
Audenaert, K M R [Institute for Mathematical Sciences, Imperial College London, 53 Prince' s Gate, Exhibition Road, London SW7 2PG (United Kingdom); Cramer, M [QOLS, Blackett Laboratory, Imperial College London, Prince Consort Road, London SW7 2BW (United Kingdom); Eisert, J [Institute for Mathematical Sciences, Imperial College London, 53 Prince' s Gate, Exhibition Road, London SW7 2PG (United Kingdom); Plenio, M B [Institute for Mathematical Sciences, Imperial College London, 53 Prince' s Gate, Exhibition Road, London SW7 2PG (United Kingdom)
2007-05-15
We review some recent rigorous results on scaling laws of entanglement properties in quantum many body systems. More specifically, we study the entanglement of a region with its surrounding and determine its scaling behaviour with its size for systems in the ground and thermal states of bosonic and fermionic lattice systems. A theorem connecting entanglement between a region and the rest of the lattice with the surface area of the boundary between the two regions is presented for non-critical systems in arbitrary spatial dimensions. The entanglement scaling in the field limit exhibits a peculiar difference between fermionic and bosonic systems. In one-spatial dimension a logarithmic divergence is recovered for both bosonic and fermionic systems. In two spatial dimensions in the setting of half-spaces however we observe strict area scaling for bosonic systems and a multiplicative logarithmic correction to such an area scaling in fermionic systems. Similar questions may be posed and answered in classical systems.
Transitionless lattices for LAMPF II
Franczak, B.J.
1984-10-01
Some techniques are described for the design of synchrotron lattices that have zero dispersion in the straight sections and/or imaginary transition energy (negative momentum-compaction factor) but no excessive amplitudes of the dispersion function. Included as an application is a single-stage synchrotron, with variable optics, that has different ion-optical properties at injection and extraction but requires a complex way of programming the quadrupoles. In addition, a two-stage facility consisting of a 45-GeV synchrotron of 1100-m circumference and a 9-GeV booster of half that size is presented. As alternates to these separated-function lattices, some combined-function modules are given that can be used to construct a synchrotron with similar properties
Graphene antidot lattice transport measurements
Mackenzie, David; Cagliani, Alberto; Gammelgaard, Lene
2017-01-01
We investigate graphene devices patterned with a narrow band of holes perpendicular to the current flow, a few-row graphene antidot lattice (FR-GAL). Theoretical reports suggest that a FR-GAL can have a bandgap with a relatively small reduction of the transmission compared to what is typical...... for antidot arrays devices. Graphene devices were fabricated using 100 keV electron beam lithography (EBL) for nanopatterning as well as for defining electrical contacts. Patterns with hole diameter and neck widths of order 30 nm were produced, which is the highest reported pattern density of antidot lattices...... in graphene reported defined by EBL. Electrical measurements showed that devices with one and five rows exhibited field effect mobility of ∼100 cm2/Vs, while a larger number of rows, around 40, led to a significant reduction of field effect mobility (
Cellular automata in cytoskeletal lattices
Smith, S A; Watt, R C; Hameroff, S R
1984-01-01
Cellular automata (CA) activities could mediate biological regulation and information processing via nonlinear electrodynamic effects in cytoskeletal lattice arrays. Frohlich coherent oscillations and other nonlinear mechanisms may effect discrete 10/sup -10/ to 10/sup -11/ s interval events which result in dynamic patterns in biolattices such as cylindrical protein polymers: microtubules (MT). Structural geometry and electrostatic forces of MT subunit dipole oscillations suggest neighbor rules among the hexagonally packed protein subunits. Computer simulations using these suggested rules and MT structural geometry demonstrate CA activities including dynamical and stable self-organizing patterns, oscillators, and traveling gliders. CA activities in MT and other cytoskeletal lattices may have important biological regulatory functions. 23 references, 6 figures, 1 table.
Innovations in lattice QCD algorithms
Orginos, Konstantinos
2006-01-01
Lattice QCD calculations demand a substantial amount of computing power in order to achieve the high precision results needed to better understand the nature of strong interactions, assist experiment to discover new physics, and predict the behavior of a diverse set of physical systems ranging from the proton itself to astrophysical objects such as neutron stars. However, computer power alone is clearly not enough to tackle the calculations we need to be doing today. A steady stream of recent algorithmic developments has made an important impact on the kinds of calculations we can currently perform. In this talk I am reviewing these algorithms and their impact on the nature of lattice QCD calculations performed today
Baryon structure from lattice QCD
Alexandrou, C.
2009-01-01
We present recent lattice results on the baryon spectrum, nucleon electromagnetic and axial form factors, nucleon to Δ transition form factors as well as the Δ electromagnetic form factors. The masses of the low lying baryons and the nucleon form factors are calculated using two degenerate flavors of twisted mass fermions down to pion mass of about 270 MeV. We compare to the results of other collaborations. The nucleon to Δ transition and Δ form factors are calculated in a hybrid scheme, which uses staggered sea quarks and domain wall valence quarks. The dominant magnetic dipole nucleon to Δ transition form factor is also evaluated using dynamical domain wall fermions. The transverse density distributions of the Δ in the infinite momentum frame are extracted using the form factors determined from lattice QCD. (author)
Multigrid for Staggered Lattice Fermions
Brower, Richard C. [Boston U.; Clark, M. A. [Unlisted, US; Strelchenko, Alexei [Fermilab; Weinberg, Evan [Boston U.
2018-01-23
Critical slowing down in Krylov methods for the Dirac operator presents a major obstacle to further advances in lattice field theory as it approaches the continuum solution. Here we formulate a multi-grid algorithm for the Kogut-Susskind (or staggered) fermion discretization which has proven difficult relative to Wilson multigrid due to its first-order anti-Hermitian structure. The solution is to introduce a novel spectral transformation by the K\\"ahler-Dirac spin structure prior to the Galerkin projection. We present numerical results for the two-dimensional, two-flavor Schwinger model, however, the general formalism is agnostic to dimension and is directly applicable to four-dimensional lattice QCD.
Mitchell, L
1996-01-01
The processes of benchmarking, benchmark data comparative analysis, and study of best practices are distinctly different. The study of best practices is explained with an example based on the Arthur Andersen & Co. 1992 "Study of Best Practices in Ambulatory Surgery". The results of a national best practices study in ambulatory surgery were used to provide our quality improvement team with the goal of improving the turnaround time between surgical cases. The team used a seven-step quality improvement problem-solving process to improve the surgical turnaround time. The national benchmark for turnaround times between surgical cases in 1992 was 13.5 minutes. The initial turnaround time at St. Joseph's Medical Center was 19.9 minutes. After the team implemented solutions, the time was reduced to an average of 16.3 minutes, an 18% improvement. Cost-benefit analysis showed a potential enhanced revenue of approximately $300,000, or a potential savings of $10,119. Applying quality improvement principles to benchmarking, benchmarks, or best practices can improve process performance. Understanding which form of benchmarking the institution wishes to embark on will help focus a team and use appropriate resources. Communicating with professional organizations that have experience in benchmarking will save time and money and help achieve the desired results.
Computing nucleon EDM on a lattice
Abramczyk, Michael; Aoki, Sinya; Blum, Tom; Izubuchi, Taku; Ohki, Hiroshi; Syritsyn, Sergey
2018-03-01
I will discuss briefly recent changes in the methodology of computing the baryon EDM on a lattice. The associated correction substantially reduces presently existing lattice values for the proton and neutron theta-induced EDMs, so that even the most precise previous lattice results become consistent with zero. On one hand, this change removes previous disagreements between these lattice results and the phenomenological estimates of the nucleon EDM. On the other hand, the nucleon EDM becomes much harder to compute on a lattice. In addition, I will review the progress in computing quark chromo-EDM-induced nucleon EDM using chiral quark action.
Heavy water critical experiments on plutonium lattice
Miyawaki, Yoshio; Shiba, Kiminori
1975-06-01
This report is the summary of physics study on plutonium lattice made in Heavy Water Critical Experiment Section of PNC. By using Deuterium Critical Assembly, physics study on plutonium lattice has been carried out since 1972. Experiments on following items were performed in a core having 22.5 cm square lattice pitch. (1) Material buckling (2) Lattice parameters (3) Local power distribution factor (4) Gross flux distribution in two region core (5) Control rod worth. Experimental results were compared with theoretical ones calculated by METHUSELAH II code. It is concluded from this study that calculation by METHUSELAH II code has acceptable accuracy in the prediction on plutonium lattice. (author)
Computing nucleon EDM on a lattice
Abramczyk, Michael; Izubuchi, Taku
2017-06-18
I will discuss briefly recent changes in the methodology of computing the baryon EDM on a lattice. The associated correction substantially reduces presently existing lattice values for the proton and neutron theta-induced EDMs, so that even the most precise previous lattice results become consistent with zero. On one hand, this change removes previous disagreements between these lattice results and the phenomenological estimates of the nucleon EDM. On the other hand, the nucleon EDM becomes much harder to compute on a lattice. In addition, I will review the progress in computing quark chromo-EDM-induced nucleon EDM using chiral quark action.
Aliasing modes in the lattice Schwinger model
Campos, Rafael G.; Tututi, Eduardo S.
2007-01-01
We study the Schwinger model on a lattice consisting of zeros of the Hermite polynomials that incorporates a lattice derivative and a discrete Fourier transform with many properties. Such a lattice produces a Klein-Gordon equation for the boson field and the exact value of the mass in the asymptotic limit if the boundaries are not taken into account. On the contrary, if the lattice is considered with boundaries new modes appear due to aliasing effects. In the continuum limit, however, this lattice yields also a Klein-Gordon equation with a reduced mass
Sommer, Rainer
2014-02-01
The principles of scale setting in lattice QCD as well as the advantages and disadvantages of various commonly used scales are discussed. After listing criteria for good scales, I concentrate on the main presently used ones with an emphasis on scales derived from the Yang-Mills gradient flow. For these I discuss discretisation errors, statistical precision and mass effects. A short review on numerical results also brings me to an unpleasant disagreement which remains to be explained.
Apiary B Factory lattice design
Donald, M.H.R.; Garren, A.A.
1991-04-01
The Apiary B Factory is a proposed high-intensity electron-positron collider. This paper will present the lattice design for this facility, which envisions two rings with unequal energies in the PEP tunnel. The design has many interesting optical and geometrical features due to the needs to conform to the existing tunnel, and to achieve the necessary emittances, damping times and vacuum. Existing hardware is used to a maximum extent. 8 figs. 1 tab
BROOKHAVEN: Lattice gauge theory symposium
Anon.
1986-12-15
Originally introduced by Kenneth Wilson in the early 70s, the lattice formulation of a quantum gauge theory became a hot topic of investigation after Mike Creutz, Laurence Jacobs and Claudio Rebbi demonstrated in 1979 the feasibility of meaningful computer simulations. The initial enthusiasm led gradually to a mature research effort, with continual attempts to improve upon previous results, to develop better computational techniques and to find new domains of application.
Harmonic Lattice Dynamics of Germanium
Nelin, G
1974-07-01
The phonon dispersion relations of the DELTA-, LAMBDA-, and SIGMA-directions of germanium at 80 K are analysed in terms of current harmonic lattice dynamical models. On the basis of this experience, a new model is proposed which gives a unified account of the strong points of the previous models. The principal elements of the presented theory are quasiparticle bond charges combined with a valence force field.
Screening in graphene antidot lattices
Schultz, Marco Haller; Jauho, A. P.; Pedersen, T. G.
2011-01-01
We compute the dynamical polarization function for a graphene antidot lattice in the random-phase approximation. The computed polarization functions display a much more complicated structure than what is found for pristine graphene (even when evaluated beyond the Dirac-cone approximation...... the plasmon dispersion law and find an approximate square-root dependence with a suppressed plasmon frequency as compared to doped graphene. The plasmon dispersion is nearly isotropic and the developed approximation schemes agree well with the full calculation....
Symplectic maps for accelerator lattices
Warnock, R.L.; Ruth, R.; Gabella, W.
1988-05-01
We describe a method for numerical construction of a symplectic map for particle propagation in a general accelerator lattice. The generating function of the map is obtained by integrating the Hamilton-Jacobi equation as an initial-value problem on a finite time interval. Given the generating function, the map is put in explicit form by means of a Fourier inversion technique. We give an example which suggests that the method has promise. 9 refs., 9 figs
Harmonic Lattice Dynamics of Germanium
Nelin, G.
1974-01-01
The phonon dispersion relations of the Δ-, Λ-, and Σ-directions of germanium at 80 K are analysed in terms of current harmonic lattice dynamical models. On the basis of this experience, a new model is proposed which gives a unified account of the strong points of the previous models. The principal elements of the presented theory are quasiparticle bond charges combined with a valence force field
Sommer, Rainer [DESY, Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC
2014-02-15
The principles of scale setting in lattice QCD as well as the advantages and disadvantages of various commonly used scales are discussed. After listing criteria for good scales, I concentrate on the main presently used ones with an emphasis on scales derived from the Yang-Mills gradient flow. For these I discuss discretisation errors, statistical precision and mass effects. A short review on numerical results also brings me to an unpleasant disagreement which remains to be explained.
Wave transmission in nonlinear lattices
Hennig, D.; Tsironis, G.P.
1999-01-01
The interplay of nonlinearity with lattice discreteness leads to phenomena and propagation properties quite distinct from those appearing in continuous nonlinear systems. For a large variety of condensed matter and optics applications the continuous wave approximation is not appropriate. In the present review we discuss wave transmission properties in one dimensional nonlinear lattices. Our paradigmatic equations are discrete nonlinear Schroedinger equations and their study is done through a dynamical systems approach. We focus on stationary wave properties and utilize well known results from the theory of dynamical systems to investigate various aspects of wave transmission and wave localization. We analyze in detail the more general dynamical system corresponding to the equation that interpolates between the non-integrable discrete nonlinear Schroedinger equation and the integrable Albowitz-Ladik equation. We utilize this analysis in a nonlinear Kronig-Penney model and investigate transmission and band modification properties. We discuss the modifications that are effected through an electric field and the nonlinear Wannier-Stark localization effects that are induced. Several applications are described, such as polarons in one dimensional lattices, semiconductor superlattices and one dimensional nonlinear photonic band gap systems. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)
Spin lattices of walking droplets
Saenz, Pedro; Pucci, Giuseppe; Goujon, Alexis; Dunkel, Jorn; Bush, John
2017-11-01
We present the results of an experimental investigation of the spontaneous emergence of collective behavior in spin lattice of droplets walking on a vibrating fluid bath. The bottom topography consists of relatively deep circular wells that encourage the walking droplets to follow circular trajectories centered at the lattice sites, in one direction or the other. Wave-mediated interactions between neighboring drops are enabled through a thin fluid layer between the wells. The sense of rotation of the walking droplets may thus become globally coupled. When the coupling is sufficiently strong, interactions with neighboring droplets may result in switches in spin that lead to preferred global arrangements, including correlated (all drops rotating in the same direction) or anti-correlated (neighboring drops rotating in opposite directions) states. Analogies with ferromagnetism and anti-ferromagnetism are drawn. Different spatial arrangements are presented in 1D and 2D lattices to illustrate the effects of topological frustration. This work was supported by the US National Science Foundation through Grants CMMI-1333242 and DMS-1614043.
Calculational methods for lattice cells
Askew, J.R.
1980-01-01
At the current stage of development, direct simulation of all the processes involved in the reactor to the degree of accuracy required is not an economic proposition, and this is achieved by progressive synthesis of models for parts of the full space/angle/energy neutron behaviour. The split between reactor and lattice calculations is one such simplification. Most reactors are constructed of repetitions of similar geometric units, the fuel elements, having broadly similar properties. Thus the provision of detailed predictions of their behaviour is an important step towards overall modelling. We shall be dealing with these lattice methods in this series of lectures, but will refer back from time to time to their relationship with overall reactor calculation The lattice cell is itself composed of somewhat similar sub-units, the fuel pins, and will itself often rely upon a further break down of modelling. Construction of a good model depends upon the identification, on physical and mathematical grounds, of the most helpful division of the calculation at this level
Statistical benchmarking in utility regulation: Role, standards and methods
Newton Lowry, Mark; Getachew, Lullit
2009-01-01
Statistical benchmarking is being used with increasing frequency around the world in utility rate regulation. We discuss how and where benchmarking is in use for this purpose and the pros and cons of regulatory benchmarking. We then discuss alternative performance standards and benchmarking methods in regulatory applications. We use these to propose guidelines for the appropriate use of benchmarking in the rate setting process. The standards, which we term the competitive market and frontier paradigms, have a bearing on method selection. These along with regulatory experience suggest that benchmarking can either be used for prudence review in regulation or to establish rates or rate setting mechanisms directly
Definition and Analysis of Heavy Water Reactor Benchmarks for Testing New Wims-D Libraries
Leszczynski, Francisco
2000-01-01
This work is part of the IAEA-WIMS Library Update Project (WLUP). A group of heavy water reactor benchmarks have been selected for testing new WIMS-D libraries, including calculations with WIMSD5B program and the analysis of results.These benchmarks cover a wide variety of reactors and conditions, from fresh fuels to high burnup, and from natural to enriched uranium.Besides, each benchmark includes variations in lattice pitch and in coolants (normally heavy water and void).Multiplication factors with critical experimental bucklings and other parameters are calculated and compared with experimental reference values.The WIMS libraries used for the calculations were generated with basic data from JEF-2.2 Rev.3 (JEF) and ENDF/B-VI iNReleaseln 5 (E6) Results obtained with WIMS-86 (W86) library, included with WIMSD5B package, from Windfrith, UK with adjusted data, are included also, for showing the improvements obtained with the new -not adjusted- libraries.The calculations with WIMSD5B were made with two methods (input program options): PIJ (two-dimension collision probability method) and DSN (one-dimension Sn method, with homogenization of materials by ring).The general conclusions are: the library based on JEF data and the DSN meted give the best results, that in average are acceptable
Benchmarking optimization solvers for structural topology optimization
Rojas Labanda, Susana; Stolpe, Mathias
2015-01-01
solvers in IPOPT and FMINCON, and the sequential quadratic programming method in SNOPT, are benchmarked on the library using performance profiles. Whenever possible the methods are applied to both the nested and the Simultaneous Analysis and Design (SAND) formulations of the problem. The performance...
Developing Benchmarking Criteria for CO2 Emissions
Neelis, M.; Worrell, E.; Mueller, N.; Angelini, T. [Ecofys, Utrecht (Netherlands); Cremer, C.; Schleich, J.; Eichhammer, W. [The Fraunhofer Institute for Systems and Innovation research, Karlsruhe (Germany)
2009-02-15
A European Union (EU) wide greenhouse gas (GHG) allowance trading scheme (EU ETS) was implemented in the EU in 2005. In the first two trading periods of the scheme (running up to 2012), free allocation based on historical emissions was the main methodology for allocation of allowances to existing installations. For the third trading period (2013 - 2020), the European Commission proposed in January 2008 a more important role of auctioning of allowances rather then free allocation. (Transitional) free allocation of allowances to industrial sectors will be determined via harmonized allocation rules, where feasible based on benchmarking. In general terms, a benchmark based method allocates allowances based on a certain amount of emissions per unit of productive output (i.e. the benchmark). This study aims to derive criteria for an allocation methodology for the EU Emission Trading Scheme based on benchmarking for the period 2013 - 2020. To test the feasibility of the criteria, we apply them to four example product groups: iron and steel, pulp and paper, lime and glass. The basis for this study is the Commission proposal for a revised ETS directive put forward on 23 January 2008 and does not take into account any changes to this proposal in the co-decision procedure that resulted in the adoption of the Energy and Climate change package in December 2008.
Why and How to Benchmark XML Databases
A.R. Schmidt; F. Waas; M.L. Kersten (Martin); D. Florescu; M.J. Carey; I. Manolescu; R. Busse
2001-01-01
textabstractBenchmarks belong to the very standard repertory of tools deployed in database development. Assessing the capabilities of a system, analyzing actual and potential bottlenecks, and, naturally, comparing the pros and cons of different systems architectures have become indispensable tasks
Determination of Benchmarks Stability within Ahmadu Bello ...
Heights of six geodetic benchmarks over a total distance of 8.6km at the Ahmadu Bello University (ABU), Zaria, Nigeria were recomputed and analysed using least squares adjustment technique. The network computations were tied to two fix primary reference pillars situated outside the campus. The two-tail Chi-square ...