Energy Technology Data Exchange (ETDEWEB)
Uddin, M.N. [Department of Physics, Jahangirnagar University, Dhaka (Bangladesh); Sarker, M.M. [Reactor Physics and Engineering Division, Institute of Nuclear Science and Technology, Atomic Energy Research Establishment, Savar, GPO Box 3787, Dhaka 1000 (Bangladesh); Khan, M.J.H. [Reactor Physics and Engineering Division, Institute of Nuclear Science and Technology, Atomic Energy Research Establishment, Savar, GPO Box 3787, Dhaka 1000 (Bangladesh)], E-mail: jahirulkhan@yahoo.com; Islam, S.M.A. [Department of Physics, Jahangirnagar University, Dhaka (Bangladesh)
2009-10-15
The aim of this paper is to present the validation of evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 through the analysis of the integral parameters of TRX and BAPL benchmark lattices of thermal reactors for neutronics analysis of TRIGA Mark-II Research Reactor at AERE, Bangladesh. In this process, the 69-group cross-section library for lattice code WIMS was generated using the basic evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 with the help of nuclear data processing code NJOY99.0. Integral measurements on the thermal reactor lattices TRX-1, TRX-2, BAPL-UO{sub 2}-1, BAPL-UO{sub 2}-2 and BAPL-UO{sub 2}-3 served as standard benchmarks for testing nuclear data files and have also been selected for this analysis. The integral parameters of the said lattices were calculated using the lattice transport code WIMSD-5B based on the generated 69-group cross-section library. The calculated integral parameters were compared to the measured values as well as the results of Monte Carlo Code MCNP. It was found that in most cases, the values of integral parameters show a good agreement with the experiment and MCNP results. Besides, the group constants in WIMS format for the isotopes U-235 and U-238 between two data files have been compared using WIMS library utility code WILLIE and it was found that the group constants are identical with very insignificant difference. Therefore, this analysis reflects the validation of evaluated nuclear data files CENDL-2.2 and JEFF-3.1.1 through benchmarking the integral parameters of TRX and BAPL lattices and can also be essential to implement further neutronic analysis of TRIGA Mark-II research reactor at AERE, Dhaka, Bangladesh.
Validation of WIMS-CANDU using Pin-Cell Lattices
Energy Technology Data Exchange (ETDEWEB)
Kim, Won Young; Min, Byung Joo; Park, Joo Hwan [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)
2006-07-01
The WIMS-CANDU is a lattice code which has a depletion capability for the analysis of reactor physics problems related to a design and safety. The WIMS-CANDU code has been developed from the WIMSD5B, a version of the WIMS code released from the OECD/NEA data bank in 1998. The lattice code POWDERPUFS-V (PPV) has been used for the physics design and analysis of a natural uranium fuel for the CANDU reactor. However since the application of PPV is limited to a fresh fuel due to its empirical correlations, the WIMS-AECL code has been developed by AECL to substitute the PPV. Also, the WIMS-CANDU code is being developed to perform the physics analysis of the present operating CANDU reactors as a replacement of PPV. As one of the developing work of WIMS-CANDU, the U{sup 238} absorption cross-section in the nuclear data library of WIMS-CANDU was updated and WIMS-CANDU was validated using the benchmark problems for pin-cell lattices such as TRX-1, TRX-2, Bapl-1, Bapl-2 and Bapl-3. The results by the WIMS-CANDU and the WIMS-AECL were compared with the experimental data.
Lattice Wess-Zumino model with Ginsparg-Wilson fermions: One-loop results and GPU benchmarks
Chen, Chen; Giedt, Joel
2010-01-01
We numerically evaluate the one-loop counterterms for the four-dimensional Wess-Zumino model formulated on the lattice using Ginsparg-Wilson fermions of the overlap (Neuberger) variety, such that a lattice version of U(1)_R symmetry is exactly preserved in the limit of vanishing bare mass. We confirm previous findings by other authors that at one loop there is no renormalization of the superpotential in the lattice theory. We discuss aspects of the simulation of this model that is planned for a follow-up work, and outline a strategy for nonperturbative improvement of the lattice supercurrent through measurements of \\susy\\ Ward identities. Related to this, some benchmarks for our graphics processing unit code are provided. An initial simulation finds a nearly vanishing vacuum expectation value for the auxiliary field, consistent with approximate supersymmetry.
D.C. Blitz (David)
2011-01-01
textabstractBenchmarking benchmarks is a bundle of six studies that are inspired by the prevalence of benchmarking in academic finance research as well as in investment practice. Three studies examine if current benchmark asset pricing models adequately describe the cross-section of stock returns.
D.C. Blitz (David)
2011-01-01
textabstractBenchmarking benchmarks is a bundle of six studies that are inspired by the prevalence of benchmarking in academic finance research as well as in investment practice. Three studies examine if current benchmark asset pricing models adequately describe the cross-section of stock returns. W
Benchmarking DFT and semiempirical methods on structures and lattice energies for ten ice polymorphs
Brandenburg, Jan Gerit; Maas, Tilo; Grimme, Stefan
2015-03-01
Water in different phases under various external conditions is very important in bio-chemical systems and for material science at surfaces. Density functional theory methods and approximations thereof have to be tested system specifically to benchmark their accuracy regarding computed structures and interaction energies. In this study, we present and test a set of ten ice polymorphs in comparison to experimental data with mass densities ranging from 0.9 to 1.5 g/cm3 and including explicit corrections for zero-point vibrational and thermal effects. London dispersion inclusive density functionals at the generalized gradient approximation (GGA), meta-GGA, and hybrid level as well as alternative low-cost molecular orbital methods are considered. The widely used functional of Perdew, Burke and Ernzerhof (PBE) systematically overbinds and overall provides inconsistent results. All other tested methods yield reasonable to very good accuracy. BLYP-D3atm gives excellent results with mean absolute errors for the lattice energy below 1 kcal/mol (7% relative deviation). The corresponding optimized structures are very accurate with mean absolute relative deviations (MARDs) from the reference unit cell volume below 1%. The impact of Axilrod-Teller-Muto (atm) type three-body dispersion and of non-local Fock exchange is small but on average their inclusion improves the results. While the density functional tight-binding model DFTB3-D3 performs well for low density phases, it does not yield good high density structures. As low-cost alternative for structure related problems, we recommend the recently introduced minimal basis Hartree-Fock method HF-3c with a MARD of about 3%.
Lattice Wess-Zumino model with Ginsparg-Wilson fermions: One-loop results and GPU benchmarks
Chen, Chen; Dzienkowski, Eric; Giedt, Joel
2010-10-01
We numerically evaluate the one-loop counterterms for the four-dimensional Wess-Zumino model formulated on the lattice using Ginsparg-Wilson fermions of the overlap (Neuberger) variety, together with an auxiliary fermion (plus superpartners), such that a lattice version of U(1)R symmetry is exactly preserved in the limit of vanishing bare mass. We confirm previous findings by other authors that at one loop there is no renormalization of the superpotential in the lattice theory, but that there is a mismatch in the wave-function renormalization of the auxiliary field. We study the range of the Dirac operator that results when the auxiliary fermion is integrated out, and show that localization does occur, but that it is less pronounced than the exponential localization of the overlap operator. We also present preliminary simulation results for this model, and outline a strategy for nonperturbative improvement of the lattice supercurrent through measurements of supersymmetry Ward identities. Related to this, some benchmarks for our graphics processing unit code are provided. Our simulation results find a nearly vanishing vacuum expectation value for the auxiliary field, consistent with approximate supersymmetry at weak coupling.
Energy Technology Data Exchange (ETDEWEB)
Hummel, D.W.; Langton, S.E.; Ball, M.R.; Novog, D.R.; Buijs, A., E-mail: hummeld@mcmaster.ca [McMaster Univ., Hamilton, Ontario (Canada)
2013-07-01
Discrepancies have been observed among a number of recent reactor physics studies in support of the PT-SCWR pre-conceptual design, including differences in lattice-level predictions of infinite neutron multiplication factor, coolant void reactivity, and radial power profile. As a first step to resolving these discrepancies, a lattice-level benchmark problem was designed based on the 78-element plutonium-thorium PT-SCWR fuel design under a set of prescribed local conditions. This benchmark problem was modeled with a suite of both deterministic and Monte Carlo neutron transport codes. The results of these models are presented here as the basis of a code-to-code comparison. (author)
Salomons, E.M.; Lohman, W.J.A.; Zhou, H.
2016-01-01
Propagation of sound waves in air can be considered as a special case of fluid dynamics. Consequently, the lattice Boltzmann method (LBM) for fluid flow can be used for simulating sound propagation. In this article application of the LBM to sound propagation is illustrated for various cases: free-fi
Erik M. Salomons; Lohman, Walter J. A.; Han Zhou
2016-01-01
Propagation of sound waves in air can be considered as a special case of fluid dynamics. Consequently, the lattice Boltzmann method (LBM) for fluid flow can be used for simulating sound propagation. In this article application of the LBM to sound propagation is illustrated for various cases: free-field propagation, propagation over porous and non-porous ground, propagation over a noise barrier, and propagation in an atmosphere with wind. LBM results are compared with solutions of the equation...
Directory of Open Access Journals (Sweden)
Shun Zou
2015-02-01
Full Text Available An efficient IBLF-dts scheme is proposed to integrate the bounce-back LBM and FVM scheme to solve the Navier-Stokes equations and the constitutive equation, respectively, for the simulation of viscoelastic fluid flows. In order to improve the efficiency, the bounce-back boundary treatment for LBM is introduced in to improve the grid mapping of LBM and FVM, and the two processes are also decoupled in different time scales according to the relaxation time of polymer and the time scale of solvent Newtonian effect. Critical numerical simulations have been carried out to validate the integrated scheme in various benchmark flows at vanishingly low Reynolds number with open source CFD toolkits. The results show that the numerical solution with IBLF-dts scheme agrees well with the exact solution and the numerical solution with FVM PISO scheme and the efficiency and scalability could be remarkably improved under equivalent configurations.
Directory of Open Access Journals (Sweden)
Erik M Salomons
Full Text Available Propagation of sound waves in air can be considered as a special case of fluid dynamics. Consequently, the lattice Boltzmann method (LBM for fluid flow can be used for simulating sound propagation. In this article application of the LBM to sound propagation is illustrated for various cases: free-field propagation, propagation over porous and non-porous ground, propagation over a noise barrier, and propagation in an atmosphere with wind. LBM results are compared with solutions of the equations of acoustics. It is found that the LBM works well for sound waves, but dissipation of sound waves with the LBM is generally much larger than real dissipation of sound waves in air. To circumvent this problem it is proposed here to use the LBM for assessing the excess sound level, i.e. the difference between the sound level and the free-field sound level. The effect of dissipation on the excess sound level is much smaller than the effect on the sound level, so the LBM can be used to estimate the excess sound level for a non-dissipative atmosphere, which is a useful quantity in atmospheric acoustics. To reduce dissipation in an LBM simulation two approaches are considered: i reduction of the kinematic viscosity and ii reduction of the lattice spacing.
Salomons, Erik M; Lohman, Walter J A; Zhou, Han
2016-01-01
Propagation of sound waves in air can be considered as a special case of fluid dynamics. Consequently, the lattice Boltzmann method (LBM) for fluid flow can be used for simulating sound propagation. In this article application of the LBM to sound propagation is illustrated for various cases: free-field propagation, propagation over porous and non-porous ground, propagation over a noise barrier, and propagation in an atmosphere with wind. LBM results are compared with solutions of the equations of acoustics. It is found that the LBM works well for sound waves, but dissipation of sound waves with the LBM is generally much larger than real dissipation of sound waves in air. To circumvent this problem it is proposed here to use the LBM for assessing the excess sound level, i.e. the difference between the sound level and the free-field sound level. The effect of dissipation on the excess sound level is much smaller than the effect on the sound level, so the LBM can be used to estimate the excess sound level for a non-dissipative atmosphere, which is a useful quantity in atmospheric acoustics. To reduce dissipation in an LBM simulation two approaches are considered: i) reduction of the kinematic viscosity and ii) reduction of the lattice spacing.
2014-09-15
Lattice Boltzmann Method (LBM) has become increasingly popular as an alternative approach to traditional NS-based techniques for modeling various...CAVS: Center for Advanced Vehicular Systems • CFD : computational fluid dynamics • DEM: discrete element method • FDM: finite difference method...Mach number • MRT: multiple relaxation time • NS: Navier-Stokes method • PISO: pressure implicit with splitting operator • Re: Reynolds number
Directory of Open Access Journals (Sweden)
Wiji Suwarno
2017-02-01
Full Text Available The term benchmarking has been encountered in the implementation of total quality (TQM or in Indonesian termed holistic quality management because benchmarking is a tool to look for ideas or learn from the library. Benchmarking is a processof measuring and comparing for continuous business process of systematic and continuous measurement, the process of measuring and comparing for continuous business process of an organization to get information that can help these organization improve their performance efforts.
2012-01-01
This bachelor's thesis is focused on financial benchmarking of TULIPA PRAHA s.r.o. The aim of this work is to evaluate financial situation of the company, identify its strengths and weaknesses and to find out how efficient is the performance of this company in comparison with top companies within the same field by using INFA benchmarking diagnostic system of financial indicators. The theoretical part includes the characteristic of financial analysis, which financial benchmarking is based on a...
DEFF Research Database (Denmark)
Hougaard, Jens Leth; Tvede, Mich
2002-01-01
Within a production theoretic framework, this paper considers an axiomatic approach to benchmark selection. It is shown that two simple and weak axioms; efficiency and comprehensive monotonicity characterize a natural family of benchmarks which typically becomes unique. Further axioms are added...
DEFF Research Database (Denmark)
Lawson, Lartey; Nielsen, Kurt
2005-01-01
distance functions. The frontier is given by an explicit quantile, e.g. “the best 90 %”. Using the explanatory model of the inefficiency, the user can adjust the frontiers by submitting state variables that influence the inefficiency. An efficiency study of Danish dairy farms is implemented......We discuss individual learning by interactive benchmarking using stochastic frontier models. The interactions allow the user to tailor the performance evaluation to preferences and explore alternative improvement strategies by selecting and searching the different frontiers using directional...... in the suggested benchmarking tool. The study investigates how different characteristics on dairy farms influences the technical efficiency....
Kvantitativ benchmark - Produktionsvirksomheder
DEFF Research Database (Denmark)
Sørensen, Ole H.; Andersen, Vibeke
Rapport med resultatet af kvantitativ benchmark over produktionsvirksomhederne i VIPS projektet.......Rapport med resultatet af kvantitativ benchmark over produktionsvirksomhederne i VIPS projektet....
Benchmarking in Student Affairs.
Mosier, Robert E.; Schwarzmueller, Gary J.
2002-01-01
Discusses the use of benchmarking in student affairs, focusing on issues related to student housing. Provides examples of how benchmarking has influenced administrative practice at many institutions. (EV)
Blecher, Jan
2009-01-01
The aim of this paper is to describe benefits of benchmarking IT in wider context and benchmarking scope at all. I specify benchmarking as a process and mention basic rules and guidelines. Further I define IT benchmarking domains and describe possibilities of their use. Best known type of IT benchmark is cost benchmark which represents only a subset of benchmark opportunities. In this paper, is cost benchmark rather an imaginary first step to benchmarking contribution to company. IT benchmark...
DSP Platform Benchmarking : DSP Platform Benchmarking
Xinyuan, Luo
2009-01-01
Benchmarking of DSP kernel algorithms was conducted in the thesis on a DSP processor for teaching in the course TESA26 in the department of Electrical Engineering. It includes benchmarking on cycle count and memory usage. The goal of the thesis is to evaluate the quality of a single MAC DSP instruction set and provide suggestions for further improvement in instruction set architecture accordingly. The scope of the thesis is limited to benchmark the processor only based on assembly coding. The...
U.S. Environmental Protection Agency — The Aquatic Life Benchmarks is an EPA-developed set of criteria for freshwater species. These benchmarks are based on toxicity values reviewed by EPA and used in the...
Lennartsson, Per; Nordlander, Lars
2002-01-01
This Master thesis describes the benchmarking of a DSP processor. Benchmarking means measuring the performance in some way. In this report, we have focused on the number of instruction cycles needed to execute certain algorithms. The algorithms we have used in the benchmark are all very common in signal processing today. The results we have reached in this thesis have been compared to benchmarks for other processors, performed by Berkeley Design Technology, Inc. The algorithms were programm...
DEFF Research Database (Denmark)
Jensen, Christian Søndergaard; Tiesyte, Dalia; Tradisauskas, Nerius
2006-01-01
, and more are underway. As a result, there is an increasing need for an independent benchmark for spatio-temporal indexes. This paper characterizes the spatio-temporal indexing problem and proposes a benchmark for the performance evaluation and comparison of spatio-temporal indexes. Notably, the benchmark...
Chakrabarti, J; Bagchi, B; Chakrabarti, Jayprokas; Basu, Asis; Bagchi, Bijon
2000-01-01
Fermions on the lattice have bosonic excitations generated from the underlying periodic background. These, the lattice bosons, arise near the empty band or when the bands are nearly full. They do not depend on the nature of the interactions and exist for any fermion-fermion coupling. We discuss these lattice boson solutions for the Dirac Hamiltonian.
Benchmarking semantic web technology
García-Castro, R
2009-01-01
This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:
Benchmarking in University Toolbox
Directory of Open Access Journals (Sweden)
Katarzyna Kuźmicz
2015-06-01
Full Text Available In the face of global competition and rising challenges that higher education institutions (HEIs meet, it is imperative to increase innovativeness and efficiency of their management. Benchmarking can be the appropriate tool to search for a point of reference necessary to assess institution’s competitive position and learn from the best in order to improve. The primary purpose of the paper is to present in-depth analysis of benchmarking application in HEIs worldwide. The study involves indicating premises of using benchmarking in HEIs. It also contains detailed examination of types, approaches and scope of benchmarking initiatives. The thorough insight of benchmarking applications enabled developing classification of benchmarking undertakings in HEIs. The paper includes review of the most recent benchmarking projects and relating them to the classification according to the elaborated criteria (geographical range, scope, type of data, subject, support and continuity. The presented examples were chosen in order to exemplify different approaches to benchmarking in higher education setting. The study was performed on the basis of the published reports from benchmarking projects, scientific literature and the experience of the author from the active participation in benchmarking projects. The paper concludes with recommendations for university managers undertaking benchmarking, derived on the basis of the conducted analysis.
Polarization response of RHIC electron lens lattices
Directory of Open Access Journals (Sweden)
V. H. Ranjbar
2016-10-01
Full Text Available Depolarization response for a system of two orthogonal snakes at irrational tunes is studied in depth using lattice independent spin integration. In particular we consider the effect of overlapping spin resonances in this system, to understand the impact of phase, tune, relative location and threshold strengths of the spin resonances. These results are benchmarked and compared to two dimensional direct tracking results for the RHIC e-lens lattice and the standard lattice. Finally we consider the effect of longitudinal motion via chromatic scans using direct six dimensional lattice tracking.
DEFF Research Database (Denmark)
Friberg, Henrik A.
This document constitutes the technical reference manual of the Conic Benchmark Format with le extension: .cbf or .CBF. It unies linear, second-order cone (also known as conic quadratic) and semidenite optimization with mixed-integer variables. The format has been designed with benchmark libraries...... in mind, and therefore focuses on compact and easily parsable representations. The problem structure is separated from the problem data, and the format moreover facilitate benchmarking of hotstart capability through sequences of changes....
DEFF Research Database (Denmark)
Bogetoft, Peter; Nielsen, Kurt
2005-01-01
We discuss the design of interactive, internet based benchmarking using parametric (statistical) as well as nonparametric (DEA) models. The user receives benchmarks and improvement potentials. The user is also given the possibility to search different efficiency frontiers and hereby to explore...
Thermal Performance Benchmarking (Presentation)
Energy Technology Data Exchange (ETDEWEB)
Moreno, G.
2014-11-01
This project will benchmark the thermal characteristics of automotive power electronics and electric motor thermal management systems. Recent vehicle systems will be benchmarked to establish baseline metrics, evaluate advantages and disadvantages of different thermal management systems, and identify areas of improvement to advance the state-of-the-art.
Blank, j.l.t.
2008-01-01
OnderzoeksrapportenArchiefTechniek, Bestuur en Management> Over faculteit> Afdelingen> Innovation Systems> IPSE> Onderzoek> Publicaties> Onderzoeksrapporten> Handleiding benchmark VO Handleiding benchmark VO 25 november 2008 door IPSE Studies Door J.L.T. Blank. Handleiding voor het lezen van de i
Benchmark af erhvervsuddannelserne
DEFF Research Database (Denmark)
Bogetoft, Peter; Wittrup, Jesper
I dette arbejdspapir diskuterer vi, hvorledes de danske erhvervsskoler kan benchmarkes, og vi præsenterer resultaterne af en række beregningsmodeller. Det er begrebsmæssigt kompliceret at benchmarke erhvervsskolerne. Skolerne udbyder en lang række forskellige uddannelser. Det gør det vanskeligt...
Benchmarking af kommunernes sagsbehandling
DEFF Research Database (Denmark)
Amilon, Anna
Fra 2007 skal Ankestyrelsen gennemføre benchmarking af kommuernes sagsbehandlingskvalitet. Formålet med benchmarkingen er at udvikle praksisundersøgelsernes design med henblik på en bedre opfølgning og at forbedre kommunernes sagsbehandling. Dette arbejdspapir diskuterer metoder for benchmarking...
DEFF Research Database (Denmark)
Seabrooke, Leonard; Wigan, Duncan
2015-01-01
Non-governmental organisations use benchmarks as a form of symbolic violence to place political pressure on firms, states, and international organisations. The development of benchmarks requires three elements: (1) salience, that the community of concern is aware of the issue and views...... are put to the test. The first is a reformist benchmarking cycle where organisations defer to experts to create a benchmark that conforms with the broader system of politico-economic norms. The second is a revolutionary benchmarking cycle driven by expert-activists that seek to contest strong vested...... interests and challenge established politico-economic norms. Differentiating these cycles provides insights into how activists work through organisations and with expert networks, as well as how campaigns on complex economic issues can be mounted and sustained....
Wang, Da-Wei; Zhu, Shi-Yao; Scully, Marlan O
2014-01-01
We show that the timed Dicke states of a collection of three-level atoms can form a tight-binding lattice in the momentum space. This lattice, coined the superradiance lattice (SL), can be constructed based on an electromagnetically induced transparency (EIT) system. For a one-dimensional SL, we need the coupling field of the EIT system to be a standing wave. The detuning between the two components of the standing wave introduces an effective electric field. The quantum behaviours of electrons in lattices, such as Bloch oscillations, Wannier-Stark ladders, Bloch band collapsing and dynamic localization can be observed in the SL. The SL can be extended to two, three and even higher dimensions where no analogous real space lattices exist and new physics are waiting to be explored.
van der Marck, Steven C.
2006-12-01
benchmarks deviates only 0.017% from the measured benchmark value. Moreover, no clear trends (with e.g. enrichment, lattice pitch, or spectrum) have been observed. Also for fast spectrum benchmarks, both for intermediately or highly enriched uranium and for plutonium, clear improvements are apparent from the calculations. The results for bare assemblies have improved, as well as those with a depleted or natural uranium reflector. On the other hand, the results for plutonium solutions (PU-SOL-THERM) are still high, on average (over 120 benchmarks) roughly 0.6%. Furthermore there still is a bias for a range of benchmarks based on cores in the Zero Power Reactor (ANL) with sizable amounts of tungsten in them. The results for the fusion shielding benchmarks have not changed significantly, compared to ENDF/B-VI.8, for most materials. The delayed neutron testing shows that the values for both thermal and fast spectrum cases are now well predicted, which is an improvement when compared with ENDF/B-VI.8.
Verification and validation benchmarks.
Energy Technology Data Exchange (ETDEWEB)
Oberkampf, William Louis; Trucano, Timothy Guy
2007-02-01
Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of
Toxicological Benchmarks for Wildlife
Energy Technology Data Exchange (ETDEWEB)
Sample, B.E. Opresko, D.M. Suter, G.W.
1993-01-01
Ecological risks of environmental contaminants are evaluated by using a two-tiered process. In the first tier, a screening assessment is performed where concentrations of contaminants in the environment are compared to no observed adverse effects level (NOAEL)-based toxicological benchmarks. These benchmarks represent concentrations of chemicals (i.e., concentrations presumed to be nonhazardous to the biota) in environmental media (water, sediment, soil, food, etc.). While exceedance of these benchmarks does not indicate any particular level or type of risk, concentrations below the benchmarks should not result in significant effects. In practice, when contaminant concentrations in food or water resources are less than these toxicological benchmarks, the contaminants may be excluded from further consideration. However, if the concentration of a contaminant exceeds a benchmark, that contaminant should be retained as a contaminant of potential concern (COPC) and investigated further. The second tier in ecological risk assessment, the baseline ecological risk assessment, may use toxicological benchmarks as part of a weight-of-evidence approach (Suter 1993). Under this approach, based toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. Other sources of evidence include media toxicity tests, surveys of biota (abundance and diversity), measures of contaminant body burdens, and biomarkers. This report presents NOAEL- and lowest observed adverse effects level (LOAEL)-based toxicological benchmarks for assessment of effects of 85 chemicals on 9 representative mammalian wildlife species (short-tailed shrew, little brown bat, meadow vole, white-footed mouse, cottontail rabbit, mink, red fox, and whitetail deer) or 11 avian wildlife species (American robin, rough-winged swallow, American woodcock, wild turkey, belted kingfisher, great blue heron, barred owl, barn owl, Cooper's hawk, and red
Benchmarking expert system tools
Riley, Gary
1988-01-01
As part of its evaluation of new technologies, the Artificial Intelligence Section of the Mission Planning and Analysis Div. at NASA-Johnson has made timing tests of several expert system building tools. Among the production systems tested were Automated Reasoning Tool, several versions of OPS5, and CLIPS (C Language Integrated Production System), an expert system builder developed by the AI section. Also included in the test were a Zetalisp version of the benchmark along with four versions of the benchmark written in Knowledge Engineering Environment, an object oriented, frame based expert system tool. The benchmarks used for testing are studied.
Financial Integrity Benchmarks
City of Jackson, Mississippi — This data compiles standard financial integrity benchmarks that allow the City to measure its financial standing. It measure the City's debt ratio and bond ratings....
Vermont Center for Geographic Information — The GeodeticBenchmark_GEOMON data layer consists of geodetic control monuments (points) that have a known position or spatial reference. The locations of these...
Diagnostic Algorithm Benchmarking
Poll, Scott
2011-01-01
A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.
Han, Rui; Lu, Xiaoyi
2014-01-01
Big data systems address the challenges of capturing, storing, managing, analyzing, and visualizing big data. Within this context, developing benchmarks to evaluate and compare big data systems has become an active topic for both research and industry communities. To date, most of the state-of-the-art big data benchmarks are designed for specific types of systems. Based on our experience, however, we argue that considering the complexity, diversity, and rapid evolution of big data systems, fo...
Benchmarking in Foodservice Operations.
2007-11-02
51. Lingle JH, Schiemann WA. From balanced scorecard to strategic gauges: Is measurement worth it? Mgt Rev. 1996; 85(3):56-61. 52. Struebing L...studies lasted from nine to twelve months, and could extend beyond that time for numerous reasons (49). Benchmarking was not industrial tourism , a...not simply data comparison, a fad, a means for reducing resources, a quick-fix program, or industrial tourism . Benchmarking was a complete process
Benchmarking File System Benchmarking: It *IS* Rocket Science
Seltzer, Margo I.; Tarasov, Vasily; Bhanage, Saumitra; Zadok, Erez
2011-01-01
The quality of file system benchmarking has not improved in over a decade of intense research spanning hundreds of publications. Researchers repeatedly use a wide range of poorly designed benchmarks, and in most cases, develop their own ad-hoc benchmarks. Our community lacks a definition of what we want to benchmark in a file system. We propose several dimensions of file system benchmarking and review the wide range of tools and techniques in widespread use. We experimentally show that even t...
The KMAT: Benchmarking Knowledge Management.
de Jager, Martha
Provides an overview of knowledge management and benchmarking, including the benefits and methods of benchmarking (e.g., competitive, cooperative, collaborative, and internal benchmarking). Arthur Andersen's KMAT (Knowledge Management Assessment Tool) is described. The KMAT is a collaborative benchmarking tool, designed to help organizations make…
Sphere Lower Bound for Rotated Lattice Constellations in Fading Channels
Fabregas, Albert Guillen i
2007-01-01
We study the error probability performance of rotated lattice constellations in frequency-flat Nakagami-$m$ block-fading channels. In particular, we use the sphere lower bound on the underlying infinite lattice as a performance benchmark. We show that the sphere lower bound has full diversity. We observe that optimally rotated lattices with largest known minimum product distance perform very close to the lower bound, while the ensemble of random rotations is shown to lack diversity and perform far from it.
Benchmarking in Mobarakeh Steel Company
Sasan Ghasemi; Mohammad Nazemi; Mehran Nejati
2008-01-01
Benchmarking is considered as one of the most effective ways of improving performance in companies. Although benchmarking in business organizations is a relatively new concept and practice, it has rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan's Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aims to share the process deployed for the benchmarking project in this company and illustrate how th...
Benchmarking in Mobarakeh Steel Company
Directory of Open Access Journals (Sweden)
Sasan Ghasemi
2008-05-01
Full Text Available Benchmarking is considered as one of the most effective ways of improving performance incompanies. Although benchmarking in business organizations is a relatively new concept and practice, ithas rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahans Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aimsto share the process deployed for the benchmarking project in this company and illustrate how the projectsystematic implementation led to succes.
PNNL Information Technology Benchmarking
Energy Technology Data Exchange (ETDEWEB)
DD Hostetler
1999-09-08
Benchmarking is a methodology for searching out industry best practices that lead to superior performance. It is exchanging information, not just with any organization, but with organizations known to be the best within PNNL, in industry, or in dissimilar industries with equivalent functions. It is used as a continuous improvement tool for business and technical processes, products, and services. Information technology--comprising all computer and electronic communication products and services--underpins the development and/or delivery of many PNNL products and services. This document describes the Pacific Northwest National Laboratory's (PNNL's) approach to information technology (IT) benchmarking. The purpose is to engage other organizations in the collaborative process of benchmarking in order to improve the value of IT services provided to customers. TM document's intended audience consists of other US Department of Energy (DOE) national laboratories and their IT staff. Although the individual participants must define the scope of collaborative benchmarking, an outline of IT service areas for possible benchmarking is described.
Donnellan, Thomas; Maxwell, E A; Plumpton, C
1968-01-01
Lattice Theory presents an elementary account of a significant branch of contemporary mathematics concerning lattice theory. This book discusses the unusual features, which include the presentation and exploitation of partitions of a finite set. Organized into six chapters, this book begins with an overview of the concept of several topics, including sets in general, the relations and operations, the relation of equivalence, and the relation of congruence. This text then defines the relation of partial order and then partially ordered sets, including chains. Other chapters examine the properti
Benchmarking Pthreads performance
Energy Technology Data Exchange (ETDEWEB)
May, J M; de Supinski, B R
1999-04-27
The importance of the performance of threads libraries is growing as clusters of shared memory machines become more popular POSIX threads, or Pthreads, is an industry threads library standard. We have implemented the first Pthreads benchmark suite. In addition to measuring basic thread functions, such as thread creation, we apply the L.ogP model to standard Pthreads communication mechanisms. We present the results of our tests for several hardware platforms. These results demonstrate that the performance of existing Pthreads implementations varies widely; parts of nearly all of these implementations could be further optimized. Since hardware differences do not fully explain these performance variations, optimizations could improve the implementations. 2. Incorporating Threads Benchmarks into SKaMPI is an MPI benchmark suite that provides a general framework for performance analysis [7]. SKaMPI does not exhaustively test the MPI standard. Instead, it
DEFF Research Database (Denmark)
Rocha, Vera; Van Praag, Mirjam; Carneiro, Anabela
survival? The analysis is based on a matched employer-employee dataset and covers about 17,500 startups in manufacturing and services. We adopt a new procedure to estimate individual benchmarks for the quantity and quality of initial human resources, acknowledging correlations between hiring decisions...... the benchmark can be substantial, are persistent over time, and hinder the survival of firms. The implications may, however, vary according to the sector and the ownership structure at entry. Given the stickiness of initial choices, wrong human capital decisions at entry turn out to be a close to irreversible...
Benchmarking for Best Practice
Zairi, Mohamed
1998-01-01
Benchmarking for Best Practice uses up-to-the-minute case-studies of individual companies and industry-wide quality schemes to show how and why implementation has succeeded. For any practitioner wanting to establish best practice in a wide variety of business areas, this book makes essential reading. .It is also an ideal textbook on the applications of TQM since it describes concepts, covers definitions and illustrates the applications with first-hand examples. Professor Mohamed Zairi is an international expert and leading figure in the field of benchmarking. His pioneering work in this area l
HPCS HPCchallenge Benchmark Suite
2007-11-02
measured HPCchallenge Benchmark performance on various HPC architectures — from Cray X1s to Beowulf clusters — in the presentation and paper...from Cray X1s to Beowulf clusters — using the updated results at http://icl.cs.utk.edu/hpcc/hpcc_results.cgi Even a small percentage of random
Benchmarking Danish Industries
DEFF Research Database (Denmark)
Gammelgaard, Britta; Bentzen, Eric; Aagaard Andreassen, Mette
2003-01-01
compatible survey. The International Manufacturing Strategy Survey (IMSS) doesbring up the question of supply chain management, but unfortunately, we did not have access to thedatabase. Data from the members of the SCOR-model, in the form of benchmarked performance data,may exist, but are nonetheless...
Western Interstate Commission for Higher Education, 2013
2013-01-01
Benchmarks: WICHE Region 2012 presents information on the West's progress in improving access to, success in, and financing of higher education. The information is updated annually to monitor change over time and encourage its use as a tool for informed discussion in policy and education communities. To establish a general context for the…
Bers, Trudy
2012-01-01
Surveys and benchmarks continue to grow in importance for community colleges in response to several factors. One is the press for accountability, that is, for colleges to report the outcomes of their programs and services to demonstrate their quality and prudent use of resources, primarily to external constituents and governing boards at the state…
Benchmarking and Performance Management
Directory of Open Access Journals (Sweden)
Adrian TANTAU
2010-12-01
Full Text Available The relevance of the chosen topic is explained by the meaning of the firm efficiency concept - the firm efficiency means the revealed performance (how well the firm performs in the actual market environment given the basic characteristics of the firms and their markets that are expected to drive their profitability (firm size, market power etc.. This complex and relative performance could be due to such things as product innovation, management quality, work organization, some other factors can be a cause even if they are not directly observed by the researcher. The critical need for the management individuals/group to continuously improve their firm/company’s efficiency and effectiveness, the need for the managers to know which are the success factors and the competitiveness determinants determine consequently, what performance measures are most critical in determining their firm’s overall success. Benchmarking, when done properly, can accurately identify both successful companies and the underlying reasons for their success. Innovation and benchmarking firm level performance are critical interdependent activities. Firm level variables, used to infer performance, are often interdependent due to operational reasons. Hence, the managers need to take the dependencies among these variables into account when forecasting and benchmarking performance. This paper studies firm level performance using financial ratio and other type of profitability measures. It uses econometric models to describe and then propose a method to forecast and benchmark performance.
Benchmarking i den offentlige sektor
DEFF Research Database (Denmark)
Bukh, Per Nikolaj; Dietrichson, Lars; Sandalgaard, Niels
2008-01-01
I artiklen vil vi kort diskutere behovet for benchmarking i fraværet af traditionelle markedsmekanismer. Herefter vil vi nærmere redegøre for, hvad benchmarking er med udgangspunkt i fire forskellige anvendelser af benchmarking. Regulering af forsyningsvirksomheder vil blive behandlet, hvorefter...
Energy Technology Data Exchange (ETDEWEB)
Jaenisch, G.-R., E-mail: Gerd-Ruediger.Jaenisch@bam.de; Deresch, A., E-mail: Gerd-Ruediger.Jaenisch@bam.de; Bellon, C., E-mail: Gerd-Ruediger.Jaenisch@bam.de [Federal Institute for Materials Research and Testing, Unter den Eichen 87, 12205 Berlin (Germany); Schumm, A.; Lucet-Sanchez, F.; Guerin, P. [EDF R and D, 1 avenue du Général de Gaulle, 92141 Clamart (France)
2015-03-31
The purpose of the 2014 WFNDEC RT benchmark study was to compare predictions of various models of radiographic techniques, in particular those that predict the contribution of scattered radiation. All calculations were carried out for homogenous materials and a mono-energetic X-ray point source in the energy range between 100 keV and 10 MeV. The calculations were to include the best physics approach available considering electron binding effects. Secondary effects like X-ray fluorescence and bremsstrahlung production were to be taken into account if possible. The problem to be considered had two parts. Part I examined the spectrum and the spatial distribution of radiation behind a single iron plate. Part II considered two equally sized plates, made of iron and aluminum respectively, only evaluating the spatial distribution. Here we present the results of above benchmark study, comparing them to MCNP as the assumed reference model. The possible origins of the observed deviations are discussed.
Breuel, Thomas M.
2015-01-01
LSTM (Long Short-Term Memory) recurrent neural networks have been highly successful in a number of application areas. This technical report describes the use of the MNIST and UW3 databases for benchmarking LSTM networks and explores the effect of different architectural and hyperparameter choices on performance. Significant findings include: (1) LSTM performance depends smoothly on learning rates, (2) batching and momentum has no significant effect on performance, (3) softmax training outperf...
Energy Technology Data Exchange (ETDEWEB)
Schaefer, Stefan [DESY (Germany). Neumann Inst. for Computing
2016-11-01
These configurations are currently in use in many on-going projects carried out by researchers throughout Europe. In particular this data will serve as an essential input into the computation of the coupling constant of QCD, where some of the simulations are still on-going. But also projects computing the masses of hadrons and investigating their structure are underway as well as activities in the physics of heavy quarks. As this initial project of gauge field generation has been successful, it is worthwhile to extend the currently available ensembles with further points in parameter space. These will allow to further study and control systematic effects like the ones introduced by the finite volume, the non-physical quark masses and the finite lattice spacing. In particular certain compromises have still been made in the region where pion masses and lattice spacing are both small. This is because physical pion masses require larger lattices to keep the effects of the finite volume under control. At light pion masses, a precise control of the continuum extrapolation is therefore difficult, but certainly a main goal of future simulations. To reach this goal, algorithmic developments as well as faster hardware will be needed.
Benchmarking: applications to transfusion medicine.
Apelseth, Torunn Oveland; Molnar, Laura; Arnold, Emmy; Heddle, Nancy M
2012-10-01
Benchmarking is as a structured continuous collaborative process in which comparisons for selected indicators are used to identify factors that, when implemented, will improve transfusion practices. This study aimed to identify transfusion medicine studies reporting on benchmarking, summarize the benchmarking approaches used, and identify important considerations to move the concept of benchmarking forward in the field of transfusion medicine. A systematic review of published literature was performed to identify transfusion medicine-related studies that compared at least 2 separate institutions or regions with the intention of benchmarking focusing on 4 areas: blood utilization, safety, operational aspects, and blood donation. Forty-five studies were included: blood utilization (n = 35), safety (n = 5), operational aspects of transfusion medicine (n = 5), and blood donation (n = 0). Based on predefined criteria, 7 publications were classified as benchmarking, 2 as trending, and 36 as single-event studies. Three models of benchmarking are described: (1) a regional benchmarking program that collects and links relevant data from existing electronic sources, (2) a sentinel site model where data from a limited number of sites are collected, and (3) an institutional-initiated model where a site identifies indicators of interest and approaches other institutions. Benchmarking approaches are needed in the field of transfusion medicine. Major challenges include defining best practices and developing cost-effective methods of data collection. For those interested in initiating a benchmarking program, the sentinel site model may be most effective and sustainable as a starting point, although the regional model would be the ideal goal.
DEFF Research Database (Denmark)
Agrell, Per J.; Bogetoft, Peter
nchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques....... In this paper, we review the modern foundations for frontier-based regulation and we discuss its actual use in several jurisdictions....
Hoppszallern, S
2001-01-01
Our fifth annual guide to benchmarking under managed care presents data that is a study in market dynamics and adaptation. New this year are financial indicators on HMOs exiting the market and those remaining. Hospital financial ratios and details on department performance are included. The physician group practice numbers show why physicians are scrutinizing capitated payments. Overall, hospitals in markets with high managed care penetration are more successful in managing labor costs and show productivity gains in imaging services, physical therapy and materials management.
Benchmarking Query Execution Robustness
Wiener, Janet L.; Kuno, Harumi; Graefe, Goetz
Benchmarks that focus on running queries on a well-tuned database system ignore a long-standing problem: adverse runtime conditions can cause database system performance to vary widely and unexpectedly. When the query execution engine does not exhibit resilience to these adverse conditions, addressing the resultant performance problems can contribute significantly to the total cost of ownership for a database system in over-provisioning, lost efficiency, and increased human administrative costs. For example, focused human effort may be needed to manually invoke workload management actions or fine-tune the optimization of specific queries.
Dual Lattice of ℤ-module Lattice
Directory of Open Access Journals (Sweden)
Futa Yuichi
2017-07-01
Full Text Available In this article, we formalize in Mizar [5] the definition of dual lattice and their properties. We formally prove that a set of all dual vectors in a rational lattice has the construction of a lattice. We show that a dual basis can be calculated by elements of an inverse of the Gram Matrix. We also formalize a summation of inner products and their properties. Lattice of ℤ-module is necessary for lattice problems, LLL(Lenstra, Lenstra and Lovász base reduction algorithm and cryptographic systems with lattice [20], [10] and [19].
Benchmarking concentrating photovoltaic systems
Duerr, Fabian; Muthirayan, Buvaneshwari; Meuret, Youri; Thienpont, Hugo
2010-08-01
Integral to photovoltaics is the need to provide improved economic viability. To achieve this goal, photovoltaic technology has to be able to harness more light at less cost. A large variety of concentrating photovoltaic concepts has provided cause for pursuit. To obtain a detailed profitability analysis, a flexible evaluation is crucial for benchmarking the cost-performance of this variety of concentrating photovoltaic concepts. To save time and capital, a way to estimate the cost-performance of a complete solar energy system is to use computer aided modeling. In this work a benchmark tool is introduced based on a modular programming concept. The overall implementation is done in MATLAB whereas Advanced Systems Analysis Program (ASAP) is used for ray tracing calculations. This allows for a flexible and extendable structuring of all important modules, namely an advanced source modeling including time and local dependence, and an advanced optical system analysis of various optical designs to obtain an evaluation of the figure of merit. An important figure of merit: the energy yield for a given photovoltaic system at a geographical position over a specific period, can be calculated.
Entropy-based benchmarking methods
2012-01-01
We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth preservation method of Causey and Trager (1981) may violate this principle, while its requirements are explicitly taken into account in the pro-posed entropy-based benchmarking methods. Our illustrati...
HPC Benchmark Suite NMx Project
National Aeronautics and Space Administration — Intelligent Automation Inc., (IAI) and University of Central Florida (UCF) propose to develop a comprehensive numerical test suite for benchmarking current and...
Benchmarking foreign electronics technologies
Energy Technology Data Exchange (ETDEWEB)
Bostian, C.W.; Hodges, D.A.; Leachman, R.C.; Sheridan, T.B.; Tsang, W.T.; White, R.M.
1994-12-01
This report has been drafted in response to a request from the Japanese Technology Evaluation Center`s (JTEC) Panel on Benchmarking Select Technologies. Since April 1991, the Competitive Semiconductor Manufacturing (CSM) Program at the University of California at Berkeley has been engaged in a detailed study of quality, productivity, and competitiveness in semiconductor manufacturing worldwide. The program is a joint activity of the College of Engineering, the Haas School of Business, and the Berkeley Roundtable on the International Economy, under sponsorship of the Alfred P. Sloan Foundation, and with the cooperation of semiconductor producers from Asia, Europe and the United States. Professors David A. Hodges and Robert C. Leachman are the project`s Co-Directors. The present report for JTEC is primarily based on data and analysis drawn from that continuing program. The CSM program is being conducted by faculty, graduate students and research staff from UC Berkeley`s Schools of Engineering and Business, and Department of Economics. Many of the participating firms are represented on the program`s Industry Advisory Board. The Board played an important role in defining the research agenda. A pilot study was conducted in 1991 with the cooperation of three semiconductor plants. The research plan and survey documents were thereby refined. The main phase of the CSM benchmarking study began in mid-1992 and will continue at least through 1997. reports are presented on the manufacture of integrated circuits; data storage; wireless technology; human-machine interfaces; and optoelectronics. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.
Benchmarking monthly homogenization algorithms
Directory of Open Access Journals (Sweden)
V. K. C. Venema
2011-08-01
Full Text Available The COST (European Cooperation in Science and Technology Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative. The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide trend was added.
Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii the error in linear trend estimates and (iii traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve
Staff Association
2017-01-01
On 12 December 2016, in Echo No. 259, we already discussed at length the MERIT and benchmark jobs. Still, we find that a couple of issues warrant further discussion. Benchmark job – administrative decision on 1 July 2017 On 12 January 2017, the HR Department informed all staff members of a change to the effective date of the administrative decision regarding benchmark jobs. The benchmark job title of each staff member will be confirmed on 1 July 2017, instead of 1 May 2017 as originally announced in HR’s letter on 18 August 2016. Postponing the administrative decision by two months will leave a little more time to address the issues related to incorrect placement in a benchmark job. Benchmark job – discuss with your supervisor, at the latest during the MERIT interview In order to rectify an incorrect placement in a benchmark job, it is essential that the supervisor and the supervisee go over the assigned benchmark job together. In most cases, this placement has been done autom...
Benchmark for Strategic Performance Improvement.
Gohlke, Annette
1997-01-01
Explains benchmarking, a total quality management tool used to measure and compare the work processes in a library with those in other libraries to increase library performance. Topics include the main groups of upper management, clients, and staff; critical success factors for each group; and benefits of benchmarking. (Author/LRW)
Internal Benchmarking for Institutional Effectiveness
Ronco, Sharron L.
2012-01-01
Internal benchmarking is an established practice in business and industry for identifying best in-house practices and disseminating the knowledge about those practices to other groups in the organization. Internal benchmarking can be done with structures, processes, outcomes, or even individuals. In colleges or universities with multicampuses or a…
Entropy-based benchmarking methods
Temurshoev, Umed
2012-01-01
We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth pre
Applications of Integral Benchmark Data
Energy Technology Data Exchange (ETDEWEB)
Giuseppe Palmiotti; Teruhiko Kugo; Fitz Trumble; Albert C. (Skip) Kahler; Dale Lancaster
2014-10-09
The International Reactor Physics Experiment Evaluation Project (IRPhEP) and the International Criticality Safety Benchmark Evaluation Project (ICSBEP) provide evaluated integral benchmark data that may be used for validation of reactor physics / nuclear criticality safety analytical methods and data, nuclear data testing, advanced modeling and simulation, and safety analysis licensing activities. The handbooks produced by these programs are used in over 30 countries. Five example applications are presented in this paper: (1) Use of IRPhEP Data in Uncertainty Analyses and Cross Section Adjustment, (2) Uncertainty Evaluation Methods for Reactor Core Design at JAEA Using Reactor Physics Experimental Data, (3) Application of Benchmarking Data to a Broad Range of Criticality Safety Problems, (4) Cross Section Data Testing with ICSBEP Benchmarks, and (5) Use of the International Handbook of Evaluated Reactor Physics Benchmark Experiments to Support the Power Industry.
Benchmarking & European Sustainable Transport Policies
DEFF Research Database (Denmark)
Gudmundsson, H.
2003-01-01
, Benchmarking is one of the management tools that have recently been introduced in the transport sector. It is rapidly being applied to a wide range of transport operations, services and policies. This paper is a contribution to the discussion of the role of benchmarking in the future efforts...... to support Sustainable European Transport Policies. The key message is that transport benchmarking has not yet been developed to cope with the challenges of this task. Rather than backing down completely, the paper suggests some critical conditions for applying and adopting benchmarking for this purpose. One...... way forward is to ensure a higher level of environmental integration in transport policy benchmarking. To this effect the paper will discuss the possible role of the socalled Transport and Environment Reporting Mechanism developed by the European Environment Agency. The paper provides an independent...
Benchmarking and Sustainable Transport Policy
DEFF Research Database (Denmark)
Gudmundsson, Henrik; Wyatt, Andrew; Gordon, Lucy
2004-01-01
is generally not advised. Several other ways in which benchmarking and policy can support one another are identified in the analysis. This leads to a range of recommended initiatives to exploit the benefits of benchmarking in transport while avoiding some of the lurking pitfalls and dead ends......Order to learn from the best. In 2000 the European Commission initiated research to explore benchmarking as a tool to promote policies for sustainable transport. This paper reports findings and recommendations on how to address this challenge. The findings suggest that benchmarking is a valuable...... tool that may indeed help to move forward the transport policy agenda. However, there are major conditions and limitations. First of all it is not always so straightforward to delimit, measure and compare transport services in order to establish a clear benchmark. Secondly sustainable transport...
DEFF Research Database (Denmark)
Santocanale, Luigi
2002-01-01
A μ-lattice is a lattice with the property that every unary polynomial has both a least and a greatest fix-point. In this paper we define the quasivariety of μ-lattices and, for a given partially ordered set P, we construct a μ-lattice JP whose elements are equivalence classes of games in a preor...
Benchmarking of energy time series
Energy Technology Data Exchange (ETDEWEB)
Williamson, M.A.
1990-04-01
Benchmarking consists of the adjustment of time series data from one source in order to achieve agreement with similar data from a second source. The data from the latter source are referred to as the benchmark(s), and often differ in that they are observed at a lower frequency, represent a higher level of temporal aggregation, and/or are considered to be of greater accuracy. This report provides an extensive survey of benchmarking procedures which have appeared in the statistical literature, and reviews specific benchmarking procedures currently used by the Energy Information Administration (EIA). The literature survey includes a technical summary of the major benchmarking methods and their statistical properties. Factors influencing the choice and application of particular techniques are described and the impact of benchmark accuracy is discussed. EIA applications and procedures are reviewed and evaluated for residential natural gas deliveries series and coal production series. It is found that the current method of adjusting the natural gas series is consistent with the behavior of the series and the methods used in obtaining the initial data. As a result, no change is recommended. For the coal production series, a staged approach based on a first differencing technique is recommended over the current procedure. A comparison of the adjustments produced by the two methods is made for the 1987 Indiana coal production series. 32 refs., 5 figs., 1 tab.
Benchmarking in academic pharmacy departments.
Bosso, John A; Chisholm-Burns, Marie; Nappi, Jean; Gubbins, Paul O; Ross, Leigh Ann
2010-10-11
Benchmarking in academic pharmacy, and recommendations for the potential uses of benchmarking in academic pharmacy departments are discussed in this paper. Benchmarking is the process by which practices, procedures, and performance metrics are compared to an established standard or best practice. Many businesses and industries use benchmarking to compare processes and outcomes, and ultimately plan for improvement. Institutions of higher learning have embraced benchmarking practices to facilitate measuring the quality of their educational and research programs. Benchmarking is used internally as well to justify the allocation of institutional resources or to mediate among competing demands for additional program staff or space. Surveying all chairs of academic pharmacy departments to explore benchmarking issues such as department size and composition, as well as faculty teaching, scholarly, and service productivity, could provide valuable information. To date, attempts to gather this data have had limited success. We believe this information is potentially important, urge that efforts to gather it should be continued, and offer suggestions to achieve full participation.
Benchmarking biofuels; Biobrandstoffen benchmarken
Energy Technology Data Exchange (ETDEWEB)
Croezen, H.; Kampman, B.; Bergsma, G.
2012-03-15
A sustainability benchmark for transport biofuels has been developed and used to evaluate the various biofuels currently on the market. For comparison, electric vehicles, hydrogen vehicles and petrol/diesel vehicles were also included. A range of studies as well as growing insight are making it ever clearer that biomass-based transport fuels may have just as big a carbon footprint as fossil fuels like petrol or diesel, or even bigger. At the request of Greenpeace Netherlands, CE Delft has brought together current understanding on the sustainability of fossil fuels, biofuels and electric vehicles, with particular focus on the performance of the respective energy carriers on three sustainability criteria, with the first weighing the heaviest: (1) Greenhouse gas emissions; (2) Land use; and (3) Nutrient consumption [Dutch] Greenpeace Nederland heeft CE Delft gevraagd een duurzaamheidsmeetlat voor biobrandstoffen voor transport te ontwerpen en hierop de verschillende biobrandstoffen te scoren. Voor een vergelijk zijn ook elektrisch rijden, rijden op waterstof en rijden op benzine of diesel opgenomen. Door onderzoek en voortschrijdend inzicht blijkt steeds vaker dat transportbrandstoffen op basis van biomassa soms net zoveel of zelfs meer broeikasgassen veroorzaken dan fossiele brandstoffen als benzine en diesel. CE Delft heeft voor Greenpeace Nederland op een rijtje gezet wat de huidige inzichten zijn over de duurzaamheid van fossiele brandstoffen, biobrandstoffen en elektrisch rijden. Daarbij is gekeken naar de effecten van de brandstoffen op drie duurzaamheidscriteria, waarbij broeikasgasemissies het zwaarst wegen: (1) Broeikasgasemissies; (2) Landgebruik; en (3) Nutriëntengebruik.
Correlational effect size benchmarks.
Bosco, Frank A; Aguinis, Herman; Singh, Kulraj; Field, James G; Pierce, Charles A
2015-03-01
Effect size information is essential for the scientific enterprise and plays an increasingly central role in the scientific process. We extracted 147,328 correlations and developed a hierarchical taxonomy of variables reported in Journal of Applied Psychology and Personnel Psychology from 1980 to 2010 to produce empirical effect size benchmarks at the omnibus level, for 20 common research domains, and for an even finer grained level of generality. Results indicate that the usual interpretation and classification of effect sizes as small, medium, and large bear almost no resemblance to findings in the field, because distributions of effect sizes exhibit tertile partitions at values approximately one-half to one-third those intuited by Cohen (1988). Our results offer information that can be used for research planning and design purposes, such as producing better informed non-nil hypotheses and estimating statistical power and planning sample size accordingly. We also offer information useful for understanding the relative importance of the effect sizes found in a particular study in relationship to others and which research domains have advanced more or less, given that larger effect sizes indicate a better understanding of a phenomenon. Also, our study offers information about research domains for which the investigation of moderating effects may be more fruitful and provide information that is likely to facilitate the implementation of Bayesian analysis. Finally, our study offers information that practitioners can use to evaluate the relative effectiveness of various types of interventions. PsycINFO Database Record (c) 2015 APA, all rights reserved.
Benchmarking in water project analysis
Griffin, Ronald C.
2008-11-01
The with/without principle of cost-benefit analysis is examined for the possible bias that it brings to water resource planning. Theory and examples for this question are established. Because benchmarking against the demonstrably low without-project hurdle can detract from economic welfare and can fail to promote efficient policy, improvement opportunities are investigated. In lieu of the traditional, without-project benchmark, a second-best-based "difference-making benchmark" is proposed. The project authorizations and modified review processes instituted by the U.S. Water Resources Development Act of 2007 may provide for renewed interest in these findings.
Precise determination of lattice phase shifts and mixing angles
Lu, Bing-Nan; Lähde, Timo A.; Lee, Dean; Meißner, Ulf-G.
2016-09-01
We introduce a general and accurate method for determining lattice phase shifts and mixing angles, which is applicable to arbitrary, non-cubic lattices. Our method combines angular momentum projection, spherical wall boundaries and an adjustable auxiliary potential. This allows us to construct radial lattice wave functions and to determine phase shifts at arbitrary energies. For coupled partial waves, we use a complex-valued auxiliary potential that breaks time-reversal invariance. We benchmark our method using a system of two spin-1/2 particles interacting through a finite-range potential with a strong tensor component. We are able to extract phase shifts and mixing angles for all angular momenta and energies, with precision greater than that of extant methods. We discuss a wide range of applications from nuclear lattice simulations to optical lattice experiments.
Energy Technology Data Exchange (ETDEWEB)
D' Hondt, P. [SCK.CEN, Mol (Belgium); Gehin, J. [ORNL, Oak Ridge, TN (United States); Na, B.C.; Sartori, E. [Organisation for Economic Co-Operation and Development, Nuclear Energy Agency, 92 - Issy les Moulineaux (France); Wiesenack, W. [Organisation for Economic Co-Operation and Development/HRP, Halden (Norway)
2001-07-01
One of the options envisaged for disposing of weapons grade plutonium, declared surplus for national defence in the Russian Federation and Usa, is to burn it in nuclear power reactors. The scientific/technical know-how accumulated in the use of MOX as a fuel for electricity generation is of great relevance for the plutonium disposition programmes. An Expert Group of the OECD/Nea is carrying out a series of benchmarks with the aim of facilitating the use of this know-how for meeting this objective. This paper describes the background that led to establishing the Expert Group, and the present status of results from these benchmarks. The benchmark studies cover a theoretical reactor physics benchmark on a VVER-1000 core loaded with MOX, two experimental benchmarks on MOX lattices and a benchmark concerned with MOX fuel behaviour for both solid and hollow pellets. First conclusions are outlined as well as future work. (author)
Water Level Superseded Benchmark Sheets
National Oceanic and Atmospheric Administration, Department of Commerce — Images of National Coast & Geodetic Survey (now NOAA's National Geodetic Survey/NGS) tidal benchmarks which have been superseded by new markers or locations....
Benchmark simulation models, quo vadis?
DEFF Research Database (Denmark)
Jeppsson, U.; Alex, J; Batstone, D. J.
2013-01-01
As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to p...... already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity. © IWA Publishing 2013....... and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work...
Benchmarking and Sustainable Transport Policy
DEFF Research Database (Denmark)
Gudmundsson, Henrik; Wyatt, Andrew; Gordon, Lucy
2004-01-01
Order to learn from the best. In 2000 the European Commission initiated research to explore benchmarking as a tool to promote policies for sustainable transport. This paper reports findings and recommendations on how to address this challenge. The findings suggest that benchmarking is a valuable...... tool that may indeed help to move forward the transport policy agenda. However, there are major conditions and limitations. First of all it is not always so straightforward to delimit, measure and compare transport services in order to establish a clear benchmark. Secondly sustainable transport...... evokes a broad range of concerns that are hard to address fully at the level of specific practices. Thirdly policies are not directly comparable across space and context. For these reasons attempting to benchmark sustainable transport policies against one another would be a highly complex task, which...
Performance Targets and External Benchmarking
DEFF Research Database (Denmark)
Friis, Ivar; Hansen, Allan; Vámosi, Tamás S.
Research on relative performance measures, transfer pricing, beyond budgeting initiatives, target costing, piece rates systems and value based management has for decades underlined the importance of external benchmarking in performance management. Research conceptualises external benchmarking...... the conditions upon which the market mechanism is performing within organizations. This paper aims to contribute to research by providing more insight to the conditions for the use of external benchmarking as an element in performance management in organizations. Our study explores a particular type of external...... towards the conditions for the use of the external benchmarks we provide more insights to some of the issues and challenges that are related to using this mechanism for performance management and advance competitiveness in organizations....
Performance Targets and External Benchmarking
DEFF Research Database (Denmark)
Friis, Ivar; Hansen, Allan; Vámosi, Tamás S.
as a market mechanism that can be brought inside the firm to provide incentives for continuous improvement and the development of competitive advances. However, whereas extant research primarily has focused on the importance and effects of using external benchmarks, less attention has been directed towards...... towards the conditions for the use of the external benchmarks we provide more insights to some of the issues and challenges that are related to using this mechanism for performance management and advance competitiveness in organizations....
Research on computer systems benchmarking
Smith, Alan Jay (Principal Investigator)
1996-01-01
This grant addresses the topic of research on computer systems benchmarking and is more generally concerned with performance issues in computer systems. This report reviews work in those areas during the period of NASA support under this grant. The bulk of the work performed concerned benchmarking and analysis of CPUs, compilers, caches, and benchmark programs. The first part of this work concerned the issue of benchmark performance prediction. A new approach to benchmarking and machine characterization was reported, using a machine characterizer that measures the performance of a given system in terms of a Fortran abstract machine. Another report focused on analyzing compiler performance. The performance impact of optimization in the context of our methodology for CPU performance characterization was based on the abstract machine model. Benchmark programs are analyzed in another paper. A machine-independent model of program execution was developed to characterize both machine performance and program execution. By merging these machine and program characterizations, execution time can be estimated for arbitrary machine/program combinations. The work was continued into the domain of parallel and vector machines, including the issue of caches in vector processors and multiprocessors. All of the afore-mentioned accomplishments are more specifically summarized in this report, as well as those smaller in magnitude supported by this grant.
Hybrid lattice Boltzmann method on overlapping grids.
Di Ilio, G; Chiappini, D; Ubertini, S; Bella, G; Succi, S
2017-01-01
In this work, a hybrid lattice Boltzmann method (HLBM) is proposed, where the standard lattice Boltzmann implementation based on the Bhatnagar-Gross-Krook (LBGK) approximation is combined together with an unstructured finite-volume lattice Boltzmann model. The method is constructed on an overlapping grid system, which allows the coexistence of a uniform lattice nodes spacing and a coordinate-free lattice structure. The natural adaptivity of the hybrid grid system makes the method particularly suitable to handle problems involving complex geometries. Moreover, the provided scheme ensures a high-accuracy solution near walls, given the capability of the unstructured submodel of achieving the desired level of refinement in a very flexible way. For these reasons, the HLBM represents a prospective tool for solving multiscale problems. The proposed method is here applied to the benchmark problem of a two-dimensional flow past a circular cylinder for a wide range of Reynolds numbers and its numerical performances are measured and compared with the standard LBGK ones.
Campos, R G; Campos, Rafael G.; Tututi, Eduardo S.
2002-01-01
It is shown that the nonlocal Dirac operator yielded by a lattice model that preserves chiral symmetry and uniqueness of fields, approaches to an ultralocal and invariant under translations operator when the size of the lattice tends to zero.
New integrable lattice hierarchies
Energy Technology Data Exchange (ETDEWEB)
Pickering, Andrew [Area de Matematica Aplicada, ESCET, Universidad Rey Juan Carlos, c/ Tulipan s/n, 28933 Mostoles, Madrid (Spain); Zhu Zuonong [Departamento de Matematicas, Universidad de Salamanca, Plaza de la Merced 1, 37008 Salamanca (Spain) and Department of Mathematics, Shanghai Jiao Tong University, Shanghai 200030 (China)]. E-mail: znzhu2@yahoo.com.cn
2006-01-23
In this Letter we give a new integrable four-field lattice hierarchy, associated to a new discrete spectral problem. We obtain our hierarchy as the compatibility condition of this spectral problem and an associated equation, constructed herein, for the time-evolution of eigenfunctions. We consider reductions of our hierarchy, which also of course admit discrete zero curvature representations, in detail. We find that our hierarchy includes many well-known integrable hierarchies as special cases, including the Toda lattice hierarchy, the modified Toda lattice hierarchy, the relativistic Toda lattice hierarchy, and the Volterra lattice hierarchy. We also obtain here a new integrable two-field lattice hierarchy, to which we give the name of Suris lattice hierarchy, since the first equation of this hierarchy has previously been given by Suris. The Hamiltonian structure of the Suris lattice hierarchy is obtained by means of a trace identity formula.
Sober Topological Molecular Lattices
Institute of Scientific and Technical Information of China (English)
张德学; 李永明
2003-01-01
A topological molecular lattice (TML) is a pair (L, T), where L is a completely distributive lattice and r is a subframe of L. There is an obvious forgetful functor from the category TML of TML's to the category Loc of locales. In this note,it is showed that this forgetful functor has a right adjoint. Then, by this adjunction,a special kind of topological molecular lattices called sober topological molecular lattices is introduced and investigated.
Atkinson, D; van Steenwijk, F.J.
The resistance between two arbitrary nodes in an infinite square lattice of:identical resistors is calculated, The method is generalized to infinite triangular and hexagonal lattices in two dimensions, and also to infinite cubic and hypercubic lattices in three and more dimensions. (C) 1999 American
Lattice Regularization and Symmetries
Hasenfratz, Peter; Von Allmen, R; Allmen, Reto von; Hasenfratz, Peter; Niedermayer, Ferenc
2006-01-01
Finding the relation between the symmetry transformations in the continuum and on the lattice might be a nontrivial task as illustrated by the history of chiral symmetry. Lattice actions induced by a renormalization group procedure inherit all symmetries of the continuum theory. We give a general procedure which gives the corresponding symmetry transformations on the lattice.
Benchmarking of human resources management
Directory of Open Access Journals (Sweden)
David M. Akinnusi
2008-12-01
Full Text Available This paper reviews the role of human resource management (HRM which, today, plays a strategic partnership role in management. The focus of the paper is on HRM in the public sector, where much hope rests on HRM as a means of transforming the public service and achieving much needed service delivery. However, a critical evaluation of HRM practices in the public sector reveals that these services leave much to be desired. The paper suggests the adoption of benchmarking as a process to revamp HRM in the public sector so that it is able to deliver on its promises. It describes the nature and process of benchmarking and highlights the inherent difficulties in applying benchmarking in HRM. It concludes with some suggestions for a plan of action. The process of identifying “best” practices in HRM requires the best collaborative efforts of HRM practitioners and academicians. If used creatively, benchmarking has the potential to bring about radical and positive changes in HRM in the public sector. The adoption of the benchmarking process is, in itself, a litmus test of the extent to which HRM in the public sector has grown professionally.
Benchmark simulation models, quo vadis?
Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D
2013-01-01
As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.
A benchmark for non-covalent interactions in solids.
Otero-de-la-Roza, A; Johnson, Erin R
2012-08-01
A benchmark for non-covalent interactions in solids (C21) based on the experimental sublimation enthalpies and geometries of 21 molecular crystals is presented. Thermal and zero-point effects are carefully accounted for and reference lattice energies and thermal pressures are provided, which allow dispersion-corrected density functionals to be assessed in a straightforward way. Other thermal corrections to the sublimation enthalpy (the 2RT term) are reexamined. We compare the recently implemented exchange-hole dipole moment (XDM) model with other approaches in the literature to find that XDM roughly doubles the accuracy of DFT-D2 and non-local functionals in computed lattice energies (4.8 kJ/mol mean absolute error) while, at the same time, predicting cell geometries within less than 2% of the experimental result on average. The XDM model of dispersion interactions is confirmed as a very promising approach in solid-state applications.
Randomized benchmarking of multiqubit gates.
Gaebler, J P; Meier, A M; Tan, T R; Bowler, R; Lin, Y; Hanneke, D; Jost, J D; Home, J P; Knill, E; Leibfried, D; Wineland, D J
2012-06-29
We describe an extension of single-qubit gate randomized benchmarking that measures the error of multiqubit gates in a quantum information processor. This platform-independent protocol evaluates the performance of Clifford unitaries, which form a basis of fault-tolerant quantum computing. We implemented the benchmarking protocol with trapped ions and found an error per random two-qubit Clifford unitary of 0.162±0.008, thus setting the first benchmark for such unitaries. By implementing a second set of sequences with an extra two-qubit phase gate inserted after each step, we extracted an error per phase gate of 0.069±0.017. We conducted these experiments with transported, sympathetically cooled ions in a multizone Paul trap-a system that can in principle be scaled to larger numbers of ions.
Wilson Dslash Kernel From Lattice QCD Optimization
Energy Technology Data Exchange (ETDEWEB)
Joo, Balint [Jefferson Lab, Newport News, VA; Smelyanskiy, Mikhail [Parallel Computing Lab, Intel Corporation, California, USA; Kalamkar, Dhiraj D. [Parallel Computing Lab, Intel Corporation, India; Vaidyanathan, Karthikeyan [Parallel Computing Lab, Intel Corporation, India
2015-07-01
Lattice Quantum Chromodynamics (LQCD) is a numerical technique used for calculations in Theoretical Nuclear and High Energy Physics. LQCD is traditionally one of the first applications ported to many new high performance computing architectures and indeed LQCD practitioners have been known to design and build custom LQCD computers. Lattice QCD kernels are frequently used as benchmarks (e.g. 168.wupwise in the SPEC suite) and are generally well understood, and as such are ideal to illustrate several optimization techniques. In this chapter we will detail our work in optimizing the Wilson-Dslash kernels for Intel Xeon Phi, however, as we will show the technique gives excellent performance on regular Xeon Architecture as well.
Radiation Detection Computational Benchmark Scenarios
Energy Technology Data Exchange (ETDEWEB)
Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.; McDonald, Ben S.
2013-09-24
Modeling forms an important component of radiation detection development, allowing for testing of new detector designs, evaluation of existing equipment against a wide variety of potential threat sources, and assessing operation performance of radiation detection systems. This can, however, result in large and complex scenarios which are time consuming to model. A variety of approaches to radiation transport modeling exist with complementary strengths and weaknesses for different problems. This variety of approaches, and the development of promising new tools (such as ORNL’s ADVANTG) which combine benefits of multiple approaches, illustrates the need for a means of evaluating or comparing different techniques for radiation detection problems. This report presents a set of 9 benchmark problems for comparing different types of radiation transport calculations, identifying appropriate tools for classes of problems, and testing and guiding the development of new methods. The benchmarks were drawn primarily from existing or previous calculations with a preference for scenarios which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22. From a technical perspective, the benchmarks were chosen to span a range of difficulty and to include gamma transport, neutron transport, or both and represent different important physical processes and a range of sensitivity to angular or energy fidelity. Following benchmark identification, existing information about geometry, measurements, and previous calculations were assembled. Monte Carlo results (MCNP decks) were reviewed or created and re-run in order to attain accurate computational times and to verify agreement with experimental data, when present. Benchmark information was then conveyed to ORNL in order to guide testing and development of hybrid calculations. The results of those ADVANTG calculations were then sent to PNNL for
Perceptual hashing algorithms benchmark suite
Institute of Scientific and Technical Information of China (English)
Zhang Hui; Schmucker Martin; Niu Xiamu
2007-01-01
Numerous perceptual hashing algorithms have been developed for identification and verification of multimedia objects in recent years. Many application schemes have been adopted for various commercial objects. Developers and users are looking for a benchmark tool to compare and evaluate their current algorithms or technologies. In this paper, a novel benchmark platform is presented. PHABS provides an open framework and lets its users define their own test strategy, perform tests, collect and analyze test data. With PHABS, various performance parameters of algorithms can be tested, and different algorithms or algorithms with different parameters can be evaluated and compared easily.
Closed-loop neuromorphic benchmarks
CSIR Research Space (South Africa)
Stewart
2015-11-01
Full Text Available Benchmarks Terrence C. Stewart 1* , Travis DeWolf 1 , Ashley Kleinhans 2 , Chris Eliasmith 1 1 University of Waterloo, Canada, 2 Council for Scientific and Industrial Research, South Africa Submitted to Journal: Frontiers in Neuroscience Specialty... the study was exempt from ethical approval procedures.) Did the study presented in the manuscript involve human or animal subjects: No I v i w 1Closed-loop Neuromorphic Benchmarks Terrence C. Stewart 1,∗, Travis DeWolf 1, Ashley Kleinhans 2 and Chris...
The contextual benchmark method: benchmarking e-government services
Jansen, Jurjen; Vries, de Sjoerd; Schaik, van Paul
2010-01-01
This paper offers a new method for benchmarking e-Government services. Government organizations no longer doubt the need to deliver their services on line. Instead, the question that is more relevant is how well the electronic services offered by a particular organization perform in comparison with
Benchmarking Internet of Things devices
CSIR Research Space (South Africa)
Kruger, CP
2014-07-01
Full Text Available International Conference on Industrial Informatics (INDIN), 27-30 July 2014 Benchmarking Internet of Things devices C.P. Kruger y and G.P. Hancke yz *Advanced Sensor Networks Research Group, Counsil for Scientific and Industrial Research, South...
Benchmarked Library Websites Comparative Study
Ramli, Rindra M.
2015-01-01
This presentation provides an analysis of services provided by the benchmarked library websites. The exploratory study includes comparison of these websites against a list of criterion and presents a list of services that are most commonly deployed by the selected websites. In addition to that, the investigators proposed a list of services that could be provided via the KAUST library website.
Engine Benchmarking - Final CRADA Report
Energy Technology Data Exchange (ETDEWEB)
Wallner, Thomas [Argonne National Lab. (ANL), Argonne, IL (United States)
2016-01-01
Detailed benchmarking of the powertrains of three light-duty vehicles was performed. Results were presented and provided to CRADA partners. The vehicles included a MY2011 Audi A4, a MY2012 Mini Cooper and a MY2014 Nissan Versa.
Benchmarking Universiteitsvastgoed: Managementinformatie bij vastgoedbeslissingen
Den Heijer, A.C.; De Vries, J.C.
2004-01-01
Voor u ligt het eindrapport van het onderzoek "Benchmarking universiteitsvastgoed". Dit rapport is de samenvoeging van twee deel producten: het theorierapport (verschenen in december 2003) en het praktijkrapport (verschenen in januari 2004). Onderwerpen in het theoriedeel zijn de analyse van andere
Benchmark Lisp And Ada Programs
Davis, Gloria; Galant, David; Lim, Raymond; Stutz, John; Gibson, J.; Raghavan, B.; Cheesema, P.; Taylor, W.
1992-01-01
Suite of nonparallel benchmark programs, ELAPSE, designed for three tests: comparing efficiency of computer processing via Lisp vs. Ada; comparing efficiencies of several computers processing via Lisp; or comparing several computers processing via Ada. Tests efficiency which computer executes routines in each language. Available for computer equipped with validated Ada compiler and/or Common Lisp system.
2010-10-01
... 42 Public Health 4 2010-10-01 2010-10-01 false Delivery of benchmark and benchmark-equivalent...: GENERAL PROVISIONS Benchmark Benefit and Benchmark-Equivalent Coverage § 440.385 Delivery of benchmark and benchmark-equivalent coverage through managed care entities. In implementing benchmark or...
Jammed lattice sphere packings
Kallus, Yoav; Marcotte, Étienne; Torquato, Salvatore
2013-01-01
We generate and study an ensemble of isostatic jammed hard-sphere lattices. These lattices are obtained by compression of a periodic system with an adaptive unit cell containing a single sphere until the point of mechanical stability. We present detailed numerical data about the densities, pair correlations, force distributions, and structure factors of such lattices. We show that this model retains many of the crucial structural features of the classical hard-sphere model and propose it as a...
On Traveling Waves in Lattices: The Case of Riccati Lattices
Dimitrova, Zlatinka
2012-09-01
The method of simplest equation is applied for analysis of a class of lattices described by differential-difference equations that admit traveling-wave solutions constructed on the basis of the solution of the Riccati equation. We denote such lattices as Riccati lattices. We search for Riccati lattices within two classes of lattices: generalized Lotka-Volterra lattices and generalized Holling lattices. We show that from the class of generalized Lotka-Volterra lattices only the Wadati lattice belongs to the class of Riccati lattices. Opposite to this many lattices from the Holling class are Riccati lattices. We construct exact traveling wave solutions on the basis of the solution of Riccati equation for three members of the class of generalized Holling lattices.
Energy Technology Data Exchange (ETDEWEB)
Shindler, A. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC
2007-07-15
I review the theoretical foundations, properties as well as the simulation results obtained so far of a variant of the Wilson lattice QCD formulation: Wilson twisted mass lattice QCD. Emphasis is put on the discretization errors and on the effects of these discretization errors on the phase structure for Wilson-like fermions in the chiral limit. The possibility to use in lattice simulations different lattice actions for sea and valence quarks to ease the renormalization patterns of phenomenologically relevant local operators, is also discussed. (orig.)
Benchmarking clinical photography services in the NHS.
Arbon, Giles
2015-01-01
Benchmarking is used in services across the National Health Service (NHS) using various benchmarking programs. Clinical photography services do not have a program in place and services have to rely on ad hoc surveys of other services. A trial benchmarking exercise was undertaken with 13 services in NHS Trusts. This highlights valuable data and comparisons that can be used to benchmark and improve services throughout the profession.
Benchmarking: Achieving the best in class
Energy Technology Data Exchange (ETDEWEB)
Kaemmerer, L
1996-05-01
Oftentimes, people find the process of organizational benchmarking an onerous task, or, because they do not fully understand the nature of the process, end up with results that are less than stellar. This paper presents the challenges of benchmarking and reasons why benchmarking can benefit an organization in today`s economy.
The LDBC Social Network Benchmark: Interactive Workload
Erling, O.; Averbuch, A.; Larriba-Pey, J.; Chafi, H.; Gubichev, A.; Prat, A.; Pham, M.D.; Boncz, P.A.
2015-01-01
The Linked Data Benchmark Council (LDBC) is now two years underway and has gathered strong industrial participation for its mission to establish benchmarks, and benchmarking practices for evaluating graph data management systems. The LDBC introduced a new choke-point driven methodology for developin
How Benchmarking and Higher Education Came Together
Levy, Gary D.; Ronco, Sharron L.
2012-01-01
This chapter introduces the concept of benchmarking and how higher education institutions began to use benchmarking for a variety of purposes. Here, benchmarking is defined as a strategic and structured approach whereby an organization compares aspects of its processes and/or outcomes to those of another organization or set of organizations to…
Methodology for Benchmarking IPsec Gateways
Directory of Open Access Journals (Sweden)
Adam Tisovský
2012-08-01
Full Text Available The paper analyses forwarding performance of IPsec gateway over the rage of offered loads. It focuses on the forwarding rate and packet loss particularly at the gateway’s performance peak and at the state of gateway’s overload. It explains possible performance degradation when the gateway is overloaded by excessive offered load. The paper further evaluates different approaches for obtaining forwarding performance parameters – a widely used throughput described in RFC 1242, maximum forwarding rate with zero packet loss and us proposed equilibrium throughput. According to our observations equilibrium throughput might be the most universal parameter for benchmarking security gateways as the others may be dependent on the duration of test trials. Employing equilibrium throughput would also greatly shorten the time required for benchmarking. Lastly, the paper presents methodology and a hybrid step/binary search algorithm for obtaining value of equilibrium throughput.
Geothermal Heat Pump Benchmarking Report
Energy Technology Data Exchange (ETDEWEB)
None
1997-01-17
A benchmarking study was conducted on behalf of the Department of Energy to determine the critical factors in successful utility geothermal heat pump programs. A Successful program is one that has achieved significant market penetration. Successfully marketing geothermal heat pumps has presented some major challenges to the utility industry. However, select utilities have developed programs that generate significant GHP sales. This benchmarking study concludes that there are three factors critical to the success of utility GHP marking programs: (1) Top management marketing commitment; (2) An understanding of the fundamentals of marketing and business development; and (3) An aggressive competitive posture. To generate significant GHP sales, competitive market forces must by used. However, because utilities have functioned only in a regulated arena, these companies and their leaders are unschooled in competitive business practices. Therefore, a lack of experience coupled with an intrinsically non-competitive culture yields an industry environment that impedes the generation of significant GHP sales in many, but not all, utilities.
Benchmarking Variable Selection in QSAR.
Eklund, Martin; Norinder, Ulf; Boyer, Scott; Carlsson, Lars
2012-02-01
Variable selection is important in QSAR modeling since it can improve model performance and transparency, as well as reduce the computational cost of model fitting and predictions. Which variable selection methods that perform well in QSAR settings is largely unknown. To address this question we, in a total of 1728 benchmarking experiments, rigorously investigated how eight variable selection methods affect the predictive performance and transparency of random forest models fitted to seven QSAR datasets covering different endpoints, descriptors sets, types of response variables, and number of chemical compounds. The results show that univariate variable selection methods are suboptimal and that the number of variables in the benchmarked datasets can be reduced with about 60 % without significant loss in model performance when using multivariate adaptive regression splines MARS and forward selection.
Directory of Open Access Journals (Sweden)
Epelbaum E.
2010-04-01
Full Text Available We review recent progress on nuclear lattice simulations using chiral eﬀective ﬁeld theory. We discuss lattice results for dilute neutron matter at next-to-leading order, three-body forces at next-to-next-toleading order, isospin-breaking and Coulomb eﬀects, and the binding energy of light nuclei.
A Benchmark for Management Effectiveness
Zimmermann, Bill; Chanaron, Jean-Jacques; Klieb, Leslie
2007-01-01
International audience; This study presents a tool to gauge managerial effectiveness in the form of a questionnaire that is easy to administer and score. The instrument covers eight distinct areas of organisational climate and culture of management inside a company or department. Benchmark scores were determined by administering sample-surveys to a wide cross-section of individuals from numerous firms in Southeast Louisiana, USA. Scores remained relatively constant over a seven-year timeframe...
Restaurant Energy Use Benchmarking Guideline
Energy Technology Data Exchange (ETDEWEB)
Hedrick, R.; Smith, V.; Field, K.
2011-07-01
A significant operational challenge for food service operators is defining energy use benchmark metrics to compare against the performance of individual stores. Without metrics, multiunit operators and managers have difficulty identifying which stores in their portfolios require extra attention to bring their energy performance in line with expectations. This report presents a method whereby multiunit operators may use their own utility data to create suitable metrics for evaluating their operations.
Thermal Performance Benchmarking: Annual Report
Energy Technology Data Exchange (ETDEWEB)
Moreno, Gilbert
2016-04-08
The goal for this project is to thoroughly characterize the performance of state-of-the-art (SOA) automotive power electronics and electric motor thermal management systems. Information obtained from these studies will be used to: Evaluate advantages and disadvantages of different thermal management strategies; establish baseline metrics for the thermal management systems; identify methods of improvement to advance the SOA; increase the publicly available information related to automotive traction-drive thermal management systems; help guide future electric drive technologies (EDT) research and development (R&D) efforts. The performance results combined with component efficiency and heat generation information obtained by Oak Ridge National Laboratory (ORNL) may then be used to determine the operating temperatures for the EDT components under drive-cycle conditions. In FY15, the 2012 Nissan LEAF power electronics and electric motor thermal management systems were benchmarked. Testing of the 2014 Honda Accord Hybrid power electronics thermal management system started in FY15; however, due to time constraints it was not possible to include results for this system in this report. The focus of this project is to benchmark the thermal aspects of the systems. ORNL's benchmarking of electric and hybrid electric vehicle technology reports provide detailed descriptions of the electrical and packaging aspects of these automotive systems.
RISKIND verification and benchmark comparisons
Energy Technology Data Exchange (ETDEWEB)
Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.
1997-08-01
This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.
HS06 Benchmark for an ARM Server
Kluth, Stefan
2014-06-01
We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.
HS06 Benchmark for an ARM Server
Kluth, Stefan
2013-01-01
We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.
Argonne Code Center: Benchmark problem book.
Energy Technology Data Exchange (ETDEWEB)
None, None
1977-06-01
This book is an outgrowth of activities of the Computational Benchmark Problems Committee of the Mathematics and Computation Division of the American Nuclear Society. This is the second supplement of the original benchmark book which was first published in February, 1968 and contained computational benchmark problems in four different areas. Supplement No. 1, which was published in December, 1972, contained corrections to the original benchmark book plus additional problems in three new areas. The current supplement. Supplement No. 2, contains problems in eight additional new areas. The objectives of computational benchmark work and the procedures used by the committee in pursuing the objectives are outlined in the original edition of the benchmark book (ANL-7416, February, 1968). The members of the committee who have made contributions to Supplement No. 2 are listed below followed by the contributors to the earlier editions of the benchmark book.
Classical Logic and Quantum Logic with Multiple and Common Lattice Models
Directory of Open Access Journals (Sweden)
Mladen Pavičić
2016-01-01
Full Text Available We consider a proper propositional quantum logic and show that it has multiple disjoint lattice models, only one of which is an orthomodular lattice (algebra underlying Hilbert (quantum space. We give an equivalent proof for the classical logic which turns out to have disjoint distributive and nondistributive ortholattices. In particular, we prove that both classical logic and quantum logic are sound and complete with respect to each of these lattices. We also show that there is one common nonorthomodular lattice that is a model of both quantum and classical logic. In technical terms, that enables us to run the same classical logic on both a digital (standard, two-subset, 0-1-bit computer and a nondigital (say, a six-subset computer (with appropriate chips and circuits. With quantum logic, the same six-element common lattice can serve us as a benchmark for an efficient evaluation of equations of bigger lattice models or theorems of the logic.
Earth Data Analysis Center, University of New Mexico — The National Flood Hazard Layer (NFHL) data incorporates all Digital Flood Insurance Rate Map(DFIRM) databases published by FEMA, and any Letters Of Map Revision...
Von Smekal, L; Sternbeck, A; Williams, A G
2007-01-01
We propose a modified lattice Landau gauge based on stereographically projecting the link variables on the circle S^1 -> R for compact U(1) or the 3-sphere S^3 -> R^3 for SU(2) before imposing the Landau gauge condition. This can reduce the number of Gribov copies exponentially and solves the Gribov problem in compact U(1) where it is a lattice artifact. Applied to the maximal Abelian subgroup this might be just enough to avoid the perfect cancellation amongst the Gribov copies in a lattice BRST formulation for SU(N), and thus to avoid the Neuberger 0/0 problem. The continuum limit of the Landau gauge remains unchanged.
Jammed lattice sphere packings.
Kallus, Yoav; Marcotte, Étienne; Torquato, Salvatore
2013-12-01
We generate and study an ensemble of isostatic jammed hard-sphere lattices. These lattices are obtained by compression of a periodic system with an adaptive unit cell containing a single sphere until the point of mechanical stability. We present detailed numerical data about the densities, pair correlations, force distributions, and structure factors of such lattices. We show that this model retains many of the crucial structural features of the classical hard-sphere model and propose it as a model for the jamming and glass transitions that enables exploration of much higher dimensions than are usually accessible.
Jammed lattice sphere packings
Kallus, Yoav; Marcotte, Étienne; Torquato, Salvatore
2013-12-01
We generate and study an ensemble of isostatic jammed hard-sphere lattices. These lattices are obtained by compression of a periodic system with an adaptive unit cell containing a single sphere until the point of mechanical stability. We present detailed numerical data about the densities, pair correlations, force distributions, and structure factors of such lattices. We show that this model retains many of the crucial structural features of the classical hard-sphere model and propose it as a model for the jamming and glass transitions that enables exploration of much higher dimensions than are usually accessible.
PageRank Pipeline Benchmark: Proposal for a Holistic System Benchmark for Big-Data Platforms
Dreher, Patrick; Hill, Chris; Gadepally, Vijay; Kuszmaul, Bradley; Kepner, Jeremy
2016-01-01
The rise of big data systems has created a need for benchmarks to measure and compare the capabilities of these systems. Big data benchmarks present unique scalability challenges. The supercomputing community has wrestled with these challenges for decades and developed methodologies for creating rigorous scalable benchmarks (e.g., HPC Challenge). The proposed PageRank pipeline benchmark employs supercomputing benchmarking methodologies to create a scalable benchmark that is reflective of many real-world big data processing systems. The PageRank pipeline benchmark builds on existing prior scalable benchmarks (Graph500, Sort, and PageRank) to create a holistic benchmark with multiple integrated kernels that can be run together or independently. Each kernel is well defined mathematically and can be implemented in any programming environment. The linear algebraic nature of PageRank makes it well suited to being implemented using the GraphBLAS standard. The computations are simple enough that performance predictio...
NASA Software Engineering Benchmarking Effort
Godfrey, Sally; Rarick, Heather
2012-01-01
Benchmarking was very interesting and provided a wealth of information (1) We did see potential solutions to some of our "top 10" issues (2) We have an assessment of where NASA stands with relation to other aerospace/defense groups We formed new contacts and potential collaborations (1) Several organizations sent us examples of their templates, processes (2) Many of the organizations were interested in future collaboration: sharing of training, metrics, Capability Maturity Model Integration (CMMI) appraisers, instructors, etc. We received feedback from some of our contractors/ partners (1) Desires to participate in our training; provide feedback on procedures (2) Welcomed opportunity to provide feedback on working with NASA
Benchmarking of human resources management
David M. Akinnusi
2008-01-01
This paper reviews the role of human resource management (HRM) which, today, plays a strategic partnership role in management. The focus of the paper is on HRM in the public sector, where much hope rests on HRM as a means of transforming the public service and achieving much needed service delivery. However, a critical evaluation of HRM practices in the public sector reveals that these services leave much to be desired. The paper suggests the adoption of benchmarking as a process to revamp HR...
Lipstein, Arthur E
2014-01-01
We formulate the theory of a 2-form gauge field on a Euclidean spacetime lattice. In this approach, the fundamental degrees of freedom live on the faces of the lattice, and the action can be constructed from the sum over Wilson surfaces associated with each fundamental cube of the lattice. If we take the gauge group to be $U(1)$, the theory reduces to the well-known abelian gerbe theory in the continuum limit. We also propose a very simple and natural non-abelian generalization with gauge group $U(N) \\times U(N)$, which gives rise to $U(N)$ Yang-Mills theory upon dimensional reduction. Formulating the theory on a lattice has several other advantages. In particular, it is possible to compute many observables, such as the expectation value of Wilson surfaces, analytically at strong coupling and numerically for any value of the coupling.
Root lattices and quasicrystals
Baake, M.; Joseph, D.; Kramer, P.; Schlottmann, M.
1990-10-01
It is shown that root lattices and their reciprocals might serve as the right pool for the construction of quasicrystalline structure models. All noncrystallographic symmetries observed so far are covered in minimal embedding with maximal symmetry.
Energy Technology Data Exchange (ETDEWEB)
ORGINOS,K.
2003-01-07
I review the current status of hadronic structure computations on the lattice. I describe the basic lattice techniques and difficulties and present some of the latest lattice results; in particular recent results of the RBC group using domain wall fermions are also discussed. In conclusion, lattice computations can play an important role in understanding the hadronic structure and the fundamental properties of Quantum Chromodynamics (QCD). Although some difficulties still exist, several significant steps have been made. Advances in computer technology are expected to play a significant role in pushing these computations closer to the chiral limit and in including dynamical fermions. RBC has already begun preliminary dynamical domain wall fermion computations [49] which we expect to be pushed forward with the arrival of QCD0C. In the near future, we also expect to complete the non-perturbative renormalization of the relevant derivative operators in quenched QCD.
Superalloy Lattice Block Structures
Nathal, M. V.; Whittenberger, J. D.; Hebsur, M. G.; Kantzos, P. T.; Krause, D. L.
2004-01-01
Initial investigations of investment cast superalloy lattice block suggest that this technology will yield a low cost approach to utilize the high temperature strength and environmental resistance of superalloys in lightweight, damage tolerant structural configurations. Work to date has demonstrated that relatively large superalloy lattice block panels can be successfully investment cast from both IN-718 and Mar-M247. These castings exhibited mechanical properties consistent with the strength of the same superalloys measured from more conventional castings. The lattice block structure also accommodates significant deformation without failure, and is defect tolerant in fatigue. The potential of lattice block structures opens new opportunities for the use of superalloys in future generations of aircraft applications that demand strength and environmental resistance at elevated temperatures along with low weight.
Meshless lattice Boltzmann method for the simulation of fluid flows.
Musavi, S Hossein; Ashrafizaadeh, Mahmud
2015-02-01
A meshless lattice Boltzmann numerical method is proposed. The collision and streaming operators of the lattice Boltzmann equation are separated, as in the usual lattice Boltzmann models. While the purely local collision equation remains the same, we rewrite the streaming equation as a pure advection equation and discretize the resulting partial differential equation using the Lax-Wendroff scheme in time and the meshless local Petrov-Galerkin scheme based on augmented radial basis functions in space. The meshless feature of the proposed method makes it a more powerful lattice Boltzmann solver, especially for cases in which using meshes introduces significant numerical errors into the solution, or when improving the mesh quality is a complex and time-consuming process. Three well-known benchmark fluid flow problems, namely the plane Couette flow, the circular Couette flow, and the impulsively started cylinder flow, are simulated for the validation of the proposed method. Excellent agreement with analytical solutions or with previous experimental and numerical results in the literature is observed in all the simulations. Although the computational resources required for the meshless method per node are higher compared to that of the standard lattice Boltzmann method, it is shown that for cases in which the total number of nodes is significantly reduced, the present method actually outperforms the standard lattice Boltzmann method.
Vector Lattice Vortex Solitons
Institute of Scientific and Technical Information of China (English)
WANG Jian-Dong; YE Fang-Wei; DONG Liang-Wei; LI Yong-Ping
2005-01-01
@@ Two-dimensional vector vortex solitons in harmonic optical lattices are investigated. The stability properties of such solitons are closely connected to the lattice depth Vo. For small Vo, vector vortex solitons with the total zero-angular momentum are more stable than those with the total nonzero-angular momentum, while for large Vo, this case is inversed. If Vo is large enough, both the types of such solitons are stable.
Pica, C; Lucini, B; Patella, A; Rago, A
2009-01-01
Technicolor theories provide an elegant mechanism for dynamical electroweak symmetry breaking. We will discuss the use of lattice simulations to study the strongly-interacting dynamics of some of the candidate theories, with matter fields in representations other than the fundamental. To be viable candidates for phenomenology, such theories need to be different from a scaled-up version of QCD, which were ruled out by LEP precision measurements, and represent a challenge for modern lattice computations.
Automated Lattice Perturbation Theory
Energy Technology Data Exchange (ETDEWEB)
Monahan, Christopher
2014-11-01
I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.
Kiefel, Martin; Jampani, Varun; Gehler, Peter V.
2014-01-01
This paper presents a convolutional layer that is able to process sparse input features. As an example, for image recognition problems this allows an efficient filtering of signals that do not lie on a dense grid (like pixel position), but of more general features (such as color values). The presented algorithm makes use of the permutohedral lattice data structure. The permutohedral lattice was introduced to efficiently implement a bilateral filter, a commonly used image processing operation....
[Benchmarking in health care: conclusions and recommendations].
Geraedts, Max; Selbmann, Hans-Konrad
2011-01-01
The German Health Ministry funded 10 demonstration projects and accompanying research of benchmarking in health care. The accompanying research work aimed to infer generalisable findings and recommendations. We performed a meta-evaluation of the demonstration projects and analysed national and international approaches to benchmarking in health care. It was found that the typical benchmarking sequence is hardly ever realised. Most projects lack a detailed analysis of structures and processes of the best performers as a starting point for the process of learning from and adopting best practice. To tap the full potential of benchmarking in health care, participation in voluntary benchmarking projects should be promoted that have been demonstrated to follow all the typical steps of a benchmarking process.
Solitons in spiraling Vogel lattices
Kartashov, Yaroslav V; Torner, Lluis
2012-01-01
We address light propagation in Vogel optical lattices and show that such lattices support a variety of stable soliton solutions in both self-focusing and self-defocusing media, whose propagation constants belong to domains resembling gaps in the spectrum of a truly periodic lattice. The azimuthally-rich structure of Vogel lattices allows generation of spiraling soliton motion.
An Effective Approach for Benchmarking Implementation
B. M. Deros; Tan, J.; M.N.A. Rahman; N. A.Q.M. Daud
2011-01-01
Problem statement: The purpose of this study is to present a benchmarking guideline, conceptual framework and computerized mini program to assists companies achieve better performance in terms of quality, cost, delivery, supply chain and eventually increase their competitiveness in the market. The study begins with literature review on benchmarking definition, barriers and advantages from the implementation and the study of benchmarking framework. Approach: Thirty res...
Computational Chemistry Comparison and Benchmark Database
SRD 101 NIST Computational Chemistry Comparison and Benchmark Database (Web, free access) The NIST Computational Chemistry Comparison and Benchmark Database is a collection of experimental and ab initio thermochemical properties for a selected set of molecules. The goals are to provide a benchmark set of molecules for the evaluation of ab initio computational methods and allow the comparison between different ab initio computational methods for the prediction of thermochemical properties.
Benchmarking i eksternt regnskab og revision
DEFF Research Database (Denmark)
Thinggaard, Frank; Kiertzner, Lars
2001-01-01
løbende i en benchmarking-proces. Dette kapitel vil bredt undersøge, hvor man med nogen ret kan få benchmarking-begrebet knyttet til eksternt regnskab og revision. Afsnit 7.1 beskæftiger sig med det eksterne årsregnskab, mens afsnit 7.2 tager fat i revisionsområdet. Det sidste afsnit i kapitlet opsummerer...... betragtningerne om benchmarking i forbindelse med begge områder....
Developing Benchmarks for Solar Radio Bursts
Biesecker, D. A.; White, S. M.; Gopalswamy, N.; Black, C.; Domm, P.; Love, J. J.; Pierson, J.
2016-12-01
Solar radio bursts can interfere with radar, communication, and tracking signals. In severe cases, radio bursts can inhibit the successful use of radio communications and disrupt a wide range of systems that are reliant on Position, Navigation, and Timing services on timescales ranging from minutes to hours across wide areas on the dayside of Earth. The White House's Space Weather Action Plan has asked for solar radio burst intensity benchmarks for an event occurrence frequency of 1 in 100 years and also a theoretical maximum intensity benchmark. The solar radio benchmark team was also asked to define the wavelength/frequency bands of interest. The benchmark team developed preliminary (phase 1) benchmarks for the VHF (30-300 MHz), UHF (300-3000 MHz), GPS (1176-1602 MHz), F10.7 (2800 MHz), and Microwave (4000-20000) bands. The preliminary benchmarks were derived based on previously published work. Limitations in the published work will be addressed in phase 2 of the benchmark process. In addition, deriving theoretical maxima requires additional work, where it is even possible to, in order to meet the Action Plan objectives. In this presentation, we will present the phase 1 benchmarks and the basis used to derive them. We will also present the work that needs to be done in order to complete the final, or phase 2 benchmarks.
Benchmarking for controllere: Metoder, teknikker og muligheder
DEFF Research Database (Denmark)
Bukh, Per Nikolaj; Sandalgaard, Niels; Dietrichson, Lars
2008-01-01
Der vil i artiklen blive stillet skarpt på begrebet benchmarking ved at præsentere og diskutere forskellige facetter af det. Der vil blive redegjort for fire forskellige anvendelser af benchmarking for at vise begrebets bredde og væsentligheden af at klarlægge formålet med et benchmarkingprojekt......, inden man går i gang. Forskellen på resultatbenchmarking og procesbenchmarking vil blive behandlet, hvorefter brugen af intern hhv. ekstern benchmarking vil blive diskuteret. Endelig introduceres brugen af benchmarking i budgetlægning og budgetopfølgning....
Establishing benchmarks and metrics for utilization management.
Melanson, Stacy E F
2014-01-01
The changing environment of healthcare reimbursement is rapidly leading to a renewed appreciation of the importance of utilization management in the clinical laboratory. The process of benchmarking of laboratory operations is well established for comparing organizational performance to other hospitals (peers) and for trending data over time through internal benchmarks. However, there are relatively few resources available to assist organizations in benchmarking for laboratory utilization management. This article will review the topic of laboratory benchmarking with a focus on the available literature and services to assist in managing physician requests for laboratory testing. © 2013.
De Rosis, Alessandro
2017-02-01
Within the framework of the central-moment-based lattice Boltzmann method, we propose a strategy to account for external forces in two and three dimensions. Its numerical properties are evaluated against consolidated benchmark problems, highlighting very high accuracy and optimal convergence. Moreover, our derivations are light and intelligible.
Hartel, P.H.; Feeley, M.; Alt, M.; Augustsson, L.
1996-01-01
Over 25 implementations of different functional languages are benchmarked using the same program, a floatingpoint intensive application taken from molecular biology. The principal aspects studied are compile time and execution time for the various implementations that were benchmarked. An important
The Zoo, Benchmarks & You: How To Reach the Oregon State Benchmarks with Zoo Resources.
2002
This document aligns Oregon state educational benchmarks and standards with Oregon Zoo resources. Benchmark areas examined include English, mathematics, science, social studies, and career and life roles. Brief descriptions of the programs offered by the zoo are presented. (SOE)
The Zoo, Benchmarks & You: How To Reach the Oregon State Benchmarks with Zoo Resources.
2002
This document aligns Oregon state educational benchmarks and standards with Oregon Zoo resources. Benchmark areas examined include English, mathematics, science, social studies, and career and life roles. Brief descriptions of the programs offered by the zoo are presented. (SOE)
Benchmarking Implementations of Functional Languages with "Pseudoknot", a float-intensive benchmark
Hartel, Pieter H.; Feeley, M.; Alt, M.; Augustsson, L.
Over 25 implementations of different functional languages are benchmarked using the same program, a floatingpoint intensive application taken from molecular biology. The principal aspects studied are compile time and execution time for the various implementations that were benchmarked. An important
A Bijection between Lattice-Valued Filters and Lattice-Valued Congruences in Residuated Lattices
Directory of Open Access Journals (Sweden)
Wei Wei
2013-01-01
Full Text Available The aim of this paper is to study relations between lattice-valued filters and lattice-valued congruences in residuated lattices. We introduce a new definition of congruences which just depends on the meet ∧ and the residuum →. Then it is shown that each of these congruences is automatically a universal-algebra-congruence. Also, lattice-valued filters and lattice-valued congruences are studied, and it is shown that there is a one-to-one correspondence between the set of all (lattice-valued filters and the set of all (lattice-valued congruences.
Benchmarking: A tool to enhance performance
Energy Technology Data Exchange (ETDEWEB)
Munro, J.F. [Oak Ridge National Lab., TN (United States); Kristal, J. [USDOE Assistant Secretary for Environmental Management, Washington, DC (United States); Thompson, G.; Johnson, T. [Los Alamos National Lab., NM (United States)
1996-12-31
The Office of Environmental Management is bringing Headquarters and the Field together to implement process improvements throughout the Complex through a systematic process of organizational learning called benchmarking. Simply stated, benchmarking is a process of continuously comparing and measuring practices, processes, or methodologies with those of other private and public organizations. The EM benchmarking program, which began as the result of a recommendation from Xerox Corporation, is building trust and removing barriers to performance enhancement across the DOE organization. The EM benchmarking program is designed to be field-centered with Headquarters providing facilitatory and integrative functions on an ``as needed`` basis. One of the main goals of the program is to assist Field Offices and their associated M&O/M&I contractors develop the capabilities to do benchmarking for themselves. In this regard, a central precept is that in order to realize tangible performance benefits, program managers and staff -- the ones closest to the work - must take ownership of the studies. This avoids the ``check the box`` mentality associated with some third party studies. This workshop will provide participants with a basic level of understanding why the EM benchmarking team was developed and the nature and scope of its mission. Participants will also begin to understand the types of study levels and the particular methodology the EM benchmarking team is using to conduct studies. The EM benchmarking team will also encourage discussion on ways that DOE (both Headquarters and the Field) can team with its M&O/M&I contractors to conduct additional benchmarking studies. This ``introduction to benchmarking`` is intended to create a desire to know more and a greater appreciation of how benchmarking processes could be creatively employed to enhance performance.
Benchmarking ICRF simulations for ITER
Energy Technology Data Exchange (ETDEWEB)
R. V. Budny, L. Berry, R. Bilato, P. Bonoli, M. Brambilla, R.J. Dumont, A. Fukuyama, R. Harvey, E.F. Jaeger, E. Lerche, C.K. Phillips, V. Vdovin, J. Wright, and members of the ITPA-IOS
2010-09-28
Abstract Benchmarking of full-wave solvers for ICRF simulations is performed using plasma profiles and equilibria obtained from integrated self-consistent modeling predictions of four ITER plasmas. One is for a high performance baseline (5.3 T, 15 MA) DT H-mode plasma. The others are for half-field, half-current plasmas of interest for the pre-activation phase with bulk plasma ion species being either hydrogen or He4. The predicted profiles are used by seven groups to predict the ICRF electromagnetic fields and heating profiles. Approximate agreement is achieved for the predicted heating power partitions for the DT and He4 cases. Profiles of the heating powers and electromagnetic fields are compared.
Benchmarking Asteroid-Deflection Experiment
Remington, Tane; Bruck Syal, Megan; Owen, John Michael; Miller, Paul L.
2016-10-01
An asteroid impacting Earth could have devastating consequences. In preparation to deflect or disrupt one before it reaches Earth, it is imperative to have modeling capabilities that adequately simulate the deflection actions. Code validation is key to ensuring full confidence in simulation results used in an asteroid-mitigation plan. We are benchmarking well-known impact experiments using Spheral, an adaptive smoothed-particle hydrodynamics code, to validate our modeling of asteroid deflection. We describe our simulation results, compare them with experimental data, and discuss what we have learned from our work. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-695540
NASA Software Engineering Benchmarking Study
Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.
2013-01-01
To identify best practices for the improvement of software engineering on projects, NASA's Offices of Chief Engineer (OCE) and Safety and Mission Assurance (OSMA) formed a team led by Heather Rarick and Sally Godfrey to conduct this benchmarking study. The primary goals of the study are to identify best practices that: Improve the management and technical development of software intensive systems; Have a track record of successful deployment by aerospace industries, universities [including research and development (R&D) laboratories], and defense services, as well as NASA's own component Centers; and Identify candidate solutions for NASA's software issues. Beginning in the late fall of 2010, focus topics were chosen and interview questions were developed, based on the NASA top software challenges. Between February 2011 and November 2011, the Benchmark Team interviewed a total of 18 organizations, consisting of five NASA Centers, five industry organizations, four defense services organizations, and four university or university R and D laboratory organizations. A software assurance representative also participated in each of the interviews to focus on assurance and software safety best practices. Interviewees provided a wealth of information on each topic area that included: software policy, software acquisition, software assurance, testing, training, maintaining rigor in small projects, metrics, and use of the Capability Maturity Model Integration (CMMI) framework, as well as a number of special topics that came up in the discussions. NASA's software engineering practices compared favorably with the external organizations in most benchmark areas, but in every topic, there were ways in which NASA could improve its practices. Compared to defense services organizations and some of the industry organizations, one of NASA's notable weaknesses involved communication with contractors regarding its policies and requirements for acquired software. One of NASA's strengths
COG validation: SINBAD Benchmark Problems
Energy Technology Data Exchange (ETDEWEB)
Lent, E M; Sale, K E; Buck, R M; Descalle, M
2004-02-23
We validated COG, a 3D Monte Carlo radiation transport code, against experimental data and MNCP4C simulations from the Shielding Integral Benchmark Archive Database (SINBAD) compiled by RSICC. We modeled three experiments: the Osaka Nickel and Aluminum sphere experiments conducted at the OKTAVIAN facility, and the liquid oxygen experiment conducted at the FNS facility. COG results are in good agreement with experimental data and generally within a few % of MCNP results. There are several possible sources of discrepancy between MCNP and COG results: (1) the cross-section database versions are different, MCNP uses ENDFB VI 1.1 while COG uses ENDFB VIR7, (2) the code implementations are different, and (3) the models may differ slightly. We also limited the use of variance reduction methods when running the COG version of the problems.
General benchmarks for quantum repeaters
Pirandola, Stefano
2015-01-01
Using a technique based on quantum teleportation, we simplify the most general adaptive protocols for key distribution, entanglement distillation and quantum communication over a wide class of quantum channels in arbitrary dimension. Thanks to this method, we bound the ultimate rates for secret key generation and quantum communication through single-mode Gaussian channels and several discrete-variable channels. In particular, we derive exact formulas for the two-way assisted capacities of the bosonic quantum-limited amplifier and the dephasing channel in arbitrary dimension, as well as the secret key capacity of the qubit erasure channel. Our results establish the limits of quantum communication with arbitrary systems and set the most general and precise benchmarks for testing quantum repeaters in both discrete- and continuous-variable settings.
Knuth, Kevin H.
2009-12-01
Previous derivations of the sum and product rules of probability theory relied on the algebraic properties of Boolean logic. Here they are derived within a more general framework based on lattice theory. The result is a new foundation of probability theory that encompasses and generalizes both the Cox and Kolmogorov formulations. In this picture probability is a bi-valuation defined on a lattice of statements that quantifies the degree to which one statement implies another. The sum rule is a constraint equation that ensures that valuations are assigned so as to not violate associativity of the lattice join and meet. The product rule is much more interesting in that there are actually two product rules: one is a constraint equation arises from associativity of the direct products of lattices, and the other a constraint equation derived from associativity of changes of context. The generality of this formalism enables one to derive the traditionally assumed condition of additivity in measure theory, as well introduce a general notion of product. To illustrate the generic utility of this novel lattice-theoretic foundation of measure, the sum and product rules are applied to number theory. Further application of these concepts to understand the foundation of quantum mechanics is described in a joint paper in this proceedings.
HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks
Energy Technology Data Exchange (ETDEWEB)
Paulson, Patrick R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Purohit, Sumit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rodriguez, Luke R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2015-05-01
This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.
42 CFR 440.330 - Benchmark health benefits coverage.
2010-10-01
... 42 Public Health 4 2010-10-01 2010-10-01 false Benchmark health benefits coverage. 440.330 Section... SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS SERVICES: GENERAL PROVISIONS Benchmark Benefit and Benchmark-Equivalent Coverage § 440.330 Benchmark health benefits coverage. Benchmark coverage is...
Chen, Yuntian
2015-01-01
We study semi-analytically the light emission and absorption properties of arbitrary stratified photonic structures with embedded two-dimensional magnetoelectric point scattering lattices, as used in recent plasmon-enhanced LEDs and solar cells. By employing dyadic Green's function for the layered structure in combination with Ewald lattice summation to deal with the particle lattice, we develop an efficient method to study the coupling between planar 2D scattering lattices of plasmonic, or metamaterial point particles, coupled to layered structures. Using the `array scanning method' we deal with localized sources. Firstly, we apply our method to light emission enhancement of dipole emitters in slab waveguides, mediated by plasmonic lattices. We benchmark the array scanning method against a reciprocity-based approach to find that the calculated radiative rate enhancement in k-space below the light cone shows excellent agreement. Secondly, we apply our method to study absorption-enhancement in thin-film solar ...
Extended particle swarm optimisation method for folding protein on triangular lattice.
Guo, Yuzhen; Wu, Zikai; Wang, Ying; Wang, Yong
2016-02-01
In this study, the authors studied the protein structure prediction problem by the two-dimensional hydrophobic-polar model on triangular lattice. Particularly the non-compact conformation was modelled to fold the amino acid sequence into a relatively larger triangular lattice, which is more biologically realistic and significant than the compact conformation. Then protein structure prediction problem was abstracted to match amino acids to lattice points. Mathematically, the problem was formulated as an integer programming and they transformed the biological problem into an optimisation problem. To solve this problem, classical particle swarm optimisation algorithm was extended by the single point adjustment strategy. Compared with square lattice, conformations on triangular lattice are more flexible in several benchmark examples. They further compared the authors' algorithm with hybrid of hill climbing and genetic algorithm. The results showed that their method was more effective in finding solution with lower energy and less running time.
An Effective Approach for Benchmarking Implementation
Directory of Open Access Journals (Sweden)
B. M. Deros
2011-01-01
Full Text Available Problem statement: The purpose of this study is to present a benchmarking guideline, conceptual framework and computerized mini program to assists companies achieve better performance in terms of quality, cost, delivery, supply chain and eventually increase their competitiveness in the market. The study begins with literature review on benchmarking definition, barriers and advantages from the implementation and the study of benchmarking framework. Approach: Thirty respondents were involved in the case study. They comprise of industrial practitioners, which had assessed usability and practicability of the guideline, conceptual framework and computerized mini program. Results: A guideline and template were proposed to simplify the adoption of benchmarking techniques. A conceptual framework was proposed by integrating the Demings PDCA and Six Sigma DMAIC theory. It was provided a step-by-step method to simplify the implementation and to optimize the benchmarking results. A computerized mini program was suggested to assist the users in adopting the technique as part of improvement project. As the result from the assessment test, the respondents found that the implementation method provided an idea for company to initiate benchmarking implementation and it guides them to achieve the desired goal as set in a benchmarking project. Conclusion: The result obtained and discussed in this study can be applied in implementing benchmarking in a more systematic way for ensuring its success.
Synergetic effect of benchmarking competitive advantages
Directory of Open Access Journals (Sweden)
N.P. Tkachova
2011-12-01
Full Text Available It is analyzed the essence of synergistic competitive benchmarking. The classification of types of synergies is developed. It is determined the sources of synergies in conducting benchmarking of competitive advantages. It is proposed methodological framework definition of synergy in the formation of competitive advantage.
Synergetic effect of benchmarking competitive advantages
N.P. Tkachova; P.G. Pererva
2011-01-01
It is analyzed the essence of synergistic competitive benchmarking. The classification of types of synergies is developed. It is determined the sources of synergies in conducting benchmarking of competitive advantages. It is proposed methodological framework definition of synergy in the formation of competitive advantage.
Benchmarking set for domestic smart grid management
Bosman, M.G.C.; Bakker, Vincent; Molderink, Albert; Hurink, Johann L.; Smit, Gerardus Johannes Maria
2010-01-01
In this paper we propose a benchmark for domestic smart grid management. It consists of an in-depth description of a domestic smart grid, in which local energy consumers, producers and buffers can be controlled. First, from this description a general benchmark framework is derived, which can be used
Machines are benchmarked by code, not algorithms
Poss, R.
2013-01-01
This article highlights how small modifications to either the source code of a benchmark program or the compilation options may impact its behavior on a specific machine. It argues that for evaluating machines, benchmark providers and users be careful to ensure reproducibility of results based on th
Benchmark analysis of railway networks and undertakings
Hansen, I.A.; Wiggenraad, P.B.L.; Wolff, J.W.
2013-01-01
Benchmark analysis of railway networks and companies has been stimulated by the European policy of deregulation of transport markets, the opening of national railway networks and markets to new entrants and separation of infrastructure and train operation. Recent international railway benchmarking s
Benchmark Assessment for Improved Learning. AACC Report
Herman, Joan L.; Osmundson, Ellen; Dietel, Ronald
2010-01-01
This report describes the purposes of benchmark assessments and provides recommendations for selecting and using benchmark assessments--addressing validity, alignment, reliability, fairness and bias and accessibility, instructional sensitivity, utility, and reporting issues. We also present recommendations on building capacity to support schools'…
Benchmark Two-Good Utility Functions
de Jaegher, K.
Benchmark two-good utility functions involving a good with zero income elasticity and unit income elasticity are well known. This paper derives utility functions for the additional benchmark cases where one good has zero cross-price elasticity, unit own-price elasticity, and zero own price
Benchmark Two-Good Utility Functions
de Jaegher, K.
2007-01-01
Benchmark two-good utility functions involving a good with zero income elasticity and unit income elasticity are well known. This paper derives utility functions for the additional benchmark cases where one good has zero cross-price elasticity, unit own-price elasticity, and zero own price elasticit
Benchmarking Learning and Teaching: Developing a Method
Henderson-Smart, Cheryl; Winning, Tracey; Gerzina, Tania; King, Shalinie; Hyde, Sarah
2006-01-01
Purpose: To develop a method for benchmarking teaching and learning in response to an institutional need to validate a new program in Dentistry at the University of Sydney, Australia. Design/methodology/approach: After a collaborative partner, University of Adelaide, was identified, the areas of teaching and learning to be benchmarked, PBL…
Lattice Boltzmann Stokesian dynamics.
Ding, E J
2015-11-01
Lattice Boltzmann Stokesian dynamics (LBSD) is presented for simulation of particle suspension in Stokes flows. This method is developed from Stokesian dynamics (SD) with resistance and mobility matrices calculated using the time-independent lattice Boltzmann algorithm (TILBA). TILBA is distinguished from the traditional lattice Boltzmann method (LBM) in that a background matrix is generated prior to the calculation. The background matrix, once generated, can be reused for calculations for different scenarios, thus the computational cost for each such subsequent calculation is significantly reduced. The LBSD inherits the merits of the SD where both near- and far-field interactions are considered. It also inherits the merits of the LBM that the computational cost is almost independent of the particle shape.
Weisz, Peter; Majumdar, Pushan
2012-03-01
Lattice gauge theory is a formulation of quantum field theory with gauge symmetries on a space-time lattice. This formulation is particularly suitable for describing hadronic phenomena. In this article we review the present status of lattice QCD. We outline some of the computational methods, discuss some phenomenological applications and a variety of non-perturbative topics. The list of references is severely incomplete, the ones we have included are text books or reviews and a few subjectively selected papers. Kronfeld and Quigg (2010) supply a reasonably comprehensive set of QCD references. We apologize for the fact that have not covered many important topics such as QCD at finite density and heavy quark effective theory adequately, and mention some of them only in the last section "In Brief". These topics should be considered in further Scholarpedia articles.
Improved Lattice Radial Quantization
Brower, Richard C; Fleming, George T
2014-01-01
Lattice radial quantization was proposed in a recent paper by Brower, Fleming and Neuberger[1] as a nonperturbative method especially suited to numerically solve Euclidean conformal field theories. The lessons learned from the lattice radial quantization of the 3D Ising model on a longitudinal cylinder with 2D Icosahedral cross-section suggested the need for an improved discretization. We consider here the use of the Finite Element Methods(FEM) to descretize the universally-equivalent $\\phi^4$ Lagrangian on $\\mathbb R \\times \\mathbb S^2$. It is argued that this lattice regularization will approach the exact conformal theory at the Wilson-Fisher fixed point in the continuum. Numerical tests are underway to support this conjecture.
Graphene antidot lattice waveguides
DEFF Research Database (Denmark)
Pedersen, Jesper Goor; Gunst, Tue; Markussen, Troels
2012-01-01
We introduce graphene antidot lattice waveguides: nanostructured graphene where a region of pristine graphene is sandwiched between regions of graphene antidot lattices. The band gaps in the surrounding antidot lattices enable localized states to emerge in the central waveguide region. We model...... the waveguides via a position-dependent mass term in the Dirac approximation of graphene and arrive at analytical results for the dispersion relation and spinor eigenstates of the localized waveguide modes. To include atomistic details we also use a tight-binding model, which is in excellent agreement...... with the analytical results. The waveguides resemble graphene nanoribbons, but without the particular properties of ribbons that emerge due to the details of the edge. We show that electrons can be guided through kinks without additional resistance and that transport through the waveguides is robust against...
Digital lattice gauge theories
Zohar, Erez; Reznik, Benni; Cirac, J Ignacio
2016-01-01
We propose a general scheme for a digital construction of lattice gauge theories with dynamical fermions. In this method, the four-body interactions arising in models with $2+1$ dimensions and higher, are obtained stroboscopically, through a sequence of two-body interactions with ancillary degrees of freedom. This yields stronger interactions than the ones obtained through pertubative methods, as typically done in previous proposals, and removes an important bottleneck in the road towards experimental realizations. The scheme applies to generic gauge theories with Lie or finite symmetry groups, both Abelian and non-Abelian. As a concrete example, we present the construction of a digital quantum simulator for a $\\mathbb{Z}_{3}$ lattice gauge theory with dynamical fermionic matter in $2+1$ dimensions, using ultracold atoms in optical lattices, involving three atomic species, representing the matter, gauge and auxiliary degrees of freedom, that are separated in three different layers. By moving the ancilla atoms...
Oates, Chris
2012-06-01
Since they were first proposed in 2003 [1], optical lattice clocks have become one of the leading technologies for the next generation of atomic clocks, which will be used for advanced timing applications and in tests of fundamental physics [2]. These clocks are based on stabilized lasers whose frequency is ultimately referenced to an ultra-narrow neutral atom transition (natural linewidths magic'' value so as to yield a vanishing net AC Stark shift for the clock transition. As a result lattice clocks have demonstrated the capability of generating high stability clock signals with small absolute uncertainties (˜ 1 part in 10^16). In this presentation I will first give an overview of the field, which now includes three different atomic species. I will then use experiments with Yb performed in our laboratory to illustrate the key features of a lattice clock. Our research has included the development of state-of-the-art optical cavities enabling ultra-high-resolution optical spectroscopy (1 Hz linewidth). Together with the large atom number in the optical lattice, we are able to achieve very low clock instability (< 0.3 Hz in 1 s) [3]. Furthermore, I will show results from some of our recent investigations of key shifts for the Yb lattice clock, including high precision measurements of ultracold atom-atom interactions in the lattice and the dc Stark effect for the Yb clock transition (necessary for the evaluation of blackbody radiation shifts). [4pt] [1] H. Katori, M. Takamoto, V. G. Pal'chikov, and V. D. Ovsiannikov, Phys. Rev. Lett. 91, 173005 (2003). [0pt] [2] Andrei Derevianko and Hidetoshi Katori, Rev. Mod. Phys. 83, 331 (2011). [0pt] [3] Y. Y. Jiang, A. D. Ludlow, N. D. Lemke, R. W. Fox, J. A. Sherman, L.-S. Ma, and C. W. Oates, Nature Photonics 5, 158 (2011).
A Seafloor Benchmark for 3-dimensional Geodesy
Chadwell, C. D.; Webb, S. C.; Nooner, S. L.
2014-12-01
We have developed an inexpensive, permanent seafloor benchmark to increase the longevity of seafloor geodetic measurements. The benchmark provides a physical tie to the sea floor lasting for decades (perhaps longer) on which geodetic sensors can be repeatedly placed and removed with millimeter resolution. Global coordinates estimated with seafloor geodetic techniques will remain attached to the benchmark allowing for the interchange of sensors as they fail or become obsolete, or for the sensors to be removed and used elsewhere, all the while maintaining a coherent series of positions referenced to the benchmark. The benchmark has been designed to free fall from the sea surface with transponders attached. The transponder can be recalled via an acoustic command sent from the surface to release from the benchmark and freely float to the sea surface for recovery. The duration of the sensor attachment to the benchmark will last from a few days to a few years depending on the specific needs of the experiment. The recovered sensors are then available to be reused at other locations, or again at the same site in the future. Three pins on the sensor frame mate precisely and unambiguously with three grooves on the benchmark. To reoccupy a benchmark a Remotely Operated Vehicle (ROV) uses its manipulator arm to place the sensor pins into the benchmark grooves. In June 2014 we deployed four benchmarks offshore central Oregon. We used the ROV Jason to successfully demonstrate the removal and replacement of packages onto the benchmark. We will show the benchmark design and its operational capabilities. Presently models of megathrust slip within the Cascadia Subduction Zone (CSZ) are mostly constrained by the sub-aerial GPS vectors from the Plate Boundary Observatory, a part of Earthscope. More long-lived seafloor geodetic measures are needed to better understand the earthquake and tsunami risk associated with a large rupture of the thrust fault within the Cascadia subduction zone
Entropic Lattice Boltzmann Method for Moving and Deforming Geometries in Three Dimensions
Dorschner, B; Karlin, I V
2016-01-01
Entropic lattice Boltzmann methods have been developed to alleviate intrinsic stability issues of lattice Boltzmann models for under-resolved simulations. Its reliability in combination with moving objects was established for various laminar benchmark flows in two dimensions in our previous work Dorschner et al. [11] as well as for three dimensional one-way coupled simulations of engine-type geometries in Dorschner et al. [12] for flat moving walls. The present contribution aims to fully exploit the advantages of entropic lattice Boltzmann models in terms of stability and accuracy and extends the methodology to three-dimensional cases including two-way coupling between fluid and structure, turbulence and deformable meshes. To cover this wide range of applications, the classical benchmark of a sedimenting sphere is chosen first to validate the general two-way coupling algorithm. Increasing the complexity, we subsequently consider the simulation of a plunging SD7003 airfoil at a Reynolds number of Re = 40000 an...
Energy Technology Data Exchange (ETDEWEB)
Catterall, Simon; Kaplan, David B.; Unsal, Mithat
2009-03-31
We provide an introduction to recent lattice formulations of supersymmetric theories which are invariant under one or more real supersymmetries at nonzero lattice spacing. These include the especially interesting case of N = 4 SYM in four dimensions. We discuss approaches based both on twisted supersymmetry and orbifold-deconstruction techniques and show their equivalence in the case of gauge theories. The presence of an exact supersymmetry reduces and in some cases eliminates the need for fine tuning to achieve a continuum limit invariant under the full supersymmetry of the target theory. We discuss open problems.
Grabisch, Michel
2008-01-01
We extend the notion of belief function to the case where the underlying structure is no more the Boolean lattice of subsets of some universal set, but any lattice, which we will endow with a minimal set of properties according to our needs. We show that all classical constructions and definitions (e.g., mass allocation, commonality function, plausibility functions, necessity measures with nested focal elements, possibility distributions, Dempster rule of combination, decomposition w.r.t. simple support functions, etc.) remain valid in this general setting. Moreover, our proof of decomposition of belief functions into simple support functions is much simpler and general than the original one by Shafer.
Directory of Open Access Journals (Sweden)
Futa Yuichi
2016-03-01
Full Text Available In this article, we formalize the definition of lattice of ℤ-module and its properties in the Mizar system [5].We formally prove that scalar products in lattices are bilinear forms over the field of real numbers ℝ. We also formalize the definitions of positive definite and integral lattices and their properties. Lattice of ℤ-module is necessary for lattice problems, LLL (Lenstra, Lenstra and Lovász base reduction algorithm [14], and cryptographic systems with lattices [15] and coding theory [9].
An Algorithm on Generating Lattice Based on Layered Concept Lattice
Directory of Open Access Journals (Sweden)
Zhang Chang-sheng
2013-08-01
Full Text Available Concept lattice is an effective tool for data analysis and rule extraction, a bottleneck factor on impacting the applications of concept lattice is how to generate lattice efficiently. In this paper, an algorithm LCLG on generating lattice in batch processing based on layered concept lattice is developed, this algorithm is based on layered concept lattice, the lattice is generated downward layer by layer through concept nodes and provisional nodes in current layer; the concept nodes are found parent-child relationships upward layer by layer, then the Hasse diagram of inter-layer connection is generated; in the generated process of the lattice nodes in each layer, we do the pruning operations dynamically according to relevant properties, and delete some unnecessary nodes, such that the generating speed is improved greatly; the experimental results demonstrate that the proposed algorithm has good performance.
Shigaki, Kenta; Noda, Fumiaki; Yamamoto, Kazami; Machida, Shinji; Molodojentsev, Alexander; Ishi, Yoshihiro
2002-12-01
The JKJ high-intensity proton accelerator facility consists of a 400-MeV linac, a 3-GeV 1-MW rapid-cycling synchrotron and a 50-GeV 0.75-MW synchrotron. The lattice and beam dynamics design of the two synchrotrons are reported.
de Raedt, Hans; von der Linden, W.; Binder, K
1995-01-01
In this chapter we review methods currently used to perform Monte Carlo calculations for quantum lattice models. A detailed exposition is given of the formalism underlying the construction of the simulation algorithms. We discuss the fundamental and technical difficulties that are encountered and gi
Knuth, Kevin H
2009-01-01
Previous derivations of the sum and product rules of probability theory relied on the algebraic properties of Boolean logic. Here they are derived within a more general framework based on lattice theory. The result is a new foundation of probability theory that encompasses and generalizes both the Cox and Kolmogorov formulations. In this picture probability is a bi-valuation defined on a lattice of statements that quantifies the degree to which one statement implies another. The sum rule is a constraint equation that ensures that valuations are assigned so as to not violate associativity of the lattice join and meet. The product rule is much more interesting in that there are actually two product rules: one is a constraint equation arises from associativity of the direct products of lattices, and the other a constraint equation derived from associativity of changes of context. The generality of this formalism enables one to derive the traditionally assumed condition of additivity in measure theory, as well in...
Williamson, S. Gill
2010-01-01
Will the cosmological multiverse, when described mathematically, have easily stated properties that are impossible to prove or disprove using mathematical physics? We explore this question by constructing lattice multiverses which exhibit such behavior even though they are much simpler mathematically than any likely cosmological multiverse.
Phenomenology from lattice QCD
Lellouch, L P
2003-01-01
After a short presentation of lattice QCD and some of its current practical limitations, I review recent progress in applications to phenomenology. Emphasis is placed on heavy-quark masses and on hadronic weak matrix elements relevant for constraining the CKM unitarity triangle. The main numerical results are highlighted in boxes.
Noetherian and Artinian Lattices
Directory of Open Access Journals (Sweden)
Derya Keskin Tütüncü
2012-01-01
Full Text Available It is proved that if L is a complete modular lattice which is compactly generated, then Rad(L/0 is Artinian if, and only if for every small element a of L, the sublattice a/0 is Artinian if, and only if L satisfies DCC on small elements.
ICSBEP Benchmarks For Nuclear Data Applications
Briggs, J. Blair
2005-05-01
The International Criticality Safety Benchmark Evaluation Project (ICSBEP) was initiated in 1992 by the United States Department of Energy. The ICSBEP became an official activity of the Organization for Economic Cooperation and Development (OECD) — Nuclear Energy Agency (NEA) in 1995. Representatives from the United States, United Kingdom, France, Japan, the Russian Federation, Hungary, Republic of Korea, Slovenia, Serbia and Montenegro (formerly Yugoslavia), Kazakhstan, Spain, Israel, Brazil, Poland, and the Czech Republic are now participating. South Africa, India, China, and Germany are considering participation. The purpose of the ICSBEP is to identify, evaluate, verify, and formally document a comprehensive and internationally peer-reviewed set of criticality safety benchmark data. The work of the ICSBEP is published as an OECD handbook entitled "International Handbook of Evaluated Criticality Safety Benchmark Experiments." The 2004 Edition of the Handbook contains benchmark specifications for 3331 critical or subcritical configurations that are intended for use in validation efforts and for testing basic nuclear data. New to the 2004 Edition of the Handbook is a draft criticality alarm / shielding type benchmark that should be finalized in 2005 along with two other similar benchmarks. The Handbook is being used extensively for nuclear data testing and is expected to be a valuable resource for code and data validation and improvement efforts for decades to come. Specific benchmarks that are useful for testing structural materials such as iron, chromium, nickel, and manganese; beryllium; lead; thorium; and 238U are highlighted.
The Isprs Benchmark on Indoor Modelling
Khoshelham, K.; Díaz Vilariño, L.; Peter, M.; Kang, Z.; Acharya, D.
2017-09-01
Automated generation of 3D indoor models from point cloud data has been a topic of intensive research in recent years. While results on various datasets have been reported in literature, a comparison of the performance of different methods has not been possible due to the lack of benchmark datasets and a common evaluation framework. The ISPRS benchmark on indoor modelling aims to address this issue by providing a public benchmark dataset and an evaluation framework for performance comparison of indoor modelling methods. In this paper, we present the benchmark dataset comprising several point clouds of indoor environments captured by different sensors. We also discuss the evaluation and comparison of indoor modelling methods based on manually created reference models and appropriate quality evaluation criteria. The benchmark dataset is available for download at: http://www2.isprs.org/commissions/comm4/wg5/benchmark-on-indoor-modelling.html"target="_blank">http://www2.isprs.org/commissions/comm4/wg5/benchmark-on-indoor-modelling.html.
Plans to update benchmarking tool.
Stokoe, Mark
2013-02-01
The use of the current AssetMark system by hospital health facilities managers and engineers (in Australia) has decreased to a point of no activity occurring. A number of reasons have been cited, including cost, time to do, slow process, and level of information required. Based on current levels of activity, it would not be of any value to IHEA, or to its members, to continue with this form of AssetMark. For AssetMark to remain viable, it needs to be developed as a tool seen to be of value to healthcare facilities managers, and not just healthcare facility engineers. Benchmarking is still a very important requirement in the industry, and AssetMark can fulfil this need provided that it remains abreast of customer needs. The proposed future direction is to develop an online version of AssetMark with its current capabilities regarding capturing of data (12 Key Performance Indicators), reporting, and user interaction. The system would also provide end-users with access to live reporting features via a user-friendly web nterface linked through the IHEA web page.
Academic Benchmarks for Otolaryngology Leaders.
Eloy, Jean Anderson; Blake, Danielle M; D'Aguillo, Christine; Svider, Peter F; Folbe, Adam J; Baredes, Soly
2015-08-01
This study aimed to characterize current benchmarks for academic otolaryngologists serving in positions of leadership and identify factors potentially associated with promotion to these positions. Information regarding chairs (or division chiefs), vice chairs, and residency program directors was obtained from faculty listings and organized by degree(s) obtained, academic rank, fellowship training status, sex, and experience. Research productivity was characterized by (a) successful procurement of active grants from the National Institutes of Health and prior grants from the American Academy of Otolaryngology-Head and Neck Surgery Foundation Centralized Otolaryngology Research Efforts program and (b) scholarly impact, as measured by the h-index. Chairs had the greatest amount of experience (32.4 years) and were the least likely to have multiple degrees, with 75.8% having an MD degree only. Program directors were the most likely to be fellowship trained (84.8%). Women represented 16% of program directors, 3% of chairs, and no vice chairs. Chairs had the highest scholarly impact (as measured by the h-index) and the greatest external grant funding. This analysis characterizes the current picture of leadership in academic otolaryngology. Chairs, when compared to their vice chair and program director counterparts, had more experience and greater research impact. Women were poorly represented among all academic leadership positions. © The Author(s) 2015.
Benchmarking Measures of Network Influence
Bramson, Aaron; Vandermarliere, Benjamin
2016-01-01
Identifying key agents for the transmission of diseases (ideas, technology, etc.) across social networks has predominantly relied on measures of centrality on a static base network or a temporally flattened graph of agent interactions. Various measures have been proposed as the best trackers of influence, such as degree centrality, betweenness, and k-shell, depending on the structure of the connectivity. We consider SIR and SIS propagation dynamics on a temporally-extruded network of observed interactions and measure the conditional marginal spread as the change in the magnitude of the infection given the removal of each agent at each time: its temporal knockout (TKO) score. We argue that this TKO score is an effective benchmark measure for evaluating the accuracy of other, often more practical, measures of influence. We find that none of the network measures applied to the induced flat graphs are accurate predictors of network propagation influence on the systems studied; however, temporal networks and the TKO measure provide the requisite targets for the search for effective predictive measures. PMID:27670635
Developing integrated benchmarks for DOE performance measurement
Energy Technology Data Exchange (ETDEWEB)
Barancik, J.I.; Kramer, C.F.; Thode, Jr. H.C.
1992-09-30
The objectives of this task were to describe and evaluate selected existing sources of information on occupational safety and health with emphasis on hazard and exposure assessment, abatement, training, reporting, and control identifying for exposure and outcome in preparation for developing DOE performance benchmarks. Existing resources and methodologies were assessed for their potential use as practical performance benchmarks. Strengths and limitations of current data resources were identified. Guidelines were outlined for developing new or improved performance factors, which then could become the basis for selecting performance benchmarks. Data bases for non-DOE comparison populations were identified so that DOE performance could be assessed relative to non-DOE occupational and industrial groups. Systems approaches were described which can be used to link hazards and exposure, event occurrence, and adverse outcome factors, as needed to generate valid, reliable, and predictive performance benchmarks. Data bases were identified which contain information relevant to one or more performance assessment categories . A list of 72 potential performance benchmarks was prepared to illustrate the kinds of information that can be produced through a benchmark development program. Current information resources which may be used to develop potential performance benchmarks are limited. There is need to develop an occupational safety and health information and data system in DOE, which is capable of incorporating demonstrated and documented performance benchmarks prior to, or concurrent with the development of hardware and software. A key to the success of this systems approach is rigorous development and demonstration of performance benchmark equivalents to users of such data before system hardware and software commitments are institutionalized.
INL Results for Phases I and III of the OECD/NEA MHTGR-350 Benchmark
Energy Technology Data Exchange (ETDEWEB)
Gerhard Strydom; Javier Ortensi; Sonat Sen; Hans Hammer
2013-09-01
The Idaho National Laboratory (INL) Very High Temperature Reactor (VHTR) Technology Development Office (TDO) Methods Core Simulation group led the construction of the Organization for Economic Cooperation and Development (OECD) Modular High Temperature Reactor (MHTGR) 350 MW benchmark for comparing and evaluating prismatic VHTR analysis codes. The benchmark is sponsored by the OECD's Nuclear Energy Agency (NEA), and the project will yield a set of reference steady-state, transient, and lattice depletion problems that can be used by the Department of Energy (DOE), the Nuclear Regulatory Commission (NRC), and vendors to assess their code suits. The Methods group is responsible for defining the benchmark specifications, leading the data collection and comparison activities, and chairing the annual technical workshops. This report summarizes the latest INL results for Phase I (steady state) and Phase III (lattice depletion) of the benchmark. The INSTANT, Pronghorn and RattleSnake codes were used for the standalone core neutronics modeling of Exercise 1, and the results obtained from these codes are compared in Section 4. Exercise 2 of Phase I requires the standalone steady-state thermal fluids modeling of the MHTGR-350 design, and the results for the systems code RELAP5-3D are discussed in Section 5. The coupled neutronics and thermal fluids steady-state solution for Exercise 3 are reported in Section 6, utilizing the newly developed Parallel and Highly Innovative Simulation for INL Code System (PHISICS)/RELAP5-3D code suit. Finally, the lattice depletion models and results obtained for Phase III are compared in Section 7. The MHTGR-350 benchmark proved to be a challenging simulation set of problems to model accurately, and even with the simplifications introduced in the benchmark specification this activity is an important step in the code-to-code verification of modern prismatic VHTR codes. A final OECD/NEA comparison report will compare the Phase I and III
Basis reduction for layered lattices
Torreão Dassen, Erwin
2011-01-01
We develop the theory of layered Euclidean spaces and layered lattices. We present algorithms to compute both Gram-Schmidt and reduced bases in this generalized setting. A layered lattice can be seen as lattices where certain directions have infinite weight. It can also be interpre
Spin qubits in antidot lattices
DEFF Research Database (Denmark)
Pedersen, Jesper Goor; Flindt, Christian; Mortensen, Niels Asger;
2008-01-01
and density of states for a periodic potential modulation, referred to as an antidot lattice, and find that localized states appear, when designed defects are introduced in the lattice. Such defect states may form the building blocks for quantum computing in a large antidot lattice, allowing for coherent...
Benchmarking – A tool for judgment or improvement?
DEFF Research Database (Denmark)
Rasmussen, Grane Mikael Gregaard
2010-01-01
these issues, and describes how effects are closely connected to the perception of benchmarking, the intended users of the system and the application of the benchmarking results. The fundamental basis of this paper is taken from the development of benchmarking in the Danish construction sector. Two distinct...... perceptions of benchmarking will be presented; public benchmarking and best practice benchmarking. These two types of benchmarking are used to characterize and discuss the Danish benchmarking system and to enhance which effects, possibilities and challenges that follow in the wake of using this kind...... of benchmarking. In conclusion it is argued that clients and the Danish government are the intended users of the benchmarking system. The benchmarking results are primarily used by the government for monitoring and regulation of the construction sector and by clients for contractor selection. The dominating use...
Benchmarks for dynamic multi-objective optimisation
CSIR Research Space (South Africa)
Helbig, M
2013-06-01
Full Text Available When algorithms solve dynamic multi-objective optimisation problems (DMOOPs), benchmark functions should be used to determine whether the algorithm can overcome specific difficulties that can occur in real-world problems. However, for dynamic multi...
Medicare Contracting - Redacted Benchmark Metric Reports
U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services has compiled aggregate national benchmark cost and workload metrics using data submitted to CMS by the AB MACs and the...
XWeB: The XML Warehouse Benchmark
Mahboubi, Hadj; Darmont, Jérôme
With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems.
XWeB: the XML Warehouse Benchmark
Mahboubi, Hadj
2011-01-01
With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems.
Professional Performance and Bureaucratic Benchmarking Information
DEFF Research Database (Denmark)
Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz
provision to the chief physician of the respective department. Professional performance is publicly disclosed due to regulatory requirements. At the same time, chief physicians typically receive bureaucratic benchmarking information from the administration. We find that more frequent bureaucratic...
Benchmarking of PR Function in Serbian Companies
National Research Council Canada - National Science Library
Nikolić, Milan; Sajfert, Zvonko; Vukonjanski, Jelena
2009-01-01
The purpose of this paper is to present methodologies for carrying out benchmarking of the PR function in Serbian companies and to test the practical application of the research results and proposed...
Energy Technology Data Exchange (ETDEWEB)
Gupta, R.
1998-12-31
The goal of the lectures on lattice QCD (LQCD) is to provide an overview of both the technical issues and the progress made so far in obtaining phenomenologically useful numbers. The lectures consist of three parts. The author`s charter is to provide an introduction to LQCD and outline the scope of LQCD calculations. In the second set of lectures, Guido Martinelli will discuss the progress they have made so far in obtaining results, and their impact on Standard Model phenomenology. Finally, Martin Luescher will discuss the topical subjects of chiral symmetry, improved formulation of lattice QCD, and the impact these improvements will have on the quality of results expected from the next generation of simulations.
Lattice Quantum Chromodynamics
Sachrajda, C. T.
2016-10-01
I review the the application of the lattice formulation of QCD and large-scale numerical simulations to the evaluation of non-perturbative hadronic effects in Standard Model Phenomenology. I present an introduction to the elements of the calculations and discuss the limitations both in the range of quantities which can be studied and in the precision of the results. I focus particularly on the extraction of the QCD parameters, i.e. the quark masses and the strong coupling constant, and on important quantities in flavour physics. Lattice QCD is playing a central role in quantifying the hadronic effects necessary for the development of precision flavour physics and its use in exploring the limits of the Standard Model and in searches for inconsistencies which would signal the presence of new physics.
Lattices of dielectric resonators
Trubin, Alexander
2016-01-01
This book provides the analytical theory of complex systems composed of a large number of high-Q dielectric resonators. Spherical and cylindrical dielectric resonators with inferior and also whispering gallery oscillations allocated in various lattices are considered. A new approach to S-matrix parameter calculations based on perturbation theory of Maxwell equations, developed for a number of high-Q dielectric bodies, is introduced. All physical relationships are obtained in analytical form and are suitable for further computations. Essential attention is given to a new unified formalism of the description of scattering processes. The general scattering task for coupled eigen oscillations of the whole system of dielectric resonators is described. The equations for the expansion coefficients are explained in an applicable way. The temporal Green functions for the dielectric resonator are presented. The scattering process of short pulses in dielectric filter structures, dielectric antennas and lattices of d...
Fractional lattice charge transport
Flach, Sergej; Khomeriki, Ramaz
2017-01-01
We consider the dynamics of noninteracting quantum particles on a square lattice in the presence of a magnetic flux α and a dc electric field E oriented along the lattice diagonal. In general, the adiabatic dynamics will be characterized by Bloch oscillations in the electrical field direction and dispersive ballistic transport in the perpendicular direction. For rational values of α and a corresponding discrete set of values of E(α) vanishing gaps in the spectrum induce a fractionalization of the charge in the perpendicular direction - while left movers are still performing dispersive ballistic transport, the complementary fraction of right movers is propagating in a dispersionless relativistic manner in the opposite direction. Generalizations and the possible probing of the effect with atomic Bose-Einstein condensates and photonic networks are discussed. Zak phase of respective band associated with gap closing regime has been computed and it is found converging to π/2 value. PMID:28102302
Borsanyi, Sz; Kampert, K H; Katz, S D; Kawanai, T; Kovacs, T G; Mages, S W; Pasztor, A; Pittler, F; Redondo, J; Ringwald, A; Szabo, K K
2016-01-01
We present a full result for the equation of state (EoS) in 2+1+1 (up/down, strange and charm quarks are present) flavour lattice QCD. We extend this analysis and give the equation of state in 2+1+1+1 flavour QCD. In order to describe the evolution of the universe from temperatures several hundreds of GeV to several tens of MeV we also include the known effects of the electroweak theory and give the effective degree of freedoms. As another application of lattice QCD we calculate the topological susceptibility (chi) up to the few GeV temperature region. These two results, EoS and chi, can be used to predict the dark matter axion's mass in the post-inflation scenario and/or give the relationship between the axion's mass and the universal axionic angle, which acts as a initial condition of our universe.
Solitons in nonlinear lattices
Kartashov, Yaroslav V; Torner, Lluis
2010-01-01
This article offers a comprehensive survey of results obtained for solitons and complex nonlinear wave patterns supported by purely nonlinear lattices (NLs), which represent a spatially periodic modulation of the local strength and sign of the nonlinearity, and their combinations with linear lattices. A majority of the results obtained, thus far, in this field and reviewed in this article are theoretical. Nevertheless, relevant experimental settings are surveyed too, with emphasis on perspectives for implementation of the theoretical predictions in the experiment. Physical systems discussed in the review belong to the realms of nonlinear optics (including artificial optical media, such as photonic crystals, and plasmonics) and Bose-Einstein condensation (BEC). The solitons are considered in one, two, and three dimensions (1D, 2D, and 3D). Basic properties of the solitons presented in the review are their existence, stability, and mobility. Although the field is still far from completion, general conclusions c...
Parametric lattice Boltzmann method
Shim, Jae Wan
2017-06-01
The discretized equilibrium distributions of the lattice Boltzmann method are presented by using the coefficients of the Lagrange interpolating polynomials that pass through the points related to discrete velocities and using moments of the Maxwell-Boltzmann distribution. The ranges of flow velocity and temperature providing positive valued distributions vary with regulating discrete velocities as parameters. New isothermal and thermal compressible models are proposed for flows of the level of the isothermal and thermal compressible Navier-Stokes equations. Thermal compressible shock tube flows are simulated by only five on-lattice discrete velocities. Two-dimensional isothermal and thermal vortices provoked by the Kelvin-Helmholtz instability are simulated by the parametric models.
Jipsen, Peter
1992-01-01
The study of lattice varieties is a field that has experienced rapid growth in the last 30 years, but many of the interesting and deep results discovered in that period have so far only appeared in research papers. The aim of this monograph is to present the main results about modular and nonmodular varieties, equational bases and the amalgamation property in a uniform way. The first chapter covers preliminaries that make the material accessible to anyone who has had an introductory course in universal algebra. Each subsequent chapter begins with a short historical introduction which sites the original references and then presents the results with complete proofs (in nearly all cases). Numerous diagrams illustrate the beauty of lattice theory and aid in the visualization of many proofs. An extensive index and bibliography also make the monograph a useful reference work.
Lattice Quantum Chromodynamics
Sachrajda, C T
2016-01-01
I review the the application of the lattice formulation of QCD and large-scale numerical simulations to the evaluation of non-perturbative hadronic effects in Standard Model Phenomenology. I present an introduction to the elements of the calculations and discuss the limitations both in the range of quantities which can be studied and in the precision of the results. I focus particularly on the extraction of the QCD parameters, i.e. the quark masses and the strong coupling constant, and on important quantities in flavour physics. Lattice QCD is playing a central role in quantifying the hadronic effects necessary for the development of precision flavour physics and its use in exploring the limits of the Standard Model and in searches for inconsistencies which would signal the presence of new physics.
A framework of benchmarking land models
Luo, Y. Q.; Randerson, J.; Abramowitz, G.; Bacour, C.; Blyth, E.; Carvalhais, N.; Ciais, P.; Dalmonech, D.; Fisher, J.; Fisher, R.; Friedlingstein, P.; Hibbard, K.; Hoffman, F.; Huntzinger, D.; Jones, C. D.; Koven, C.; Lawrence, D.; Li, D. J.; Mahecha, M.; Niu, S. L.; Norby, R.; Piao, S. L.; Qi, X.; Peylin, P.; Prentice, I. C.; Riley, W.; Reichstein, M.; Schwalm, C.; Wang, Y. P.; Xia, J. Y.; Zaehle, S.; Zhou, X. H.
2012-02-01
Land models, which have been developed by the modeling community in the past two decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure and evaluate performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land models. The framework includes (1) targeted aspects of model performance to be evaluated; (2) a set of benchmarks as defined references to test model performance; (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies; and (4) model improvement. Component 4 may or may not be involved in a benchmark analysis but is an ultimate goal of general modeling research. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and the land-surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics across timescales in response to both weather and climate change. Benchmarks that are used to evaluate models generally consist of direct observations, data-model products, and data-derived patterns and relationships. Metrics of measuring mismatches between models and benchmarks may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance for future improvement. Iterations between model evaluation and improvement via benchmarking shall demonstrate progress of land modeling and help establish confidence in land models for their predictions of future states of ecosystems and climate.
A framework of benchmarking land models
Directory of Open Access Journals (Sweden)
Y. Q. Luo
2012-02-01
Full Text Available Land models, which have been developed by the modeling community in the past two decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure and evaluate performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land models. The framework includes (1 targeted aspects of model performance to be evaluated; (2 a set of benchmarks as defined references to test model performance; (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies; and (4 model improvement. Component 4 may or may not be involved in a benchmark analysis but is an ultimate goal of general modeling research. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and the land-surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics across timescales in response to both weather and climate change. Benchmarks that are used to evaluate models generally consist of direct observations, data-model products, and data-derived patterns and relationships. Metrics of measuring mismatches between models and benchmarks may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance for future improvement. Iterations between model evaluation and improvement via benchmarking shall demonstrate progress of land modeling and help establish confidence in land models for their predictions of future states of ecosystems and climate.
A framework for benchmarking land models
Luo, Y. Q.; Randerson, J. T.; Abramowitz, G.; Bacour, C.; Blyth, E.; Carvalhais, N.; Ciais, P.; Dalmonech, D.; Fisher, J. B.; Fisher, R.; Friedlingstein, P.; Hibbard, K.; Hoffman, F.; Huntzinger, D.; Jones, C. D.; Koven, C.; Lawrence, D.; Li, D. J.; Mahecha, M.; Niu, S. L.; Norby, R.; Piao, S. L.; Qi, X.; Peylin, P.; Prentice, I. C.; Riley, W.; Reichstein, M.; Schwalm, C.; Wang, Y. P.; Xia, J. Y.; Zaehle, S.; Zhou, X. H.
2012-10-01
Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1) targeted aspects of model performance to be evaluated, (2) a set of benchmarks as defined references to test model performance, (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4) model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties of land models
Benchmarking Attosecond Physics with Atomic Hydrogen
2015-05-25
Final 3. DATES COVERED (From - To) 12 Mar 12 – 11 Mar 15 4. TITLE AND SUBTITLE Benchmarking attosecond physics with atomic hydrogen 5a...AND SUBTITLE Benchmarking attosecond physics with atomic hydrogen 5a. CONTRACT NUMBER FA2386-12-1-4025 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...attosecond physics with atomic hydrogen ” May 25, 2015 PI information: David Kielpinski, dave.kielpinski@gmail.com Griffith University Centre
Aerodynamic Benchmarking of the Deepwind Design
DEFF Research Database (Denmark)
Bedona, Gabriele; Schmidt Paulsen, Uwe; Aagaard Madsen, Helge;
2015-01-01
The aerodynamic benchmarking for the DeepWind rotor is conducted comparing different rotor geometries and solutions and keeping the comparison as fair as possible. The objective for the benchmarking is to find the most suitable configuration in order to maximize the power production and minimize...... NACA airfoil family. (C) 2015 Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license...
Benchmarking Danish Vocational Education and Training Programmes
DEFF Research Database (Denmark)
Bogetoft, Peter; Wittrup, Jesper
This study paper discusses methods whereby Danish vocational education and training colleges can be benchmarked, and presents results from a number of models. It is conceptually complicated to benchmark vocational colleges, as the various colleges in Denmark offer a wide range of course programmes...... attempt to summarise the various effects that the colleges have in two relevant figures, namely retention rates of students and employment rates among students who have completed training programmes....
Implementation of NAS Parallel Benchmarks in Java
Frumkin, Michael; Schultz, Matthew; Jin, Hao-Qiang; Yan, Jerry
2000-01-01
A number of features make Java an attractive but a debatable choice for High Performance Computing (HPC). In order to gauge the applicability of Java to the Computational Fluid Dynamics (CFD) we have implemented NAS Parallel Benchmarks in Java. The performance and scalability of the benchmarks point out the areas where improvement in Java compiler technology and in Java thread implementation would move Java closer to Fortran in the competition for CFD applications.
A framework for benchmarking land models
Directory of Open Access Journals (Sweden)
Y. Q. Luo
2012-10-01
Full Text Available Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1 targeted aspects of model performance to be evaluated, (2 a set of benchmarks as defined references to test model performance, (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4 model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data–model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties
The MCNP6 Analytic Criticality Benchmark Suite
Energy Technology Data Exchange (ETDEWEB)
Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Codes Group
2016-06-16
Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling) and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.
Simple Benchmark Specifications for Space Radiation Protection
Singleterry, Robert C. Jr.; Aghara, Sukesh K.
2013-01-01
This report defines space radiation benchmark specifications. This specification starts with simple, monoenergetic, mono-directional particles on slabs and progresses to human models in spacecraft. This report specifies the models and sources needed to what the team performing the benchmark needs to produce in a report. Also included are brief descriptions of how OLTARIS, the NASA Langley website for space radiation analysis, performs its analysis.
International Lattice Data Grid
Davies, C T H; Kenway, R D; Maynard, C M
2002-01-01
We propose the co-ordination of lattice QCD grid developments in different countries to allow transparent exchange of gauge configurations in future, should participants wish to do so. We describe briefly UKQCD's XML schema for labelling and cataloguing the data. A meeting to further develop these ideas will be held in Edinburgh on 19/20 December 2002, and will be available over AccessGrid.
Weakly deformed soliton lattices
Energy Technology Data Exchange (ETDEWEB)
Dubrovin, B. (Moskovskij Gosudarstvennyj Univ., Moscow (USSR). Dept. of Mechanics and Mathematics)
1990-12-01
In this lecture the author discusses periodic and quasiperiodic solutions of nonlinear evolution equations of phi{sub t}=K (phi, phi{sub x},..., phi{sup (n)}), the so-called soliton lattices. After introducing the theory of integrable systems of hydrodynamic type he discusses their Hamiltonian formalism, i.e. the theory of Poisson brackets of hydrodynamic type. Then he describes the application of algebraic geometry to the effective integration of such equations. (HSI).
Crystallographic Lattice Boltzmann Method
Namburi, Manjusha; Krithivasan, Siddharth; Ansumali, Santosh
2016-01-01
Current approaches to Direct Numerical Simulation (DNS) are computationally quite expensive for most realistic scientific and engineering applications of Fluid Dynamics such as automobiles or atmospheric flows. The Lattice Boltzmann Method (LBM), with its simplified kinetic descriptions, has emerged as an important tool for simulating hydrodynamics. In a heterogeneous computing environment, it is often preferred due to its flexibility and better parallel scaling. However, direct simulation of realistic applications, without the use of turbulence models, remains a distant dream even with highly efficient methods such as LBM. In LBM, a fictitious lattice with suitable isotropy in the velocity space is considered to recover Navier-Stokes hydrodynamics in macroscopic limit. The same lattice is mapped onto a cartesian grid for spatial discretization of the kinetic equation. In this paper, we present an inverted argument of the LBM, by making spatial discretization as the central theme. We argue that the optimal spatial discretization for LBM is a Body Centered Cubic (BCC) arrangement of grid points. We illustrate an order-of-magnitude gain in efficiency for LBM and thus a significant progress towards feasibility of DNS for realistic flows. PMID:27251098
Bietenholz, W; Pepe, M; Wiese, U -J
2010-01-01
We consider lattice field theories with topological actions, which are invariant against small deformations of the fields. Some of these actions have infinite barriers separating different topological sectors. Topological actions do not have the correct classical continuum limit and they cannot be treated using perturbation theory, but they still yield the correct quantum continuum limit. To show this, we present analytic studies of the 1-d O(2) and O(3) model, as well as Monte Carlo simulations of the 2-d O(3) model using topological lattice actions. Some topological actions obey and others violate a lattice Schwarz inequality between the action and the topological charge $Q$. Irrespective of this, in the 2-d O(3) model the topological susceptibility $\\chi_t = \\l/V$ is logarithmically divergent in the continuum limit. Still, at non-zero distance the correlator of the topological charge density has a finite continuum limit which is consistent with analytic predictions. Our study shows explicitly that some cla...
Adamatzky, Andrew
2015-01-01
The book gives a comprehensive overview of the state-of-the-art research and engineering in theory and application of Lattice Automata in design and control of autonomous Robots. Automata and robots share the same notional meaning. Automata (originated from the latinization of the Greek word “αυτόματον”) as self-operating autonomous machines invented from ancient years can be easily considered the first steps of robotic-like efforts. Automata are mathematical models of Robots and also they are integral parts of robotic control systems. A Lattice Automaton is a regular array or a collective of finite state machines, or automata. The Automata update their states by the same rules depending on states of their immediate neighbours. In the context of this book, Lattice Automata are used in developing modular reconfigurable robotic systems, path planning and map exploration for robots, as robot controllers, synchronisation of robot collectives, robot vision, parallel robotic actuators. All chapters are...
Hadroquarkonium from lattice QCD
Alberti, Maurizio; Bali, Gunnar S.; Collins, Sara; Knechtli, Francesco; Moir, Graham; Söldner, Wolfgang
2017-04-01
The hadroquarkonium picture [S. Dubynskiy and M. B. Voloshin, Phys. Lett. B 666, 344 (2008), 10.1016/j.physletb.2008.07.086] provides one possible interpretation for the pentaquark candidates with hidden charm, recently reported by the LHCb Collaboration, as well as for some of the charmoniumlike "X , Y , Z " states. In this picture, a heavy quarkonium core resides within a light hadron giving rise to four- or five-quark/antiquark bound states. We test this scenario in the heavy quark limit by investigating the modification of the potential between a static quark-antiquark pair induced by the presence of a hadron. Our lattice QCD simulations are performed on a Coordinated Lattice Simulations (CLS) ensemble with Nf=2 +1 flavors of nonperturbatively improved Wilson quarks at a pion mass of about 223 MeV and a lattice spacing of about a =0.0854 fm . We study the static potential in the presence of a variety of light mesons as well as of octet and decuplet baryons. In all these cases, the resulting configurations are favored energetically. The associated binding energies between the quarkonium in the heavy quark limit and the light hadron are found to be smaller than a few MeV, similar in strength to deuterium binding. It needs to be seen if the small attraction survives in the infinite volume limit and supports bound states or resonances.
Digital lattice gauge theories
Zohar, Erez; Farace, Alessandro; Reznik, Benni; Cirac, J. Ignacio
2017-02-01
We propose a general scheme for a digital construction of lattice gauge theories with dynamical fermions. In this method, the four-body interactions arising in models with 2 +1 dimensions and higher are obtained stroboscopically, through a sequence of two-body interactions with ancillary degrees of freedom. This yields stronger interactions than the ones obtained through perturbative methods, as typically done in previous proposals, and removes an important bottleneck in the road towards experimental realizations. The scheme applies to generic gauge theories with Lie or finite symmetry groups, both Abelian and non-Abelian. As a concrete example, we present the construction of a digital quantum simulator for a Z3 lattice gauge theory with dynamical fermionic matter in 2 +1 dimensions, using ultracold atoms in optical lattices, involving three atomic species, representing the matter, gauge, and auxiliary degrees of freedom, that are separated in three different layers. By moving the ancilla atoms with a proper sequence of steps, we show how we can obtain the desired evolution in a clean, controlled way.
Benchmarking for Cost Improvement. Final report
Energy Technology Data Exchange (ETDEWEB)
1993-09-01
The US Department of Energy`s (DOE) Office of Environmental Restoration and Waste Management (EM) conducted the Benchmarking for Cost Improvement initiative with three objectives: Pilot test benchmarking as an EM cost improvement tool; identify areas for cost improvement and recommend actions to address these areas; provide a framework for future cost improvement. The benchmarking initiative featured the use of four principal methods (program classification, nationwide cost improvement survey, paired cost comparison and component benchmarking). Interested parties contributed during both the design and execution phases. The benchmarking initiative was conducted on an accelerated basis. Of necessity, it considered only a limited set of data that may not be fully representative of the diverse and complex conditions found at the many DOE installations. The initiative generated preliminary data about cost differences and it found a high degree of convergence on several issues. Based on this convergence, the report recommends cost improvement strategies and actions. This report describes the steps taken as part of the benchmarking initiative and discusses the findings and recommended actions for achieving cost improvement. The results and summary recommendations, reported below, are organized by the study objectives.
Benchmarking infrastructure for mutation text mining
2014-01-01
Background Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. Results We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. Conclusion We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption. PMID:24568600
A Mechanical Lattice Aid for Crystallography Teaching.
Amezcua-Lopez, J.; Cordero-Borboa, A. E.
1988-01-01
Introduces a 3-dimensional mechanical lattice with adjustable telescoping mechanisms. Discusses the crystalline state, the 14 Bravais lattices, operational principles of the mechanical lattice, construction methods, and demonstrations in classroom. Provides lattice diagrams, schemes of the lattice, and various pictures of the lattice. (YP)
Kenneth Wilson and lattice QCD
Ukawa, Akira
2015-01-01
We discuss the physics and computation of lattice QCD, a space-time lattice formulation of quantum chromodynamics, and Kenneth Wilson's seminal role in its development. We start with the fundamental issue of confinement of quarks in the theory of the strong interactions, and discuss how lattice QCD provides a framework for understanding this phenomenon. A conceptual issue with lattice QCD is a conflict of space-time lattice with chiral symmetry of quarks. We discuss how this problem is resolved. Since lattice QCD is a non-linear quantum dynamical system with infinite degrees of freedom, quantities which are analytically calculable are limited. On the other hand, it provides an ideal case of massively parallel numerical computations. We review the long and distinguished history of parallel-architecture supercomputers designed and built for lattice QCD. We discuss algorithmic developments, in particular the difficulties posed by the fermionic nature of quarks, and their resolution. The triad of efforts toward b...
Criticality benchmark guide for light-water-reactor fuel in transportation and storage packages
Energy Technology Data Exchange (ETDEWEB)
Lichtenwalter, J.J.; Bowman, S.M.; DeHart, M.D.; Hopper, C.M.
1997-03-01
This report is designed as a guide for performing criticality benchmark calculations for light-water-reactor (LWR) fuel applications. The guide provides documentation of 180 criticality experiments with geometries, materials, and neutron interaction characteristics representative of transportation packages containing LWR fuel or uranium oxide pellets or powder. These experiments should benefit the U.S. Nuclear Regulatory Commission (NRC) staff and licensees in validation of computational methods used in LWR fuel storage and transportation concerns. The experiments are classified by key parameters such as enrichment, water/fuel volume, hydrogen-to-fissile ratio (H/X), and lattice pitch. Groups of experiments with common features such as separator plates, shielding walls, and soluble boron are also identified. In addition, a sample validation using these experiments and a statistical analysis of the results are provided. Recommendations for selecting suitable experiments and determination of calculational bias and uncertainty are presented as part of this benchmark guide.
Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies
Energy Technology Data Exchange (ETDEWEB)
Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)
2017-05-23
This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.
Directory of Open Access Journals (Sweden)
Jahn, Franziska
2015-08-01
Full Text Available Benchmarking is a method of strategic information management used by many hospitals today. During the last years, several benchmarking clusters have been established within the German-speaking countries. They support hospitals in comparing and positioning their information system’s and information management’s costs, performance and efficiency against other hospitals. In order to differentiate between these benchmarking clusters and to provide decision support in selecting an appropriate benchmarking cluster, a classification scheme is developed. The classification scheme observes both general conditions and examined contents of the benchmarking clusters. It is applied to seven benchmarking clusters which have been active in the German-speaking countries within the last years. Currently, performance benchmarking is the most frequent benchmarking type, whereas the observed benchmarking clusters differ in the number of benchmarking partners and their cooperation forms. The benchmarking clusters also deal with different benchmarking subjects. Assessing costs and quality application systems, physical data processing systems, organizational structures of information management and IT services processes are the most frequent benchmarking subjects. There is still potential for further activities within the benchmarking clusters to measure strategic and tactical information management, IT governance and quality of data and data-processing processes. Based on the classification scheme and the comparison of the benchmarking clusters, we derive general recommendations for benchmarking of hospital information systems.
Storage-Intensive Supercomputing Benchmark Study
Energy Technology Data Exchange (ETDEWEB)
Cohen, J; Dossa, D; Gokhale, M; Hysom, D; May, J; Pearce, R; Yoo, A
2007-10-30
Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe: (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows
Lattice topology dictates photon statistics.
Kondakci, H Esat; Abouraddy, Ayman F; Saleh, Bahaa E A
2017-08-21
Propagation of coherent light through a disordered network is accompanied by randomization and possible conversion into thermal light. Here, we show that network topology plays a decisive role in determining the statistics of the emerging field if the underlying lattice is endowed with chiral symmetry. In such lattices, eigenmode pairs come in skew-symmetric pairs with oppositely signed eigenvalues. By examining one-dimensional arrays of randomly coupled waveguides arranged on linear and ring topologies, we are led to a remarkable prediction: the field circularity and the photon statistics in ring lattices are dictated by its parity while the same quantities are insensitive to the parity of a linear lattice. For a ring lattice, adding or subtracting a single lattice site can switch the photon statistics from super-thermal to sub-thermal, or vice versa. This behavior is understood by examining the real and imaginary fields on a lattice exhibiting chiral symmetry, which form two strands that interleave along the lattice sites. These strands can be fully braided around an even-sited ring lattice thereby producing super-thermal photon statistics, while an odd-sited lattice is incommensurate with such an arrangement and the statistics become sub-thermal.
Full sphere hydrodynamic and dynamo benchmarks
Marti, P.
2014-01-26
Convection in planetary cores can generate fluid flow and magnetic fields, and a number of sophisticated codes exist to simulate the dynamic behaviour of such systems. We report on the first community activity to compare numerical results of computer codes designed to calculate fluid flow within a whole sphere. The flows are incompressible and rapidly rotating and the forcing of the flow is either due to thermal convection or due to moving boundaries. All problems defined have solutions that alloweasy comparison, since they are either steady, slowly drifting or perfectly periodic. The first two benchmarks are defined based on uniform internal heating within the sphere under the Boussinesq approximation with boundary conditions that are uniform in temperature and stress-free for the flow. Benchmark 1 is purely hydrodynamic, and has a drifting solution. Benchmark 2 is a magnetohydrodynamic benchmark that can generate oscillatory, purely periodic, flows and magnetic fields. In contrast, Benchmark 3 is a hydrodynamic rotating bubble benchmark using no slip boundary conditions that has a stationary solution. Results from a variety of types of code are reported, including codes that are fully spectral (based on spherical harmonic expansions in angular coordinates and polynomial expansions in radius), mixed spectral and finite difference, finite volume, finite element and also a mixed Fourier-finite element code. There is good agreement between codes. It is found that in Benchmarks 1 and 2, the approximation of a whole sphere problem by a domain that is a spherical shell (a sphere possessing an inner core) does not represent an adequate approximation to the system, since the results differ from whole sphere results. © The Authors 2014. Published by Oxford University Press on behalf of The Royal Astronomical Society.
Criteria of benchmark selection for efficient flexible multibody system formalisms
Directory of Open Access Journals (Sweden)
Valášek M.
2007-10-01
Full Text Available The paper deals with the selection process of benchmarks for testing and comparing efficient flexible multibody formalisms. The existing benchmarks are briefly summarized. The purposes for benchmark selection are investigated. The result of this analysis is the formulation of the criteria of benchmark selection for flexible multibody formalisms. Based on them the initial set of suitable benchmarks is described. Besides that the evaluation measures are revised and extended.
Test Nationally, Benchmark Locally: Using Local DIBELS Benchmarks to Predict Performance on the Pssa
Ferchalk, Matthew R.
2013-01-01
The Dynamic Indicators of Basic Early Literacy Skills (DIBELS) benchmarks are frequently used to make important decision regarding student performance. More information, however, is needed to understand if the nationally-derived benchmarks created by the DIBELS system provide the most accurate criterion for evaluating reading proficiency. The…
El-Saed, Aiman; Balkhy, Hanan H; Weber, David J
2013-10-01
Growing numbers of healthcare facilities are routinely collecting standardized data on healthcare-associated infection (HAI), which can be used not only to track internal performance but also to compare local data to national and international benchmarks. Benchmarking overall (crude) HAI surveillance metrics without accounting or adjusting for potential confounders can result in misleading conclusions. Methods commonly used to provide risk-adjusted metrics include multivariate logistic regression analysis, stratification, indirect standardization, and restrictions. The characteristics of recognized benchmarks worldwide, including the advantages and limitations are described. The choice of the right benchmark for the data from the Gulf Cooperation Council (GCC) states is challenging. The chosen benchmark should have similar data collection and presentation methods. Additionally, differences in surveillance environments including regulations should be taken into consideration when considering such a benchmark. The GCC center for infection control took some steps to unify HAI surveillance systems in the region. GCC hospitals still need to overcome legislative and logistic difficulties in sharing data to create their own benchmark. The availability of a regional GCC benchmark may better enable health care workers and researchers to obtain more accurate and realistic comparisons.
The Concepts "Benchmarks and Benchmarking" Used in Education Planning: Teacher Education as Example
Steyn, H. J.
2015-01-01
Planning in education is a structured activity that includes several phases and steps that take into account several kinds of information (Steyn, Steyn, De Waal & Wolhuter, 2002: 146). One of the sets of information that are usually considered is the (so-called) "benchmarks" and "benchmarking" regarding the focus of a…
Drashkovicheva, Kh; Igoshin, V I; Katrinyak, T; Kolibiar, M
1989-01-01
This book is another publication in the recent surveys of ordered sets and lattices. The papers, which might be characterized as "reviews of reviews," are based on articles reviewed in the Referativnyibreve Zhurnal: Matematika from 1978 to 1982. For the sake of completeness, the authors also attempted to integrate information from other relevant articles from that period. The bibliography of each paper provides references to the reviews in RZhMat and Mathematical Reviews where one can seek more detailed information. Specifically excluded from consideration in this volume were such topics as al
Lattice Vibrations in Chlorobenzenes:
DEFF Research Database (Denmark)
Reynolds, P. A.; Kjems, Jørgen; White, J. W.
1974-01-01
Lattice vibrational dispersion curves for the ``intermolecular'' modes in the triclinic, one molecule per unit cell β phase of p‐C6D4Cl2 and p‐C6H4Cl2 have been obtained by inelastic neutron scattering. The deuterated sample was investigated at 295 and at 90°K and a linear extrapolation to 0°K...... by consideration of electrostatic forces or by further anisotropy in the dispersion forces not described in the atom‐atom model. Anharmonic effects are shown to be large, but the dominant features in the temperature variation of frequencies are describable by a quasiharmonic model....
Features and technology of enterprise internal benchmarking
Directory of Open Access Journals (Sweden)
A.V. Dubodelova
2013-06-01
Full Text Available The aim of the article. The aim of the article is to generalize characteristics, objectives, advantages of internal benchmarking. The stages sequence of internal benchmarking technology is formed. It is focused on continuous improvement of process of the enterprise by implementing existing best practices.The results of the analysis. Business activity of domestic enterprises in crisis business environment has to focus on the best success factors of their structural units by using standard research assessment of their performance and their innovative experience in practice. Modern method of those needs satisfying is internal benchmarking. According to Bain & Co internal benchmarking is one the three most common methods of business management.The features and benefits of benchmarking are defined in the article. The sequence and methodology of implementation of individual stages of benchmarking technology projects are formulated.The authors define benchmarking as a strategic orientation on the best achievement by comparing performance and working methods with the standard. It covers the processes of researching, organization of production and distribution, management and marketing methods to reference objects to identify innovative practices and its implementation in a particular business.Benchmarking development at domestic enterprises requires analysis of theoretical bases and practical experience. Choice best of experience helps to develop recommendations for their application in practice.Also it is essential to classificate species, identify characteristics, study appropriate areas of use and development methodology of implementation. The structure of internal benchmarking objectives includes: promoting research and establishment of minimum acceptable levels of efficiency processes and activities which are available at the enterprise; identification of current problems and areas that need improvement without involvement of foreign experience
Toxicological benchmarks for wildlife: 1994 Revision
Energy Technology Data Exchange (ETDEWEB)
Opresko, D.M.; Sample, B.E.; Suter, G.W. II
1994-09-01
The process by which ecological risks of environmental contaminants are evaluated is two-tiered. The first tier is a screening assessment where concentrations of contaminants in the environment are compared to toxicological benchmarks which represent concentrations of chemicals in environmental media (water, sediment, soil, food, etc.) that are presumed to be nonhazardous to the surrounding biota. The second tier is a baseline ecological risk assessment where toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. The report presents toxicological benchmarks for assessment of effects of 76 chemicals on 8 representative mammalian wildlife species and 31 chemicals on 9 avian wildlife species. The chemicals are some of those that occur at United States Department of Energy waste sites; the wildlife species were chosen because they are widely distributed and provide a representative range of body sizes and diets. Further descriptions of the chosen wildlife species and chemicals are provided in the report. The benchmarks presented in this report represent values believed to be nonhazardous for the listed wildlife species. These benchmarks only consider contaminant exposure through oral ingestion of contaminated media; exposure through inhalation or direct dermal exposure are not considered in this report.
Czarnik, Piotr; Rams, Marek M.; Dziarmaga, Jacek
2016-12-01
A Gibbs operator e-β H for a two-dimensional (2D) lattice system with a Hamiltonian H can be represented by a 3D tensor network, with the third dimension being the imaginary time (inverse temperature) β . Coarse graining the network along β results in a 2D projected entangled-pair operator (PEPO) with a finite bond dimension. The coarse graining is performed by a tree tensor network of isometries. They are optimized variationally to maximize the accuracy of the PEPO as a representation of the 2D thermal state e-β H. The algorithm is applied to the two-dimensional Hubbard model on an infinite square lattice. Benchmark results at finite temperature are obtained that are consistent with the best cluster dynamical mean-field theory and power-series expansion in the regime of parameters where they yield mutually consistent results.
Lattice harmonics expansion revisited
Kontrym-Sznajd, G.; Holas, A.
2017-04-01
The main subject of the work is to provide the most effective way of determining the expansion of some quantities into orthogonal polynomials, when these quantities are known only along some limited number of sampling directions. By comparing the commonly used Houston method with the method based on the orthogonality relation, some relationships, which define the applicability and correctness of these methods, are demonstrated. They are verified for various sets of sampling directions applicable for expanding quantities having the full symmetry of the Brillouin zone of cubic and non-cubic lattices. All results clearly show that the Houston method is always better than the orthogonality-relation one. For the cubic symmetry we present a few sets of special directions (SDs) showing how their construction and, next, a proper application depend on the choice of various sets of lattice harmonics. SDs are important mainly for experimentalists who want to reconstruct anisotropic quantities from their measurements, performed at a limited number of sampling directions.
Extreme lattices: symmetries and decorrelation
Andreanov, A.; Scardicchio, A.; Torquato, S.
2016-11-01
We study statistical and structural properties of extreme lattices, which are the local minima in the density landscape of lattice sphere packings in d-dimensional Euclidean space {{{R}}d} . Specifically, we ascertain statistics of the densities and kissing numbers as well as the numbers of distinct symmetries of the packings for dimensions 8 through 13 using the stochastic Voronoi algorithm. The extreme lattices in a fixed dimension of space d (d≥slant 8 ) are dominated by typical lattices that have similar packing properties, such as packing densities and kissing numbers, while the best and the worst packers are in the long tails of the distribution of the extreme lattices. We also study the validity of the recently proposed decorrelation principle, which has important implications for sphere packings in general. The degree to which extreme-lattice packings decorrelate as well as how decorrelation is related to the packing density and symmetry of the lattices as the space dimension increases is also investigated. We find that the extreme lattices decorrelate with increasing dimension, while the least symmetric lattices decorrelate faster.
A LATTICE BOLTZMANN SUBGRID MODEL FOR LID-DRIVEN CAVITY FLOW
Institute of Scientific and Technical Information of China (English)
YANG Fan; LIU Shu-hong; WU Yu-lin; TANG Xue-lin
2005-01-01
In recent years, the Lattice Boltzmann Method (LBM) has developed into an alternative and promising numerical scheme for simulating fluid flows and modeling physics in fluids. In order to propose LBM for high Reynolds number fluid flow applications, a subgrid turbulence model for LBM was introduced based on standard Smagorinsky subgrid model and Lattice Bhatnagar-Gross-Krook (LBGK) model. The subgrid LBGK model was subsequently used to simulate the two-dimensional driven cavity flow at high Reynolds numbers. The simulation results including distribution of stream lines, dimensionless velocities distribution, values of stream function, as well as location of vertex center, were compared with benchmark solutions, with satisfactory agreements.
A lattice Boltzmann coupled to finite volumes method for solving phase change problems
Directory of Open Access Journals (Sweden)
El Ganaoui Mohammed
2009-01-01
Full Text Available A numerical scheme coupling lattice Boltzmann and finite volumes approaches has been developed and qualified for test cases of phase change problems. In this work, the coupled partial differential equations of momentum conservation equations are solved with a non uniform lattice Boltzmann method. The energy equation is discretized by using a finite volume method. Simulations show the ability of this developed hybrid method to model the effects of convection, and to predict transfers. Benchmarking is operated both for conductive and convective situation dominating solid/liquid transition. Comparisons are achieved with respect to available analytical solutions and experimental results.
Coral benchmarks in the center of biodiversity.
Licuanan, W Y; Robles, R; Dygico, M; Songco, A; van Woesik, R
2017-01-30
There is an urgent need to quantify coral reef benchmarks that assess changes and recovery rates through time and serve as goals for management. Yet, few studies have identified benchmarks for hard coral cover and diversity in the center of marine diversity. In this study, we estimated coral cover and generic diversity benchmarks on the Tubbataha reefs, the largest and best-enforced no-take marine protected area in the Philippines. The shallow (2-6m) reef slopes of Tubbataha were monitored annually, from 2012 to 2015, using hierarchical sampling. Mean coral cover was 34% (σ±1.7) and generic diversity was 18 (σ±0.9) per 75m by 25m station. The southeastern leeward slopes supported on average 56% coral cover, whereas the northeastern windward slopes supported 30%, and the western slopes supported 18% coral cover. Generic diversity was more spatially homogeneous than coral cover. Copyright © 2016 Elsevier Ltd. All rights reserved.
Professional Performance and Bureaucratic Benchmarking Information
DEFF Research Database (Denmark)
Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz
Prior research documents positive effects of benchmarking information provision on performance and attributes this to social comparisons. However, the effects on professional recipients are unclear. Studies of professional control indicate that professional recipients often resist bureaucratic...... controls because of organizational-professional conflicts. We therefore analyze the association between bureaucratic benchmarking information provision and professional performance and suggest that the association is more positive if prior professional performance was low. We test our hypotheses based...... and professional performance but only if prior professional performance was low. Supplemental analyses support the robustness of our results. Findings indicate conditions under which bureaucratic benchmarking information may affect professional performance and advance research on professional control and social...
The national hydrologic bench-mark network
Cobb, Ernest D.; Biesecker, J.E.
1971-01-01
The United States is undergoing a dramatic growth of population and demands on its natural resources. The effects are widespread and often produce significant alterations of the environment. The hydrologic bench-mark network was established to provide data on stream basins which are little affected by these changes. The network is made up of selected stream basins which are not expected to be significantly altered by man. Data obtained from these basins can be used to document natural changes in hydrologic characteristics with time, to provide a better understanding of the hydrologic structure of natural basins, and to provide a comparative base for studying the effects of man on the hydrologic environment. There are 57 bench-mark basins in 37 States. These basins are in areas having a wide variety of climate and topography. The bench-mark basins and the types of data collected in the basins are described.
DWEB: A Data Warehouse Engineering Benchmark
Darmont, Jérôme; Boussaïd, Omar
2005-01-01
Data warehouse architectural choices and optimization techniques are critical to decision support query performance. To facilitate these choices, the performance of the designed data warehouse must be assessed. This is usually done with the help of benchmarks, which can either help system users comparing the performances of different systems, or help system engineers testing the effect of various design choices. While the TPC standard decision support benchmarks address the first point, they are not tuneable enough to address the second one and fail to model different data warehouse schemas. By contrast, our Data Warehouse Engineering Benchmark (DWEB) allows to generate various ad-hoc synthetic data warehouses and workloads. DWEB is fully parameterized to fulfill data warehouse design needs. However, two levels of parameterization keep it relatively easy to tune. Finally, DWEB is implemented as a Java free software that can be interfaced with most existing relational database management systems. A sample usag...
Benchmarking optimization solvers for structural topology optimization
DEFF Research Database (Denmark)
Rojas Labanda, Susana; Stolpe, Mathias
2015-01-01
The purpose of this article is to benchmark different optimization solvers when applied to various finite element based structural topology optimization problems. An extensive and representative library of minimum compliance, minimum volume, and mechanism design problem instances for different...... sizes is developed for this benchmarking. The problems are based on a material interpolation scheme combined with a density filter. Different optimization solvers including Optimality Criteria (OC), the Method of Moving Asymptotes (MMA) and its globally convergent version GCMMA, the interior point...... profiles conclude that general solvers are as efficient and reliable as classical structural topology optimization solvers. Moreover, the use of the exact Hessians in SAND formulations, generally produce designs with better objective function values. However, with the benchmarked implementations solving...
Energy benchmarking of South Australian WWTPs.
Krampe, J
2013-01-01
Optimising the energy consumption and energy generation of wastewater treatment plants (WWTPs) is a topic with increasing importance for water utilities in times of rising energy costs and pressures to reduce greenhouse gas (GHG) emissions. Assessing the energy efficiency and energy optimisation of a WWTP are difficult tasks as most plants vary greatly in size, process layout and other influencing factors. To overcome these limits it is necessary to compare energy efficiency with a statistically relevant base to identify shortfalls and optimisation potential. Such energy benchmarks have been successfully developed and used in central Europe over the last two decades. This paper demonstrates how the latest available energy benchmarks from Germany have been applied to 24 WWTPs in South Australia. It shows how energy benchmarking can be used to identify shortfalls in current performance, prioritise detailed energy assessments and help inform decisions on capital investment.
Confidential benchmarking based on multiparty computation
DEFF Research Database (Denmark)
Damgård, Ivan Bjerre; Damgård, Kasper Lyneborg; Nielsen, Kurt;
We report on the design and implementation of a system that uses multiparty computation to enable banks to benchmark their customers' confidential performance data against a large representative set of confidential performance data from a consultancy house. The system ensures that both the banks......' and the consultancy house's data stays confidential, the banks as clients learn nothing but the computed benchmarking score. In the concrete business application, the developed prototype help Danish banks to find the most efficient customers among a large and challenging group of agricultural customers with too much...... debt. We propose a model based on linear programming for doing the benchmarking and implement it using the SPDZ protocol by Damgård et al., which we modify using a new idea that allows clients to supply data and get output without having to participate in the preprocessing phase and without keeping...
FGK Benchmark Stars A new metallicity scale
Jofre, Paula; Soubiran, C; Blanco-Cuaresma, S; Pancino, E; Bergemann, M; Cantat-Gaudin, T; Hernandez, J I Gonzalez; Hill, V; Lardo, C; de Laverny, P; Lind, K; Magrini, L; Masseron, T; Montes, D; Mucciarelli, A; Nordlander, T; Recio-Blanco, A; Sobeck, J; Sordo, R; Sousa, S G; Tabernero, H; Vallenari, A; Van Eck, S; Worley, C C
2013-01-01
In the era of large spectroscopic surveys of stars of the Milky Way, atmospheric parameter pipelines require reference stars to evaluate and homogenize their values. We provide a new metallicity scale for the FGK benchmark stars based on their corresponding fundamental effective temperature and surface gravity. This was done by analyzing homogeneously with up to seven different methods a spectral library of benchmark stars. Although our direct aim was to provide a reference metallicity to be used by the Gaia-ESO Survey, the fundamental effective temperatures and surface gravities of benchmark stars of Heiter et al. 2013 (in prep) and their metallicities obtained in this work can also be used as reference parameters for other ongoing surveys, such as Gaia, HERMES, RAVE, APOGEE and LAMOST.
Professional Performance and Bureaucratic Benchmarking Information
DEFF Research Database (Denmark)
Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz
Professionals are often expected to be reluctant with regard to bureaucratic controls because of assumed conflicting values and goals of the organization vis-à-vis the profession. We suggest however, that the provision of bureaucratic benchmarking information is positively associated with profess......Professionals are often expected to be reluctant with regard to bureaucratic controls because of assumed conflicting values and goals of the organization vis-à-vis the profession. We suggest however, that the provision of bureaucratic benchmarking information is positively associated...... for 191 orthopaedics departments of German hospitals matched with survey data on bureaucratic benchmarking information provision to the chief physician of the respective department. Professional performance is publicly disclosed due to regulatory requirements. At the same time, chief physicians typically...
Shielding Integral Benchmark Archive and Database (SINBAD)
Energy Technology Data Exchange (ETDEWEB)
Kirk, Bernadette Lugue [ORNL; Grove, Robert E [ORNL; Kodeli, I. [International Atomic Energy Agency (IAEA); Sartori, Enrico [ORNL; Gulliford, J. [OECD Nuclear Energy Agency
2011-01-01
The Shielding Integral Benchmark Archive and Database (SINBAD) collection of benchmarks was initiated in the early 1990 s. SINBAD is an international collaboration between the Organization for Economic Cooperation and Development s Nuclear Energy Agency Data Bank (OECD/NEADB) and the Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL). SINBAD is a major attempt to compile experiments and corresponding computational models with the goal of preserving institutional knowledge and expertise that need to be handed down to future scientists. SINBAD is also a learning tool for university students and scientists who need to design experiments or gain expertise in modeling and simulation. The SINBAD database is currently divided into three categories fission, fusion, and accelerator benchmarks. Where possible, each experiment is described and analyzed using deterministic or probabilistic (Monte Carlo) radiation transport software.
A Benchmarking System for Domestic Water Use
Directory of Open Access Journals (Sweden)
Dexter V. L. Hunt
2014-05-01
Full Text Available The national demand for water in the UK is predicted to increase, exacerbated by a growing UK population, and home-grown demands for energy and food. When set against the context of overstretched existing supply sources vulnerable to droughts, particularly in increasingly dense city centres, the delicate balance of matching minimal demands with resource secure supplies becomes critical. When making changes to "internal" demands the role of technological efficiency and user behaviour cannot be ignored, yet existing benchmarking systems traditionally do not consider the latter. This paper investigates the practicalities of adopting a domestic benchmarking system (using a band rating that allows individual users to assess their current water use performance against what is possible. The benchmarking system allows users to achieve higher benchmarks through any approach that reduces water consumption. The sensitivity of water use benchmarks are investigated by making changes to user behaviour and technology. The impact of adopting localised supplies (i.e., Rainwater harvesting—RWH and Grey water—GW and including "external" gardening demands are investigated. This includes the impacts (in isolation and combination of the following: occupancy rates (1 to 4; roof size (12.5 m2 to 100 m2; garden size (25 m2 to 100 m2 and geographical location (North West, Midlands and South East, UK with yearly temporal effects (i.e., rainfall and temperature. Lessons learnt from analysis of the proposed benchmarking system are made throughout this paper, in particular its compatibility with the existing Code for Sustainable Homes (CSH accreditation system. Conclusions are subsequently drawn for the robustness of the proposed system.
Elimination of spurious lattice fermion solutions and noncompact lattice QCD
Energy Technology Data Exchange (ETDEWEB)
Lee, T.D.
1997-09-22
It is well known that the Dirac equation on a discrete hyper-cubic lattice in D dimension has 2{sup D} degenerate solutions. The usual method of removing these spurious solutions encounters difficulties with chiral symmetry when the lattice spacing l {ne} 0, as exemplified by the persistent problem of the pion mass. On the other hand, we recall that in any crystal in nature, all the electrons do move in a lattice and satisfy the Dirac equation; yet there is not a single physical result that has ever been entangled with a spurious fermion solution. Therefore it should not be difficult to eliminate these unphysical elements. On a discrete lattice, particle hop from point to point, whereas in a real crystal the lattice structure in embedded in a continuum and electrons move continuously from lattice cell to lattice cell. In a discrete system, the lattice functions are defined only on individual points (or links as in the case of gauge fields). However, in a crystal the electron state vector is represented by the Bloch wave functions which are continuous functions in {rvec {gamma}}, and herein lies one of the essential differences.
Validation of VHTRC calculation benchmark of critical experiment using the MCB code
Directory of Open Access Journals (Sweden)
Stanisz Przemysław
2016-01-01
Full Text Available The calculation benchmark problem Very High Temperature Reactor Critical (VHTR a pin-in-block type core critical assembly has been investigated with the Monte Carlo Burnup (MCB code in order to validate the latest version of Nuclear Data Library based on ENDF format. Executed benchmark has been made on the basis of VHTR benchmark available from the International Handbook of Evaluated Reactor Physics Benchmark Experiments. This benchmark is useful for verifying the discrepancies in keff values between various libraries and experimental values. This allows to improve accuracy of the neutron transport calculations that may help in designing the high performance commercial VHTRs. Almost all safety parameters depend on the accuracy of neutron transport calculation results that, in turn depend on the accuracy of nuclear data libraries. Thus, evaluation of the libraries applicability to VHTR modelling is one of the important subjects. We compared the numerical experiment results with experimental measurements using two versions of available nuclear data (ENDF-B-VII.1 and JEFF-3.2 prepared for required temperatures. Calculations have been performed with the MCB code which allows to obtain very precise representation of complex VHTR geometry, including the double heterogeneity of a fuel element. In this paper, together with impact of nuclear data, we discuss also the impact of different lattice modelling inside the fuel pins. The discrepancies of keff have been successfully observed and show good agreement with each other and with the experimental data within the 1 σ range of the experimental uncertainty. Because some propagated discrepancies observed, we proposed appropriate corrections in experimental constants which can improve the reactivity coefficient dependency. Obtained results confirm the accuracy of the new Nuclear Data Libraries.
Benchmarking Danish Vocational Education and Training Programmes
DEFF Research Database (Denmark)
Bogetoft, Peter; Wittrup, Jesper
This study paper discusses methods whereby Danish vocational education and training colleges can be benchmarked, and presents results from a number of models. It is conceptually complicated to benchmark vocational colleges, as the various colleges in Denmark offer a wide range of course programmes....... This makes it difficult to compare the resources used, since some programmes by their nature require more classroom time and equipment than others. It is also far from straightforward to compare college effects with respect to grades, since the various programmes apply very different forms of assessment...
Confidential benchmarking based on multiparty computation
DEFF Research Database (Denmark)
Damgård, Ivan Bjerre; Damgård, Kasper Lyneborg; Nielsen, Kurt
We report on the design and implementation of a system that uses multiparty computation to enable banks to benchmark their customers' confidential performance data against a large representative set of confidential performance data from a consultancy house. The system ensures that both the banks......' and the consultancy house's data stays confidential, the banks as clients learn nothing but the computed benchmarking score. In the concrete business application, the developed prototype help Danish banks to find the most efficient customers among a large and challenging group of agricultural customers with too much...
Benchmarking af kommunernes førtidspensionspraksis
DEFF Research Database (Denmark)
Gregersen, Ole
Hvert år udgiver Den Sociale Ankestyrelse statistikken over afgørelser i sager om førtidspension. I forbindelse med årsstatistikken udgives resultater fra en benchmarking model, hvor antal tilkendelser i den enkelte kommune sammenlignes med et forventet antal tilkendelser, hvis kommunen havde haft...... samme afgørelsespraksis, som den "gennemsnitlige kommune", når vi korrigerer for den sociale struktur i kommunen. Den hidtil anvendte benchmarking model er dokumenteret i Ole Gregersen (1994): Kommunernes Pensionspraksis, Servicerapport, Socialforskningsinstituttet. I dette notat dokumenteres en...
Benchmarking of Heavy Ion Transport Codes
Energy Technology Data Exchange (ETDEWEB)
Remec, Igor [ORNL; Ronningen, Reginald M. [Michigan State University, East Lansing; Heilbronn, Lawrence [University of Tennessee, Knoxville (UTK)
2011-01-01
Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in designing and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models and codes and additional benchmarking are required.
Toxicological benchmarks for wildlife: 1996 Revision
Energy Technology Data Exchange (ETDEWEB)
Sample, B.E.; Opresko, D.M.; Suter, G.W., II
1996-06-01
The purpose of this report is to present toxicological benchmarks for assessment of effects of certain chemicals on mammalian and avian wildlife species. Publication of this document meets a milestone for the Environmental Restoration (ER) Risk Assessment Program. This document provides the ER Program with toxicological benchmarks that may be used as comparative tools in screening assessments as well as lines of evidence to support or refute the presence of ecological effects in ecological risk assessments. The chemicals considered in this report are some that occur at US DOE waste sites, and the wildlife species evaluated herein were chosen because they represent a range of body sizes and diets.
Lattice Boltzmann Model for Compressible Fluid on a Square Lattice
Institute of Scientific and Technical Information of China (English)
SUN Cheng-Hai
2000-01-01
A two-level four-direction lattice Boltzmann model is formulated on a square lattice to simulate compressible flows with a high Mach number. The particle velocities are adaptive to the mean velocity and internal energy. Therefore, the mean flow can have a high Mach number. Due to the simple form of the equilibrium distribution, the 4th order velocity tensors are not involved in the calculations. Unlike the standard lattice Boltzmann model, o special treatment is need for the homogeneity of 4th order velocity tensors on square lattices. The Navier-Stokes equations were derived by the Chapman-Enskog method from the BGK Boltzmann equation. The model can be easily extended to three-dimensional cubic lattices. Two-dimensional shock-wave propagation was simulated
Entangling gates in even Euclidean lattices such as Leech lattice
Planat, Michel
2010-01-01
We point out a organic relationship between real entangling n-qubit gates of quantum computation and the group of automorphisms of even Euclidean lattices of the corresponding dimension 2n. The type of entanglement that is found in the gates/generators of Aut() depends on the lattice. In particular, we investigate Zn lattices, Barnes-Wall lattices D4, E8, 16 (associated to n = 2, 3 and 4 qubits), and the Leech lattices h24 and 24 (associated to a 3-qubit/qutrit system). Balanced tripartite entanglement is found to be a basic feature of Aut(), a nding that bears out our recent work related to the Weyl group of E8 [1, 2].
Introduction to lattice gauge theory
Gupta, R.
The lattice formulation of Quantum Field Theory (QFT) can be exploited in many ways. We can derive the lattice Feynman rules and carry out weak coupling perturbation expansions. The lattice then serves as a manifestly gauge invariant regularization scheme, albeit one that is more complicated than standard continuum schemes. Strong coupling expansions: these give us useful qualitative information, but unfortunately no hard numbers. The lattice theory is amenable to numerical simulations by which one calculates the long distance properties of a strongly interacting theory from first principles. The observables are measured as a function of the bare coupling g and a gauge invariant cut-off approx. = 1/alpha, where alpha is the lattice spacing. The continuum (physical) behavior is recovered in the limit alpha yields 0, at which point the lattice artifacts go to zero. This is the more powerful use of lattice formulation, so in these lectures the author focuses on setting up the theory for the purpose of numerical simulations to get hard numbers. The numerical techniques used in Lattice Gauge Theories have their roots in statistical mechanics, so it is important to develop an intuition for the interconnection between quantum mechanics and statistical mechanics.
Lewis, Randy
2014-01-01
Several collaborations have recently performed lattice calculations aimed specifically at dark matter, including work with SU(2), SU(3), SU(4) and SO(4) gauge theories to represent the dark sector. Highlights of these studies are presented here, after a reminder of how lattice calculations in QCD itself are helping with the hunt for dark matter.
Fast simulation of lattice systems
DEFF Research Database (Denmark)
Bohr, H.; Kaznelson, E.; Hansen, Frank;
1983-01-01
A new computer system with an entirely new processor design is described and demonstrated on a very small trial lattice. The new computer simulates systems of differential equations of the order of 104 times faster than present day computers and we describe how the machine can be applied to lattice...
Branes and integrable lattice models
Yagi, Junya
2016-01-01
This is a brief review of my work on the correspondence between four-dimensional $\\mathcal{N} = 1$ supersymmetric field theories realized by brane tilings and two-dimensional integrable lattice models. I explain how to construct integrable lattice models from extended operators in partially topological quantum field theories, and elucidate the correspondence as an application of this construction.
Charmed baryons on the lattice
Padmanath, M
2015-01-01
We discuss the significance of charm baryon spectroscopy in hadron physics and review the recent developments of the spectra of charmed baryons in lattice calculations. Special emphasis is given on the recent studies of highly excited charm baryon states. Recent precision lattice measurements of the low lying charm and bottom baryons are also reviewed.
Quantum phases in optical lattices
Dickerscheid, Dennis Brian Martin
2006-01-01
An important new development in the field of ultracold atomic gases is the study of the properties of these gases in a so-called optical lattice. An optical lattice is a periodic trapping potential for the atoms that is formed by the interference pattern of a few laser beams. A reason for the
Lattice Induced Transparency in Metasurfaces
Manjappa, Manukumara; Singh, Ranjan
2016-01-01
Lattice modes are intrinsic to the periodic structures and their occurrence can be easily tuned and controlled by changing the lattice constant of the structural array. Previous studies have revealed excitation of sharp absorption resonances due to lattice mode coupling with the plasmonic resonances. Here, we report the first experimental observation of a lattice induced transparency (LIT) by coupling the first order lattice mode (FOLM) to the structural resonance of a metamaterial resonator at terahertz frequencies. The observed sharp transparency is a result of the destructive interference between the bright mode and the FOLM mediated dark mode. As the FOLM is swept across the metamaterial resonance, the transparency band undergoes large change in its bandwidth and resonance position. Besides controlling the transparency behaviour, LIT also shows a huge enhancement in the Q-factor and record high group delay of 28 ps, which could be pivotal in ultrasensitive sensing and slow light device applications.
Lattice models of ionic systems
Kobelev, Vladimir; Kolomeisky, Anatoly B.; Fisher, Michael E.
2002-05-01
A theoretical analysis of Coulomb systems on lattices in general dimensions is presented. The thermodynamics is developed using Debye-Hückel theory with ion-pairing and dipole-ion solvation, specific calculations being performed for three-dimensional lattices. As for continuum electrolytes, low-density results for simple cubic (sc), body-centered cubic (bcc), and face-centered cubic (fcc) lattices indicate the existence of gas-liquid phase separation. The predicted critical densities have values comparable to those of continuum ionic systems, while the critical temperatures are 60%-70% higher. However, when the possibility of sublattice ordering as well as Debye screening is taken into account systematically, order-disorder transitions and a tricritical point are found on sc and bcc lattices, and gas-liquid coexistence is suppressed. Our results agree with recent Monte Carlo simulations of lattice electrolytes.
Lattice quantum chromodynamics practical essentials
Knechtli, Francesco; Peardon, Michael
2017-01-01
This book provides an overview of the techniques central to lattice quantum chromodynamics, including modern developments. The book has four chapters. The first chapter explains the formulation of quarks and gluons on a Euclidean lattice. The second chapter introduces Monte Carlo methods and details the numerical algorithms to simulate lattice gauge fields. Chapter three explains the mathematical and numerical techniques needed to study quark fields and the computation of quark propagators. The fourth chapter is devoted to the physical observables constructed from lattice fields and explains how to measure them in simulations. The book is aimed at enabling graduate students who are new to the field to carry out explicitly the first steps and prepare them for research in lattice QCD.
Algorithm and Architecture Independent Benchmarking with SEAK
Energy Technology Data Exchange (ETDEWEB)
Tallent, Nathan R.; Manzano Franco, Joseph B.; Gawande, Nitin A.; Kang, Seung-Hwa; Kerbyson, Darren J.; Hoisie, Adolfy; Cross, Joseph
2016-05-23
Many applications of high performance embedded computing are limited by performance or power bottlenecks. We have designed the Suite for Embedded Applications & Kernels (SEAK), a new benchmark suite, (a) to capture these bottlenecks in a way that encourages creative solutions; and (b) to facilitate rigorous, objective, end-user evaluation for their solutions. To avoid biasing solutions toward existing algorithms, SEAK benchmarks use a mission-centric (abstracted from a particular algorithm) and goal-oriented (functional) specification. To encourage solutions that are any combination of software or hardware, we use an end-user black-box evaluation that can capture tradeoffs between performance, power, accuracy, size, and weight. The tradeoffs are especially informative for procurement decisions. We call our benchmarks future proof because each mission-centric interface and evaluation remains useful despite shifting algorithmic preferences. It is challenging to create both concise and precise goal-oriented specifications for mission-centric problems. This paper describes the SEAK benchmark suite and presents an evaluation of sample solutions that highlights power and performance tradeoffs.
A human benchmark for language recognition
Orr, R.; Leeuwen, D.A. van
2009-01-01
In this study, we explore a human benchmark in language recognition, for the purpose of comparing human performance to machine performance in the context of the NIST LRE 2007. Humans are categorised in terms of language proficiency, and performance is presented per proficiency. Themain challenge in
Benchmarking Year Five Students' Reading Abilities
Lim, Chang Kuan; Eng, Lin Siew; Mohamed, Abdul Rashid
2014-01-01
Reading and understanding a written text is one of the most important skills in English learning.This study attempts to benchmark Year Five students' reading abilities of fifteen rural schools in a district in Malaysia. The objectives of this study are to develop a set of standardised written reading comprehension and a set of indicators to inform…
Benchmark Generation and Simulation at Extreme Scale
Energy Technology Data Exchange (ETDEWEB)
Lagadapati, Mahesh [North Carolina State University (NCSU), Raleigh; Mueller, Frank [North Carolina State University (NCSU), Raleigh; Engelmann, Christian [ORNL
2016-01-01
The path to extreme scale high-performance computing (HPC) poses several challenges related to power, performance, resilience, productivity, programmability, data movement, and data management. Investigating the performance of parallel applications at scale on future architectures and the performance impact of different architectural choices is an important component of HPC hardware/software co-design. Simulations using models of future HPC systems and communication traces from applications running on existing HPC systems can offer an insight into the performance of future architectures. This work targets technology developed for scalable application tracing of communication events. It focuses on extreme-scale simulation of HPC applications and their communication behavior via lightweight parallel discrete event simulation for performance estimation and evaluation. Instead of simply replaying a trace within a simulator, this work promotes the generation of a benchmark from traces. This benchmark is subsequently exposed to simulation using models to reflect the performance characteristics of future-generation HPC systems. This technique provides a number of benefits, such as eliminating the data intensive trace replay and enabling simulations at different scales. The presented work features novel software co-design aspects, combining the ScalaTrace tool to generate scalable trace files, the ScalaBenchGen tool to generate the benchmark, and the xSim tool to assess the benchmark characteristics within a simulator.
A Benchmark and Simulator for UAV Tracking
Mueller, Matthias
2016-09-16
In this paper, we propose a new aerial video dataset and benchmark for low altitude UAV target tracking, as well as, a photorealistic UAV simulator that can be coupled with tracking methods. Our benchmark provides the first evaluation of many state-of-the-art and popular trackers on 123 new and fully annotated HD video sequences captured from a low-altitude aerial perspective. Among the compared trackers, we determine which ones are the most suitable for UAV tracking both in terms of tracking accuracy and run-time. The simulator can be used to evaluate tracking algorithms in real-time scenarios before they are deployed on a UAV “in the field”, as well as, generate synthetic but photo-realistic tracking datasets with automatic ground truth annotations to easily extend existing real-world datasets. Both the benchmark and simulator are made publicly available to the vision community on our website to further research in the area of object tracking from UAVs. (https://ivul.kaust.edu.sa/Pages/pub-benchmark-simulator-uav.aspx.). © Springer International Publishing AG 2016.
Thermodynamic benchmark study using Biacore technology
Navratilova, I.; Papalia, G.A.; Rich, R.L.; Bedinger, D.; Brophy, S.; Condon, B.; Deng, T.; Emerick, A.W.; Guan, H.W.; Hayden, T.; Heutmekers, T.; Hoorelbeke, B.; McCroskey, M.C.; Murphy, M.M.; Nakagawa, T.; Parmeggiani, F.; Xiaochun, Q.; Rebe, S.; Nenad, T.; Tsang, T.; Waddell, M.B.; Zhang, F.F.; Leavitt, S.; Myszka, D.G.
2007-01-01
A total of 22 individuals participated in this benchmark study to characterize the thermodynamics of small-molecule inhibitor-enzyme interactions using Biacore instruments. Participants were provided with reagents (the enzyme carbonic anhydrase II, which was immobilized onto the sensor surface, and
Benchmarking European Gas Transmission System Operators
DEFF Research Database (Denmark)
Agrell, Per J.; Bogetoft, Peter; Trinkner, Urs
This is the final report for the pan-European efficiency benchmarking of gas transmission system operations commissioned by the Netherlands Authority for Consumers and Markets (ACM), Den Haag, on behalf of the Council of European Energy Regulators (CEER) under the supervision of the authors....
Alberta K-12 ESL Proficiency Benchmarks
Salmon, Kathy; Ettrich, Mike
2012-01-01
The Alberta K-12 ESL Proficiency Benchmarks are organized by division: kindergarten, grades 1-3, grades 4-6, grades 7-9, and grades 10-12. They are descriptors of language proficiency in listening, speaking, reading, and writing. The descriptors are arranged in a continuum of seven language competences across five proficiency levels. Several…
Seven Benchmarks for Information Technology Investment.
Smallen, David; Leach, Karen
2002-01-01
Offers benchmarks to help campuses evaluate their efforts in supplying information technology (IT) services. The first three help understand the IT budget, the next three provide insight into staffing levels and emphases, and the seventh relates to the pervasiveness of institutional infrastructure. (EV)
Benchmarking Peer Production Mechanisms, Processes & Practices
Fischer, Thomas; Kretschmer, Thomas
2008-01-01
This deliverable identifies key approaches for quality management in peer production by benchmarking peer production practices and processes in other areas. (Contains 29 footnotes, 13 figures and 2 tables.)[This report has been authored with contributions of: Kaisa Honkonen-Ratinen, Matti Auvinen, David Riley, Jose Pinzon, Thomas Fischer, Thomas…
Operational benchmarking of Japanese and Danish hopsitals
DEFF Research Database (Denmark)
Traberg, Andreas; Itoh, Kenji; Jacobsen, Peter
2010-01-01
This benchmarking model is designed as an integration of three organizational dimensions suited for the healthcare sector. The model incorporates posterior operational indicators, and evaluates upon aggregation of performance. The model is tested upon seven cases from Japan and Denmark. Japanese...
Simple benchmark for complex dose finding studies.
Cheung, Ying Kuen
2014-06-01
While a general goal of early phase clinical studies is to identify an acceptable dose for further investigation, modern dose finding studies and designs are highly specific to individual clinical settings. In addition, as outcome-adaptive dose finding methods often involve complex algorithms, it is crucial to have diagnostic tools to evaluate the plausibility of a method's simulated performance and the adequacy of the algorithm. In this article, we propose a simple technique that provides an upper limit, or a benchmark, of accuracy for dose finding methods for a given design objective. The proposed benchmark is nonparametric optimal in the sense of O'Quigley et al. (2002, Biostatistics 3, 51-56), and is demonstrated by examples to be a practical accuracy upper bound for model-based dose finding methods. We illustrate the implementation of the technique in the context of phase I trials that consider multiple toxicities and phase I/II trials where dosing decisions are based on both toxicity and efficacy, and apply the benchmark to several clinical examples considered in the literature. By comparing the operating characteristics of a dose finding method to that of the benchmark, we can form quick initial assessments of whether the method is adequately calibrated and evaluate its sensitivity to the dose-outcome relationships.
Benchmarking 2010: Trends in Education Philanthropy
Bearman, Jessica
2010-01-01
"Benchmarking 2010" offers insights into the current priorities, practices and concerns of education grantmakers. The report is divided into five sections: (1) Mapping the Education Grantmaking Landscape; (2) 2010 Funding Priorities; (3) Strategies for Leveraging Greater Impact; (4) Identifying Significant Trends in Education Funding; and (5)…
Benchmark Experiment for Beryllium Slab Samples
Institute of Scientific and Technical Information of China (English)
NIE; Yang-bo; BAO; Jie; HAN; Rui; RUAN; Xi-chao; REN; Jie; HUANG; Han-xiong; ZHOU; Zu-ying
2015-01-01
In order to validate the evaluated nuclear data on beryllium,a benchmark experiment has been performed at China Institution of Atomic Energy(CIAE).Neutron leakage spectra from pure beryllium slab samples(10cm×10cm×11cm)were measured at 61°and 121°using timeof-
Benchmarking 2011: Trends in Education Philanthropy
Grantmakers for Education, 2011
2011-01-01
The analysis in "Benchmarking 2011" is based on data from an unduplicated sample of 184 education grantmaking organizations--approximately two-thirds of Grantmakers for Education's (GFE's) network of grantmakers--who responded to an online survey consisting of fixed-choice and open-ended questions. Because a different subset of funders elects to…
Cleanroom Energy Efficiency: Metrics and Benchmarks
Energy Technology Data Exchange (ETDEWEB)
International SEMATECH Manufacturing Initiative; Mathew, Paul A.; Tschudi, William; Sartor, Dale; Beasley, James
2010-07-07
Cleanrooms are among the most energy-intensive types of facilities. This is primarily due to the cleanliness requirements that result in high airflow rates and system static pressures, as well as process requirements that result in high cooling loads. Various studies have shown that there is a wide range of cleanroom energy efficiencies and that facility managers may not be aware of how energy efficient their cleanroom facility can be relative to other cleanroom facilities with the same cleanliness requirements. Metrics and benchmarks are an effective way to compare one facility to another and to track the performance of a given facility over time. This article presents the key metrics and benchmarks that facility managers can use to assess, track, and manage their cleanroom energy efficiency or to set energy efficiency targets for new construction. These include system-level metrics such as air change rates, air handling W/cfm, and filter pressure drops. Operational data are presented from over 20 different cleanrooms that were benchmarked with these metrics and that are part of the cleanroom benchmark dataset maintained by Lawrence Berkeley National Laboratory (LBNL). Overall production efficiency metrics for cleanrooms in 28 semiconductor manufacturing facilities in the United States and recorded in the Fabs21 database are also presented.
Issues in Benchmarking and Assessing Institutional Engagement
Furco, Andrew; Miller, William
2009-01-01
The process of assessing and benchmarking community engagement can take many forms. To date, more than two dozen assessment tools for measuring community engagement institutionalization have been published. These tools vary substantially in purpose, level of complexity, scope, process, structure, and focus. While some instruments are designed to…
Benchmarking European Gas Transmission System Operators
DEFF Research Database (Denmark)
Agrell, Per J.; Bogetoft, Peter; Trinkner, Urs
This is the final report for the pan-European efficiency benchmarking of gas transmission system operations commissioned by the Netherlands Authority for Consumers and Markets (ACM), Den Haag, on behalf of the Council of European Energy Regulators (CEER) under the supervision of the authors....
Bog, Anja
2014-01-01
This book introduces a new benchmark for hybrid database systems, gauging the effect of adding OLAP to an OLTP workload and analyzing the impact of commonly used optimizations in historically separate OLTP and OLAP domains in mixed-workload scenarios.
Energy Technology Data Exchange (ETDEWEB)
NONE
2006-07-01
The aim of this project has been to produce benchmarks for electricity consumption in Danish schools in order to encourage electricity conservation. An internet programme has been developed with the aim of facilitating schools' access to benchmarks and to evaluate energy consumption. The overall purpose is to create increased attention to the electricity consumption of each separate school by publishing benchmarks which take the schools' age and number of pupils as well as after school activities into account. Benchmarks can be used to make green accounts and work as markers in e.g. energy conservation campaigns, energy management and for educational purposes. The internet tool can be found on www.energiguiden.dk. (BA)
The ACRV Picking Benchmark (APB): A Robotic Shelf Picking Benchmark to Foster Reproducible Research
Leitner, Jürgen; Tow, Adam W.; Dean, Jake E.; Suenderhauf, Niko; Durham, Joseph W.; Cooper, Matthew; Eich, Markus; Lehnert, Christopher; Mangels, Ruben; McCool, Christopher; Kujala, Peter; Nicholson, Lachlan; Van Pham, Trung; Sergeant, James; Wu, Liao
2016-01-01
Robotic challenges like the Amazon Picking Challenge (APC) or the DARPA Challenges are an established and important way to drive scientific progress. They make research comparable on a well-defined benchmark with equal test conditions for all participants. However, such challenge events occur only occasionally, are limited to a small number of contestants, and the test conditions are very difficult to replicate after the main event. We present a new physical benchmark challenge for robotic pi...
Watson, Martin; Dick, Robert; Huang, Y. Helen; Lockley, Andrew; Cardoso, Rui; Santos, Abel
2016-08-01
This Benchmark is designed to predict the fracture of a food can after drawing, reverse redrawing and expansion. The aim is to assess different sheet metal forming difficulties such as plastic anisotropic earing and failure models (strain and stress based Forming Limit Diagrams) under complex nonlinear strain paths. To study these effects, two distinct materials, TH330 steel (unstoved) and AA5352 aluminum alloy are considered in this Benchmark. Problem description, material properties, and simulation reports with experimental data are summarized.
Revaluering benchmarking - A topical theme for the construction industry
DEFF Research Database (Denmark)
Rasmussen, Grane Mikael Gregaard
2011-01-01
Over the past decade, benchmarking has increasingly gained foothold in the construction industry. The predominant research, perceptions and uses of benchmarking are valued so strongly and uniformly, that what may seem valuable, is actually abstaining researchers and practitioners from studying an...... organizational relations, behaviors and actions. In closing it is briefly considered how to study the calculative practices of benchmarking....... and questioning the concept objectively. This paper addresses the underlying nature of benchmarking, and accounts for the importance of focusing attention on the sociological impacts benchmarking has in organizations. To understand these sociological impacts, benchmarking research needs to transcend...... this perspective develops more thorough knowledge about benchmarking and challenges the current dominating rationales. Hereby, it is argued that benchmarking is not a neutral practice. On the contrary it is highly influenced by organizational ambitions and strategies, with the potentials to transform...
Effects of Exposure Imprecision on Estimation of the Benchmark Dose
DEFF Research Database (Denmark)
Budtz-Jørgensen, Esben; Keiding, Niels; Grandjean, Philippe
Environmental epidemiology; exposure measurement error; effect of prenatal mercury exposure; exposure standards; benchmark dose......Environmental epidemiology; exposure measurement error; effect of prenatal mercury exposure; exposure standards; benchmark dose...
Benchmarking in Identifying Priority Directions of Development of Telecommunication Operators
Directory of Open Access Journals (Sweden)
Zaharchenko Lolita A.
2013-12-01
Full Text Available The article analyses evolution of development and possibilities of application of benchmarking in the telecommunication sphere. It studies essence of benchmarking on the basis of generalisation of approaches of different scientists to definition of this notion. In order to improve activity of telecommunication operators, the article identifies the benchmarking technology and main factors, that determine success of the operator in the modern market economy, and the mechanism of benchmarking and component stages of carrying out benchmarking by a telecommunication operator. It analyses the telecommunication market and identifies dynamics of its development and tendencies of change of the composition of telecommunication operators and providers. Having generalised the existing experience of benchmarking application, the article identifies main types of benchmarking of telecommunication operators by the following features: by the level of conduct of (branch, inter-branch and international benchmarking; by relation to participation in the conduct (competitive and joint; and with respect to the enterprise environment (internal and external.
Irreversible stochastic processes on lattices
Energy Technology Data Exchange (ETDEWEB)
Nord, R.S.
1986-01-01
Models for irreversible random or cooperative filling of lattices are required to describe many processes in chemistry and physics. Since the filling is assumed to be irreversible, even the stationary, saturation state is not in equilibrium. The kinetics and statistics of these processes are described by recasting the master equations in infinite hierarchical form. Solutions can be obtained by implementing various techniques: refinements in these solution techniques are presented. Programs considered include random dimer, trimer, and tetramer filling of 2D lattices, random dimer filling of a cubic lattice, competitive filling of two or more species, and the effect of a random distribution of inactive sites on the filling. Also considered is monomer filling of a linear lattice with nearest neighbor cooperative effects and solve for the exact cluster-size distribution for cluster sizes up to the asymptotic regime. Additionally, a technique is developed to directly determine the asymptotic properties of the cluster size distribution. Finally cluster growth is considered via irreversible aggregation involving random walkers. In particular, explicit results are provided for the large-lattice-size asymptotic behavior of trapping probabilities and average walk lengths for a single walker on a lattice with multiple traps. Procedures for exact calculation of these quantities on finite lattices are also developed.
Lattice topology dictates photon statistics
Kondakci, H Esat; Saleh, Bahaa E A
2016-01-01
Propagation of coherent light through a disordered network is accompanied by randomization and possible conversion into thermal light. Here, we show that network topology plays a decisive role in determining the statistics of the emerging field if the underlying lattice satisfies chiral symmetry. By examining one-dimensional arrays of randomly coupled waveguides arranged on linear and ring topologies, we are led to a remarkable prediction: the field circularity and the photon statistics in ring lattices are dictated by its parity -- whether the number of sites is even or odd, while the same quantities are insensitive to the parity of a linear lattice. Adding or subtracting a single lattice site can switch the photon statistics from super-thermal to sub-thermal, or vice versa. This behavior is understood by examining the real and imaginary fields on a chiral-symmetric lattice, which form two strands that interleave along the lattice sites. These strands can be fully braided around an even-sited ring lattice th...
Benchmarking in Identifying Priority Directions of Development of Telecommunication Operators
Zaharchenko Lolita A.; Kolesnyk Oksana A.
2013-01-01
The article analyses evolution of development and possibilities of application of benchmarking in the telecommunication sphere. It studies essence of benchmarking on the basis of generalisation of approaches of different scientists to definition of this notion. In order to improve activity of telecommunication operators, the article identifies the benchmarking technology and main factors, that determine success of the operator in the modern market economy, and the mechanism of benchmarking an...
Regression Benchmarking: An Approach to Quality Assurance in Performance
2005-01-01
The paper presents a short summary of our work in the area of regression benchmarking and its application to software development. Specially, we explain the concept of regression benchmarking, the requirements for employing regression testing in a software project, and methods used for analyzing the vast amounts of data resulting from repeated benchmarking. We present the application of regression benchmarking on a real software project and conclude with a glimpse at the challenges for the fu...
Benchmarking of corporate social responsibility: Methodological problems and robustness
2004-01-01
This paper investigates the possibilities and problems of benchmarking Corporate Social Responsibility (CSR). After a methodological analysis of the advantages and problems of benchmarking, we develop a benchmark method that includes economic, social and environmental aspects as well as national and international aspects of CSR. The overall benchmark is based on a weighted average of these aspects. The weights are based on the opinions of companies and NGO’s. Using different me...
Lattice Boltzmann model for nanofluids
Energy Technology Data Exchange (ETDEWEB)
Xuan Yimin; Yao Zhengping [Nanjing University of Science and Technology, School of Power Engineering, Nanjing (China)
2005-01-01
A nanofluid is a particle suspension that consists of base liquids and nanoparticles and has great potential for heat transfer enhancement. By accounting for the external and internal forces acting on the suspended nanoparticles and interactions among the nanoparticles and fluid particles, a lattice Boltzmann model is proposed for simulating flow and energy transport processes inside the nanofluids. First, we briefly introduce the conventional lattice Boltzmann model for multicomponent systems. Then, we discuss the irregular motion of the nanoparticles and inherent dynamic behavior of nanofluids and describe a lattice Boltzmann model for simulating nanofluids. Finally, we conduct some calculations for the distribution of the suspended nanoparticles. (orig.)
Localized structures in Kagome lattices
Energy Technology Data Exchange (ETDEWEB)
Saxena, Avadh B [Los Alamos National Laboratory; Bishop, Alan R [Los Alamos National Laboratory; Law, K J H [UNIV OF MASSACHUSETTS; Kevrekidis, P G [UNIV OF MASSACHUSETTS
2009-01-01
We investigate the existence and stability of gap vortices and multi-pole gap solitons in a Kagome lattice with a defocusing nonlinearity both in a discrete case and in a continuum one with periodic external modulation. In particular, predictions are made based on expansion around a simple and analytically tractable anti-continuum (zero coupling) limit. These predictions are then confirmed for a continuum model of an optically-induced Kagome lattice in a photorefractive crystal obtained by a continuous transformation of a honeycomb lattice.
Borwein, J M; McPhedran, R C
2013-01-01
The study of lattice sums began when early investigators wanted to go from mechanical properties of crystals to the properties of the atoms and ions from which they were built (the literature of Madelung's constant). A parallel literature was built around the optical properties of regular lattices of atoms (initiated by Lord Rayleigh, Lorentz and Lorenz). For over a century many famous scientists and mathematicians have delved into the properties of lattices, sometimes unwittingly duplicating the work of their predecessors. Here, at last, is a comprehensive overview of the substantial body of
Directory of Open Access Journals (Sweden)
Brian Jefferies
2014-01-01
Full Text Available A bounded linear operator T on a Hilbert space ℋ is trace class if its singular values are summable. The trace class operators on ℋ form an operator ideal and in the case that ℋ is finite-dimensional, the trace tr(T of T is given by ∑jajj for any matrix representation {aij} of T. In applications of trace class operators to scattering theory and representation theory, the subject is complicated by the fact that if k is an integral kernel of the operator T on the Hilbert space L2(μ with μ a σ-finite measure, then k(x,x may not be defined, because the diagonal {(x,x} may be a set of (μ⊗μ-measure zero. The present note describes a class of linear operators acting on a Banach function space X which forms a lattice ideal of operators on X, rather than an operator ideal, but coincides with the collection of hermitian positive trace class operators in the case of X=L2(μ.
Benchmarking a signpost to excellence in quality and productivity
Karlof, Bengt
1993-01-01
According to the authors, benchmarking exerts a powerful leverage effect on an organization and they consider some of the factors which justify their claim. Describes how to implement benchmarking and exactly what to benchmark. Explains benchlearning which integrates education, leadership development and organizational dynamics with the actual work being done and how to make it work more efficiently in terms of quality and productivity.
Taking Stock of Corporate Benchmarking Practices: Panacea or Pandora's Box?
Fleisher, Craig S.; Burton, Sara
1995-01-01
Discusses why corporate communications/public relations (cc/pr) should be benchmarked (an approach used by cc/pr managers to demonstrate the value of their activities to skeptical organizational executives). Discusses myths about cc/pr benchmarking; types, targets, and focus of cc/pr benchmarking; a process model; and critical decisions about…
47 CFR 69.108 - Transport rate benchmark.
2010-10-01
... with this subpart, the DS3-to-DS1 benchmark ratio shall be calculated as follows: the telephone company... benchmark ratio of 9.6 to 1 or higher. (c) If a telephone company's initial transport rates are based on... 47 Telecommunication 3 2010-10-01 2010-10-01 false Transport rate benchmark. 69.108 Section...
Discovering and Implementing Best Practices to Strengthen SEAs: Collaborative Benchmarking
Building State Capacity and Productivity Center, 2013
2013-01-01
This paper is written for state educational agency (SEA) leaders who are considering the benefits of collaborative benchmarking, and it addresses the following questions: (1) What does benchmarking of best practices entail?; (2) How does "collaborative benchmarking" enhance the process?; (3) How do SEAs control the process so that "their" needs…
29 CFR 1952.323 - Compliance staffing benchmarks.
2010-07-01
... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.323 Section 1952.323... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...
29 CFR 1952.343 - Compliance staffing benchmarks.
2010-07-01
... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.343 Section 1952.343... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall, Compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...
29 CFR 1952.213 - Compliance staffing benchmarks.
2010-07-01
... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.213 Section 1952.213... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...
29 CFR 1952.373 - Compliance staffing benchmarks.
2010-07-01
... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.373 Section 1952.373... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...
29 CFR 1952.163 - Compliance staffing benchmarks.
2010-07-01
... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.163 Section 1952.163... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall, compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...
29 CFR 1952.203 - Compliance staffing benchmarks.
2010-07-01
... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.203 Section 1952.203... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall, compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...
29 CFR 1952.293 - Compliance staffing benchmarks.
2010-07-01
... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.293 Section 1952.293... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...
29 CFR 1952.223 - Compliance staffing benchmarks.
2010-07-01
... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.223 Section 1952.223... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...
29 CFR 1952.233 - Compliance staffing benchmarks.
2010-07-01
... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.233 Section 1952.233... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...
29 CFR 1952.113 - Compliance staffing benchmarks.
2010-07-01
... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.113 Section 1952.113... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall, compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...
29 CFR 1952.93 - Compliance staffing benchmarks.
2010-07-01
... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.93 Section 1952.93....93 Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were...
29 CFR 1952.353 - Compliance staffing benchmarks.
2010-07-01
... 29 Labor 9 2010-07-01 2010-07-01 false Compliance staffing benchmarks. 1952.353 Section 1952.353... Compliance staffing benchmarks. Under the terms of the 1978 Court Order in AFL-CIO v. Marshall, compliance staffing levels (benchmarks) necessary for a “fully effective” enforcement program were required to...
Characterization of addressability by simultaneous randomized benchmarking
Gambetta, Jay M; Merkel, S T; Johnson, B R; Smolin, John A; Chow, Jerry M; Ryan, Colm A; Rigetti, Chad; Poletto, S; Ohki, Thomas A; Ketchen, Mark B; Steffen, M
2012-01-01
The control and handling of errors arising from cross-talk and unwanted interactions in multi-qubit systems is an important issue in quantum information processing architectures. We introduce a benchmarking protocol that provides information about the amount of addressability present in the system and implement it on coupled superconducting qubits. The protocol consists of randomized benchmarking each qubit individually and then simultaneously, and the amount of addressability is related to the difference of the average gate fidelities of those experiments. We present the results on two similar samples with different amounts of cross-talk and unwanted interactions, which agree with predictions based on simple models for the amount of residual coupling.
Specification for the VERA Depletion Benchmark Suite
Energy Technology Data Exchange (ETDEWEB)
Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
2015-12-17
CASL-X-2015-1014-000 iii Consortium for Advanced Simulation of LWRs EXECUTIVE SUMMARY The CASL neutronics simulator MPACT is under development for the neutronics and T-H coupled simulation for the pressurized water reactor. MPACT includes the ORIGEN-API and internal depletion module to perform depletion calculations based upon neutron-material reaction and radioactive decay. It is a challenge to validate the depletion capability because of the insufficient measured data. One of the detoured methods to validate it is to perform a code-to-code comparison for benchmark problems. In this study a depletion benchmark suite has been developed and a detailed guideline has been provided to obtain meaningful computational outcomes which can be used in the validation of the MPACT depletion capability.
HPL and STREAM Benchmarks on SANAM Supercomputer
Bin Sulaiman, Riman A.
2017-03-13
SANAM supercomputer was jointly built by KACST and FIAS in 2012 ranking second that year in the Green500 list with a power efficiency of 2.3 GFLOPS/W (Rohr et al., 2014). It is a heterogeneous accelerator-based HPC system that has 300 compute nodes. Each node includes two Intel Xeon E5?2650 CPUs, two AMD FirePro S10000 dual GPUs and 128 GiB of main memory. In this work, the seven benchmarks of HPCC were installed and configured to reassess the performance of SANAM, as part of an unpublished master thesis, after it was reassembled in the Kingdom of Saudi Arabia. We present here detailed results of HPL and STREAM benchmarks.
The PROOF benchmark suite measuring PROOF performance
Ryu, S.; Ganis, G.
2012-06-01
The PROOF benchmark suite is a new utility suite of PROOF to measure performance and scalability. The primary goal of the benchmark suite is to determine optimal configuration parameters for a set of machines to be used as PROOF cluster. The suite measures the performance of the cluster for a set of standard tasks as a function of the number of effective processes. Cluster administrators can use the suite to measure the performance of the cluster and find optimal configuration parameters. PROOF developers can also utilize the suite to help them measure, identify problems and improve their software. In this paper, the new tool is explained in detail and use cases are presented to illustrate the new tool.
Measuring NUMA effects with the STREAM benchmark
Bergstrom, Lars
2011-01-01
Modern high-end machines feature multiple processor packages, each of which contains multiple independent cores and integrated memory controllers connected directly to dedicated physical RAM. These packages are connected via a shared bus, creating a system with a heterogeneous memory hierarchy. Since this shared bus has less bandwidth than the sum of the links to memory, aggregate memory bandwidth is higher when parallel threads all access memory local to their processor package than when they access memory attached to a remote package. But, the impact of this heterogeneous memory architecture is not easily understood from vendor benchmarks. Even where these measurements are available, they provide only best-case memory throughput. This work presents a series of modifications to the well-known STREAM benchmark to measure the effects of NUMA on both a 48-core AMD Opteron machine and a 32-core Intel Xeon machine.
Non-judgemental Dynamic Fuel Cycle Benchmarking
Scopatz, Anthony Michael
2015-01-01
This paper presents a new fuel cycle benchmarking analysis methodology by coupling Gaussian process regression, a popular technique in Machine Learning, to dynamic time warping, a mechanism widely used in speech recognition. Together they generate figures-of-merit that are applicable to any time series metric that a benchmark may study. The figures-of-merit account for uncertainty in the metric itself, utilize information across the whole time domain, and do not require that the simulators use a common time grid. Here, a distance measure is defined that can be used to compare the performance of each simulator for a given metric. Additionally, a contribution measure is derived from the distance measure that can be used to rank order the importance of fuel cycle metrics. Lastly, this paper warns against using standard signal processing techniques for error reduction. This is because it is found that error reduction is better handled by the Gaussian process regression itself.
Argonne Code Center: benchmark problem book
Energy Technology Data Exchange (ETDEWEB)
1977-06-01
This report is a supplement to the original report, published in 1968, as revised. The Benchmark Problem Book is intended to serve as a source book of solutions to mathematically well-defined problems for which either analytical or very accurate approximate solutions are known. This supplement contains problems in eight new areas: two-dimensional (R-z) reactor model; multidimensional (Hex-z) HTGR model; PWR thermal hydraulics--flow between two channels with different heat fluxes; multidimensional (x-y-z) LWR model; neutron transport in a cylindrical ''black'' rod; neutron transport in a BWR rod bundle; multidimensional (x-y-z) BWR model; and neutronic depletion benchmark problems. This supplement contains only the additional pages and those requiring modification. (RWR)
Assessing and benchmarking multiphoton microscopes for biologists.
Corbin, Kaitlin; Pinkard, Henry; Peck, Sebastian; Beemiller, Peter; Krummel, Matthew F
2014-01-01
Multiphoton microscopy has become staple tool for tracking cells within tissues and organs due to superior depth of penetration, low excitation volumes, and reduced phototoxicity. Many factors, ranging from laser pulse width to relay optics to detectors and electronics, contribute to the overall ability of these microscopes to excite and detect fluorescence deep within tissues. However, we have found that there are few standard ways already described in the literature to distinguish between microscopes or to benchmark existing microscopes to measure the overall quality and efficiency of these instruments. Here, we discuss some simple parameters and methods that can either be used within a multiphoton facility or by a prospective purchaser to benchmark performance. This can both assist in identifying decay in microscope performance and in choosing features of a scope that are suited to experimental needs.
ASBench: benchmarking sets for allosteric discovery.
Huang, Wenkang; Wang, Guanqiao; Shen, Qiancheng; Liu, Xinyi; Lu, Shaoyong; Geng, Lv; Huang, Zhimin; Zhang, Jian
2015-08-01
Allostery allows for the fine-tuning of protein function. Targeting allosteric sites is gaining increasing recognition as a novel strategy in drug design. The key challenge in the discovery of allosteric sites has strongly motivated the development of computational methods and thus high-quality, publicly accessible standard data have become indispensable. Here, we report benchmarking data for experimentally determined allosteric sites through a complex process, including a 'Core set' with 235 unique allosteric sites and a 'Core-Diversity set' with 147 structurally diverse allosteric sites. These benchmarking sets can be exploited to develop efficient computational methods to predict unknown allosteric sites in proteins and reveal unique allosteric ligand-protein interactions to guide allosteric drug design.
Active vibration control of nonlinear benchmark buildings
Institute of Scientific and Technical Information of China (English)
ZHOU Xing-de; CHEN Dao-zheng
2007-01-01
The present nonlinear model reduction methods unfit the nonlinear benchmark buildings as their vibration equations belong to a non-affine system. Meanwhile,the controllers designed directly by the nonlinear control strategy have a high order, and they are difficult to be applied actually. Therefore, a new active vibration control way which fits the nonlinear buildings is proposed. The idea of the proposed way is based on the model identification and structural model linearization, and exerting the control force to the built model according to the force action principle. This proposed way has a better practicability as the built model can be reduced by the balance reduction method based on the empirical Grammian matrix. A three-story benchmark structure is presented and the simulation results illustrate that the proposed method is viable for the civil engineering structures.
Direct data access protocols benchmarking on DPM
Furano, Fabrizio; Keeble, Oliver; Mancinelli, Valentina
2015-01-01
The Disk Pool Manager is an example of a multi-protocol, multi-VO system for data access on the Grid that went though a considerable technical evolution in the last years. Among other features, its architecture offers the opportunity of testing its different data access frontends under exactly the same conditions, including hardware and backend software. This characteristic inspired the idea of collecting monitoring information from various testbeds in order to benchmark the behaviour of the HTTP and Xrootd protocols for the use case of data analysis, batch or interactive. A source of information is the set of continuous tests that are run towards the worldwide endpoints belonging to the DPM Collaboration, which accumulated relevant statistics in its first year of activity. On top of that, the DPM releases are based on multiple levels of automated testing that include performance benchmarks of various kinds, executed regularly every day. At the same time, the recent releases of DPM can report monitoring infor...
Physics benchmarks of the VELO upgrade
Eklund, Lars
2017-01-01
The LHCb Experiment at the LHC is successfully performing precision measurements primarily in the area of flavour physics. The collaboration is preparing an upgrade that will start taking data in 2021 with a trigger-less readout at five times the current luminosity. The vertex locator has been crucial in the success of the experiment and will continue to be so for the upgrade. It will be replaced by a hybrid pixel detector and this paper discusses the performance benchmarks of the upgraded detector. Despite the challenging experimental environment, the vertex locator will maintain or improve upon its benchmark figures compared to the current detector. Finally the long term plans for LHCb, beyond those of the upgrade currently in preparation, are discussed.
Experiences in Benchmarking of Autonomic Systems
Etchevers, Xavier; Coupaye, Thierry; Vachet, Guy
Autonomic computing promises improvements of systems quality of service in terms of availability, reliability, performance, security, etc. However, little research and experimental results have so far demonstrated this assertion, nor provided proof of the return on investment stemming from the efforts that introducing autonomic features requires. Existing works in the area of benchmarking of autonomic systems can be characterized by their qualitative and fragmented approaches. Still a crucial need is to provide generic (i.e. independent from business, technology, architecture and implementation choices) autonomic computing benchmarking tools for evaluating and/or comparing autonomic systems from a technical and, ultimately, an economical point of view. This article introduces a methodology and a process for defining and evaluating factors, criteria and metrics in order to qualitatively and quantitatively assess autonomic features in computing systems. It also discusses associated experimental results on three different autonomic systems.
Toxicological benchmarks for wildlife. Environmental Restoration Program
Energy Technology Data Exchange (ETDEWEB)
Opresko, D.M.; Sample, B.E.; Suter, G.W.
1993-09-01
This report presents toxicological benchmarks for assessment of effects of 55 chemicals on six representative mammalian wildlife species (short-tailed shrew, white-footed mouse, cottontail ink, red fox, and whitetail deer) and eight avian wildlife species (American robin, woodcock, wild turkey, belted kingfisher, great blue heron, barred owl, Cooper`s hawk, and redtailed hawk) (scientific names are presented in Appendix C). These species were chosen because they are widely distributed and provide a representative range of body sizes and diets. The chemicals are some of those that occur at United States Department of Energy (DOE) waste sites. The benchmarks presented in this report are values believed to be nonhazardous for the listed wildlife species.
Lattice radial quantization by cubature
Neuberger, Herbert
2014-01-01
Basic aspects of a program to put field theories quantized in radial coordinates on the lattice are presented. Only scalar fields are discussed. Simple examples are solved to illustrate the strategy when applied to the 3D Ising model.
Wilby, Brian
1974-01-01
As an alternative to the usual method of counting squares to find the area of a plane shape, a method of counting lattice points (determined by vertices of a unit square) is proposed. Activities using this method are suggested. (DT)
De Soto, F; Carbonell, J; Leroy, J P; Pène, O; Roiesnel, C; Boucaud, Ph.
2007-01-01
We present the first results of a quantum field approach to nuclear models obtained by lattice techniques. Renormalization effects for fermion mass and coupling constant in case of scalar and pseudoscalar interaction lagrangian densities are discussed.
Lattice Boltzmann technique for heat transport phenomena coupled with melting process
Ibrahem, A. M.; El-Amin, M. F.; Mohammadein, A. A.; Gorla, Rama Subba Reddy
2016-04-01
In this work, the heat transport phenomena coupled with melting process are studied by using the enthalpy-based lattice Boltzmann method (LBM). The proposed model is a modified version of thermal LB model, where could avoid iteration steps and ensures high accuracy. The Bhatnagar-Gross-Krook (BGK) approximation with a D1Q2 lattice was used to determine the temperature field for one-dimensional melting by conduction and multi-distribution functions (MDF) with D2Q9 lattice was used to determine the density, velocity and temperature fields for two-dimensional melting by natural convection. Different boundary conditions including Dirichlet, adiabatic and bounce-back boundary conditions were used. The influence of increasing Rayleigh number (from 103 to 105) on temperature distribution and melting process is studied. The obtained results show that a good agreement with the analytical solution for melting by conduction case and with the benchmark solution for melting by convection.
Lattice Boltzmann technique for heat transport phenomena coupled with melting process
Ibrahem, A. M.; El-Amin, M. F.; Mohammadein, A. A.; Gorla, Rama Subba Reddy
2017-01-01
In this work, the heat transport phenomena coupled with melting process are studied by using the enthalpy-based lattice Boltzmann method (LBM). The proposed model is a modified version of thermal LB model, where could avoid iteration steps and ensures high accuracy. The Bhatnagar-Gross-Krook (BGK) approximation with a D1Q2 lattice was used to determine the temperature field for one-dimensional melting by conduction and multi-distribution functions (MDF) with D2Q9 lattice was used to determine the density, velocity and temperature fields for two-dimensional melting by natural convection. Different boundary conditions including Dirichlet, adiabatic and bounce-back boundary conditions were used. The influence of increasing Rayleigh number (from 103 to 105) on temperature distribution and melting process is studied. The obtained results show that a good agreement with the analytical solution for melting by conduction case and with the benchmark solution for melting by convection.
Benchmarking Nature Tourism between Zhangjiajie and Repovesi
Wu, Zhou
2014-01-01
Since nature tourism became a booming business in modern society, more and more tourists choose nature-based tourism destination for their holidays. To find ways to promote Repovesi national park is quite significant, in a bid to reinforce the competitiveness of Repovesi national park. The topic of this thesis is both to find good marketing strategies used by the Zhangjiajie national park, via benchmarking and to provide some suggestions to Repovesi national park. The Method used in t...
Benchmarking Performance of Web Service Operations
Zhang, Shuai
2011-01-01
Web services are often used for retrieving data from servers providing information of different kinds. A data providing web service operation returns collections of objects for a given set of arguments without any side effects. In this project a web service benchmark (WSBENCH) is developed to simulate the performance of web service calls. Web service operations are specified as SQL statements. The function generator of WSBENCH converts user specified SQL queries into functions and automatical...
Felix Stub Generator and Benchmarks Generator
Valenciano, Jose Jaime
2014-01-01
This report discusses two projects I have been working on during my summer studentship period in the context of the FELIX upgrade for ATLAS. The first project concerns the automated code generation needed to support and speed-up the FELIX firmware and software development cycle. The second project required the execution and analysis of benchmarks of the FELIX data-decoding software as a function of data sizes, number of threads and number of data blocks.
Benchmarking polish basic metal manufacturing companies
Directory of Open Access Journals (Sweden)
P. Pomykalski
2014-01-01
Full Text Available Basic metal manufacturing companies are undergoing substantial strategic changes resulting from global changes in demand. During such periods managers should closely monitor and benchmark the financial results of companies operating in their section. Proper and timely identification of the consequences of changes in these areas may be crucial as managers seek to exploit opportunities and avoid threats. The paper examines changes in financial ratios of basic metal manufacturing companies operating in Poland in the period 2006-2011.
BENCHMARK AS INSTRUMENT OF CRISIS MANAGEMENT
Haievskyi, Vladyslav
2017-01-01
In the article is determined the essence of a question's benchmark through synthesis of such concepts as “benchmark”, “crisis management” as an instrument of crisis management, the powerful tool which the entity carries out the comparative analysis of processes and effective activities and allows to reduce costs for production's of products in case of limitation's resources, to raise profit and to achieve success in optimization of strategy's activities of the entity.
Self-interacting Dark Matter Benchmarks
Kaplinghat, M.; Tulin, S.; Yu, H-B
2017-01-01
Dark matter self-interactions have important implications for the distributions of dark matter in the Universe, from dwarf galaxies to galaxy clusters. We present benchmark models that illustrate characteristic features of dark matter that is self-interacting through a new light mediator. These models have self-interactions large enough to change dark matter densities in the centers of galaxies in accord with observations, while remaining compatible with large-scale structur...
Aeroelasticity Benchmark Assessment: Subsonic Fixed Wing Program
Florance, Jennifer P.; Chwalowski, Pawel; Wieseman, Carol D.
2010-01-01
The fundamental technical challenge in computational aeroelasticity is the accurate prediction of unsteady aerodynamic phenomena and the effect on the aeroelastic response of a vehicle. Currently, a benchmarking standard for use in validating the accuracy of computational aeroelasticity codes does not exist. Many aeroelastic data sets have been obtained in wind-tunnel and flight testing throughout the world; however, none have been globally presented or accepted as an ideal data set. There are numerous reasons for this. One reason is that often, such aeroelastic data sets focus on the aeroelastic phenomena alone (flutter, for example) and do not contain associated information such as unsteady pressures and time-correlated structural dynamic deflections. Other available data sets focus solely on the unsteady pressures and do not address the aeroelastic phenomena. Other discrepancies can include omission of relevant data, such as flutter frequency and / or the acquisition of only qualitative deflection data. In addition to these content deficiencies, all of the available data sets present both experimental and computational technical challenges. Experimental issues include facility influences, nonlinearities beyond those being modeled, and data processing. From the computational perspective, technical challenges include modeling geometric complexities, coupling between the flow and the structure, grid issues, and boundary conditions. The Aeroelasticity Benchmark Assessment task seeks to examine the existing potential experimental data sets and ultimately choose the one that is viewed as the most suitable for computational benchmarking. An initial computational evaluation of that configuration will then be performed using the Langley-developed computational fluid dynamics (CFD) software FUN3D1 as part of its code validation process. In addition to the benchmarking activity, this task also includes an examination of future research directions. Researchers within the
Baryon spectroscopy in lattice QCD
Energy Technology Data Exchange (ETDEWEB)
Derek B. Leinweber; Wolodymyr Melnitchouk; David Richards; Anthony G. Williams; James Zanotti
2004-04-01
We review recent developments in the study of excited baryon spectroscopy in lattice QCD. After introducing the basic methods used to extract masses from correlation functions, we discuss various interpolating fields and lattice actions commonly used in the literature. We present a survey of results of recent calculations of excited baryons in quenched QCD, and outline possible future directions in the study of baryon spectra.
Lattice QCD: A Brief Introduction
Meyer, H. B.
A general introduction to lattice QCD is given. The reader is assumed to have some basic familiarity with the path integral representation of quantum field theory. Emphasis is placed on showing that the lattice regularization provides a robust conceptual and computational framework within quantum field theory. The goal is to provide a useful overview, with many references pointing to the following chapters and to freely available lecture series for more in-depth treatments of specifics topics.
Yamamoto, Arata
2016-01-01
We propose the lattice QCD calculation of the Berry phase which is defined by the ground state of a single fermion. We perform the ground-state projection of a single-fermion propagator, construct the Berry link variable on a momentum-space lattice, and calculate the Berry phase. As the first application, the first Chern number of the (2+1)-dimensional Wilson fermion is calculated by the Monte Carlo simulation.
Transport in Sawtooth photonic lattices
Weimann, Steffen; Real, Bastián; Cantillano, Camilo; Szameit, Alexander; Vicencio, Rodrigo A
2016-01-01
We investigate, theoretically and experimentally, a photonic realization of a Sawtooth lattice. This special lattice exhibits two spectral bands, with one of them experiencing a complete collapse to a highly degenerate flat band for a special set of inter-site coupling constants. We report the ob- servation of different transport regimes, including strong transport inhibition due to the appearance of the non-diffractive flat band. Moreover, we excite localized Shockley surfaces states, residing in the gap between the two linear bands.
Lattice Studies of Hyperon Spectroscopy
Energy Technology Data Exchange (ETDEWEB)
Richards, David G. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)
2016-04-01
I describe recent progress at studying the spectrum of hadrons containing the strange quark through lattice QCD calculations. I emphasise in particular the richness of the spectrum revealed by lattice studies, with a spectrum of states at least as rich as that of the quark model. I conclude by prospects for future calculations, including in particular the determination of the decay amplitudes for the excited states.
Energy Technology Data Exchange (ETDEWEB)
DeGrand, T. [Univ. of Colorado, Boulder, CO (United States). Dept. of Physics
1997-06-01
These lectures provide an introduction to lattice methods for nonperturbative studies of Quantum Chromodynamics. Lecture 1: Basic techniques for QCD and results for hadron spectroscopy using the simplest discretizations; lecture 2: Improved actions--what they are and how well they work; lecture 3: SLAC physics from the lattice-structure functions, the mass of the glueball, heavy quarks and {alpha}{sub s} (M{sub z}), and B-{anti B} mixing. 67 refs., 36 figs.
Multifractal behaviour of -simplex lattic
Indian Academy of Sciences (India)
Sanjay Kumar; Debaprasad Giri; Sujata Krishna
2000-06-01
We study the asymptotic behaviour of resistance scaling and ﬂuctuation of resistance that give rise to ﬂicker noise in an -simplex lattice. We propose a simple method to calculate the resistance scaling and give a closed-form formula to calculate the exponent, , associated with resistance scaling, for any . Using current cumulant method we calculate the exact noise exponent for -simplex lattices.
Benchmarking the Multidimensional Stellar Implicit Code MUSIC
Goffrey, T.; Pratt, J.; Viallet, M.; Baraffe, I.; Popov, M. V.; Walder, R.; Folini, D.; Geroux, C.; Constantino, T.
2017-04-01
We present the results of a numerical benchmark study for the MUltidimensional Stellar Implicit Code (MUSIC) based on widely applicable two- and three-dimensional compressible hydrodynamics problems relevant to stellar interiors. MUSIC is an implicit large eddy simulation code that uses implicit time integration, implemented as a Jacobian-free Newton Krylov method. A physics based preconditioning technique which can be adjusted to target varying physics is used to improve the performance of the solver. The problems used for this benchmark study include the Rayleigh-Taylor and Kelvin-Helmholtz instabilities, and the decay of the Taylor-Green vortex. Additionally we show a test of hydrostatic equilibrium, in a stellar environment which is dominated by radiative effects. In this setting the flexibility of the preconditioning technique is demonstrated. This work aims to bridge the gap between the hydrodynamic test problems typically used during development of numerical methods and the complex flows of stellar interiors. A series of multidimensional tests were performed and analysed. Each of these test cases was analysed with a simple, scalar diagnostic, with the aim of enabling direct code comparisons. As the tests performed do not have analytic solutions, we verify MUSIC by comparing it to established codes including ATHENA and the PENCIL code. MUSIC is able to both reproduce behaviour from established and widely-used codes as well as results expected from theoretical predictions. This benchmarking study concludes a series of papers describing the development of the MUSIC code and provides confidence in future applications.
A PWR Thorium Pin Cell Burnup Benchmark
Energy Technology Data Exchange (ETDEWEB)
Weaver, Kevan Dean; Zhao, X.; Pilat, E. E; Hejzlar, P.
2000-05-01
As part of work to evaluate the potential benefits of using thorium in LWR fuel, a thorium fueled benchmark comparison was made in this study between state-of-the-art codes, MOCUP (MCNP4B + ORIGEN2), and CASMO-4 for burnup calculations. The MOCUP runs were done individually at MIT and INEEL, using the same model but with some differences in techniques and cross section libraries. Eigenvalue and isotope concentrations were compared on a PWR pin cell model up to high burnup. The eigenvalue comparison as a function of burnup is good: the maximum difference is within 2% and the average absolute difference less than 1%. The isotope concentration comparisons are better than a set of MOX fuel benchmarks and comparable to a set of uranium fuel benchmarks reported in the literature. The actinide and fission product data sources used in the MOCUP burnup calculations for a typical thorium fuel are documented. Reasons for code vs code differences are analyzed and discussed.
Benchmarking and accounting for the (private) cloud
Belleman, J.; Schwickerath, U.
2015-12-01
During the past two years large parts of the CERN batch farm have been moved to virtual machines running on the CERN internal cloud. During this process a large fraction of the resources, which had previously been used as physical batch worker nodes, were converted into hypervisors. Due to the large spread of the per-core performance in the farm, caused by its heterogenous nature, it is necessary to have a good knowledge of the performance of the virtual machines. This information is used both for scheduling in the batch system and for accounting. While in the previous setup worker nodes were classified and benchmarked based on the purchase order number, for virtual batch worker nodes this is no longer possible; the information is now either hidden or hard to retrieve. Therefore we developed a new scheme to classify worker nodes according to their performance. The new scheme is flexible enough to be usable both for virtual and physical machines in the batch farm. With the new classification it is possible to have an estimation of the performance of worker nodes also in a very dynamic farm with worker nodes coming and going at a high rate, without the need to benchmark each new node again. An extension to public cloud resources is possible if all conditions under which the benchmark numbers have been obtained are fulfilled.
BENCHMARKING ON-LINE SERVICES INDUSTRIES
Institute of Scientific and Technical Information of China (English)
John HAMILTON
2006-01-01
The Web Quality Analyser (WQA) is a new benchmarking tool for industry. It hasbeen extensively tested across services industries. Forty five critical success features are presented as measures that capture the user's perception of services industry websites. This tool differs to previous tools, in that it captures the information technology (IT) related driver sectors of website performance, along with the marketing-services related driver sectors. These driver sectors capture relevant structure, function and performance components.An 'on-off' switch measurement approach determines each component. Relevant component measures scale into a relative presence of the applicable feature, with a feature block delivering one of the sector drivers. Although it houses both measurable and a few subjective components, the WQA offers a proven and useful means to compare relevant websites.The WQA defines website strengths and weaknesses, thereby allowing for corrections to the website structure of the specific business. WQA benchmarking against services related business competitors delivers a position on the WQA index, facilitates specific website driver rating comparisons, and demonstrates where key competitive advantage may reside. This paper reports on the marketing-services driver sectors of this new benchmarking WQA tool.
Perspective: Selected benchmarks from commercial CFD codes
Energy Technology Data Exchange (ETDEWEB)
Freitas, C.J. [Southwest Research Inst., San Antonio, TX (United States). Computational Mechanics Section
1995-06-01
This paper summarizes the results of a series of five benchmark simulations which were completed using commercial Computational Fluid Dynamics (CFD) codes. These simulations were performed by the vendors themselves, and then reported by them in ASME`s CFD Triathlon Forum and CFD Biathlon Forum. The first group of benchmarks consisted of three laminar flow problems. These were the steady, two-dimensional flow over a backward-facing step, the low Reynolds number flow around a circular cylinder, and the unsteady three-dimensional flow in a shear-driven cubical cavity. The second group of benchmarks consisted of two turbulent flow problems. These were the two-dimensional flow around a square cylinder with periodic separated flow phenomena, and the stead, three-dimensional flow in a 180-degree square bend. All simulation results were evaluated against existing experimental data nd thereby satisfied item 10 of the Journal`s policy statement for numerical accuracy. The objective of this exercise was to provide the engineering and scientific community with a common reference point for the evaluation of commercial CFD codes.
Optimal lattice-structured materials
Messner, Mark C.
2016-11-01
This work describes a method for optimizing the mesostructure of lattice-structured materials. These materials are periodic arrays of slender members resembling efficient, lightweight macroscale structures like bridges and frame buildings. Current additive manufacturing technologies can assemble lattice structures with length scales ranging from nanometers to millimeters. Previous work demonstrates that lattice materials have excellent stiffness- and strength-to-weight scaling, outperforming natural materials. However, there are currently no methods for producing optimal mesostructures that consider the full space of possible 3D lattice topologies. The inverse homogenization approach for optimizing the periodic structure of lattice materials requires a parameterized, homogenized material model describing the response of an arbitrary structure. This work develops such a model, starting with a method for describing the long-wavelength, macroscale deformation of an arbitrary lattice. The work combines the homogenized model with a parameterized description of the total design space to generate a parameterized model. Finally, the work describes an optimization method capable of producing optimal mesostructures. Several examples demonstrate the optimization method. One of these examples produces an elastically isotropic, maximally stiff structure, here called the isotruss, that arguably outperforms the anisotropic octet truss topology.
Advances in Lattice Quantum Chromodynamics
McGlynn, Greg
In this thesis we make four contributions to the state of the art in numerical lattice simulations of quantum chromodynamics (QCD). First, we present the most detailed investigation yet of the autocorrelations of topological observations in hybrid Monte Carlo simulations of QCD and of the effects of the boundary conditions on these autocorrelations. This results in a numerical criterion for deciding when open boundary conditions are useful for reducing these autocorrelations, which are a major barrier to reliable calculations at fine lattice spacings. Second, we develop a dislocation-enhancing determinant, and demonstrate that it reduces the autocorrelation time of the topological charge. This alleviates problems with slow topological tunneling at fine lattice spacings, enabling simulations on fine lattices to be completed with much less computational effort. Third, we show how to apply the recently developed zMobius technique to hybrid Monte Carlo evolutions with domain wall fermions, achieving nearly a factor of two speedup in the light quark determinant, the single most expensive part of the calculation. The dislocation-enhancing determinant and the zMobius technique have enabled us to begin simulations of fine ensembles with four flavors of dynamical domain wall quarks. Finally, we show how to include the previously-neglected G1 operator in nonperturbative renormalization of the DeltaS = 1 effective weak Hamiltonian on the lattice. This removes an important systematic error in lattice calculations of weak matrix elements, in particular the important K → pipi decay.
SPICE benchmark for global tomographic methods
Qin, Yilong; Capdeville, Yann; Maupin, Valerie; Montagner, Jean-Paul; Lebedev, Sergei; Beucler, Eric
2008-11-01
The existing global tomographic methods result in different models due to different parametrization, scale resolution and theoretical approach. To test how current imaging techniques are limited by approximations in theory and by the inadequacy of data quality and coverage, it is necessary to perform a global-scale benchmark to understand the resolving properties of each specific imaging algorithm. In the framework of the Seismic wave Propagation and Imaging in Complex media: a European network (SPICE) project, it was decided to perform a benchmark experiment of global inversion algorithms. First, a preliminary benchmark with a simple isotropic model is carried out to check the feasibility in terms of acquisition geometry and numerical accuracy. Then, to fully validate tomographic schemes with a challenging synthetic data set, we constructed one complex anisotropic global model, which is characterized by 21 elastic constants and includes 3-D heterogeneities in velocity, anisotropy (radial and azimuthal anisotropy), attenuation, density, as well as surface topography and bathymetry. The intermediate-period (>32 s), high fidelity anisotropic modelling was performed by using state-of-the-art anisotropic anelastic modelling code, that is, coupled spectral element method (CSEM), on modern massively parallel computing resources. The benchmark data set consists of 29 events and three-component seismograms are recorded by 256 stations. Because of the limitation of the available computing power, synthetic seismograms have a minimum period of 32 s and a length of 10 500 s. The inversion of the benchmark data set demonstrates several well-known problems of classical surface wave tomography, such as the importance of crustal correction to recover the shallow structures, the loss of resolution with depth, the smearing effect, both horizontal and vertical, the inaccuracy of amplitude of isotropic S-wave velocity variation, the difficulty of retrieving the magnitude of azimuthal
State of the art: benchmarking microprocessors for embedded automotive applications
Directory of Open Access Journals (Sweden)
Adnan Shaout
2016-09-01
Full Text Available Benchmarking microprocessors provides a way for consumers to evaluate the performance of the processors. This is done by using either synthetic or real world applications. There are a number of benchmarks that exist today to assist consumers in evaluating the vast number of microprocessors that are available in the market. In this paper an investigation of the various benchmarks available for evaluating microprocessors for embedded automotive applications will be performed. We will provide an overview of the following benchmarks: Whetstone, Dhrystone, Linpack, standard performance evaluation corporation (SPEC CPU2006, embedded microprocessor benchmark consortium (EEMBC AutoBench and MiBench. A comparison of existing benchmarks will be given based on relevant characteristics of automotive applications which will give the proper recommendation when benchmarking processors for automotive applications.
A lexicographic shellability characterization of geometric lattices
Davidson, Ruth
2011-01-01
Geometric lattices are characterized as those finite, atomic lattices such that every atom ordering induces a lexicographic shelling given by an edge labeling known as a minimal labeling. This new characterization fits into a similar paradigm as McNamara's characterization of supersolvable lattices as those lattices admitting a different type of lexicographic shelling, namely one in which each maximal chain is labeled with a permutation of {1,...,n}. Geometric lattices arise as the intersection lattices of central hyperplane arrangements and more generally as the lattices of flats for matroids.
Embedded Lattice and Properties of Gram Matrix
Directory of Open Access Journals (Sweden)
Futa Yuichi
2017-03-01
Full Text Available In this article, we formalize in Mizar [14] the definition of embedding of lattice and its properties. We formally define an inner product on an embedded module. We also formalize properties of Gram matrix. We formally prove that an inverse of Gram matrix for a rational lattice exists. Lattice of Z-module is necessary for lattice problems, LLL (Lenstra, Lenstra and Lov´asz base reduction algorithm [16] and cryptographic systems with lattice [17].
Nuclear Lattice Simulations using Symmetry-Sign Extrapolation
Lähde, Timo A; Lee, Dean; Meißner, Ulf-G; Epelbaum, Evgeny; Krebs, Hermann; Rupak, Gautam
2015-01-01
Projection Monte Carlo calculations of lattice Chiral Effective Field Theory suffer from sign oscillations to a varying degree dependent on the number of protons and neutrons. Hence, such studies have hitherto been concentrated on nuclei with equal numbers of protons and neutrons, and especially on the alpha nuclei where the sign oscillations are smallest. We now introduce the technique of "symmetry-sign extrapolation" which allows us to use the approximate Wigner SU(4) symmetry of the nuclear interaction to control the sign oscillations without introducing unknown systematic errors. We benchmark this method by calculating the ground-state energies of the $^{12}$C, $^6$He and $^6$Be nuclei, and discuss its potential for studies of neutron-rich halo nuclei and asymmetric nuclear matter.
Nuclear lattice simulations using symmetry-sign extrapolation
Energy Technology Data Exchange (ETDEWEB)
Laehde, Timo A.; Luu, Thomas [Forschungszentrum Juelich, Institute for Advanced Simulation, Institut fuer Kernphysik, and Juelich Center for Hadron Physics, Juelich (Germany); Lee, Dean [North Carolina State University, Department of Physics, Raleigh, NC (United States); Meissner, Ulf G. [Universitaet Bonn, Helmholtz-Institut fuer Strahlen- und Kernphysik and Bethe Center for Theoretical Physics, Bonn (Germany); Forschungszentrum Juelich, Institute for Advanced Simulation, Institut fuer Kernphysik, and Juelich Center for Hadron Physics, Juelich (Germany); Forschungszentrum Juelich, JARA - High Performance Computing, Juelich (Germany); Epelbaum, Evgeny; Krebs, Hermann [Ruhr-Universitaet Bochum, Institut fuer Theoretische Physik II, Bochum (Germany); Rupak, Gautam [Mississippi State University, Department of Physics and Astronomy, Mississippi State, MS (United States)
2015-07-15
Projection Monte Carlo calculations of lattice Chiral Effective Field Theory suffer from sign oscillations to a varying degree dependent on the number of protons and neutrons. Hence, such studies have hitherto been concentrated on nuclei with equal numbers of protons and neutrons, and especially on the alpha nuclei where the sign oscillations are smallest. Here, we introduce the ''symmetry-sign extrapolation'' method, which allows us to use the approximate Wigner SU(4) symmetry of the nuclear interaction to systematically extend the Projection Monte Carlo calculations to nuclear systems where the sign problem is severe. We benchmark this method by calculating the ground-state energies of the {sup 12}C, {sup 6}He and {sup 6}Be nuclei, and discuss its potential for studies of neutron-rich halo nuclei and asymmetric nuclear matter. (orig.)
Cross section generation for LWR pin lattices simulations
Energy Technology Data Exchange (ETDEWEB)
Velasquez, Carlos E.; Macedo, Anderson A.P.; Cardoso, Fabiano S.; Pereira, Claubia; Veloso, Maria Auxiliadora F.; Costa, Antonella L. [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil); Instituto Nacional de Ciencia e Tecnologia de Reatores Nucleares Inovadores/CNPq, Brasilia, DF (Brazil); Barros, Graiciany de P. [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)
2015-07-01
The majority of the neutron data library provided with the MCNP code is set at room temperature. Therefore, it is important to generate continuous energy cross section library for MCNP code for different temperatures. To evaluate the methodology used, the criticality calculations obtained using MCNP with the cross section generated at DEN/UFMG, are compared with the criticality data from the International Handbook of Evaluated Reactor Physics Benchmarks Experiments about the PIN lattices for light water reactors. It was used nuclear data from the ENDF-VII.1, JEFF-3.1 and TENDL-2014, which were processed using the NJOY99 code for different energies and temperatures. This code provides the nuclear data in ACE libraries, which then are added to MCNP libraries to perform the simulations. The results indicate the methodology efficiency developed by DEN/UFMG. (author)
Lattice dislocation in Si nanowires
Energy Technology Data Exchange (ETDEWEB)
Omar, M.S., E-mail: dr_m_s_omar@yahoo.co [Department of Physics, College of Science, University of Salahaddin, Arbil, Iraqi Kurdistan (Iraq); Taha, H.T. [Department of Physics, College of Science, University of Salahaddin, Arbil, Iraqi Kurdistan (Iraq)
2009-12-15
Modified formulas were used to calculate lattice thermal expansion, specific heat and Bulk modulus for Si nanowires with diameters of 115, 56, 37 and 22 nm. From these values and Gruneisen parameter taken from reference, mean lattice volumes were found to be as 20.03 A{sup 3} for the bulk and 23.63, 29.91, 34.69 and 40.46 A{sup 3} for Si nanowire diameters mentioned above, respectively. Their mean bonding length was calculated to be as 0.235 nm for the bulk and 0.248, 0.269, 0.282 and 0.297 nm for the nanowires diameter mentioned above, respectively. By dividing the nanowires diameter on the mean bonding length, number of layers per each nanowire size was found to be as 230, 104, 65 and 37 for the diameters mentioned above, respectively. Lattice dislocations in 22 nm diameter wire were found to be from 0.00324 nm for the 1st central lattice to 0.2579 nm for the last surface lattice. Such dislocation was smaller for larger wire diameters. Dislocation concentration found to change in Si nanowires according to the proportionalities of surface thickness to nanowire radius ratios.
Hamiltonian tomography of photonic lattices
Ma, Ruichao; Owens, Clai; LaChapelle, Aman; Schuster, David I.; Simon, Jonathan
2017-06-01
In this paper we introduce an approach to Hamiltonian tomography of noninteracting tight-binding photonic lattices. To begin with, we prove that the matrix element of the low-energy effective Hamiltonian between sites α and β may be obtained directly from Sα β(ω ) , the (suitably normalized) two-port measurement between sites α and β at frequency ω . This general result enables complete characterization of both on-site energies and tunneling matrix elements in arbitrary lattice networks by spectroscopy, and suggests that coupling between lattice sites is a topological property of the two-port spectrum. We further provide extensions of this technique for measurement of band projectors in finite, disordered systems with good band flatness ratios, and apply the tool to direct real-space measurement of the Chern number. Our approach demonstrates the extraordinary potential of microwave quantum circuits for exploration of exotic synthetic materials, providing a clear path to characterization and control of single-particle properties of Jaynes-Cummings-Hubbard lattices. More broadly, we provide a robust, unified method of spectroscopic characterization of linear networks from photonic crystals to microwave lattices and everything in between.
Hydrodynamic behaviour of Lattice Boltzmann and Lattice BGK models
Behrend, O; Warren, P
1993-01-01
Abstract: We present a numerical analysis of the validity of classical and generalized hydrodynamics for Lattice Boltzmann Equation (LBE) and Lattice BGK methods in two and three dimensions, as a function of the collision parameters of these models. Our analysis is based on the wave-number dependence of the evolution operator. Good ranges of validity are found for BGK models as long as the relaxation time is chosen smaller than or equal to unity. The additional freedom in the choice of collision parameters for LBE models does not seem to give significant improvement.
Towards Systematic Benchmarking of Climate Model Performance
Gleckler, P. J.
2014-12-01
The process by which climate models are evaluated has evolved substantially over the past decade, with the Coupled Model Intercomparison Project (CMIP) serving as a centralizing activity for coordinating model experimentation and enabling research. Scientists with a broad spectrum of expertise have contributed to the CMIP model evaluation process, resulting in many hundreds of publications that have served as a key resource for the IPCC process. For several reasons, efforts are now underway to further systematize some aspects of the model evaluation process. First, some model evaluation can now be considered routine and should not require "re-inventing the wheel" or a journal publication simply to update results with newer models. Second, the benefit of CMIP research to model development has not been optimal because the publication of results generally takes several years and is usually not reproducible for benchmarking newer model versions. And third, there are now hundreds of model versions and many thousands of simulations, but there is no community-based mechanism for routinely monitoring model performance changes. An important change in the design of CMIP6 can help address these limitations. CMIP6 will include a small set standardized experiments as an ongoing exercise (CMIP "DECK": ongoing Diagnostic, Evaluation and Characterization of Klima), so that modeling groups can submit them at any time and not be overly constrained by deadlines. In this presentation, efforts to establish routine benchmarking of existing and future CMIP simulations will be described. To date, some benchmarking tools have been made available to all CMIP modeling groups to enable them to readily compare with CMIP5 simulations during the model development process. A natural extension of this effort is to make results from all CMIP simulations widely available, including the results from newer models as soon as the simulations become available for research. Making the results from routine
A Uranium Bioremediation Reactive Transport Benchmark
Energy Technology Data Exchange (ETDEWEB)
Yabusaki, Steven B.; Sengor, Sevinc; Fang, Yilin
2015-06-01
A reactive transport benchmark problem set has been developed based on in situ uranium bio-immobilization experiments that have been performed at a former uranium mill tailings site in Rifle, Colorado, USA. Acetate-amended groundwater stimulates indigenous microorganisms to catalyze the reduction of U(VI) to a sparingly soluble U(IV) mineral. The interplay between the flow, acetate loading periods and rates, microbially-mediated and geochemical reactions leads to dynamic behavior in metal- and sulfate-reducing bacteria, pH, alkalinity, and reactive mineral surfaces. The benchmark is based on an 8.5 m long one-dimensional model domain with constant saturated flow and uniform porosity. The 159-day simulation introduces acetate and bromide through the upgradient boundary in 14-day and 85-day pulses separated by a 10 day interruption. Acetate loading is tripled during the second pulse, which is followed by a 50 day recovery period. Terminal electron accepting processes for goethite, phyllosilicate Fe(III), U(VI), and sulfate are modeled using Monod-type rate laws. Major ion geochemistry modeled includes mineral reactions, as well as aqueous and surface complexation reactions for UO2++, Fe++, and H+. In addition to the dynamics imparted by the transport of the acetate pulses, U(VI) behavior involves the interplay between bioreduction, which is dependent on acetate availability, and speciation-controlled surface complexation, which is dependent on pH, alkalinity and available surface complexation sites. The general difficulty of this benchmark is the large number of reactions (74), multiple rate law formulations, a multisite uranium surface complexation model, and the strong interdependency and sensitivity of the reaction processes. Results are presented for three simulators: HYDROGEOCHEM, PHT3D, and PHREEQC.
Benchmarks in Tacit Knowledge Skills Instruction
DEFF Research Database (Denmark)
Tackney, Charles T.; Strömgren, Ole; Sato, Toyoko
2006-01-01
While the knowledge management literature has addressed the explicit and tacit skills needed for successful performance in the modern enterprise, little attention has been paid to date in this particular literature as to how these wide-ranging skills may be suitably acquired during the course...... experience more empowering of essential tacit knowledge skills than that found in educational institutions in other national settings. We specify the program forms and procedures for consensus-based governance and group work (as benchmarks) that demonstrably instruct undergraduates in the tacit skill...
Directory of Open Access Journals (Sweden)
Matthias S. Müller
2003-01-01
Full Text Available The purpose of this benchmark is to propose several optimization techniques and to test their existence in current OpenMP compilers. Examples are the removal of redundant synchronization constructs, effective constructs for alternative code and orphaned directives. The effectiveness of the compiler generated code is measured by comparing different OpenMP constructs and compilers. If possible, we also compare with the hand coded "equivalent" solution. Six out of seven proposed optimization techniques are already implemented in different compilers. However, most compilers implement only one or two of them.
Benchmarks of Global Clean Energy Manufacturing
Energy Technology Data Exchange (ETDEWEB)
Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)
2017-01-01
The Clean Energy Manufacturing Analysis Center (CEMAC), sponsored by the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE), provides objective analysis and up-to-date data on global supply chains and manufacturing of clean energy technologies. Benchmarks of Global Clean Energy Manufacturing sheds light on several fundamental questions about the global clean technology manufacturing enterprise: How does clean energy technology manufacturing impact national economies? What are the economic opportunities across the manufacturing supply chain? What are the global dynamics of clean energy technology manufacturing?
International Benchmarking of Electricity Transmission System Operators
DEFF Research Database (Denmark)
Agrell, Per J.; Bogetoft, Peter
2014-01-01
Electricity transmission system operators (TSO) in Europe are increasing subject to high-powered performance-based regulation, such as revenue-cap regimes. The determination of the parameters in such regimes is challenging for national regulatory authorities (NRA), since there is normally a single...... TSO operating in each jurisdiction. The solution for European regulators has been found in international regulatory benchmarking, organized in collaboration with the Council of European Energy Regulators (CEER) in 2008 and 2012 for 22 and 23 TSOs, respectively. The frontier study provides static cost...... weight restrictions and a correction method for opening balances....
Benchmarking of methods for genomic taxonomy
DEFF Research Database (Denmark)
Larsen, Mette Voldby; Cosentino, Salvatore; Lukjancenko, Oksana
2014-01-01
. Nevertheless, the method has been found to have a number of shortcomings. In the current study, we trained and benchmarked five methods for whole-genome sequence-based prokaryotic species identification on a common data set of complete genomes: (i) SpeciesFinder, which is based on the complete 16S rRNA gene......-specific functional protein domain profiles; and finally (v) KmerFinder, which examines the number of cooccurring k-mers (substrings of k nucleotides in DNA sequence data). The performances of the methods were subsequently evaluated on three data sets of short sequence reads or draft genomes from public databases...
Robust randomized benchmarking of quantum processes
Magesan, Easwar; Emerson, Joseph
2010-01-01
We describe a simple randomized benchmarking protocol for quantum information processors and obtain a sequence of models for the observable fidelity decay as a function of a perturbative expansion of the errors. We are able to prove that the protocol provides an efficient and reliable estimate of an average error-rate for a set operations (gates) under a general noise model that allows for both time and gate-dependent errors. We determine the conditions under which this estimate remains valid and illustrate the protocol through numerical examples.
Benchmarking result diversification in social image retrieval
DEFF Research Database (Denmark)
Ionescu, Bogdan; Popescu, Adrian; Müller, Henning
2014-01-01
This article addresses the issue of retrieval result diversification in the context of social image retrieval and discusses the results achieved during the MediaEval 2013 benchmarking. 38 runs and their results are described and analyzed in this text. A comparison of the use of expert vs....... crowdsourcing annotations shows that crowdsourcing results are slightly different and have higher inter observer differences but results are comparable at lower cost. Multimodal approaches have best results in terms of cluster recall. Manual approaches can lead to high precision but often lower diversity....... With this detailed results analysis we give future insights on this matter....
Benchmarking East Tennessee`s economic capacity
Energy Technology Data Exchange (ETDEWEB)
NONE
1995-04-20
This presentation is comprised of viewgraphs delineating major economic factors operating in 15 counties in East Tennessee. The purpose of the information presented is to provide a benchmark analysis of economic conditions for use in guiding economic growth in the region. The emphasis of the presentation is economic infrastructure, which is classified into six categories: human resources, technology, financial resources, physical infrastructure, quality of life, and tax and regulation. Data for analysis of key indicators in each of the categories are presented. Preliminary analyses, in the form of strengths and weaknesses and comparison to reference groups, are given.
Benchmarks for multicomponent diffusion and electrochemical migration
DEFF Research Database (Denmark)
Rasouli, Pejman; Steefel, Carl I.; Mayer, K. Ulrich
2015-01-01
In multicomponent electrolyte solutions, the tendency of ions to diffuse at different rates results in a charge imbalance that is counteracted by the electrostatic coupling between charged species leading to a process called “electrochemical migration” or “electromigration.” Although not commonly...... not been published to date. This contribution provides a set of three benchmark problems that demonstrate the effect of electric coupling during multicomponent diffusion and electrochemical migration and at the same time facilitate the intercomparison of solutions from existing reactive transport codes...
ABM11 parton distributions and benchmarks
Energy Technology Data Exchange (ETDEWEB)
Alekhin, Sergey [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Institut Fiziki Vysokikh Ehnergij, Protvino (Russian Federation); Bluemlein, Johannes; Moch, Sven-Olaf [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany)
2012-08-15
We present a determination of the nucleon parton distribution functions (PDFs) and of the strong coupling constant {alpha}{sub s} at next-to-next-to-leading order (NNLO) in QCD based on the world data for deep-inelastic scattering and the fixed-target data for the Drell-Yan process. The analysis is performed in the fixed-flavor number scheme for n{sub f}=3,4,5 and uses the MS scheme for {alpha}{sub s} and the heavy quark masses. The fit results are compared with other PDFs and used to compute the benchmark cross sections at hadron colliders to the NNLO accuracy.
Benchmarks in Tacit Knowledge Skills Instruction
DEFF Research Database (Denmark)
Tackney, Charles T.; Strömgren, Ole; Sato, Toyoko
2006-01-01
While the knowledge management literature has addressed the explicit and tacit skills needed for successful performance in the modern enterprise, little attention has been paid to date in this particular literature as to how these wide-ranging skills may be suitably acquired during the course...... experience more empowering of essential tacit knowledge skills than that found in educational institutions in other national settings. We specify the program forms and procedures for consensus-based governance and group work (as benchmarks) that demonstrably instruct undergraduates in the tacit skill...
A Benchmark Construction of Positron Crystal Undulator
Tikhomirov, Victor V
2015-01-01
Optimization of a positron crystal undulator (CU) is addressed. The ways to assure both the maximum intensity and minimum spectral width of positron CU radiation are outlined. We claim that the minimum CU spectrum width of 3 -- 4% is reached at the positron energies of a few GeV and that the optimal bending radius of crystals planes in CU ranges from 3 to 5 critical bending radii for channeled particles. Following suggested approach a benchmark positron CU construction is devised and its functioning is illustrated using the simulation method widely tested by experimental data.
A Benchmark for Virtual Camera Control
DEFF Research Database (Denmark)
Burelli, Paolo; Yannakakis, Georgios N.
2015-01-01
Automatically animating and placing the virtual camera in a dynamic environment is a challenging task. The camera is expected to maximise and maintain a set of properties — i.e. visual composition — while smoothly moving through the environment and avoiding obstacles. A large number of different....... For this reason, in this paper, we propose a benchmark for the problem of virtual camera control and we analyse a number of different problems in different virtual environments. Each of these scenarios is described through a set of complexity measures and, as a result of this analysis, a subset of scenarios...
Measurement Methods in the field of benchmarking
Directory of Open Access Journals (Sweden)
István Szűts
2004-05-01
Full Text Available In benchmarking we often come across with parameters being difficultto measure while executing comparisons or analyzing performance, yet they haveto be compared and measured so as to be able to choose the best practices. Thesituation is similar in the case of complex, multidimensional evaluation as well,when the relative importance and order of different dimensions, parameters to beevaluated have to be determined or when the range of similar performanceindicators have to be decreased with regard to simpler comparisons. In suchcases we can use the ordinal or interval scales of measurement elaborated by S.S.Stevens.
Benchmarking research of steel companies in Europe
Directory of Open Access Journals (Sweden)
M. Antošová
2013-07-01
Full Text Available In present time steelworks are at a stage of permanent changes that are marked with still stronger competition pressure. Therefore managers must solve questions of how to decrease production costs, how to overcome competition and how to survive in the world market. Still more attention should be paid to the modern managerial methods of market research and comparison with competition. Benchmarking research is one of the effective tools for such research. The goal of this contribution is to compare chosen steelworks and to indicate new directions for their development with the possibility of increasing the productivity of steel production.
Nuclear Reactions from Lattice QCD
Briceño, Raúl A; Luu, Thomas C
2014-01-01
One of the overarching goals of nuclear physics is to rigorously compute properties of hadronic systems directly from the fundamental theory of strong interactions, Quantum Chromodynamics (QCD). In particular, the hope is to perform reliable calculations of nuclear reactions which will impact our understanding of environments that occur during big bang nucleosynthesis, the evolution of stars and supernovae, and within nuclear reactors and high energy/density facilities. Such calculations, being truly ab initio, would include all two-nucleon and three- nucleon (and higher) interactions in a consistent manner. Currently, lattice QCD provides the only reliable option for performing calculations of some of the low- energy hadronic observables. With the aim of bridging the gap between lattice QCD and nuclear many-body physics, the Institute for Nuclear Theory held a workshop on Nuclear Reactions from Lattice QCD on March 2013. In this review article, we report on the topics discussed in this workshop and the path ...
Quantum Gravity on the Lattice
Hamber, Herbert W
2009-01-01
I review the lattice approach to quantum gravity, and how it relates to the non-trivial ultraviolet fixed point scenario of the continuum theory. After a brief introduction covering the general problem of ultraviolet divergences in gravity and other non-renormalizable theories, I cover the general methods and goals of the lattice approach. An underlying theme is an attempt at establishing connections between the continuum renormalization group results, which are mainly based on diagrammatic perturbation theory, and the recent lattice results, which should apply to the strong gravity regime and are inherently non-perturbative. A second theme in this review is the ever-present natural correspondence between infrared methods of strongly coupled non-abelian gauge theories on the one hand, and the low energy approach to quantum gravity based on the renormalization group and universality of critical behavior on the other. Towards the end of the review I discuss possible observational consequences of path integral q...
Algebraic Lattices in QFT Renormalization
Borinsky, Michael
2016-07-01
The structure of overlapping subdivergences, which appear in the perturbative expansions of quantum field theory, is analyzed using algebraic lattice theory. It is shown that for specific QFTs the sets of subdivergences of Feynman diagrams form algebraic lattices. This class of QFTs includes the standard model. In kinematic renormalization schemes, in which tadpole diagrams vanish, these lattices are semimodular. This implies that the Hopf algebra of Feynman diagrams is graded by the coradical degree or equivalently that every maximal forest has the same length in the scope of BPHZ renormalization. As an application of this framework, a formula for the counter terms in zero-dimensional QFT is given together with some examples of the enumeration of primitive or skeleton diagrams.
Lattice Structures For Aerospace Applications
Del Olmo, E.; Grande, E.; Samartin, C. R.; Bezdenejnykh, M.; Torres, J.; Blanco, N.; Frovel, M.; Canas, J.
2012-07-01
The way of mass reduction improving performances in the aerospace structures is a constant and relevant challenge in the space business. The designs, materials and manufacturing processes are permanently in evolution to explore and get mass optimization solutions at low cost. In the framework of ICARO project, EADS CASA ESPACIO (ECE) has designed, manufactured and tested a technology demonstrator which shows that lattice type of grid structures is a promising weight saving solution for replacing some traditional metallic and composite structures for space applications. A virtual testing methodology was used in order to support the design of a high modulus CFRP cylindrical lattice technology demonstrator. The manufacturing process, based on composite Automatic Fiber Placement (AFP) technology developed by ECE, allows obtaining high quality low weight lattice structures potentially applicable to a wide range of aerospace structures. Launcher payload adaptors, satellite platforms, antenna towers or instrument supports are some promising candidates.
Kaon fluctuations from lattice QCD
Noronha-Hostler, Jacquelyn; Gunther, Jana; Parotto, Paolo; Pasztor, Attila; Vazquez, Israel Portillo; Ratti, Claudia
2016-01-01
We show that it is possible to isolate a set of kaon fluctuations in lattice QCD. By means of the Hadron Resonance Gas (HRG) model, we calculate the actual kaon second-to-first fluctuation ratio, which receives contribution from primordial kaons and resonance decays, and show that it is very close to the one obtained for primordial kaons in the Boltzmann approximation. The latter only involves the strangeness and electric charge chemical potentials, which are functions of $T$ and $\\mu_B$ due to the experimental constraint on strangeness and electric charge, and can therefore be calculated on the lattice. This provides an unambiguous method to extract the kaon freeze-out temperature, by comparing the lattice results to the experimental values for the corresponding fluctuations.
Hadron Structure on the Lattice
Can, K. U.; Kusno, A.; Mastropas, E. V.; Zanotti, J. M.
The aim of these lectures will be to provide an introduction to some of the concepts needed to study the structure of hadrons on the lattice. Topics covered include the electromagnetic form factors of the nucleon and pion, the nucleon's axial charge and moments of parton and generalised parton distribution functions. These are placed in a phenomenological context by describing how they can lead to insights into the distribution of charge, spin and momentum amongst a hadron's partonic constituents. We discuss the techniques required for extracting the relevant matrix elements from lattice simulations and draw attention to potential sources of systematic error. Examples of recent lattice results are presented and are compared with results from both experiment and theoretical models.
Flavor Physics and Lattice QCD
Bouchard, C M
2013-01-01
Our ability to resolve new physics effects is, largely, limited by the precision with which we calculate. The calculation of observables in the Standard (or a new physics) Model requires knowledge of associated hadronic contributions. The precision of such calculations, and therefore our ability to leverage experiment, is typically limited by hadronic uncertainties. The only first-principles method for calculating the nonperturbative, hadronic contributions is lattice QCD. Modern lattice calculations have controlled errors, are systematically improvable, and in some cases, are pushing the sub-percent level of precision. I outline the role played by, highlight state of the art efforts in, and discuss possible future directions of lattice calculations in flavor physics.
Lattice QCD for nuclear physics
Meyer, Harvey
2015-01-01
With ever increasing computational resources and improvements in algorithms, new opportunities are emerging for lattice gauge theory to address key questions in strongly interacting systems, such as nuclear matter. Calculations today use dynamical gauge-field ensembles with degenerate light up/down quarks and the strange quark and it is possible now to consider including charm-quark degrees of freedom in the QCD vacuum. Pion masses and other sources of systematic error, such as finite-volume and discretization effects, are beginning to be quantified systematically. Altogether, an era of precision calculation has begun, and many new observables will be calculated at the new computational facilities. The aim of this set of lectures is to provide graduate students with a grounding in the application of lattice gauge theory methods to strongly interacting systems, and in particular to nuclear physics. A wide variety of topics are covered, including continuum field theory, lattice discretizations, hadron spect...
A dynamically adaptive lattice Boltzmann method for thermal convection problems
Directory of Open Access Journals (Sweden)
Feldhusen Kai
2016-12-01
Full Text Available Utilizing the Boussinesq approximation, a double-population incompressible thermal lattice Boltzmann method (LBM for forced and natural convection in two and three space dimensions is developed and validated. A block-structured dynamic adaptive mesh refinement (AMR procedure tailored for the LBM is applied to enable computationally efficient simulations of moderate to high Rayleigh number flows which are characterized by a large scale disparity in boundary layers and free stream flow. As test cases, the analytically accessible problem of a two-dimensional (2D forced convection flow through two porous plates and the non-Cartesian configuration of a heated rotating cylinder are considered. The objective of the latter is to advance the boundary conditions for an accurate treatment of curved boundaries and to demonstrate the effect on the solution. The effectiveness of the overall approach is demonstrated for the natural convection benchmark of a 2D cavity with differentially heated walls at Rayleigh numbers from 103 up to 108. To demonstrate the benefit of the employed AMR procedure for three-dimensional (3D problems, results from the natural convection in a cubic cavity at Rayleigh numbers from 103 up to 105 are compared with benchmark results.
Nucleon structure from lattice QCD
Energy Technology Data Exchange (ETDEWEB)
Dinter, Simon
2012-11-13
In this thesis we compute within lattice QCD observables related to the structure of the nucleon. One part of this thesis is concerned with moments of parton distribution functions (PDFs). Those moments are essential elements for the understanding of nucleon structure and can be extracted from a global analysis of deep inelastic scattering experiments. On the theoretical side they can be computed non-perturbatively by means of lattice QCD. However, since the time lattice calculations of moments of PDFs are available, there is a tension between these lattice calculations and the results from a global analysis of experimental data. We examine whether systematic effects are responsible for this tension, and study particularly intensively the effects of excited states by a dedicated high precision computation. Moreover, we carry out a first computation with four dynamical flavors. Another aspect of this thesis is a feasibility study of a lattice QCD computation of the scalar quark content of the nucleon, which is an important element in the cross-section of a heavy particle with the nucleon mediated by a scalar particle (e.g. Higgs particle) and can therefore have an impact on Dark Matter searches. Existing lattice QCD calculations of this quantity usually have a large error and thus a low significance for phenomenological applications. We use a variance-reduction technique for quark-disconnected diagrams to obtain a precise result. Furthermore, we introduce a new stochastic method for the calculation of connected 3-point correlation functions, which are needed to compute nucleon structure observables, as an alternative to the usual sequential propagator method. In an explorative study we check whether this new method is competitive to the standard one. We use Wilson twisted mass fermions at maximal twist in all our calculations, such that all observables considered here have only O(a{sup 2}) discretization effects.
Nuclear Physics from Lattice QCD
Energy Technology Data Exchange (ETDEWEB)
William Detmold, Silas Beane, Konstantinos Orginos, Martin Savage
2011-01-01
We review recent progress toward establishing lattice Quantum Chromodynamics as a predictive calculational framework for nuclear physics. A survey of the current techniques that are used to extract low-energy hadronic scattering amplitudes and interactions is followed by a review of recent two-body and few-body calculations by the NPLQCD collaboration and others. An outline of the nuclear physics that is expected to be accomplished with Lattice QCD in the next decade, along with estimates of the required computational resources, is presented.
Chiral Fermions on the Lattice
Bietenholz, Wolfgang
2010-01-01
In the last century the non-perturbative regularization of chiral fermions was a long-standing problem. We review how this problem was finally overcome by the formulation of a modified but exact form of chiral symmetry on the lattice. This also provides a sound definition of the topological charge of lattice gauge configurations. We illustrate a variety of applications to QCD in the p-, the epsilon- and the delta-regime, where simulation results can now be related to Random Matrix Theory and Chiral Perturbation Theory. The latter contains Low Energy Constants as free parameters, and we comment on their evaluation from first principles of QCD.
Lattices, graphs, and Conway mutation
Greene, Joshua Evan
2011-01-01
The d-invariant of an integral, positive definite lattice L records the minimal norm of a characteristic covector in each equivalence class mod 2L. We prove that the 2-isomorphism type of a connected graph is determined by the d-invariant of its lattice of integral cuts (or flows). As an application, we prove that a reduced, alternating link diagram is determined up to mutation by the Heegaard Floer homology of the link's branched double-cover. Thus, alternating links with homeomorphic branched double-covers are mutants.
Graphene on graphene antidot lattices
DEFF Research Database (Denmark)
Gregersen, Søren Schou; Pedersen, Jesper Goor; Power, Stephen
2015-01-01
Graphene bilayer systems are known to exhibit a band gap when the layer symmetry is broken by applying a perpendicular electric field. The resulting band structure resembles that of a conventional semiconductor with a parabolic dispersion. Here, we introduce a bilayer graphene heterostructure......, where single-layer graphene is placed on top of another layer of graphene with a regular lattice of antidots. We dub this class of graphene systems GOAL: graphene on graphene antidot lattice. By varying the structure geometry, band-structure engineering can be performed to obtain linearly dispersing...
Unconventional superconductivity in honeycomb lattice
Directory of Open Access Journals (Sweden)
P Sahebsara
2013-03-01
Full Text Available The possibility of symmetrical s-wave superconductivity in the honeycomb lattice is studied within a strongly correlated regime, using the Hubbard model. The superconducting order parameter is defined by introducing the Green function, which is obtained by calculating the density of the electrons . In this study showed that the superconducting order parameter appears in doping interval between 0 and 0.5, and x=0.25 is the optimum doping for the s-wave superconductivity in honeycomb lattice.
Building with Benchmarks: The Role of the District in Philadelphia's Benchmark Assessment System
Bulkley, Katrina E.; Christman, Jolley Bruce; Goertz, Margaret E.; Lawrence, Nancy R.
2010-01-01
In recent years, interim assessments have become an increasingly popular tool in districts seeking to improve student learning and achievement. Philadelphia has been at the forefront of this change, implementing a set of Benchmark assessments aligned with its Core Curriculum district-wide in 2004. In this article, we examine the overall context…
Development of a California commercial building benchmarking database
Energy Technology Data Exchange (ETDEWEB)
Kinney, Satkartar; Piette, Mary Ann
2002-05-17
Building energy benchmarking is a useful starting point for commercial building owners and operators to target energy savings opportunities. There are a number of tools and methods for benchmarking energy use. Benchmarking based on regional data can provides more relevant information for California buildings than national tools such as Energy Star. This paper discusses issues related to benchmarking commercial building energy use and the development of Cal-Arch, a building energy benchmarking database for California. Currently Cal-Arch uses existing survey data from California's Commercial End Use Survey (CEUS), a largely underutilized wealth of information collected by California's major utilities. Doe's Commercial Building Energy Consumption Survey (CBECS) is used by a similar tool, Arch, and by a number of other benchmarking tools. Future versions of Arch/Cal-Arch will utilize additional data sources including modeled data and individual buildings to expand the database.
Transparency benchmarking on audio watermarks and steganography
Kraetzer, Christian; Dittmann, Jana; Lang, Andreas
2006-02-01
The evaluation of transparency plays an important role in the context of watermarking and steganography algorithms. This paper introduces a general definition of the term transparency in the context of steganography, digital watermarking and attack based evaluation of digital watermarking algorithms. For this purpose the term transparency is first considered individually for each of the three application fields (steganography, digital watermarking and watermarking algorithm evaluation). From the three results a general definition for the overall context is derived in a second step. The relevance and applicability of the definition given is evaluated in practise using existing audio watermarking and steganography algorithms (which work in time, frequency and wavelet domain) as well as an attack based evaluation suite for audio watermarking benchmarking - StirMark for Audio (SMBA). For this purpose selected attacks from the SMBA suite are modified by adding transparency enhancing measures using a psychoacoustic model. The transparency and robustness of the evaluated audio watermarking algorithms by using the original and modifid attacks are compared. The results of this paper show hat transparency benchmarking will lead to new information regarding the algorithms under observation and their usage. This information can result in concrete recommendations for modification, like the ones resulting from the tests performed here.
Simple mathematical law benchmarks human confrontations
Johnson, Neil F.; Medina, Pablo; Zhao, Guannan; Messinger, Daniel S.; Horgan, John; Gill, Paul; Bohorquez, Juan Camilo; Mattson, Whitney; Gangi, Devon; Qi, Hong; Manrique, Pedro; Velasquez, Nicolas; Morgenstern, Ana; Restrepo, Elvira; Johnson, Nicholas; Spagat, Michael; Zarama, Roberto
2013-12-01
Many high-profile societal problems involve an individual or group repeatedly attacking another - from child-parent disputes, sexual violence against women, civil unrest, violent conflicts and acts of terror, to current cyber-attacks on national infrastructure and ultrafast cyber-trades attacking stockholders. There is an urgent need to quantify the likely severity and timing of such future acts, shed light on likely perpetrators, and identify intervention strategies. Here we present a combined analysis of multiple datasets across all these domains which account for >100,000 events, and show that a simple mathematical law can benchmark them all. We derive this benchmark and interpret it, using a minimal mechanistic model grounded by state-of-the-art fieldwork. Our findings provide quantitative predictions concerning future attacks; a tool to help detect common perpetrators and abnormal behaviors; insight into the trajectory of a `lone wolf' identification of a critical threshold for spreading a message or idea among perpetrators; an intervention strategy to erode the most lethal clusters; and more broadly, a quantitative starting point for cross-disciplinary theorizing about human aggression at the individual and group level, in both real and online worlds.
BENCHMARKING LEARNER EDUCATION USING ONLINE BUSINESS SIMULATION
Directory of Open Access Journals (Sweden)
Alfred H. Miller
2016-06-01
Full Text Available For programmatic accreditation by the Accreditation Council of Business Schools and Programs (ACBSP, business programs are required to meet STANDARD #4, Measurement and Analysis of Student Learning and Performance. Business units must demonstrate that outcome assessment systems are in place using documented evidence that shows how the results are being used to further develop or improve the academic business program. The Higher Colleges of Technology, a 17 campus federal university in the United Arab Emirates, differentiates its applied degree programs through a ‘learning by doing ethos,’ which permeates the entire curricula. This paper documents benchmarking of education for managing innovation. Using business simulation for Bachelors of Business, Year 3 learners, in a business strategy class; learners explored through a simulated environment the following functional areas; research and development, production, and marketing of a technology product. Student teams were required to use finite resources and compete against other student teams in the same universe. The study employed an instrument developed in a 60-sample pilot study of business simulation learners against which subsequent learners participating in online business simulation could be benchmarked. The results showed incremental improvement in the program due to changes made in assessment strategies, including the oral defense.
Baseline and benchmark model development for hotels
Hooks, Edward T., Jr.
The hotel industry currently faces rising energy costs and requires the tools to maximize energy efficiency. In order to achieve this goal a clear definition of the current methods used to measure and monitor energy consumption is made. Uncovering the limitations to the most common practiced analysis strategies and presenting methods that can potentially overcome those limitations is the main purpose. Techniques presented can be used for measurement and verification of energy efficiency plans and retrofits. Also, modern energy modeling tool are introduced to demonstrate how they can be utilized for benchmarking and baseline models. This will provide the ability to obtain energy saving recommendations and parametric analysis to explore energy savings potential. These same energy models can be used in design decisions for new construction. An energy model is created of a resort style hotel that over one million square feet and has over one thousand rooms. A simulation and detailed analysis is performed on a hotel room. The planning process for creating the model and acquiring data from the hotel room to calibrate and verify the simulation will be explained. An explanation as to how this type of modeling can potentially be beneficial for future baseline and benchmarking strategies for the hotel industry. Ultimately the conclusion will address some common obstacles the hotel industry has in reaching their full potential of energy efficiency and how these techniques can best serve them.
Benchmarking database performance for genomic data.
Khushi, Matloob
2015-06-01
Genomic regions represent features such as gene annotations, transcription factor binding sites and epigenetic modifications. Performing various genomic operations such as identifying overlapping/non-overlapping regions or nearest gene annotations are common research needs. The data can be saved in a database system for easy management, however, there is no comprehensive database built-in algorithm at present to identify overlapping regions. Therefore I have developed a novel region-mapping (RegMap) SQL-based algorithm to perform genomic operations and have benchmarked the performance of different databases. Benchmarking identified that PostgreSQL extracts overlapping regions much faster than MySQL. Insertion and data uploads in PostgreSQL were also better, although general searching capability of both databases was almost equivalent. In addition, using the algorithm pair-wise, overlaps of >1000 datasets of transcription factor binding sites and histone marks, collected from previous publications, were reported and it was found that HNF4G significantly co-locates with cohesin subunit STAG1 (SA1).Inc.
EVA Health and Human Performance Benchmarking Study
Abercromby, A. F.; Norcross, J.; Jarvis, S. L.
2016-01-01
Multiple HRP Risks and Gaps require detailed characterization of human health and performance during exploration extravehicular activity (EVA) tasks; however, a rigorous and comprehensive methodology for characterizing and comparing the health and human performance implications of current and future EVA spacesuit designs does not exist. This study will identify and implement functional tasks and metrics, both objective and subjective, that are relevant to health and human performance, such as metabolic expenditure, suit fit, discomfort, suited postural stability, cognitive performance, and potentially biochemical responses for humans working inside different EVA suits doing functional tasks under the appropriate simulated reduced gravity environments. This study will provide health and human performance benchmark data for humans working in current EVA suits (EMU, Mark III, and Z2) as well as shirtsleeves using a standard set of tasks and metrics with quantified reliability. Results and methodologies developed during this test will provide benchmark data against which future EVA suits, and different suit configurations (eg, varied pressure, mass, CG) may be reliably compared in subsequent tests. Results will also inform fitness for duty standards as well as design requirements and operations concepts for future EVA suits and other exploration systems.
Multisensor benchmark data for riot control
Jäger, Uwe; Höpken, Marc; Dürr, Bernhard; Metzler, Jürgen; Willersinn, Dieter
2008-10-01
Quick and precise response is essential for riot squads when coping with escalating violence in crowds. Often it is just a single person, known as the leader of the gang, who instigates other people and thus is responsible of excesses. Putting this single person out of action in most cases leads to a de-escalating situation. Fostering de-escalations is one of the main tasks of crowd and riot control. To do so, extensive situation awareness is mandatory for the squads and can be promoted by technical means such as video surveillance using sensor networks. To develop software tools for situation awareness appropriate input data with well-known quality is needed. Furthermore, the developer must be able to measure algorithm performance and ongoing improvements. Last but not least, after algorithm development has finished and marketing aspects emerge, meeting of specifications must be proved. This paper describes a multisensor benchmark which exactly serves this purpose. We first define the underlying algorithm task. Then we explain details about data acquisition and sensor setup and finally we give some insight into quality measures of multisensor data. Currently, the multisensor benchmark described in this paper is applied to the development of basic algorithms for situational awareness, e.g. tracking of individuals in a crowd.
Remarks on a benchmark nonlinear constrained optimization problem
Institute of Scientific and Technical Information of China (English)
Luo Yazhong; Lei Yongjun; Tang Guojin
2006-01-01
Remarks on a benchmark nonlinear constrained optimization problem are made. Due to a citation error, two absolutely different results for the benchmark problem are obtained by independent researchers. Parallel simulated annealing using simplex method is employed in our study to solve the benchmark nonlinear constrained problem with mistaken formula and the best-known solution is obtained, whose optimality is testified by the Kuhn-Tucker conditions.
DEVELOPMENT OF A MARKET BENCHMARK PRICE FOR AGMAS PERFORMANCE EVALUATIONS
Good, Darrel L.; Irwin, Scott H.; Jackson, Thomas E.
1998-01-01
The purpose of this research report is to identify the appropriate market benchmark price to use to evaluate the pricing performance of market advisory services that are included in the annual AgMAS pricing performance evaluations. Five desirable properties of market benchmark prices are identified. Three potential specifications of the market benchmark price are considered: the average price received by Illinois farmers, the harvest cash price, and the average cash price over a two-year crop...
42 CFR 422.258 - Calculation of benchmarks.
2010-10-01
... 42 Public Health 3 2010-10-01 2010-10-01 false Calculation of benchmarks. 422.258 Section 422.258... and Plan Approval § 422.258 Calculation of benchmarks. (a) The term “MA area-specific non-drug monthly benchmark amount” means, for a month in a year: (1) For MA local plans with service areas entirely within...
Hospital Energy Benchmarking Guidance - Version 1.0
Energy Technology Data Exchange (ETDEWEB)
Singer, Brett C.
2009-09-08
This document describes an energy benchmarking framework for hospitals. The document is organized as follows. The introduction provides a brief primer on benchmarking and its application to hospitals. The next two sections discuss special considerations including the identification of normalizing factors. The presentation of metrics is preceded by a description of the overall framework and the rationale for the grouping of metrics. Following the presentation of metrics, a high-level protocol is provided. The next section presents draft benchmarks for some metrics; benchmarks are not available for many metrics owing to a lack of data. This document ends with a list of research needs for further development.
A Benchmark for Evaluating Moving Object Indexes
DEFF Research Database (Denmark)
Chen, Su; Jensen, Christian Søndergaard; Lin, Dan
2008-01-01
that targets techniques for the indexing of the current and near-future positions of moving objects. This benchmark enables the comparison of existing and future indexing techniques. It covers important aspects of such indexes that have not previously been covered by any benchmark. Notable aspects covered...... include update efficiency, query efficiency, concurrency control, and storage requirements. Next, the paper applies the benchmark to half a dozen notable moving-object indexes, thus demonstrating the viability of the benchmark and offering new insight into the performance properties of the indexes....
Benchmarking Central Banks in Latin America, 1990-2010
National Research Council Canada - National Science Library
Germán Alarco Tosoni
2013-01-01
This benchmarking exercise analyzes the effectiveness of central banks in Latin America between 1900 and 2010, considering the monetary authority's primary and secondary functions in the countries...
Jacquin, Hugo; Shakhnovich, Eugene; Cocco, Simona; Monasson, Rémi
2016-01-01
Inverse statistical approaches to determine protein structure and function from Multiple Sequence Alignments (MSA) are emerging as powerful tools in computational biology. However the underlying assumptions of the relationship between the inferred effective Potts Hamiltonian and real protein structure and energetics remain untested so far. Here we use lattice protein model (LP) to benchmark those inverse statistical approaches. We build MSA of highly stable sequences in target LP structures, and infer the effective pairwise Potts Hamiltonians from those MSA. We find that inferred Potts Hamiltonians reproduce many important aspects of 'true' LP structures and energetics. Careful analysis reveals that effective pairwise couplings in inferred Potts Hamiltonians depend not only on the energetics of the native structure but also on competing folds; in particular, the coupling values reflect both positive design (stabilization of native conformation) and negative design (destabilization of competing folds). In addi...
Hamiltonian monodromy as lattice defect
Zhilinskii, B.
2003-01-01
The analogy between monodromy in dynamical (Hamiltonian) systems and defects in crystal lattices is used in order to formulate some general conjectures about possible types of qualitative features of quantum systems which can be interpreted as a manifestation of classical monodromy in quantum finite particle (molecular) problems.
Triangles in a Lattice Parabola.
Sastry, K. R. S.
1991-01-01
Discussed are properties possessed by polygons inscribed in the lattice parabola y=x, including the area of a triangle, triangles of minimum area, conditions for right triangles, triangles whose area is the cube of an integer, and implications of Pick's Theorem. Further directions to pursue are suggested. (MDH)
Nucleon structure using lattice QCD
Energy Technology Data Exchange (ETDEWEB)
Alexandrou, C.; Kallidonis, C. [Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; The Cyprus Institute, Nicosia (Cyprus). Computational-Based Science and technology Research Center; Constantinou, M.; Hatziyiannakou, K. [Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; Drach, V. [DESY Zeuthen (Germany). John von Neumann-Institut fuer Computing NIC; Jansen, K. [DESY Zeuthen (Germany). John von Neumann-Institut fuer Computing NIC; Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; Koutsou, G.; Vaquero, A. [The Cyprus Institute, Nicosia (Cyprus). Computational-Based Science and technology Research Center; Leontiou, T. [Frederick Univ, Nicosia (Cyprus). General Dept.
2013-03-15
A review of recent nucleon structure calculations within lattice QCD is presented. The nucleon excited states, the axial charge, the isovector momentum fraction and helicity distribution are discussed, assessing the methods applied for their study, including approaches to evaluate the disconnected contributions. Results on the spin carried by the quarks in the nucleon are also presented.
Anisotropic dissipation in lattice metamaterials
Directory of Open Access Journals (Sweden)
Dimitri Krattiger
2016-12-01
Full Text Available Plane wave propagation in an elastic lattice material follows regular patterns as dictated by the nature of the lattice symmetry and the mechanical configuration of the unit cell. A unique feature pertains to the loss of elastodynamic isotropy at frequencies where the wavelength is on the order of the lattice spacing or shorter. Anisotropy may also be realized at lower frequencies with the inclusion of local resonators, especially when designed to exhibit directionally non-uniform connectivity and/or cross-sectional geometry. In this paper, we consider free and driven waves within a plate-like lattice−with and without local resonators−and examine the effects of damping on the isofrequency dispersion curves. We also examine, for free waves, the effects of damping on the frequency-dependent anisotropy of dissipation. Furthermore, we investigate the possibility of engineering the dissipation anisotropy by tuning the directional properties of the prescribed damping. The results demonstrate that uniformly applied damping tends to reduce the intensity of anisotropy in the isofrequency dispersion curves. On the other hand, lattice crystals and metamaterials are shown to provide an excellent platform for direction-dependent dissipation engineering which may be realized by simple changes in the spatial distribution of the damping elements.
Triangles in a Lattice Parabola.
Sastry, K. R. S.
1991-01-01
Discussed are properties possessed by polygons inscribed in the lattice parabola y=x, including the area of a triangle, triangles of minimum area, conditions for right triangles, triangles whose area is the cube of an integer, and implications of Pick's Theorem. Further directions to pursue are suggested. (MDH)
Lattice investigation of tetraquark candidates
Energy Technology Data Exchange (ETDEWEB)
Berlin, Joshua; Wagner, Marc [Goethe-Universitaet Frankfurt am Main, Institut fuer Theoretische Physik (Germany); Abdel-Rehim, Abdou; Alexandrou, Constantia; Gravina, Mario; Koutsou, Giannis [Department of Physics, University of Cyprus, Nicosia (Cyprus); Computation-based Science and Technology Research Center, Cyprus Institute, Nicosia (Cyprus); Dalla Brida, Mattia [School of Mathematics, Trinity College Dublin (Ireland)
2014-07-01
We present the status of an ongoing long-term lattice QCD project concerned with the study of light and heavy tetraquark candidates, using a variety of different creation operators. The computation of disconnected diagrams, which is technically challenging, is discussed in detail.
Decompressive craniectomy with lattice duraplasty.
Mitchell, P; Tseng, M; Mendelow, A D
2004-02-01
A method of opening dura for decompressive craniectomies is described. Numerous cuts intersecting in a lattice pattern allow the dura to expand in a gradual and controlled manner minimising the chances of cortical laceration or venous kinking on the craniectomy edge.
From lattice gases to polymers
Frenkel, D.
1990-01-01
The modification of a technique that was developed to study time correlations in lattice-gas cellular automata to facilitate the numerical simulation of chain molecules is described. As an example, the calculation of the excess chemical potential of an ideal polymer in a dense colloidal
Subwavelength vortical plasmonic lattice solitons.
Ye, Fangwei; Mihalache, Dumitru; Hu, Bambi; Panoiu, Nicolae C
2011-04-01
We present a theoretical study of vortical plasmonic lattice solitons, which form in two-dimensional arrays of metallic nanowires embedded into nonlinear media with both focusing and defocusing Kerr nonlinearities. Their existence, stability, and subwavelength spatial confinement are investigated in detail.
Beane, Silas
2009-01-01
Recent studies by the NPLQCD collaboration of hadronic interactions using lattice QCD are reviewed, with an emphasis on a recent calculation of meson-baryon scattering lengths. Ongoing high-statistics calculations of baryon interactions are also reviewed. In particular, new insights into the signal/noise problems that plague correlation functions involving baryons are discussed.
Hybrid Charmonium from Lattice QCD
Luo, X Q
2006-01-01
We review our recent results on the JPC = 0¡¡ exotic hybrid charmonium mass and JPC = 0¡+, 1¡¡ and 1++ nonexotic hybrid charmonium spectrum from anisotropic improved lattice QCD and discuss the relevance to the recent discovery of the Y(4260) state and future experimental search for other states.
Chiral fermions on the lattice
Jahn, O; Jahn, Oliver; Pawlowski, Jan M.
2002-01-01
We discuss topological obstructions to putting chiral fermions on an even dimensional lattice. The setting includes Ginsparg-Wilson fermions, but is more general. We prove a theorem which relates the total chirality to the difference of generalised winding numbers of chiral projection operators. For an odd number of Weyl fermions this implies that particles and anti-particles live in topologically different spaces.
High order spectral difference lattice Boltzmann method for incompressible hydrodynamics
Li, Weidong
2017-09-01
This work presents a lattice Boltzmann equation (LBE) based high order spectral difference method for incompressible flows. In the present method, the spectral difference (SD) method is adopted to discretize the convection and collision term of the LBE to obtain high order (≥3) accuracy. Because the SD scheme represents the solution as cell local polynomials and the solution polynomials have good tensor-product property, the present spectral difference lattice Boltzmann method (SD-LBM) can be implemented on arbitrary unstructured quadrilateral meshes for effective and efficient treatment of complex geometries. Thanks to only first oder PDEs involved in the LBE, no special techniques, such as hybridizable discontinuous Galerkin method (HDG), local discontinuous Galerkin method (LDG) and so on, are needed to discrete diffusion term, and thus, it simplifies the algorithm and implementation of the high order spectral difference method for simulating viscous flows. The proposed SD-LBM is validated with four incompressible flow benchmarks in two-dimensions: (a) the Poiseuille flow driven by a constant body force; (b) the lid-driven cavity flow without singularity at the two top corners-Burggraf flow; and (c) the unsteady Taylor-Green vortex flow; (d) the Blasius boundary-layer flow past a flat plate. Computational results are compared with analytical solutions of these cases and convergence studies of these cases are also given. The designed accuracy of the proposed SD-LBM is clearly verified.
Lattice Boltzmann methods for global linear instability analysis
Pérez, José Miguel; Aguilar, Alfonso; Theofilis, Vassilis
2016-11-01
Modal global linear instability analysis is performed using, for the first time ever, the lattice Boltzmann method (LBM) to analyze incompressible flows with two and three inhomogeneous spatial directions. Four linearization models have been implemented in order to recover the linearized Navier-Stokes equations in the incompressible limit. Two of those models employ the single relaxation time and have been proposed previously in the literature as linearization of the collision operator of the lattice Boltzmann equation. Two additional models are derived herein for the first time by linearizing the local equilibrium probability distribution function. Instability analysis results are obtained in three benchmark problems, two in closed geometries and one in open flow, namely the square and cubic lid-driven cavity flow and flow in the wake of the circular cylinder. Comparisons with results delivered by classic spectral element methods verify the accuracy of the proposed new methodologies and point potential limitations particular to the LBM approach. The known issue of appearance of numerical instabilities when the SRT model is used in direct numerical simulations employing the LBM is shown to be reflected in a spurious global eigenmode when the SRT model is used in the instability analysis. Although this mode is absent in the multiple relaxation times model, other spurious instabilities can also arise and are documented herein. Areas of potential improvements in order to make the proposed methodology competitive with established approaches for global instability analysis are discussed.
Orbital optical lattices with bosons
Kock, T.; Hippler, C.; Ewerbeck, A.; Hemmerich, A.
2016-02-01
This article provides a synopsis of our recent experimental work exploring Bose-Einstein condensation in metastable higher Bloch bands of optical lattices. Bipartite lattice geometries have allowed us to implement appropriate band structures, which meet three basic requirements: the existence of metastable excited states sufficiently protected from collisional band relaxation, a mechanism to excite the atoms initially prepared in the lowest band with moderate entropy increase, and the possibility of cross-dimensional tunneling dynamics, necessary to establish coherence along all lattice axes. A variety of bands can be selectively populated and a subsequent thermalization process leads to the formation of a condensate in the lowest energy state of the chosen band. As examples the 2nd, 4th and 7th bands in a bipartite square lattice are discussed. The geometry of the 2nd and 7th bands can be tuned such that two inequivalent energetically degenerate energy minima arise at the X ±-points at the edge of the 1st Brillouin zone. In this case even a small interaction energy is sufficient to lock the phase between the two condensation points such that a complex-valued chiral superfluid order parameter can emerge, which breaks time reversal symmetry. In the 4th band a condensate can be formed at the Γ-point in the center of the 1st Brillouin zone, which can be used to explore topologically protected band touching points. The new techniques to access orbital degrees of freedom in higher bands greatly extend the class of many-body scenarios that can be explored with bosons in optical lattices.
Hadron Structure and Spectrum from the Lattice
Lang, C B
2015-01-01
Lattice calculations for hadrons are now entering the domain of resonances and scattering, necessitating a better understanding of the observed discrete energy spectrum. This is a reviewing survey about recent lattice QCD results, with some emphasis on spectrum and scattering.
Gauge Fixing on the Lattice without Ambiguity
Vink, Jeroen C; 10.1016/0370-2693(92)91372-G
2009-01-01
A new gauge fixing condition is discussed, which is (lattice) rotation invariant, has the `smoothness' properties of the Landau gauge but can be efficiently computed and is unambiguous for almost all lattice gauge field configurations.
Turbo Lattices: Construction and Performance Analysis
Sakzad, Amin; Panario, Daniel
2011-01-01
In this paper a new class of lattices called turbo lattices is introduced and established. We use the lattice Construction $D$ to produce turbo lattices. This method needs a set of nested linear codes as its underlying structure. We benefit from turbo codes as our basis codes. Therefore, a set of nested turbo codes based on nested interleavers and nested convolutional codes is built. To this end, we employ both tail-biting and zero-tail convolutional codes. Using these codes, along with construction $D$, turbo lattices are created. Several properties of Construction $D$ lattices and fundamental characteristics of turbo lattices including the minimum distance, coding gain, kissing number and an upper bound on the probability of error under a maximum likelihood decoder over AWGN channel are investigated. Furthermore, a multi-stage turbo lattice decoding algorithm based on iterative turbo decoding algorithm is given. Finally, simulation experiments provide strong agreement with our theoretical results. More prec...
Chiral Four-Dimensional Heterotic Covariant Lattices
Beye, Florian
2014-01-01
In the covariant lattice formalism, chiral four-dimensional heterotic string vacua are obtained from certain even self-dual lattices which completely decompose into a left-mover and a right-mover lattice. The main purpose of this work is to classify all right-mover lattices that can appear in such a chiral model, and to study the corresponding left-mover lattices using the theory of lattice genera. In particular, the Smith-Minkowski-Siegel mass formula is employed to calculate a lower bound on the number of left-mover lattices. Also, the known relationship between asymmetric orbifolds and covariant lattices is considered in the context of our classification.
Energy Technology Data Exchange (ETDEWEB)
Erradi, L.; Htet, A.; Chakir, E
2001-06-01
This paper presents, the analysis of discrepancies between calculation and experiments in the prediction of isothermal moderator temperature coefficient of a set of lattice experiments, using WIMSD4, WIMSD5B and APOLLO1 lattice codes. In this analysis we have used the original cross sections libraries (CEA-86 library for APOLLO1 and both of 1981 and 1986 libraries for WIMS) and the updated ones based on JEF2.2 data for APOLLO code and both of ENDF/B6 and JEF2.2 data for WIMS code. We have also analysed the numerical benchmark proposed by Mosteller to evaluate the accuracy in predicting Doppler coefficient in light water type lattices. This study on Doppler coefficient was performed using, in addition to APOLLO and WIMS codes, the Monte Carlo code MCNP4B for which a new library based on ENDF/B6 nuclear data file, have been processed using the NJOY system.
Erradi, L; Chakir, E
2001-01-01
This paper presents, the analysis of discrepancies between calculation and experiments in the prediction of isothermal moderator temperature coefficient of a set of lattice experiments, using WIMSD4, WIMSD5B and APOLLO1 lattice codes. In this analysis we have used the original cross sections libraries (CEA-86 library for APOLLO1 and both of 1981 and 1986 libraries for WIMS) and the updated ones based on JEF2.2 data for APOLLO code and both of ENDF/B6 and JEF2.2 data for WIMS code. We have also analysed the numerical benchmark proposed by Mosteller to evaluate the accuracy in predicting Doppler coefficient in light water type lattices. This study on Doppler coefficient was performed using, in addition to APOLLO and WIMS codes, the Monte Carlo code MCNP4B for which a new library based on ENDF/B6 nuclear data file, have been processed using the NJOY system.
Distributive lattice orderings and Priestley duality
Krebs, Michel
2007-01-01
The ordering relation of a bounded distributive lattice L is a (distributive) (0, 1)-sublattice of L \\times L. This construction gives rise to a functor \\Phi from the category of bounded distributive lattices to itself. We examine the interaction of \\Phi with Priestley duality and characterise those bounded distributive lattices L such that there is a bounded distributive lattice K such that \\Phi(K) is (isomorphic to) L.
Rough Class on a Completely Distributive Lattice
Institute of Scientific and Technical Information of China (English)
陈德刚; 张文修; 宋士吉
2003-01-01
This paper generalizes the Pawlak rough set method to a completely distributive lattice. Theconcept of a rough set has many applications in data mining. The approximation operators on a completelydistributive lattice are studied, the rough class on a completely distributive lattice is defined and theexpressional theorems of the rough class are proven. These expressional theorems are used to prove that thecollection of all rough classes is an atomic completely distributive lattice.
Lattice Boltzmann method for the fractional advection-diffusion equation.
Zhou, J G; Haygarth, P M; Withers, P J A; Macleod, C J A; Falloon, P D; Beven, K J; Ockenden, M C; Forber, K J; Hollaway, M J; Evans, R; Collins, A L; Hiscock, K M; Wearing, C; Kahana, R; Villamizar Velez, M L
2016-04-01
Mass transport, such as movement of phosphorus in soils and solutes in rivers, is a natural phenomenon and its study plays an important role in science and engineering. It is found that there are numerous practical diffusion phenomena that do not obey the classical advection-diffusion equation (ADE). Such diffusion is called abnormal or superdiffusion, and it is well described using a fractional advection-diffusion equation (FADE). The FADE finds a wide range of applications in various areas with great potential for studying complex mass transport in real hydrological systems. However, solution to the FADE is difficult, and the existing numerical methods are complicated and inefficient. In this study, a fresh lattice Boltzmann method is developed for solving the fractional advection-diffusion equation (LabFADE). The FADE is transformed into an equation similar to an advection-diffusion equation and solved using the lattice Boltzmann method. The LabFADE has all the advantages of the conventional lattice Boltzmann method and avoids a complex solution procedure, unlike other existing numerical methods. The method has been validated through simulations of several benchmark tests: a point-source diffusion, a boundary-value problem of steady diffusion, and an initial-boundary-value problem of unsteady diffusion with the coexistence of source and sink terms. In addition, by including the effects of the skewness β, the fractional order α, and the single relaxation time τ, the accuracy and convergence of the method have been assessed. The numerical predictions are compared with the analytical solutions, and they indicate that the method is second-order accurate. The method presented will allow the FADE to be more widely applied to complex mass transport problems in science and engineering.
Lattice Boltzmann formulation for conjugate heat transfer in heterogeneous media.
Karani, Hamid; Huber, Christian
2015-02-01
In this paper, we propose an approach for studying conjugate heat transfer using the lattice Boltzmann method (LBM). The approach is based on reformulating the lattice Boltzmann equation for solving the conservative form of the energy equation. This leads to the appearance of a source term, which introduces the jump conditions at the interface between two phases or components with different thermal properties. The proposed source term formulation conserves conductive and advective heat flux simultaneously, which makes it suitable for modeling conjugate heat transfer in general multiphase or multicomponent systems. The simple implementation of the source term approach avoids any correction of distribution functions neighboring the interface and provides an algorithm that is independent from the topology of the interface. Moreover, our approach is independent of the choice of lattice discretization and can be easily applied to different advection-diffusion LBM solvers. The model is tested against several benchmark problems including steady-state convection-diffusion within two fluid layers with parallel and normal interfaces with respect to the flow direction, unsteady conduction in a three-layer stratified domain, and steady conduction in a two-layer annulus. The LBM results are in excellent agreement with analytical solution. Error analysis shows that our model is first-order accurate in space, but an extension to a second-order scheme is straightforward. We apply our LBM model to heat transfer in a two-component heterogeneous medium with a random microstructure. This example highlights that the method we propose is independent of the topology of interfaces between the different phases and, as such, is ideally suited for complex natural heterogeneous media. We further validate the present LBM formulation with a study of natural convection in a porous enclosure. The results confirm the reliability of the model in simulating complex coupled fluid and thermal dynamics
Lattice Boltzmann method for the fractional advection-diffusion equation
Zhou, J. G.; Haygarth, P. M.; Withers, P. J. A.; Macleod, C. J. A.; Falloon, P. D.; Beven, K. J.; Ockenden, M. C.; Forber, K. J.; Hollaway, M. J.; Evans, R.; Collins, A. L.; Hiscock, K. M.; Wearing, C.; Kahana, R.; Villamizar Velez, M. L.
2016-04-01
Mass transport, such as movement of phosphorus in soils and solutes in rivers, is a natural phenomenon and its study plays an important role in science and engineering. It is found that there are numerous practical diffusion phenomena that do not obey the classical advection-diffusion equation (ADE). Such diffusion is called abnormal or superdiffusion, and it is well described using a fractional advection-diffusion equation (FADE). The FADE finds a wide range of applications in various areas with great potential for studying complex mass transport in real hydrological systems. However, solution to the FADE is difficult, and the existing numerical methods are complicated and inefficient. In this study, a fresh lattice Boltzmann method is developed for solving the fractional advection-diffusion equation (LabFADE). The FADE is transformed into an equation similar to an advection-diffusion equation and solved using the lattice Boltzmann method. The LabFADE has all the advantages of the conventional lattice Boltzmann method and avoids a complex solution procedure, unlike other existing numerical methods. The method has been validated through simulations of several benchmark tests: a point-source diffusion, a boundary-value problem of steady diffusion, and an initial-boundary-value problem of unsteady diffusion with the coexistence of source and sink terms. In addition, by including the effects of the skewness β , the fractional order α , and the single relaxation time τ , the accuracy and convergence of the method have been assessed. The numerical predictions are compared with the analytical solutions, and they indicate that the method is second-order accurate. The method presented will allow the FADE to be more widely applied to complex mass transport problems in science and engineering.
SCALE Modeling of Selected Neutronics Test Problems within the OECD UAM LWR’s Benchmark
Directory of Open Access Journals (Sweden)
Luigi Mercatali
2013-01-01
Full Text Available The OECD UAM Benchmark was launched in 2005 with the objective of determining the uncertainty in the simulation of Light Water Reactors (LWRs system calculations at all the stages of the coupled reactor physics—thermal hydraulics modeling. Within the framework of the “Neutronics Phase” of the Benchmark the solutions of some selected test cases at the cell physics and lattice physics levels are presented. The SCALE 6.1 code package has been used for the neutronics modeling of the selected exercises. Sensitivity and Uncertainty analysis (S/U based on the generalized perturbation theory has been performed in order to assess the uncertainty of the computation of some selected reactor integral parameters due to the uncertainty in the basic nuclear data. As a general trend, it has been found that the main sources of uncertainty are the 238U (n, and the 239Pu nubar for the UOX- and the MOX-fuelled test cases, respectively. Moreover, the reference solutions for the test cases obtained using Monte Carlo methodologies together with a comparison between deterministic and stochastic solutions are presented.
Plasma Waves as a Benchmark Problem
Kilian, Patrick; Schreiner, Cedric; Spanier, Felix
2016-01-01
A large number of wave modes exist in a magnetized plasma. Their properties are determined by the interaction of particles and waves. In a simulation code, the correct treatment of field quantities and particle behavior is essential to correctly reproduce the wave properties. Consequently, plasma waves provide test problems that cover a large fraction of the simulation code. The large number of possible wave modes and the freedom to choose parameters make the selection of test problems time consuming and comparison between different codes difficult. This paper therefore aims to provide a selection of test problems, based on different wave modes and with well defined parameter values, that is accessible to a large number of simulation codes to allow for easy benchmarking and cross validation. Example results are provided for a number of plasma models. For all plasma models and wave modes that are used in the test problems, a mathematical description is provided to clarify notation and avoid possible misunderst...
Numerical simulation of the RAMAC benchmark test
Energy Technology Data Exchange (ETDEWEB)
Leblanc, J.E.; Sugihara, M.; Fujiwara, T. [Nagoya Univ. (Japan). Dept. of Aerospace Engineering; Nusca, M. [Nagoya Univ. (Japan). Dept. of Aerospace Engineering; U.S. Army Research Lab., Ballistics and Weapons Concepts Div., AMSRL-WM-BE, Aberdeen Proving Ground, MD (United States); Wang, X. [Nagoya Univ. (Japan). Dept. of Aerospace Engineering; School of Mechanical and Production Engineering, Nanyang Technological Univ. (Singapore); Seiler, F. [Nagoya Univ. (Japan). Dept. of Aerospace Engineering; French-German Research Inst. of Saint-Louis, ISL, Saint-Louis (France)
2000-11-01
Numerical simulations of the same ramac geometry and boundary conditions by different numerical and physical models highlight the variety of solutions possible and the strong effect of the chemical kinetics model on the solution. The benchmark test was defined and announced within the community of ramac researchers. Three laboratories undertook the project. The numerical simulations include Navier-Stokes and Euler simulations with various levels of physical models and equations of state. The non-reactive part of the simulation produced similar steady state results in the three simulations. The chemically reactive part of the simulation produced widely different outcomes. The original experimental data and experimental conditions are presented. A description of each computer code and the resulting flowfield is included. A comparison between codes and results is achieved. The most critical choice for the simulation was the chemical kinetics model. (orig.)
Development of solutions to benchmark piping problems
Energy Technology Data Exchange (ETDEWEB)
Reich, M; Chang, T Y; Prachuktam, S; Hartzman, M
1977-12-01
Benchmark problems and their solutions are presented. The problems consist in calculating the static and dynamic response of selected piping structures subjected to a variety of loading conditions. The structures range from simple pipe geometries to a representative full scale primary nuclear piping system, which includes the various components and their supports. These structures are assumed to behave in a linear elastic fashion only, i.e., they experience small deformations and small displacements with no existing gaps, and remain elastic through their entire response. The solutions were obtained by using the program EPIPE, which is a modification of the widely available program SAP IV. A brief outline of the theoretical background of this program and its verification is also included.
FRIB driver linac vacuum model and benchmarks
Durickovic, Bojan; Kersevan, Roberto; Machicoane, Guillaume
2014-01-01
The Facility for Rare Isotope Beams (FRIB) is a superconducting heavy-ion linear accelerator that is to produce rare isotopes far from stability for low energy nuclear science. In order to achieve this, its driver linac needs to achieve a very high beam current (up to 400 kW beam power), and this requirement makes vacuum levels of critical importance. Vacuum calculations have been carried out to verify that the vacuum system design meets the requirements. The modeling procedure was benchmarked by comparing models of an existing facility against measurements. In this paper, we present an overview of the methods used for FRIB vacuum calculations and simulation results for some interesting sections of the accelerator. (C) 2013 Elsevier Ltd. All rights reserved.
NASA Indexing Benchmarks: Evaluating Text Search Engines
Esler, Sandra L.; Nelson, Michael L.
1997-01-01
The current proliferation of on-line information resources underscores the requirement for the ability to index collections of information and search and retrieve them in a convenient manner. This study develops criteria for analytically comparing the index and search engines and presents results for a number of freely available search engines. A product of this research is a toolkit capable of automatically indexing, searching, and extracting performance statistics from each of the focused search engines. This toolkit is highly configurable and has the ability to run these benchmark tests against other engines as well. Results demonstrate that the tested search engines can be grouped into two levels. Level one engines are efficient on small to medium sized data collections, but show weaknesses when used for collections 100MB or larger. Level two search engines are recommended for data collections up to and beyond 100MB.
Shielding integral benchmark archive and database (SINBAD)
Energy Technology Data Exchange (ETDEWEB)
Kirk, B.L.; Grove, R.E. [Radiation Safety Information Computational Center RSICC, Oak Ridge National Laboratory, P.O. Box 2008, Oak Ridge, TN 37831-6171 (United States); Kodeli, I. [Josef Stefan Inst., Jamova 39, 1000 Ljubljana (Slovenia); Gulliford, J.; Sartori, E. [OECD NEA Data Bank, Bd des Iles, 92130 Issy-les-Moulineaux (France)
2011-07-01
The shielding integral benchmark archive and database (SINBAD) collection of experiments descriptions was initiated in the early 1990s. SINBAD is an international collaboration between the Organization for Economic Cooperation and Development's Nuclear Energy Agency Data Bank (OECD/NEADB) and the Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL). SINBAD was designed to compile experiments and corresponding computational models with the goal of preserving institutional knowledge and expertise that need to be handed down to future scientists. SINBAD can serve as a learning tool for university students and scientists who need to design experiments or gain expertise in modeling and simulation. The SINBAD database is currently divided into three categories - fission, fusion, and accelerator experiments. Many experiments are described and analyzed using deterministic or stochastic (Monte Carlo) radiation transport software. The nuclear cross sections also play an important role as they are necessary in performing computational analysis. (authors)
Benchmarking management practices in Australian public healthcare.
Agarwal, Renu; Green, Roy; Agarwal, Neeru; Randhawa, Krithika
2016-01-01
The purpose of this paper is to investigate the quality of management practices of public hospitals in the Australian healthcare system, specifically those in the state-managed health systems of Queensland and New South Wales (NSW). Further, the authors assess the management practices of Queensland and NSW public hospitals jointly and globally benchmark against those in the health systems of seven other countries, namely, USA, UK, Sweden, France, Germany, Italy and Canada. In this study, the authors adapt the unique and globally deployed Bloom et al. (2009) survey instrument that uses a "double blind, double scored" methodology and an interview-based scoring grid to measure and internationally benchmark the management practices in Queensland and NSW public hospitals based on 21 management dimensions across four broad areas of management - operations, performance monitoring, targets and people management. The findings reveal the areas of strength and potential areas of improvement in the Queensland and NSW Health hospital management practices when compared with public hospitals in seven countries, namely, USA, UK, Sweden, France, Germany, Italy and Canada. Together, Queensland and NSW Health hospitals perform best in operations management followed by performance monitoring. While target management presents scope for improvement, people management is the sphere where these Australian hospitals lag the most. This paper is of interest to both hospital administrators and health care policy-makers aiming to lift management quality at the hospital level as well as at the institutional level, as a vehicle to consistently deliver sustainable high-quality health services. This study provides the first internationally comparable robust measure of management capability in Australian public hospitals, where hospitals are run independently by the state-run healthcare systems. Additionally, this research study contributes to the empirical evidence base on the quality of
Ground truth and benchmarks for performance evaluation
Takeuchi, Ayako; Shneier, Michael; Hong, Tsai Hong; Chang, Tommy; Scrapper, Christopher; Cheok, Geraldine S.
2003-09-01
Progress in algorithm development and transfer of results to practical applications such as military robotics requires the setup of standard tasks, of standard qualitative and quantitative measurements for performance evaluation and validation. Although the evaluation and validation of algorithms have been discussed for over a decade, the research community still faces a lack of well-defined and standardized methodology. The range of fundamental problems include a lack of quantifiable measures of performance, a lack of data from state-of-the-art sensors in calibrated real-world environments, and a lack of facilities for conducting realistic experiments. In this research, we propose three methods for creating ground truth databases and benchmarks using multiple sensors. The databases and benchmarks will provide researchers with high quality data from suites of sensors operating in complex environments representing real problems of great relevance to the development of autonomous driving systems. At NIST, we have prototyped a High Mobility Multi-purpose Wheeled Vehicle (HMMWV) system with a suite of sensors including a Riegl ladar, GDRS ladar, stereo CCD, several color cameras, Global Position System (GPS), Inertial Navigation System (INS), pan/tilt encoders, and odometry . All sensors are calibrated with respect to each other in space and time. This allows a database of features and terrain elevation to be built. Ground truth for each sensor can then be extracted from the database. The main goal of this research is to provide ground truth databases for researchers and engineers to evaluate algorithms for effectiveness, efficiency, reliability, and robustness, thus advancing the development of algorithms.
Benchmarking analogue models of brittle thrust wedges
Schreurs, Guido; Buiter, Susanne J. H.; Boutelier, Jennifer; Burberry, Caroline; Callot, Jean-Paul; Cavozzi, Cristian; Cerca, Mariano; Chen, Jian-Hong; Cristallini, Ernesto; Cruden, Alexander R.; Cruz, Leonardo; Daniel, Jean-Marc; Da Poian, Gabriela; Garcia, Victor H.; Gomes, Caroline J. S.; Grall, Céline; Guillot, Yannick; Guzmán, Cecilia; Hidayah, Triyani Nur; Hilley, George; Klinkmüller, Matthias; Koyi, Hemin A.; Lu, Chia-Yu; Maillot, Bertrand; Meriaux, Catherine; Nilfouroushan, Faramarz; Pan, Chang-Chih; Pillot, Daniel; Portillo, Rodrigo; Rosenau, Matthias; Schellart, Wouter P.; Schlische, Roy W.; Take, Andy; Vendeville, Bruno; Vergnaud, Marine; Vettori, Matteo; Wang, Shih-Hsien; Withjack, Martha O.; Yagupsky, Daniel; Yamada, Yasuhiro
2016-11-01
We performed a quantitative comparison of brittle thrust wedge experiments to evaluate the variability among analogue models and to appraise the reproducibility and limits of model interpretation. Fifteen analogue modeling laboratories participated in this benchmark initiative. Each laboratory received a shipment of the same type of quartz and corundum sand and all laboratories adhered to a stringent model building protocol and used the same type of foil to cover base and sidewalls of the sandbox. Sieve structure, sifting height, filling rate, and details on off-scraping of excess sand followed prescribed procedures. Our analogue benchmark shows that even for simple plane-strain experiments with prescribed stringent model construction techniques, quantitative model results show variability, most notably for surface slope, thrust spacing and number of forward and backthrusts. One of the sources of the variability in model results is related to slight variations in how sand is deposited in the sandbox. Small changes in sifting height, sifting rate, and scraping will result in slightly heterogeneous material bulk densities, which will affect the mechanical properties of the sand, and will result in lateral and vertical differences in peak and boundary friction angles, as well as cohesion values once the model is constructed. Initial variations in basal friction are inferred to play the most important role in causing model variability. Our comparison shows that the human factor plays a decisive role, and even when one modeler repeats the same experiment, quantitative model results still show variability. Our observations highlight the limits of up-scaling quantitative analogue model results to nature or for making comparisons with numerical models. The frictional behavior of sand is highly sensitive to small variations in material state or experimental set-up, and hence, it will remain difficult to scale quantitative results such as number of thrusts, thrust spacing
Benchmarking Competitiveness: Is America's Technological Hegemony Waning?
Lubell, Michael S.
2006-03-01
For more than half a century, by almost every standard, the United States has been the world's leader in scientific discovery, innovation and technological competitiveness. To a large degree, that dominant position stemmed from the circumstances our nation inherited at the conclusion of the World War Two: we were, in effect, the only major nation left standing that did not have to repair serious war damage. And we found ourselves with an extraordinary science and technology base that we had developed for military purposes. We had the laboratories -- industrial, academic and government -- as well as the scientific and engineering personnel -- many of them immigrants who had escaped from war-time Europe. What remained was to convert the wartime machinery into peacetime uses. We adopted private and public policies that accomplished the transition remarkably well, and we have prospered ever since. Our higher education system, our protection of intellectual property rights, our venture capital system, our entrepreneurial culture and our willingness to commit government funds for the support of science and engineering have been key components to our success. But recent competitiveness benchmarks suggest that our dominance is waning rapidly, in part because other nations have begun to emulate our successful model, in part because globalization has ``flattened'' the world and in part because we have been reluctant to pursue the public policies that are necessary to ensure our leadership. We will examine these benchmarks and explore the policy changes that are needed to keep our nation's science and technology enterprise vibrant and our economic growth on an upward trajectory.
Benchmarking homogenization algorithms for monthly data
Directory of Open Access Journals (Sweden)
V. K. C. Venema
2012-01-01
Full Text Available The COST (European Cooperation in Science and Technology Action ES0601: advances in homogenization methods of climate series: an integrated approach (HOME has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative. The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random independent break-type inhomogeneities with normally distributed breakpoint sizes were added to the simulated datasets. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide trend was added.
Participants provided 25 separate homogenized contributions as part of the blind study. After the deadline at which details of the imposed inhomogeneities were revealed, 22 additional solutions were submitted. These homogenized datasets were assessed by a number of performance metrics including (i the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii the error in linear trend estimates and (iii traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve
Modified $U(1)$ lattice gauge theory towards realistic lattice QED
Bornyakov, V G; Müller-Preussker, M
1992-01-01
We study properties of the compact $~4D~$ $U(1)$ lattice gauge theory with monopoles {\\it removed}. Employing Monte Carlo simulations we calculate correlators of scalar, vector and tensor operators at zero and nonzero momenta $~\\vec{p}~$. We confirm that the theory without monopoles has no phase transition, at least, in the interval $~0 < \\beta \\leq 2~$. There the photon becomes massless and fits the lattice free field theory dispersion relation very well. The energies of the $~0^{++}~$, $~1^{+-}~$ and $~2^{++}~$ states show a rather weak dependence on the coupling in the interval of $~\\beta~$ investigated, and their ratios are practically constant. We show also a further modification of the theory suppressing the negative plaquettes to improve drastically the overlap with the lowest states (at least, for $~J=1$).
The Developement of A Lattice Structured Database
DEFF Research Database (Denmark)
Bruun, Hans
to a given set of inserted terms, that is the smallest lattice where the inserted terms preserve their value compared to the value in the initial algebra/lattice. The database is the dual representation of this most disjoint lattice. We develop algorithms to construct and make queries to the database....
Lattice Boltzmann solver of Rossler equation
Institute of Scientific and Technical Information of China (English)
GuangwuYAN; LiRUAN
2000-01-01
We proposed a lattice Boltzmann model for the Rossler equation. Using a method of multiscales in the lattice Boltzmann model, we get the diffusion reaction as a special case. If the diffusion effect disappeared, we can obtain the lattice Boltzmann solution of the Rossler equation on the mesescopic scale. The numerical results show the method can be used to simulate Rossler equation.
Clar sextets in square graphene antidot lattices
DEFF Research Database (Denmark)
Petersen, Rene; Pedersen, Thomas Garm; Jauho, Antti-Pekka
2011-01-01
A periodic array of holes transforms graphene from a semimetal into a semiconductor with a band gap tuneable by varying the parameters of the lattice. In earlier work only hexagonal lattices have been treated. Using atomistic models we here investigate the size of the band gap of a square lattice...
Modal analysis of kagome-lattice structures
Perez, H.; Blakley, S.; Zheltikov, A. M.
2015-05-01
The first few lowest order circularly symmetric electromagnetic eigenmodes of a full kagome lattice are compared to those of a kagome lattice with a hexagonal defect. This analysis offers important insights into the physics behind the waveguiding properties of hollow-core fibers with a kagome-lattice cladding.
Perfect and Quasi-Perfect Lattice Actions
Bietenholz, W
1998-01-01
Perfect lattice actions are exiting with several respects: they provide new insight into conceptual questions of the lattice regularization, and quasi-perfect actions could enable a great leap forward in the non-perturbative solution of QCD. We try to transmit a flavor of them, also beyond the lattice community.
DEFF Research Database (Denmark)
Fajstrup, Lisbeth
The set of d-structures on a topological space form a lattice and in fact a locale. There is a Galois connection between the lattice of subsets of the space and the lattice of d-structures. Variation of the d-structures induces change in the spaces of directed paths. Hence variation of d...
SIMPLE LATTICE BOLTZMANN MODEL FOR TRAFFIC FLOWS
Institute of Scientific and Technical Information of China (English)
Yan Guangwu; Hu Shouxin
2000-01-01
A lattice Boltzmann model with 5-bit lattice for traffic flows is proposed.Using the Chapman-Enskog expansion and multi-scale technique,we obtain the higher-order moments of equilibrium distribution function.A simple traffic light problem is simulated by using the present lattice Boltzmann model,and the result agrees well with analytical solution.
Exact Chiral Symmetry on the Lattice
Neuberger, H
2001-01-01
Developments during the last eight years have refuted the folklore that chiral symmetries cannot be preserved on the lattice. The mechanism that permits chiral symmetry to coexist with the lattice is quite general and may work in Nature as well. The reconciliation between chiral symmetry and the lattice is likely to revolutionize the field of numerical QCD.
Mayet-Godowski Hilbert Lattice Equations
Megill, Norman D.; Pavicic, Mladen
2006-01-01
Several new results in the field of Hilbert lattice equations based on states defined on the lattice as well as novel techniques used to arrive at these results are presented. An open problem of Mayet concerning Hilbert lattice equations based on Hilbert-space-valued states is answered.
Lattice gaugefixing and other optics in lattice gauge theory
Energy Technology Data Exchange (ETDEWEB)
Yee, Ken
1992-06-01
We present results from four projects. In the first, quark and gluon propagators and effective masses and {Delta}I = 1/2 Rule operator matching coefficients are computed numerically in gaugefixed lattice QCD. In the second, the same quantities are evaluated analytically in the strong coupling, N {yields} {infinity} limit. In the third project, the Schwinger model is studied in covariant gauges, where we show that the effective electron mass varies with the gauge parameter and that longitudinal gaugefixing ambiguities affect operator product expansion coefficients (analogous to {Delta}I = 1/2 Rule matching coefficients) determined by matching gauge variant matrix elements. However, we find that matching coefficients even if shifted by the unphysical modes are {xi} invariant. In the fourth project, we show that the strong coupling parallelogram lattice Schwinger model as a different thermodynamic limit than the weak coupling continuum limit. As a function of lattice skewness angle these models span the {Delta} = {minus}1 critical line of 6-vertex models which, in turn, have been identified as c = 1 conformal field theories.
Lattice gaugefixing and other optics in lattice gauge theory
Energy Technology Data Exchange (ETDEWEB)
Yee, Ken.
1992-06-01
We present results from four projects. In the first, quark and gluon propagators and effective masses and {Delta}I = 1/2 Rule operator matching coefficients are computed numerically in gaugefixed lattice QCD. In the second, the same quantities are evaluated analytically in the strong coupling, N {yields} {infinity} limit. In the third project, the Schwinger model is studied in covariant gauges, where we show that the effective electron mass varies with the gauge parameter and that longitudinal gaugefixing ambiguities affect operator product expansion coefficients (analogous to {Delta}I = 1/2 Rule matching coefficients) determined by matching gauge variant matrix elements. However, we find that matching coefficients even if shifted by the unphysical modes are {xi} invariant. In the fourth project, we show that the strong coupling parallelogram lattice Schwinger model as a different thermodynamic limit than the weak coupling continuum limit. As a function of lattice skewness angle these models span the {Delta} = {minus}1 critical line of 6-vertex models which, in turn, have been identified as c = 1 conformal field theories.
Evaluation of PWR and BWR pin cell benchmark results
Energy Technology Data Exchange (ETDEWEB)
Pijlgroms, B.J.; Gruppelaar, H.; Janssen, A.J. (Netherlands Energy Research Foundation (ECN), Petten (Netherlands)); Hoogenboom, J.E.; Leege, P.F.A. de (Interuniversitair Reactor Inst., Delft (Netherlands)); Voet, J. van der (Gemeenschappelijke Kernenergiecentrale Nederland NV, Dodewaard (Netherlands)); Verhagen, F.C.M. (Keuring van Electrotechnische Materialen NV, Arnhem (Netherlands))
1991-12-01
Benchmark results of the Dutch PINK working group on PWR and BWR pin cell calculational benchmark as defined by EPRI are presented and evaluated. The observed discrepancies are problem dependent: a part of the results is satisfactory, some other results require further analysis. A brief overview is given of the different code packages used in this analysis. (author). 14 refs., 9 figs., 30 tabs.
BIM quickscan: benchmark of BIM performance in the Netherlands
Berlo, L.A.H.M. van; Dijkmans, T.J.A.; Hendriks, H.; Spekkink, D.; Pel, W.
2012-01-01
In 2009 a “BIM QuickScan” for benchmarking BIM performance was created in the Netherlands (Sebastian, Berlo 2010). This instrument aims to provide insight into the current BIM performance of a company. The benchmarking instrument combines quantitative and qualitative assessments of the ‘hard’ and ‘s
The Snowmass Points and Slopes Benchmarks for SUSY Searches
Allanach, Benjamin C; Blair, G A; Carena, M S; de Roeck, A; Dedes, A; Djouadi, Abdelhak; Gerdes, D W; Ghodbane, N; Gunion, J F; Haber, Howard E; Han, T; Heinemeyer, S; Hewett, J L; Hinchliffe, Ian; Kalinowski, Jan; Logan, H E; Martin, S P; Martyn, H U; Matchev, K T; Moretti, S; Moortgat, F; Moortgat-Pick, G; Mrenna, S; Nauenberg, U; Okada, Y; Olive, Keith A; Porod, Werner; Schmitt, M; Su, S; Wagner, C E M; Weiglein, Georg; Wells, J; Wilson, G W; Zerwas, Peter M
2002-01-01
The ``Snowmass Points and Slopes'' (SPS) are a set of benchmark points and parameter lines in the MSSM parameter space corresponding to different scenarios in the search for Supersymmetry at present and future experiments. This set of benchmarks was agreed upon at the 2001 ``Snowmass Workshop on the Future of Particle Physics'' as a consensus based on different existing proposals.