WorldWideScience

Sample records for dynamical mass benchmark

  1. DYNAMICAL MASS OF THE SUBSTELLAR BENCHMARK BINARY HD 130948BC , ,

    International Nuclear Information System (INIS)

    Dupuy, Trent J.; Liu, Michael C.; Ireland, Michael J.

    2009-01-01

    We present Keck adaptive optics imaging of the L4+L4 binary HD 130948BC along with archival Hubble Space Telescope and Gemini North observations, which together span ∼ 70% of the binary's orbital period. From the relative orbit, we determine a total dynamical mass of 0.109 ± 0.003 M sun (114 ± 3 M Jup ). The flux ratio of HD 130948BC is near unity, so both components are unambiguously substellar for any plausible mass ratio. An independent constraint on the age of the system is available from the primary HD 130948A (G2V, [M/H] = 0.0). The ensemble of available indicators suggests an age comparable to Hyades, with the most precise age being 0.79 +0.22 -0.15 Gyr based on gyrochronology. Therefore, HD 130948BC is now a unique benchmark among field L and T dwarfs, with a well-determined mass, luminosity, and age. We find that substellar theoretical models disagree with our observations. (1) Both components of HD 130948BC appear to be overluminous by a factor of ∼ 2-3 times compared to evolutionary models. The age of the system would have to be notably younger than the gyro age to ameliorate the luminosity disagreement. (2) Effective temperatures derived from evolutionary models for HD 130948B and C are inconsistent with temperatures determined from spectral synthesis for objects of similar spectral type. Overall, regardless of the adopted age, evolutionary and atmospheric models give inconsistent results, which indicate systematic errors in at least one class of models, possibly both. The masses of HD 130948BC happen to be very near the theoretical mass limit for lithium burning, and thus measuring the differential lithium depletion between B and C will provide a uniquely discriminating test of theoretical models. The potential underestimate of luminosities by evolutionary models would have wide-ranging implications; therefore, a more refined estimate age for HD 130948A is critically needed.

  2. Dynamic benchmarking of simulation codes

    International Nuclear Information System (INIS)

    Henry, R.E.; Paik, C.Y.; Hauser, G.M.

    1996-01-01

    Computer simulation of nuclear power plant response can be a full-scope control room simulator, an engineering simulator to represent the general behavior of the plant under normal and abnormal conditions, or the modeling of the plant response to conditions that would eventually lead to core damage. In any of these, the underlying foundation for their use in analysing situations, training of vendor/utility personnel, etc. is how well they represent what has been known from industrial experience, large integral experiments and separate effects tests. Typically, simulation codes are benchmarked with some of these; the level of agreement necessary being dependent upon the ultimate use of the simulation tool. However, these analytical models are computer codes, and as a result, the capabilities are continually enhanced, errors are corrected, new situations are imposed on the code that are outside of the original design basis, etc. Consequently, there is a continual need to assure that the benchmarks with important transients are preserved as the computer code evolves. Retention of this benchmarking capability is essential to develop trust in the computer code. Given the evolving world of computer codes, how is this retention of benchmarking capabilities accomplished? For the MAAP4 codes this capability is accomplished through a 'dynamic benchmarking' feature embedded in the source code. In particular, a set of dynamic benchmarks are included in the source code and these are exercised every time the archive codes are upgraded and distributed to the MAAP users. Three different types of dynamic benchmarks are used: plant transients; large integral experiments; and separate effects tests. Each of these is performed in a different manner. The first is accomplished by developing a parameter file for the plant modeled and an input deck to describe the sequence; i.e. the entire MAAP4 code is exercised. The pertinent plant data is included in the source code and the computer

  3. Benchmarks for dynamic multi-objective optimisation

    CSIR Research Space (South Africa)

    Helbig, M

    2013-06-01

    Full Text Available When algorithms solve dynamic multi-objective optimisation problems (DMOOPs), benchmark functions should be used to determine whether the algorithm can overcome specific difficulties that can occur in real-world problems. However, for dynamic multi...

  4. μ-synthesis for the coupled mass benchmark problem

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, J.; Tøffner-Clausen, S.

    1997-01-01

    A robust controller design for the coupled mass benchmark problem is presented in this paper. The applied design method is based on a modified D-K iteration, i.e. μ-synthesis which take care of mixed real and complex perturbations sets. This μ-synthesis method for mixed perturbation sets is a str......A robust controller design for the coupled mass benchmark problem is presented in this paper. The applied design method is based on a modified D-K iteration, i.e. μ-synthesis which take care of mixed real and complex perturbations sets. This μ-synthesis method for mixed perturbation sets...

  5. Calculation of the 5th AER dynamic benchmark with APROS

    International Nuclear Information System (INIS)

    Puska, E.K.; Kontio, H.

    1998-01-01

    The model used for calculation of the 5th AER dynamic benchmark with APROS code is presented. In the calculation of the 5th AER dynamic benchmark the three-dimensional neutronics model of APROS was used. The core was divided axially into 20 nodes according to the specifications of the benchmark and each six identical fuel assemblies were placed into one one-dimensional thermal hydraulic channel. The five-equation thermal hydraulic model was used in the benchmark. The plant process and automation was described with a generic VVER-440 plant model created by IVO PE. (author)

  6. Calculation of the fifth atomic energy research dynamic benchmark with APROS

    International Nuclear Information System (INIS)

    Puska Eija Karita; Kontio Harii

    1998-01-01

    The band-out presents the model used for calculation of the fifth atomic energy research dynamic benchmark with APROS code. In the calculation of the fifth atomic energy research dynamic benchmark the three-dimensional neutronics model of APROS was used. The core was divided axially into 20 nodes according to the specifications of the benchmark and each six identical fuel assemblies were placed into one one-dimensional thermal hydraulic channel. The five-equation thermal hydraulic model was used in the benchmark. The plant process and automation was described with a generic WWER-440 plant model created by IVO Power Engineering Ltd. - Finland. (Author)

  7. Pynamic: the Python Dynamic Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Lee, G L; Ahn, D H; de Supinksi, B R; Gyllenhaal, J C; Miller, P J

    2007-07-10

    Python is widely used in scientific computing to facilitate application development and to support features such as computational steering. Making full use of some of Python's popular features, which improve programmer productivity, leads to applications that access extremely high numbers of dynamically linked libraries (DLLs). As a result, some important Python-based applications severely stress a system's dynamic linking and loading capabilities and also cause significant difficulties for most development environment tools, such as debuggers. Furthermore, using the Python paradigm for large scale MPI-based applications can create significant file IO and further stress tools and operating systems. In this paper, we present Pynamic, the first benchmark program to support configurable emulation of a wide-range of the DLL usage of Python-based applications for large scale systems. Pynamic has already accurately reproduced system software and tool issues encountered by important large Python-based scientific applications on our supercomputers. Pynamic provided insight for our system software and tool vendors, and our application developers, into the impact of several design decisions. As we describe the Pynamic benchmark, we will highlight some of the issues discovered in our large scale system software and tools using Pynamic.

  8. Solution of the 6th dynamic AER benchmark using the coupled core DYN3D/ATHLET

    International Nuclear Information System (INIS)

    Seidel, A.; Kliem, S.

    2001-01-01

    The 6 th dynamic benchmark is a logical continuation of the work to validate systematically coupled neutron kinetics/thermohydraulics code systems for the estimation of the transient behaviour of WWER type nuclear power plant which was started in the 5 th dynamic benchmark. This benchmark concerns a double ended break of the main steam line (asymmetrical MSLB) in a WWER plant. The core is at the end of first cycle in full power conditions. The asymmetric leak causes a different depressurization of all steam generators. New features in comparison to the 5 th dynamic benchmark were included: asymmetric operation of the feed water system, consideration of incomplete coolant mixing in the reactor vessel, and the definition of a fixed isothermal recriticality temperature for normalising the nuclear data (Authors)

  9. The fifth AER dynamic benchmark calculation with hextran-smabre

    International Nuclear Information System (INIS)

    Haemaelaeinen, A.; Kyrki-Rajamaeki, R.

    1998-01-01

    The first AER benchmark for coupling of the thermohydraulic codes and three-dimensional reactordynamic core models is discussed. HEXTRAN 2.7 is used for the core dynamics and SMABRE 4.6 as a thermohydraulic model for the primary and secondary loops. The plant model for SMABRE is based mainly on two input models, the Loviisa model and standard VVER-440/213 plant model. The primary circuit includes six separate loops, totally 505 nodes and 652 junctions. The reactor pressure vessel is divided into six parallel channels. In HEXTRAN calculation 1/6 symmetry is used in the core. In the calculations nuclear data is based on the ENDF/B-IV library and it has been evaluated with the CASMO-HEX code. The importance of the nuclear data was illustrated by repeating the benchmark calculation with using three different data sets. Optimal extensive data valid from hot to cold conditions were not available for all types of fuel enrichments needed in this benchmark. (author)

  10. Benchmarking novel approaches for modelling species range dynamics.

    Science.gov (United States)

    Zurell, Damaris; Thuiller, Wilfried; Pagel, Jörn; Cabral, Juliano S; Münkemüller, Tamara; Gravel, Dominique; Dullinger, Stefan; Normand, Signe; Schiffers, Katja H; Moore, Kara A; Zimmermann, Niklaus E

    2016-08-01

    Increasing biodiversity loss due to climate change is one of the most vital challenges of the 21st century. To anticipate and mitigate biodiversity loss, models are needed that reliably project species' range dynamics and extinction risks. Recently, several new approaches to model range dynamics have been developed to supplement correlative species distribution models (SDMs), but applications clearly lag behind model development. Indeed, no comparative analysis has been performed to evaluate their performance. Here, we build on process-based, simulated data for benchmarking five range (dynamic) models of varying complexity including classical SDMs, SDMs coupled with simple dispersal or more complex population dynamic models (SDM hybrids), and a hierarchical Bayesian process-based dynamic range model (DRM). We specifically test the effects of demographic and community processes on model predictive performance. Under current climate, DRMs performed best, although only marginally. Under climate change, predictive performance varied considerably, with no clear winners. Yet, all range dynamic models improved predictions under climate change substantially compared to purely correlative SDMs, and the population dynamic models also predicted reasonable extinction risks for most scenarios. When benchmarking data were simulated with more complex demographic and community processes, simple SDM hybrids including only dispersal often proved most reliable. Finally, we found that structural decisions during model building can have great impact on model accuracy, but prior system knowledge on important processes can reduce these uncertainties considerably. Our results reassure the clear merit in using dynamic approaches for modelling species' response to climate change but also emphasize several needs for further model and data improvement. We propose and discuss perspectives for improving range projections through combination of multiple models and for making these approaches

  11. The fifth Atomic Energy Research dynamic benchmark calculation with HEXTRAN-SMABRE

    International Nuclear Information System (INIS)

    Haenaelaeinen, Anitta

    1998-01-01

    The fifth Atomic Energy Research dynamic benchmark is the first Atomic Energy Research benchmark for coupling of the thermohydraulic codes and three-dimensional reactor dynamic core models. In VTT HEXTRAN 2.7 is used for the core dynamics and SMABRE 4.6 as a thermohydraulic model for the primary and secondary loops. The plant model for SMABRE is based mainly on two input models. the Loviisa model and standard WWER-440/213 plant model. The primary circuit includes six separate loops, totally 505 nodes and 652 junctions. The reactor pressure vessel is divided into six parallel channels. In HEXTRAN calculation 176 symmetry is used in the core. In the sequence of main steam header break at the hot standby state, the liquid temperature is decreased symmetrically in the core inlet which leads to return to power. In the benchmark, no isolations of the steam generators are assumed and the maximum core power is about 38 % of the nominal power at four minutes after the break opening in the HEXTRAN-SMABRE calculation. Due to boric acid in the high pressure safety injection water, the power finally starts to decrease. The break flow is pure steam in the HEXTRAN-SMABRE calculation during the whole transient even in the swell levels in the steam generators are very high due to flashing. Because of sudden peaks in the preliminary results of the steam generator heat transfer, the SMABRE drift-flux model was modified. The new model is a simplified version of the EPRI correlation based on test data. The modified correlation behaves smoothly. In the calculations nuclear data is based on the ENDF/B-IV library and it has been evaluated with the CASMO-HEX code. The importance of the nuclear data was illustrated by repeating the benchmark calculation with using three different data sets. Optimal extensive data valid from hot to cold conditions were not available for all types of fuel enrichments needed in this benchmark.(Author)

  12. Benchmarking of small-signal dynamics of single-phase PLLs

    DEFF Research Database (Denmark)

    Zhang, Chong; Wang, Xiongfei; Blaabjerg, Frede

    2015-01-01

    Phase-looked Loop (PLL) is a critical component for the control and grid synchronization of grid-connected power converters. This paper presents a benchmarking study on the small-signal dynamics of three commonly used PLLs for single-phase converters, including enhanced PLL, second......-order generalized integrator based PLL, and the inverse-PLL. First, a unified small-signal model of those PLLs is established for comparing their dynamics. Then, a systematic design guideline for parameters tuning of the PLLs is formulated. To confirm the validity of theoretical analysis, nonlinear time...

  13. Benchmarks and statistics of entanglement dynamics

    International Nuclear Information System (INIS)

    Tiersch, Markus

    2009-01-01

    In the present thesis we investigate how the quantum entanglement of multicomponent systems evolves under realistic conditions. More specifically, we focus on open quantum systems coupled to the (uncontrolled) degrees of freedom of an environment. We identify key quantities that describe the entanglement dynamics, and provide efficient tools for its calculation. For quantum systems of high dimension, entanglement dynamics can be characterized with high precision. In the first part of this work, we derive evolution equations for entanglement. These formulas determine the entanglement after a given time in terms of a product of two distinct quantities: the initial amount of entanglement and a factor that merely contains the parameters that characterize the dynamics. The latter is given by the entanglement evolution of an initially maximally entangled state. A maximally entangled state thus benchmarks the dynamics, and hence allows for the immediate calculation or - under more general conditions - estimation of the change in entanglement. Thereafter, a statistical analysis supports that the derived (in-)equalities describe the entanglement dynamics of the majority of weakly mixed and thus experimentally highly relevant states with high precision. The second part of this work approaches entanglement dynamics from a topological perspective. This allows for a quantitative description with a minimum amount of assumptions about Hilbert space (sub-)structure and environment coupling. In particular, we investigate the limit of increasing system size and density of states, i.e. the macroscopic limit. In this limit, a universal behaviour of entanglement emerges following a ''reference trajectory'', similar to the central role of the entanglement dynamics of a maximally entangled state found in the first part of the present work. (orig.)

  14. Benchmarks and statistics of entanglement dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Tiersch, Markus

    2009-09-04

    In the present thesis we investigate how the quantum entanglement of multicomponent systems evolves under realistic conditions. More specifically, we focus on open quantum systems coupled to the (uncontrolled) degrees of freedom of an environment. We identify key quantities that describe the entanglement dynamics, and provide efficient tools for its calculation. For quantum systems of high dimension, entanglement dynamics can be characterized with high precision. In the first part of this work, we derive evolution equations for entanglement. These formulas determine the entanglement after a given time in terms of a product of two distinct quantities: the initial amount of entanglement and a factor that merely contains the parameters that characterize the dynamics. The latter is given by the entanglement evolution of an initially maximally entangled state. A maximally entangled state thus benchmarks the dynamics, and hence allows for the immediate calculation or - under more general conditions - estimation of the change in entanglement. Thereafter, a statistical analysis supports that the derived (in-)equalities describe the entanglement dynamics of the majority of weakly mixed and thus experimentally highly relevant states with high precision. The second part of this work approaches entanglement dynamics from a topological perspective. This allows for a quantitative description with a minimum amount of assumptions about Hilbert space (sub-)structure and environment coupling. In particular, we investigate the limit of increasing system size and density of states, i.e. the macroscopic limit. In this limit, a universal behaviour of entanglement emerges following a ''reference trajectory'', similar to the central role of the entanglement dynamics of a maximally entangled state found in the first part of the present work. (orig.)

  15. Benchmarking burnup reconstruction methods for dynamically operated research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Sternat, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Charlton, William S. [Univ. of Nebraska, Lincoln, NE (United States). National Strategic Research Institute; Nichols, Theodore F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-03-01

    The burnup of an HEU fueled dynamically operated research reactor, the Oak Ridge Research Reactor, was experimentally reconstructed using two different analytic methodologies and a suite of signature isotopes to evaluate techniques for estimating burnup for research reactor fuel. The methods studied include using individual signature isotopes and the complete mass spectrometry spectrum to recover the sample’s burnup. The individual, or sets of, isotopes include 148Nd, 137Cs+137Ba, 139La, and 145Nd+146Nd. The storage documentation from the analyzed fuel material provided two different measures of burnup: burnup percentage and the total power generated from the assembly in MWd. When normalized to conventional units, these two references differed by 7.8% (395.42GWd/MTHM and 426.27GWd/MTHM) in the resulting burnup for the spent fuel element used in the benchmark. Among all methods being evaluated, the results were within 11.3% of either reference burnup. The results were mixed in closeness to both reference burnups; however, consistent results were achieved from all three experimental samples.

  16. GPI Spectroscopy of the Mass, Age, and Metallicity Benchmark Brown Dwarf HD 4747 B

    Science.gov (United States)

    Crepp, Justin R.; Principe, David A.; Wolff, Schuyler; Giorla Godfrey, Paige A.; Rice, Emily L.; Cieza, Lucas; Pueyo, Laurent; Bechter, Eric B.; Gonzales, Erica J.

    2018-02-01

    The physical properties of brown dwarf companions found to orbit nearby, solar-type stars can be benchmarked against independent measures of their mass, age, chemical composition, and other parameters, offering insights into the evolution of substellar objects. The TRENDS high-contrast imaging survey has recently discovered a (mass/age/metallicity) benchmark brown dwarf orbiting the nearby (d = 18.69 ± 0.19 pc), G8V/K0V star HD 4747. We have acquired follow-up spectroscopic measurements of HD 4747 B using the Gemini Planet Imager to study its spectral type, effective temperature, surface gravity, and cloud properties. Observations obtained in the H-band and K 1-band recover the companion and reveal that it is near the L/T transition (T1 ± 2). Fitting atmospheric models to the companion spectrum, we find strong evidence for the presence of clouds. However, spectral models cannot satisfactorily fit the complete data set: while the shape of the spectrum can be well-matched in individual filters, a joint fit across the full passband results in discrepancies that are a consequence of the inherent color of the brown dwarf. We also find a 2σ tension in the companion mass, age, and surface gravity when comparing to evolutionary models. These results highlight the importance of using benchmark objects to study “secondary effects” such as metallicity, non-equilibrium chemistry, cloud parameters, electron conduction, non-adiabatic cooling, and other subtleties affecting emergent spectra. As a new L/T transition benchmark, HD 4747 B warrants further investigation into the modeling of cloud physics using higher resolution spectroscopy across a broader range of wavelengths, polarimetric observations, and continued Doppler radial velocity and astrometric monitoring.

  17. Experimental benchmark for piping system dynamic-response analyses

    International Nuclear Information System (INIS)

    1981-01-01

    This paper describes the scope and status of a piping system dynamics test program. A 0.20 m(8 in.) nominal diameter test piping specimen is designed to be representative of main heat transport system piping of LMFBR plants. Particular attention is given to representing piping restraints. Applied loadings consider component-induced vibration as well as seismic excitation. The principal objective of the program is to provide a benchmark for verification of piping design methods by correlation of predicted and measured responses. Pre-test analysis results and correlation methods are discussed

  18. Experimental benchmark for piping system dynamic response analyses

    International Nuclear Information System (INIS)

    Schott, G.A.; Mallett, R.H.

    1981-01-01

    The scope and status of a piping system dynamics test program are described. A 0.20-m nominal diameter test piping specimen is designed to be representative of main heat transport system piping of LMFBR plants. Attention is given to representing piping restraints. Applied loadings consider component-induced vibration as well as seismic excitation. The principal objective of the program is to provide a benchmark for verification of piping design methods by correlation of predicted and measured responses. Pre-test analysis results and correlation methods are discussed. 3 refs

  19. Yucca Mountain Project thermal and mechanical codes first benchmark exercise: Part 3, Jointed rock mass analysis

    International Nuclear Information System (INIS)

    Costin, L.S.; Bauer, S.J.

    1991-10-01

    Thermal and mechanical models for intact and jointed rock mass behavior are being developed, verified, and validated at Sandia National Laboratories for the Yucca Mountain Site Characterization Project. Benchmarking is an essential part of this effort and is one of the tools used to demonstrate verification of engineering software used to solve thermomechanical problems. This report presents the results of the third (and final) phase of the first thermomechanical benchmark exercise. In the first phase of this exercise, nonlinear heat conduction code were used to solve the thermal portion of the benchmark problem. The results from the thermal analysis were then used as input to the second and third phases of the exercise, which consisted of solving the structural portion of the benchmark problem. In the second phase of the exercise, a linear elastic rock mass model was used. In the third phase of the exercise, two different nonlinear jointed rock mass models were used to solve the thermostructural problem. Both models, the Sandia compliant joint model and the RE/SPEC joint empirical model, explicitly incorporate the effect of the joints on the response of the continuum. Three different structural codes, JAC, SANCHO, and SPECTROM-31, were used with the above models in the third phase of the study. Each model was implemented in two different codes so that direct comparisons of results from each model could be made. The results submitted by the participants showed that the finite element solutions using each model were in reasonable agreement. Some consistent differences between the solutions using the two different models were noted but are not considered important to verification of the codes. 9 refs., 18 figs., 8 tabs

  20. Benchmarking Computational Fluid Dynamics for Application to PWR Fuel

    International Nuclear Information System (INIS)

    Smith, L.D. III; Conner, M.E.; Liu, B.; Dzodzo, B.; Paramonov, D.V.; Beasley, D.E.; Langford, H.M.; Holloway, M.V.

    2002-01-01

    The present study demonstrates a process used to develop confidence in Computational Fluid Dynamics (CFD) as a tool to investigate flow and temperature distributions in a PWR fuel bundle. The velocity and temperature fields produced by a mixing spacer grid of a PWR fuel assembly are quite complex. Before using CFD to evaluate these flow fields, a rigorous benchmarking effort should be performed to ensure that reasonable results are obtained. Westinghouse has developed a method to quantitatively benchmark CFD tools against data at conditions representative of the PWR. Several measurements in a 5 x 5 rod bundle were performed. Lateral flow-field testing employed visualization techniques and Particle Image Velocimetry (PIV). Heat transfer testing involved measurements of the single-phase heat transfer coefficient downstream of the spacer grid. These test results were used to compare with CFD predictions. Among the parameters optimized in the CFD models based on this comparison with data include computational mesh, turbulence model, and boundary conditions. As an outcome of this effort, a methodology was developed for CFD modeling that provides confidence in the numerical results. (authors)

  1. Validation of flexible multibody dynamics beam formulations using benchmark problems

    Energy Technology Data Exchange (ETDEWEB)

    Bauchau, Olivier A., E-mail: obauchau@umd.edu [University of Maryland (United States); Betsch, Peter [Karlsruhe Institute of Technology (Germany); Cardona, Alberto [CIMEC (UNL/Conicet) (Argentina); Gerstmayr, Johannes [Leopold-Franzens Universität Innsbruck (Austria); Jonker, Ben [University of Twente (Netherlands); Masarati, Pierangelo [Politecnico di Milano (Italy); Sonneville, Valentin [Université de Liège (Belgium)

    2016-05-15

    As the need to model flexibility arose in multibody dynamics, the floating frame of reference formulation was developed, but this approach can yield inaccurate results when elastic displacements becomes large. While the use of three-dimensional finite element formulations overcomes this problem, the associated computational cost is overwhelming. Consequently, beam models, which are one-dimensional approximations of three-dimensional elasticity, have become the workhorse of many flexible multibody dynamics codes. Numerous beam formulations have been proposed, such as the geometrically exact beam formulation or the absolute nodal coordinate formulation, to name just two. New solution strategies have been investigated as well, including the intrinsic beam formulation or the DAE approach. This paper provides a systematic comparison of these various approaches, which will be assessed by comparing their predictions for four benchmark problems. The first problem is the Princeton beam experiment, a study of the static large displacement and rotation behavior of a simple cantilevered beam under a gravity tip load. The second problem, the four-bar mechanism, focuses on a flexible mechanism involving beams and revolute joints. The third problem investigates the behavior of a beam bent in its plane of greatest flexural rigidity, resulting in lateral buckling when a critical value of the transverse load is reached. The last problem investigates the dynamic stability of a rotating shaft. The predictions of eight independent codes are compared for these four benchmark problems and are found to be in close agreement with each other and with experimental measurements, when available.

  2. Sparticle mass hierarchies, simplified models from SUGRA unification, and benchmarks for LHC Run-II SUSY searches

    International Nuclear Information System (INIS)

    Francescone, David; Akula, Sujeet; Altunkaynak, Baris; Nath, Pran

    2015-01-01

    Sparticle mass hierarchies contain significant information regarding the origin and nature of supersymmetry breaking. The hierarchical patterns are severely constrained by electroweak symmetry breaking as well as by the astrophysical and particle physics data. They are further constrained by the Higgs boson mass measurement. The sparticle mass hierarchies can be used to generate simplified models consistent with the high scale models. In this work we consider supergravity models with universal boundary conditions for soft parameters at the unification scale as well as supergravity models with nonuniversalities and delineate the list of sparticle mass hierarchies for the five lightest sparticles. Simplified models can be obtained by a truncation of these, retaining a smaller set of lightest particles. The mass hierarchies and their truncated versions enlarge significantly the list of simplified models currently being used in the literature. Benchmarks for a variety of supergravity unified models appropriate for SUSY searches at future colliders are also presented. The signature analysis of two benchmark models has been carried out and a discussion of the searches needed for their discovery at LHC Run-II is given. An analysis of the spin-independent neutralino-proton cross section exhibiting the Higgs boson mass dependence and the hierarchical patterns is also carried out. It is seen that a knowledge of the spin-independent neutralino-proton cross section and the neutralino mass will narrow down the list of the allowed sparticle mass hierarchies. Thus dark matter experiments along with analyses for the LHC Run-II will provide strong clues to the nature of symmetry breaking at the unification scale.

  3. Benchmarking Benchmarks

    NARCIS (Netherlands)

    D.C. Blitz (David)

    2011-01-01

    textabstractBenchmarking benchmarks is a bundle of six studies that are inspired by the prevalence of benchmarking in academic finance research as well as in investment practice. Three studies examine if current benchmark asset pricing models adequately describe the cross-section of stock returns.

  4. Solution of the fifth dynamic Atomic Energy Research benchmark problem using the coupled code DIN3/ATHLET

    International Nuclear Information System (INIS)

    Kliem, S.

    1998-01-01

    The fifth dynamic benchmark is the first benchmark for coupled thermohydraulic system/three dimensional hexagonal neutron kinetic core models. In this benchmark the interaction between the components of a WWER-440 NPP with the reactor core has been investigated. The initiating event is a symmetrical break of the main steam header at the end of the first fuel cycle and the shutdown conditions with one control rod group s tucking. This break causes an overcooling of the primary circuit. During this overcooling the scram reactivity is compensated and the scrammed reactor becomes re critical. The calculation was continued until the highly-borated water from the high pressure injection system terminated the power excursion. Several aspects of the very complex and complicated benchmark problem are analyzed in detail. Sensitivity studies with different hydraulic parameters are made. The influence on the course of the transient and on the solution is discussed.(Author)

  5. Dynamic Rupture Benchmarking of the ADER-DG Method

    Science.gov (United States)

    Gabriel, Alice; Pelties, Christian

    2013-04-01

    We will verify the arbitrary high-order derivative Discontinuous Galerkin (ADER-DG) method in various test cases of the 'SCEC/USGS Dynamic Earthquake Rupture Code Verification Exercise' benchmark suite (Harris et al. 2009). The ADER-DG scheme is able to solve the spontaneous rupture problem with high-order accuracy in space and time on three-dimensional unstructured tetrahedral meshes. Strong mesh coarsening or refinement at areas of interest can be applied to keep the computational costs feasible. Moreover, the method does not generate spurious high-frequency contributions in the slip rate spectra and therefore does not require any artificial damping as demonstrated in previous presentations and publications (Pelties et al. 2010 and 2012). We will show that the mentioned features hold also for more advanced setups as e.g. a branching fault system, heterogeneous background stresses and bimaterial faults. The advanced geometrical flexibility combined with an enhanced accuracy will make the ADER-DG method a useful tool to study earthquake dynamics on complex fault systems in realistic rheologies. References: Harris, R.A., M. Barall, R. Archuleta, B. Aagaard, J.-P. Ampuero, H. Bhat, V. Cruz-Atienza, L. Dalguer, P. Dawson, S. Day, B. Duan, E. Dunham, G. Ely, Y. Kaneko, Y. Kase, N. Lapusta, Y. Liu, S. Ma, D. Oglesby, K. Olsen, A. Pitarka, S. Song, and E. Templeton, The SCEC/USGS Dynamic Earthquake Rupture Code Verification Exercise, Seismological Research Letters, vol. 80, no. 1, pages 119-126, 2009 Pelties, C., J. de la Puente, and M. Kaeser, Dynamic Rupture Modeling in Three Dimensions on Unstructured Meshes Using a Discontinuous Galerkin Method, AGU 2010 Fall Meeting, abstract #S21C-2068 Pelties, C., J. de la Puente, J.-P. Ampuero, G. Brietzke, and M. Kaeser, Three-Dimensional Dynamic Rupture Simulation with a High-order Discontinuous Galerkin Method on Unstructured Tetrahedral Meshes, JGR. - Solid Earth, VOL. 117, B02309, 2012

  6. Using chemical benchmarking to determine the persistence of chemicals in a Swedish lake.

    Science.gov (United States)

    Zou, Hongyan; Radke, Michael; Kierkegaard, Amelie; MacLeod, Matthew; McLachlan, Michael S

    2015-02-03

    It is challenging to measure the persistence of chemicals under field conditions. In this work, two approaches for measuring persistence in the field were compared: the chemical mass balance approach, and a novel chemical benchmarking approach. Ten pharmaceuticals, an X-ray contrast agent, and an artificial sweetener were studied in a Swedish lake. Acesulfame K was selected as a benchmark to quantify persistence using the chemical benchmarking approach. The 95% confidence intervals of the half-life for transformation in the lake system ranged from 780-5700 days for carbamazepine to benchmarking approach agreed well with those from the mass balance approach (1-21% difference), indicating that chemical benchmarking can be a valid and useful method to measure the persistence of chemicals under field conditions. Compared to the mass balance approach, the benchmarking approach partially or completely eliminates the need to quantify mass flow of chemicals, so it is particularly advantageous when the quantification of mass flow of chemicals is difficult. Furthermore, the benchmarking approach allows for ready comparison and ranking of the persistence of different chemicals.

  7. Comparison of the results of the fifth dynamic AER benchmark-a benchmark for coupled thermohydraulic system/three-dimensional hexagonal kinetic core models

    International Nuclear Information System (INIS)

    Kliem, S.

    1998-01-01

    The fifth dynamic benchmark was defined at seventh AER-Symposium, held in Hoernitz, Germany in 1997. It is the first benchmark for coupled thermohydraulic system/three-dimensional hexagonal neutron kinetic core models. In this benchmark the interaction between the components of a WWER-440 NPP with the reactor core has been investigated. The initiating event is a symmetrical break of the main steam header at the end of the first fuel cycle and hot shutdown conditions with one control rod group stucking. This break causes an overcooling of the primary circuit. During this overcooling the scram reactivity is compensated and the scrammed reactor becomes re critical. The calculation was continued until the highly-borated water from the high pressure injection system terminated the power excursion. Each participant used own best-estimate nuclear cross section data. Only the initial subcriticality at the beginning of the transient was given. Solutions were received from Kurchatov Institute Russia with the code BIPR8/ATHLET, VTT Energy Finland with HEXTRAN/SMABRE, NRI Rez Czech Republic with DYN3/ATHLET, KFKI Budapest Hungary with KIKO3D/ATHLET and from FZR Germany with the code DYN3D/ATHLET.In this paper the results are compared. Beside the comparison of global results, the behaviour of several thermohydraulic and neutron kinetic parameters is presented to discuss the revealed differences between the solutions.(Authors)

  8. Final results of the fifth three-dimensional dynamic Atomic Energy Research benchmark problem calculations

    International Nuclear Information System (INIS)

    Hadek, J.

    1999-01-01

    The paper gives a brief survey of the fifth three-dimensional dynamic Atomic Energy Research benchmark calculation results received with the code DYN3D/ATHLET at NRI Rez. This benchmark was defined at the seventh Atomic Energy Research Symposium (Hoernitz near Zittau, 1997). Its initiating event is a symmetrical break of the main steam header at the end of the first fuel cycle and hot shutdown conditions with one stuck out control rod group. The calculations were performed with the externally coupled codes ATHLET Mod.1.1 Cycle C and DYN3DH1.1/M3. The standard WWER-440/213 input deck of ATHLET code was adopted for benchmark purposes and for coupling with the code DYN3D. The first part of paper contains a brief characteristics of NPP input deck and reactor core model. The second part shows the time dependencies of important global and local parameters. In comparison with the results published at the eighth Atomic Energy Research Symposium (Bystrice nad Pernstejnem, 1998), the results published in this paper are based on improved ATHLET descriptions of control and safety systems. (Author)

  9. Dynamics of Variable Mass Systems

    Science.gov (United States)

    Eke, Fidelis O.

    1998-01-01

    This report presents the results of an investigation of the effects of mass loss on the attitude behavior of spinning bodies in flight. The principal goal is to determine whether there are circumstances under which the motion of variable mass systems can become unstable in the sense that their transverse angular velocities become unbounded. Obviously, results from a study of this kind would find immediate application in the aerospace field. The first part of this study features a complete and mathematically rigorous derivation of a set of equations that govern both the translational and rotational motions of general variable mass systems. The remainder of the study is then devoted to the application of the equations obtained to a systematic investigation of the effect of various mass loss scenarios on the dynamics of increasingly complex models of variable mass systems. It is found that mass loss can have a major impact on the dynamics of mechanical systems, including a possible change in the systems stability picture. Factors such as nozzle geometry, combustion chamber geometry, propellant's initial shape, size and relative mass, and propellant location can all have important influences on the system's dynamic behavior. The relative importance of these parameters on-system motion are quantified in a way that is useful for design purposes.

  10. Benchmarking

    OpenAIRE

    Meylianti S., Brigita

    1999-01-01

    Benchmarking has different meaning to different people. There are five types of benchmarking, namely internal benchmarking, competitive benchmarking, industry / functional benchmarking, process / generic benchmarking and collaborative benchmarking. Each type of benchmarking has its own advantages as well as disadvantages. Therefore it is important to know what kind of benchmarking is suitable to a specific application. This paper will discuss those five types of benchmarking in detail, includ...

  11. Supermarket Refrigeration System - Benchmark for Hybrid System Control

    DEFF Research Database (Denmark)

    Sloth, Lars Finn; Izadi-Zamanabadi, Roozbeh; Wisniewski, Rafal

    2007-01-01

    This paper presents a supermarket refrigeration system as a benchmark for development of new ideas and a comparison of methods for hybrid systems' modeling and control. The benchmark features switch dynamics and discrete valued input making it a hybrid system, furthermore the outputs are subjected...

  12. Mass-shell properties of the dynamical quark mass

    International Nuclear Information System (INIS)

    Reinders, L.J.; Stam, K.

    1986-07-01

    We discuss the running dynamical quark mass in the framework of the operator product expansion. It is shown that for vertical strokep 2 vertical stroke>m 2 the quark-condensate part of the quark self energy has no contributions of order m 2 or higher, and is frozen to its mass-shell value for smaller vertical strokep 2 vertical stroke. (orig.)

  13. Benchmarking a signpost to excellence in quality and productivity

    CERN Document Server

    Karlof, Bengt

    1993-01-01

    According to the authors, benchmarking exerts a powerful leverage effect on an organization and they consider some of the factors which justify their claim. Describes how to implement benchmarking and exactly what to benchmark. Explains benchlearning which integrates education, leadership development and organizational dynamics with the actual work being done and how to make it work more efficiently in terms of quality and productivity.

  14. Contour tracing for segmentation of mammographic masses

    International Nuclear Information System (INIS)

    Elter, Matthias; Held, Christian; Wittenberg, Thomas

    2010-01-01

    CADx systems have the potential to support radiologists in the difficult task of discriminating benign and malignant mammographic lesions. The segmentation of mammographic masses from the background tissue is an important module of CADx systems designed for the characterization of mass lesions. In this work, a novel approach to this task is presented. The segmentation is performed by automatically tracing the mass' contour in-between manually provided landmark points defined on the mass' margin. The performance of the proposed approach is compared to the performance of implementations of three state-of-the-art approaches based on region growing and dynamic programming. For an unbiased comparison of the different segmentation approaches, optimal parameters are selected for each approach by means of tenfold cross-validation and a genetic algorithm. Furthermore, segmentation performance is evaluated on a dataset of ROI and ground-truth pairs. The proposed method outperforms the three state-of-the-art methods. The benchmark dataset will be made available with publication of this paper and will be the first publicly available benchmark dataset for mass segmentation.

  15. Definition of the seventh dynamic AER benchmark-WWER-440 pressure vessel coolant mixing by re-connection of an isolated loop

    International Nuclear Information System (INIS)

    Kotsarev, A.; Lizorkin, M.; Petrin, R.

    2010-01-01

    The seventh dynamic benchmark is a continuation of the efforts to validate systematically codes for the estimation of the transient behavior of VVER type nuclear power plants. This benchmark is a continuation of the work in the sixth dynamic benchmark. It is proposed to be simulated the transient - re-connection of an isolated circulating loop with low temperature or low boron concentration in a VVER-440 plant. It is supposed to expand the benchmark to other cases when a different number of loops are in operation leading to different symmetric and asymmetric core boundary conditions. The purposes of the proposed benchmark are: 1) Best-estimate simulations of an transient with a coolant flow mixing in the Reactor Pressure Vessel of WWER-440 plant by re-connection of one coolant loop to the several ones on operation, 2) Performing of code-to-code comparisons. The core is at the end of its first cycle with a power of 1196.25 MWt. The basic additional difference of the 7-seventh benchmark is in the detailed description of the downcomer and bottom part of the reactor vessel that allow describing the effects of coolant mixing in the Reactor Pressure Vessel without any additional conservative assumptions. The burn-up and the power distributions at this reactor state have to be calculated by the participants. The thermohydraulic conditions of the core in the beginning of the transient are specified. Participants self-generated best estimate nuclear data is to be used. The main geometrical parameters of the plant and the characteristics of the control and safety systems are also specified. Use generated input data decks developed for a WWER-440 plant and for the applied codes should be used. The behaviour of the plant should be studied applying coupled system codes, which combine a three-dimensional neutron kinetics description of the core with a pseudo or real 3D thermohydraulics system code. (Authors)

  16. Dynamics of mechanical systems with variable mass

    CERN Document Server

    Belyaev, Alexander

    2014-01-01

    The book presents up-to-date and unifying formulations for treating dynamics of different types of mechanical systems with variable mass. The starting point is overview of the continuum mechanics relations of balance and jump for open systems from which extended Lagrange and Hamiltonian formulations are derived. Corresponding approaches are stated at the level of analytical mechanics with emphasis on systems with a position-dependent mass and at the level of structural mechanics. Special emphasis is laid upon axially moving structures like belts and chains, and on pipes with an axial flow of fluid. Constitutive relations in the dynamics of systems with variable mass are studied with particular reference to modeling of multi-component mixtures. The dynamics of machines with a variable mass are treated in detail and conservation laws and the stability of motion will be analyzed. Novel finite element formulations for open systems in coupled fluid and structural dynamics are presented.

  17. Reevaluation of the Jezebel Benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Favorite, Jeffrey A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-03-10

    Every nuclear engineering student is familiar with Jezebel, the homogeneous bare sphere of plutonium first assembled at Los Alamos in 1954-1955. The actual Jezebel assembly was neither homogeneous, nor bare, nor spherical; nor was it singular – there were hundreds of Jezebel configurations assembled. The Jezebel benchmark has been reevaluated for the International Criticality Safety Benchmark Evaluation Project (ICSBEP) Handbook. Logbooks, original drawings, mass accountability statements, internal reports, and published reports have been used to model four actual three-dimensional Jezebel assemblies with high fidelity. Because the documentation available today is often inconsistent, three major assumptions were made regarding plutonium part masses and dimensions. The first was that the assembly masses given in Los Alamos report LA-4208 (1969) were correct, and the second was that the original drawing dimension for the polar height of a certain major part was correct. The third assumption was that a change notice indicated on the original drawing was not actually implemented. This talk will describe these assumptions, the alternatives, and the implications. Since the publication of the 2013 ICSBEP Handbook, the actual masses of the major components have turned up. Our assumption regarding the assembly masses was proven correct, but we had the mass distribution incorrect. Work to incorporate the new information is ongoing, and this talk will describe the latest assessment.

  18. Developing a benchmark for emotional analysis of music.

    Science.gov (United States)

    Aljanaki, Anna; Yang, Yi-Hsuan; Soleymani, Mohammad

    2017-01-01

    Music emotion recognition (MER) field rapidly expanded in the last decade. Many new methods and new audio features are developed to improve the performance of MER algorithms. However, it is very difficult to compare the performance of the new methods because of the data representation diversity and scarcity of publicly available data. In this paper, we address these problems by creating a data set and a benchmark for MER. The data set that we release, a MediaEval Database for Emotional Analysis in Music (DEAM), is the largest available data set of dynamic annotations (valence and arousal annotations for 1,802 songs and song excerpts licensed under Creative Commons with 2Hz time resolution). Using DEAM, we organized the 'Emotion in Music' task at MediaEval Multimedia Evaluation Campaign from 2013 to 2015. The benchmark attracted, in total, 21 active teams to participate in the challenge. We analyze the results of the benchmark: the winning algorithms and feature-sets. We also describe the design of the benchmark, the evaluation procedures and the data cleaning and transformations that we suggest. The results from the benchmark suggest that the recurrent neural network based approaches combined with large feature-sets work best for dynamic MER.

  19. Boiling water reactor turbine trip (TT) benchmark

    International Nuclear Information System (INIS)

    2001-06-01

    In the field of coupled neutronics/thermal-hydraulics computation there is a need to enhance scientific knowledge in order to develop advanced modelling techniques for new nuclear technologies and concepts, as well as for current nuclear applications Recently developed 'best-estimate' computer code systems for modelling 3-D coupled neutronics/thermal-hydraulics transients in nuclear cores and for the coupling of core phenomena and system dynamics (PWR, BWR, VVER) need to be compared against each other and validated against results from experiments. International benchmark studies have been set up for the purpose. The present volume describes the specification of such a benchmark. The transient addressed is a turbine trip (TT) in a BWR involving pressurization events in which the coupling between core phenomena and system dynamics plays an important role. In addition, the data made available from experiments carried out at the plant make the present benchmark very valuable. The data used are from events at the Peach Bottom 2 reactor (a GE-designed BWR/4). (authors)

  20. The level 1 and 2 specification for parallel benchmark and a benchmark test of scalar-parallel computer SP2 based on the specifications

    International Nuclear Information System (INIS)

    Orii, Shigeo

    1998-06-01

    A benchmark specification for performance evaluation of parallel computers for numerical analysis is proposed. Level 1 benchmark, which is a conventional type benchmark using processing time, measures performance of computers running a code. Level 2 benchmark proposed in this report is to give the reason of the performance. As an example, scalar-parallel computer SP2 is evaluated with this benchmark specification in case of a molecular dynamics code. As a result, the main causes to suppress the parallel performance are maximum band width and start-up time of communication between nodes. Especially the start-up time is proportional not only to the number of processors but also to the number of particles. (author)

  1. Role of centre vortices in dynamical mass generation

    International Nuclear Information System (INIS)

    Leinweber, Derek B.; Bowman, Patrick O.; Heller, Urs M.; Kusterer, Daniel-Jens; Langfeld, Kurt; Williams, Anthony G.

    2006-01-01

    The mass and renormalization functions of the nonperturbative quark propagator are studied in SU(3) gauge field theory with a Symanzik-improved gluon action and the AsqTad fermion action. Centre vortices in the gauge field are identified by fixing to maximal centre gauge. The role of centre vortices in dynamical mass generation is explored by removing centre vortices from the gauge fields and studying the associated changes in the quark propagator. We find that dynamical mass generation survives in the vortex-removed SU(3) gauge field theory despite the vanishing of the string tension and suppression of the gluon propagator in the infrared suggesting the possibility of decoupling dynamical mass generation from confinement

  2. Dynamical Mass Generation.

    Science.gov (United States)

    Mendel Horwitz, Roberto Ruben

    1982-03-01

    In the framework of the Glashow-Weinberg-Salem model without elementary scalar particles, we show that masses for fermions and intermediate vector bosons can be generated dynamically. The mechanism is the formation of fermion-antifermion pseudoscalar bound states of zero total four momentum, which form a condensate in the physical vacuum. The force responsible for the binding is the short distance part of the net Coulomb force due to photon and Z exchange. Fermions and bosons acquire masses through their interaction with this condensate. The neutrinos remain massless because their righthanded components have no interactions. Also the charge -1/3 quarks remain massless because the repulsive force from the Z exchange dominates over the Coulomb force. To correct this, we propose two possible modifications to the theory. One is to cut off the Z exchange at very small distances, so that all fermions except the neutrinos acquire masses, which are then, purely electromagnetic in origin. The other is to introduce an additional gauge boson that couples to all quarks with a pure vector coupling. To make this vector boson unobservable at usual energies, at least two new fermions must couple to it. The vector boson squared masses receive additive contributions from all the fermion squared masses. The photon remains massless and the masses of the Z and W('(+OR -)) bosons are shown to be related through the Weinberg angle in the conventional way. Assuming only three families of fermions, we obtain estimates for the top quark mass.

  3. Interim results of the sixth three-dimensional AER dynamic benchmark problem calculation. Solution of problem with DYN3D and RELAP5-3D codes

    International Nuclear Information System (INIS)

    Hadek, J.; Kral, P.; Macek, J.

    2001-01-01

    The paper gives a brief survey of the 6 th three-dimensional AER dynamic benchmark calculation results received with the codes DYN3D and RELAPS-3D at NRI Rez. This benchmark was defined at the 10 th AER Symposium. Its initiating event is a double ended break in the steam line of steam generator No. I in a WWER-440/213 plant at the end of the first fuel cycle and in hot full power conditions. Stationary and burnup calculations as well as tuning of initial state before the transient were performed with the code DYN3D. Transient calculations were made with the system code RELAPS-3D.The KASSETA library was used for the generation of reactor core neutronic parameters. The detailed six loops model of NPP Dukovany was adopted for the 6 th AER dynamic benchmark purposes. The RELAPS-3D full core neutronic model was connected with seven coolant channels thermal-hydraulic model of the core (Authors)

  4. Benchmarking in Identifying Priority Directions of Development of Telecommunication Operators

    Directory of Open Access Journals (Sweden)

    Zaharchenko Lolita A.

    2013-12-01

    Full Text Available The article analyses evolution of development and possibilities of application of benchmarking in the telecommunication sphere. It studies essence of benchmarking on the basis of generalisation of approaches of different scientists to definition of this notion. In order to improve activity of telecommunication operators, the article identifies the benchmarking technology and main factors, that determine success of the operator in the modern market economy, and the mechanism of benchmarking and component stages of carrying out benchmarking by a telecommunication operator. It analyses the telecommunication market and identifies dynamics of its development and tendencies of change of the composition of telecommunication operators and providers. Having generalised the existing experience of benchmarking application, the article identifies main types of benchmarking of telecommunication operators by the following features: by the level of conduct of (branch, inter-branch and international benchmarking; by relation to participation in the conduct (competitive and joint; and with respect to the enterprise environment (internal and external.

  5. Benchmarking the Cost per Person of Mass Treatment for Selected Neglected Tropical Diseases: An Approach Based on Literature Review and Meta-regression with Web-Based Software Application.

    Directory of Open Access Journals (Sweden)

    Christopher Fitzpatrick

    2016-12-01

    Full Text Available Advocacy around mass treatment for the elimination of selected Neglected Tropical Diseases (NTDs has typically put the cost per person treated at less than US$ 0.50. Whilst useful for advocacy, the focus on a single number misrepresents the complexity of delivering "free" donated medicines to about a billion people across the world. We perform a literature review and meta-regression of the cost per person per round of mass treatment against NTDs. We develop a web-based software application (https://healthy.shinyapps.io/benchmark/ to calculate setting-specific unit costs against which programme budgets and expenditures or results-based pay-outs can be benchmarked.We reviewed costing studies of mass treatment for the control, elimination or eradication of lymphatic filariasis, schistosomiasis, soil-transmitted helminthiasis, onchocerciasis, trachoma and yaws. These are the main 6 NTDs for which mass treatment is recommended. We extracted financial and economic unit costs, adjusted to a standard definition and base year. We regressed unit costs on the number of people treated and other explanatory variables. Regression results were used to "predict" country-specific unit cost benchmarks.We reviewed 56 costing studies and included in the meta-regression 34 studies from 23 countries and 91 sites. Unit costs were found to be very sensitive to economies of scale, and the decision of whether or not to use local volunteers. Financial unit costs are expected to be less than 2015 US$ 0.50 in most countries for programmes that treat 100 thousand people or more. However, for smaller programmes, including those in the "last mile", or those that cannot rely on local volunteers, both economic and financial unit costs are expected to be higher.The available evidence confirms that mass treatment offers a low cost public health intervention on the path towards universal health coverage. However, more costing studies focussed on elimination are needed. Unit cost

  6. Infrared divergences, mass shell singularities and gauge dependence of the dynamical fermion mass

    International Nuclear Information System (INIS)

    Das, Ashok K.; Frenkel, J.; Schubert, C.

    2013-01-01

    We study the behavior of the dynamical fermion mass when infrared divergences and mass shell singularities are present in a gauge theory. In particular, in the massive Schwinger model in covariant gauges we find that the pole of the fermion propagator is divergent and gauge dependent at one loop, but the leading singularities cancel in the quenched rainbow approximation. On the other hand, in physical gauges, we find that the dynamical fermion mass is finite and gauge independent at least up to one loop

  7. Large mass hierarchies from strongly-coupled dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Athenodorou, Andreas [Department of Physics, University of Cyprus,B.O. Box 20537, 1678 Nicosia (Cyprus); Bennett, Ed [Department of Physics, College of Science, Swansea University,Singleton Park, Swansea SA2 8PP (United Kingdom); Kobayashi-Maskawa Institute for the Origin of Particles and the Universe (KMI),Nagoya University,Furo, Chikusa, Nagoya 464-8602 (Japan); Bergner, Georg [Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics,University of Bern,Sidlerstrasse 5, CH-3012 Bern (Switzerland); Elander, Daniel [National Institute for Theoretical Physics, School of Physics andMandelstam Institute for Theoretical Physics, University of the Witwatersrand,1 Jan Smuts Avenue, Johannesburg, Wits 2050 (South Africa); Lin, C.-J. David [Institute of Physics, National Chiao-Tung University,1001 Ta-Hsueh Road, Hsinchu 30010, Taiwan (China); CNRS, Aix Marseille Université, Université de Toulon, Centre de Physique Théorique,UMR 7332, F-13288 Marseille (France); Lucini, Biagio; Piai, Maurizio [Department of Physics, College of Science, Swansea University,Singleton Park, Swansea SA2 8PP (United Kingdom)

    2016-06-20

    Besides the Higgs particle discovered in 2012, with mass 125 GeV, recent LHC data show tentative signals for new resonances in diboson as well as diphoton searches at high center-of-mass energies (2 TeV and 750 GeV, respectively). If these signals are confirmed (or other new resonances are discovered at the TeV scale), the large hierarchies between masses of new bosons require a dynamical explanation. Motivated by these tentative signals of new physics, we investigate the theoretical possibility that large hierarchies in the masses of glueballs could arise dynamically in new strongly-coupled gauge theories extending the standard model of particle physics. We study lattice data on non-Abelian gauge theories in the (near-)conformal regime as well as a simple toy model in the context of gauge/gravity dualities. We focus our attention on the ratio R between the mass of the lightest spin-2 and spin-0 resonances, that for technical reasons is a particularly convenient and clean observable to study. For models in which (non-perturbative) large anomalous dimensions arise dynamically, we show indications that this mass ratio can be large, with R>5. Moreover, our results suggest that R might be related to universal properties of the IR fixed point. Our findings provide an interesting step towards understanding large mass ratios in the non-perturbative regime of quantum field theories with (near) IR conformal behaviour.

  8. Trickle-bed root culture bioreactor design and scale-up: growth, fluid-dynamics, and oxygen mass transfer.

    Science.gov (United States)

    Ramakrishnan, Divakar; Curtis, Wayne R

    2004-10-20

    Trickle-bed root culture reactors are shown to achieve tissue concentrations as high as 36 g DW/L (752 g FW/L) at a scale of 14 L. Root growth rate in a 1.6-L reactor configuration with improved operational conditions is shown to be indistinguishable from the laboratory-scale benchmark, the shaker flask (mu=0.33 day(-1)). These results demonstrate that trickle-bed reactor systems can sustain tissue concentrations, growth rates and volumetric biomass productivities substantially higher than other reported bioreactor configurations. Mass transfer and fluid dynamics are characterized in trickle-bed root reactors to identify appropriate operating conditions and scale-up criteria. Root tissue respiration goes through a minimum with increasing liquid flow, which is qualitatively consistent with traditional trickle-bed performance. However, liquid hold-up is much higher than traditional trickle-beds and alternative correlations based on liquid hold-up per unit tissue mass are required to account for large changes in biomass volume fraction. Bioreactor characterization is sufficient to carry out preliminary design calculations that indicate scale-up feasibility to at least 10,000 liters.

  9. Dynamical mass generation in QED with weak magnetic fields

    International Nuclear Information System (INIS)

    Ayala, A.; Rojas, E.; Bashir, A.; Raya, A.

    2006-01-01

    We study the dynamical generation of masses for fundamental fermions in quenched quantum electrodynamics in the presence of magnetic fields using Schwinger-Dyson equations. We show that, contrary to the case where the magnetic field is strong, in the weak field limit eB << m(0)2, where m(0) is the value of the dynamically generated mass in the absence of the magnetic field, masses are generated above a critical value of the coupling and that this value is the same as in the case with no magnetic field. We carry out a numerical analysis to study the magnetic field dependence of the mass function above critical coupling and show that in this regime the dynamically generated mass and the chiral condensate for the lowest Landau level increase proportionally to (eB)2

  10. Information feedback and mass media effects in cultural dynamics

    OpenAIRE

    Gonzalez-Avella, J. C.; Cosenza, M. G.; Klemm, K.; Eguiluz, V. M.; Miguel, M. San

    2007-01-01

    We study the effects of different forms of information feedback associated with mass media on an agent-agent based model of the dynamics of cultural dissemination. In addition to some processes previously considered, we also examine a model of local mass media influence in cultural dynamics. Two mechanisms of information feedback are investigated: (i) direct mass media influence, where local or global mass media act as an additional element in the network of interactions of each agent, and (i...

  11. Anisotropic dynamic mass density for fluidsolid composites

    KAUST Repository

    Wu, Ying; Mei, Jun; Sheng, Ping

    2012-01-01

    By taking the low frequency limit of multiple-scattering theory, we obtain the dynamic effective mass density of fluidsolid composites with a two-dimensional rectangular lattice structure. The anisotropic mass density can be described by an angle

  12. A framework for benchmarking land models

    Directory of Open Access Journals (Sweden)

    Y. Q. Luo

    2012-10-01

    Full Text Available Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1 targeted aspects of model performance to be evaluated, (2 a set of benchmarks as defined references to test model performance, (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4 model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data–model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties

  13. Boiling water reactor turbine trip (TT) benchmark

    International Nuclear Information System (INIS)

    2005-01-01

    In the field of coupled neutronics/thermal-hydraulics computation there is a need to enhance scientific knowledge in order to develop advanced modelling techniques for new nuclear technologies and concepts as well as for current applications. Recently developed 'best-estimate' computer code systems for modelling 3-D coupled neutronics/thermal-hydraulics transients in nuclear cores and for coupling core phenomena and system dynamics (PWR, BWR, VVER) need to be compared against each other and validated against results from experiments. International benchmark studies have been set up for this purpose. The present report is the second in a series of four and summarises the results of the first benchmark exercise, which identifies the key parameters and important issues concerning the thermalhydraulic system modelling of the transient, with specified core average axial power distribution and fission power time transient history. The transient addressed is a turbine trip in a boiling water reactor, involving pressurization events in which the coupling between core phenomena and system dynamics plays an important role. In addition, the data made available from experiments carried out at the Peach Bottom 2 reactor (a GE-designed BWR/4) make the present benchmark particularly valuable. (author)

  14. Design and testing of corrosion damaged prestressed concrete joists: the Pescara Benchmark

    International Nuclear Information System (INIS)

    Di Evangelista, A; De Leonardis, A; Valente, C; Zuccarino, L

    2011-01-01

    An experimental campaign named the Pescara benchmark and devoted to study the dynamic behaviour of corroded p.c. joists has been conducted. The steel corrosion reduces the area of the reinforcement and causes cracking of concrete so that r/c members are subjected to loss of strength and stiffness. It is of interest to evaluate the corrosion level at which the damage can be detected through signal processing procedures and how close such level is to the r/c member safety limits. Joists of current industrial production having different steel to concrete ratios are tested in different laboratory conditions. Dynamic tests involve either free vibrations and forced vibrations due to a moving mass simulating actual traffic loads in railway bridges. The paper discusses the rationale of the tests including the set up of the artificial corrosion, the static characterization of the joist and the dynamic tests in the different stages of corrosion experienced.

  15. Salivary gland masses. Dynamic MR imaging and pathologic correlation

    International Nuclear Information System (INIS)

    Park, Jinho; Inoue, Shingo; Ishizuka, Yasuhito; Shindo, Hiroaki; Kawanishi, Masayuki; Kakizaki, Dai; Abe, Kimihiko; Ebihara, Yoshiro

    1997-01-01

    To evaluate the efficiency of dynamic contrast-enhanced magnetic resonance imaging (MRI) for the diagnosis of salivary gland masses. We retrospectively examined 19 salivary gland masses that were pathologically diagnosed by surgical operation or biopsy. We obtained T1- and T2-weighted images on MRI, performed dynamic studies on each mass and examined the correlation between enhancement patterns and pathological findings. Four enhancement patterns were recognized on contrast-enhanced MRI: type 1 showed marked, homogeneous enhancement; type 2 slights, homogeneous enhancement; type 3 marginal enhancement; and type 4 poor enhancement of the mass. Most pleomorphic adenomas had a type 1 enhancement pattern, but two had a type 2 pattern. Pathologically, each mass enhancement pattern had different tumor cell and matrix components. Warthin's tumor generally showed the type 4 pattern. Primary malignant tumors of the salivary gland all showed the type 3 pattern, and pathological specimens showed many tumor cells along the marginal portion of the tumor. One inflammatory cyst and one Warthin's tumor also showed the type 3 pattern. Except for metastatic renal cell carcinoma, the enhancement patterns of late phase images and dynamic study images were the same. Dynamic MRI added little diagnostic information about salivary gland masses, but the contrast-enhanced MR features correlated well with the pathological findings. (author)

  16. Full sphere hydrodynamic and dynamo benchmarks

    KAUST Repository

    Marti, P.

    2014-01-26

    Convection in planetary cores can generate fluid flow and magnetic fields, and a number of sophisticated codes exist to simulate the dynamic behaviour of such systems. We report on the first community activity to compare numerical results of computer codes designed to calculate fluid flow within a whole sphere. The flows are incompressible and rapidly rotating and the forcing of the flow is either due to thermal convection or due to moving boundaries. All problems defined have solutions that alloweasy comparison, since they are either steady, slowly drifting or perfectly periodic. The first two benchmarks are defined based on uniform internal heating within the sphere under the Boussinesq approximation with boundary conditions that are uniform in temperature and stress-free for the flow. Benchmark 1 is purely hydrodynamic, and has a drifting solution. Benchmark 2 is a magnetohydrodynamic benchmark that can generate oscillatory, purely periodic, flows and magnetic fields. In contrast, Benchmark 3 is a hydrodynamic rotating bubble benchmark using no slip boundary conditions that has a stationary solution. Results from a variety of types of code are reported, including codes that are fully spectral (based on spherical harmonic expansions in angular coordinates and polynomial expansions in radius), mixed spectral and finite difference, finite volume, finite element and also a mixed Fourier-finite element code. There is good agreement between codes. It is found that in Benchmarks 1 and 2, the approximation of a whole sphere problem by a domain that is a spherical shell (a sphere possessing an inner core) does not represent an adequate approximation to the system, since the results differ from whole sphere results. © The Authors 2014. Published by Oxford University Press on behalf of The Royal Astronomical Society.

  17. Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Sartor, Dale; Tschudi, William

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  18. Mass transfer dynamics in double degenerate binary systems

    International Nuclear Information System (INIS)

    Dan, M; Rosswog, S; Brueggen, M

    2009-01-01

    We present a numerical study of the mass transfer dynamics prior to the gravitational wave-driven merger of a double white dwarf system. Recently, there has been some discussion about the dynamics of these last stages, different methods seemed to provide qualitatively different results. While earlier SPH simulations indicated a very quick disruption of the binary on roughly the orbital time scale, more recent grid-based calculations find long-lived mass transfer for many orbital periods. Here we demonstrate how sensitive the dynamics of this last stage is to the exact initial conditions. We show that, after a careful preparation of the initial conditions, the reportedly short-lived systems undergo mass transfer for many dozens of orbits. The reported numbers of orbits are resolution-biased and therefore represent only lower limits to what is realized in nature. Nevertheless, the study shows convincingly the convergence of different methods to very similar results.

  19. Benchmarking local healthcare-associated infections: Available benchmarks and interpretation challenges

    Directory of Open Access Journals (Sweden)

    Aiman El-Saed

    2013-10-01

    Full Text Available Summary: Growing numbers of healthcare facilities are routinely collecting standardized data on healthcare-associated infection (HAI, which can be used not only to track internal performance but also to compare local data to national and international benchmarks. Benchmarking overall (crude HAI surveillance metrics without accounting or adjusting for potential confounders can result in misleading conclusions. Methods commonly used to provide risk-adjusted metrics include multivariate logistic regression analysis, stratification, indirect standardization, and restrictions. The characteristics of recognized benchmarks worldwide, including the advantages and limitations are described. The choice of the right benchmark for the data from the Gulf Cooperation Council (GCC states is challenging. The chosen benchmark should have similar data collection and presentation methods. Additionally, differences in surveillance environments including regulations should be taken into consideration when considering such a benchmark. The GCC center for infection control took some steps to unify HAI surveillance systems in the region. GCC hospitals still need to overcome legislative and logistic difficulties in sharing data to create their own benchmark. The availability of a regional GCC benchmark may better enable health care workers and researchers to obtain more accurate and realistic comparisons. Keywords: Benchmarking, Comparison, Surveillance, Healthcare-associated infections

  20. Storage-Intensive Supercomputing Benchmark Study

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J; Dossa, D; Gokhale, M; Hysom, D; May, J; Pearce, R; Yoo, A

    2007-10-30

    Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe: (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows

  1. Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  2. On the gauge (in)dependence of the dynamical quark mass

    International Nuclear Information System (INIS)

    Reinders, L.J.; Stam, K.

    1987-04-01

    We compute the contribution of the mixed quark-gluon condensate to the quark self-energy to all orders in the dynamical quark mass. We investigate the consistency of different expansion schemes. It is found that nonabelian interactions form an obstruction to defining a true dynamical gauge independent mass shell. (orig.)

  3. Improving predictions of large scale soil carbon dynamics: Integration of fine-scale hydrological and biogeochemical processes, scaling, and benchmarking

    Science.gov (United States)

    Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.

    2015-12-01

    Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we

  4. (Invited) Microreactors for Characterization and Benchmarking of Photocatalysts

    DEFF Research Database (Denmark)

    Vesborg, Peter Christian Kjærgaard; Dionigi, Fabio; Trimarco, Daniel Bøndergaard

    2015-01-01

    In the field of photocatalysis the batch-nature of the typical benchmarking experiment makes it very laborious to obtain good kinetic data as a function of parameters such as illumination wavelength, irradiance, catalyst temperature, reactant composition, etc. Microreactors with on-line mass...

  5. Dynamical limits on dark mass in the outer solar system

    International Nuclear Information System (INIS)

    Hogg, D.W.; Quinlan, G.D.; Tremaine, S.

    1991-01-01

    Simplified model solar systems with known observational errors are considered in conducting a dynamical search for dark mass and its minimum detectable amount, and in determining the significance of observed anomalies. The numerical analysis of the dynamical influence of dark mass on the orbits of outer planets and comets is presented in detail. Most conclusions presented are based on observations of the four giant planets where the observational errors in latitude and longitude are independent Gaussian variables with a standard deviation. Neptune's long orbital period cannot be predicted by modern ephemerides, and no evidence of dark mass is found in considering this planet. Studying the improvement in fit when observations are fitted to models that consider dark mass is found to be an efficient way to detect dark mass. Planet X must have a mass of more than about 10 times the minimum detectable mass to locate the hypothetical planet. It is suggested that the IRAS survey would have already located the Planet X if it is so massive and close that it dynamically influences the outer planets. Orbital residuals from comets are found to be more effective than those from planets in detecting the Kuiper belt. 35 refs

  6. Benchmarking of FA2D/PARCS Code Package

    International Nuclear Information System (INIS)

    Grgic, D.; Jecmenica, R.; Pevec, D.

    2006-01-01

    FA2D/PARCS code package is used at Faculty of Electrical Engineering and Computing (FER), University of Zagreb, for static and dynamic reactor core analyses. It consists of two codes: FA2D and PARCS. FA2D is a multigroup two dimensional transport theory code for burn-up calculations based on collision probability method, developed at FER. It generates homogenised cross sections both of single pins and entire fuel assemblies. PARCS is an advanced nodal code developed at Purdue University for US NRC and it is based on neutron diffusion theory for three dimensional whole core static and dynamic calculations. It is modified at FER to enable internal 3D depletion calculation and usage of neutron cross section data in a format produced by FA2D and interface codes. The FA2D/PARCS code system has been validated on NPP Krsko operational data (Cycles 1 and 21). As we intend to use this code package for development of IRIS reactor loading patterns the first logical step was to validate the FA2D/PARCS code package on a set of IRIS benchmarks, starting from simple unit fuel cell, via fuel assembly, to full core benchmark. The IRIS 17x17 fuel with erbium burnable absorber was used in last full core benchmark. The results of modelling the IRIS full core benchmark using FA2D/PARCS code package have been compared with reference data showing the adequacy of FA2D/PARCS code package model for IRIS reactor core design.(author)

  7. Dirac Mass Dynamics in Multidimensional Nonlocal Parabolic Equations

    KAUST Repository

    Lorz, Alexander

    2011-01-17

    Nonlocal Lotka-Volterra models have the property that solutions concentrate as Dirac masses in the limit of small diffusion. Is it possible to describe the dynamics of the limiting concentration points and of the weights of the Dirac masses? What is the long time asymptotics of these Dirac masses? Can several Dirac masses coexist? We will explain how these questions relate to the so-called "constrained Hamilton-Jacobi equation" and how a form of canonical equation can be established. This equation has been established assuming smoothness. Here we build a framework where smooth solutions exist and thus the full theory can be developed rigorously. We also show that our form of canonical equation comes with a kind of Lyapunov functional. Numerical simulations show that the trajectories can exhibit unexpected dynamics well explained by this equation. Our motivation comes from population adaptive evolution a branch of mathematical ecology which models Darwinian evolution. © Taylor & Francis Group, LLC.

  8. BENCHMARKING AND CONFIGURATION OF OPENSOURCE MANUFACTURING EXECUTION SYSTEM (MES APPLICATION

    Directory of Open Access Journals (Sweden)

    Ganesha Nur Laksmana

    2013-05-01

    Full Text Available Information now is an important element to every growing industry in the world. Inorder to keep up with other competitors, endless improvements in optimizing overall efficiency areneeded. There still exist barriers that separate departments in PT. XYZ and cause limitation to theinformation sharing in the system. Open-Source Manufacturing Execution System (MES presentsas an IT-based application that offers wide variety of customization to eliminate stovepipes bysharing information between departments. Benchmarking is used to choose the best Open-SourceMES Application; and Dynamic System Development Method (DSDM is adopted as this workguideline. As a result, recommendations of the chosen Open-Source MES Application arerepresented.Keywords: Manufacturing Execution System (MES; Open Source; Dynamic SystemDevelopment Method (DSDM; Benchmarking; Configuration

  9. Comparative analysis for low-mass and low-inertia dynamic balancing of mechanisms

    NARCIS (Netherlands)

    van der Wijk, V.; Demeulenaere, B.; Gosselin, C.M.; Herder, Justus Laurens

    2012-01-01

    Dynamic balance is an important feature of high speed mechanisms and robotics that need to minimize vibrations of the base. The main disadvantage of dynamic balancing, however, is that it is accompanied with a considerable increase in mass and inertia. Aiming at low-mass and low-inertia dynamic

  10. Direct infusion mass spectrometry metabolomics dataset: a benchmark for data processing and quality control

    Science.gov (United States)

    Kirwan, Jennifer A; Weber, Ralf J M; Broadhurst, David I; Viant, Mark R

    2014-01-01

    Direct-infusion mass spectrometry (DIMS) metabolomics is an important approach for characterising molecular responses of organisms to disease, drugs and the environment. Increasingly large-scale metabolomics studies are being conducted, necessitating improvements in both bioanalytical and computational workflows to maintain data quality. This dataset represents a systematic evaluation of the reproducibility of a multi-batch DIMS metabolomics study of cardiac tissue extracts. It comprises of twenty biological samples (cow vs. sheep) that were analysed repeatedly, in 8 batches across 7 days, together with a concurrent set of quality control (QC) samples. Data are presented from each step of the workflow and are available in MetaboLights. The strength of the dataset is that intra- and inter-batch variation can be corrected using QC spectra and the quality of this correction assessed independently using the repeatedly-measured biological samples. Originally designed to test the efficacy of a batch-correction algorithm, it will enable others to evaluate novel data processing algorithms. Furthermore, this dataset serves as a benchmark for DIMS metabolomics, derived using best-practice workflows and rigorous quality assessment. PMID:25977770

  11. Creation of a simplified benchmark model for the neptunium sphere experiment

    International Nuclear Information System (INIS)

    Mosteller, Russell D.; Loaiza, David J.; Sanchez, Rene G.

    2004-01-01

    Although neptunium is produced in significant amounts by nuclear power reactors, its critical mass is not well known. In addition, sizeable uncertainties exist for its cross sections. As an important step toward resolution of these issues, a critical experiment was conducted in 2002 at the Los Alamos Critical Experiments Facility. In the experiment, a 6-kg sphere of 237 Np was surrounded by nested hemispherical shells of highly enriched uranium. The shells were required in order to reach a critical condition. Subsequently, a detailed model of the experiment was developed. This model faithfully reproduces the components of the experiment, but it is geometrically complex. Furthermore, the isotopics analysis upon which that model is based omits nearly 1 % of the mass of the sphere. A simplified benchmark model has been constructed that retains all of the neutronically important aspects of the detailed model and substantially reduces the computer resources required for the calculation. The reactivity impact, of each of the simplifications is quantified, including the effect of the missing mass. A complete set of specifications for the benchmark is included in the full paper. Both the detailed and simplified benchmark models underpredict k eff by more than 1% Δk. This discrepancy supports the suspicion that better cross sections are needed for 237 Np.

  12. Benchmarking in University Toolbox

    Directory of Open Access Journals (Sweden)

    Katarzyna Kuźmicz

    2015-06-01

    Full Text Available In the face of global competition and rising challenges that higher education institutions (HEIs meet, it is imperative to increase innovativeness and efficiency of their management. Benchmarking can be the appropriate tool to search for a point of reference necessary to assess institution’s competitive position and learn from the best in order to improve. The primary purpose of the paper is to present in-depth analysis of benchmarking application in HEIs worldwide. The study involves indicating premises of using benchmarking in HEIs. It also contains detailed examination of types, approaches and scope of benchmarking initiatives. The thorough insight of benchmarking applications enabled developing classification of benchmarking undertakings in HEIs. The paper includes review of the most recent benchmarking projects and relating them to the classification according to the elaborated criteria (geographical range, scope, type of data, subject, support and continuity. The presented examples were chosen in order to exemplify different approaches to benchmarking in higher education setting. The study was performed on the basis of the published reports from benchmarking projects, scientific literature and the experience of the author from the active participation in benchmarking projects. The paper concludes with recommendations for university managers undertaking benchmarking, derived on the basis of the conducted analysis.

  13. Conserved peptide fragmentation as a benchmarking tool for mass spectrometers and a discriminating feature for targeted proteomics.

    Science.gov (United States)

    Toprak, Umut H; Gillet, Ludovic C; Maiolica, Alessio; Navarro, Pedro; Leitner, Alexander; Aebersold, Ruedi

    2014-08-01

    Quantifying the similarity of spectra is an important task in various areas of spectroscopy, for example, to identify a compound by comparing sample spectra to those of reference standards. In mass spectrometry based discovery proteomics, spectral comparisons are used to infer the amino acid sequence of peptides. In targeted proteomics by selected reaction monitoring (SRM) or SWATH MS, predetermined sets of fragment ion signals integrated over chromatographic time are used to identify target peptides in complex samples. In both cases, confidence in peptide identification is directly related to the quality of spectral matches. In this study, we used sets of simulated spectra of well-controlled dissimilarity to benchmark different spectral comparison measures and to develop a robust scoring scheme that quantifies the similarity of fragment ion spectra. We applied the normalized spectral contrast angle score to quantify the similarity of spectra to objectively assess fragment ion variability of tandem mass spectrometric datasets, to evaluate portability of peptide fragment ion spectra for targeted mass spectrometry across different types of mass spectrometers and to discriminate target assays from decoys in targeted proteomics. Altogether, this study validates the use of the normalized spectral contrast angle as a sensitive spectral similarity measure for targeted proteomics, and more generally provides a methodology to assess the performance of spectral comparisons and to support the rational selection of the most appropriate similarity measure. The algorithms used in this study are made publicly available as an open source toolset with a graphical user interface. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  14. Preliminary results of the seventh three-dimensional AER dynamic benchmark problem calculation. Solution with DYN3D and RELAP5-3D codes

    International Nuclear Information System (INIS)

    Bencik, M.; Hadek, J.

    2011-01-01

    The paper gives a brief survey of the seventh three-dimensional AER dynamic benchmark calculation results received with the codes DYN3D and RELAP5-3D at Nuclear Research Institute Rez. This benchmark was defined at the twentieth AER Symposium in Hanassari (Finland). It is focused on investigation of transient behaviour in a WWER-440 nuclear power plant. Its initiating event is opening of the main isolation valve and re-connection of the loop with its main circulation pump in operation. The WWER-440 plant is at the end of the first fuel cycle and in hot full power conditions. Stationary and burnup calculations were performed with the code DYN3D. Transient calculation was made with the system code RELAP5-3D. The two-group homogenized cross sections library HELGD05 created by HELIOS code was used for the generation of reactor core neutronic parameters. The detailed six loops model of NPP Dukovany was adopted for the seventh AER dynamic benchmark purposes. The RELAP5-3D full core neutronic model was coupled with 49 core thermal-hydraulic channels and 8 reflector channels connected with the three-dimensional model of the reactor vessel. The detailed nodalization of reactor downcomer, lower and upper plenum was used. Mixing in lower and upper plenum was simulated. The first part of paper contains a brief characteristic of RELAP5-3D system code and a short description of NPP input deck and reactor core model. The second part shows the time dependencies of important global and local parameters. (Authors)

  15. Piping benchmark problems for the Westinghouse AP600 Standardized Plant

    International Nuclear Information System (INIS)

    Bezler, P.; DeGrassi, G.; Braverman, J.; Wang, Y.K.

    1997-01-01

    To satisfy the need for verification of the computer programs and modeling techniques that will be used to perform the final piping analyses for the Westinghouse AP600 Standardized Plant, three benchmark problems were developed. The problems are representative piping systems subjected to representative dynamic loads with solutions developed using the methods being proposed for analysis for the AP600 standard design. It will be required that the combined license licensees demonstrate that their solutions to these problems are in agreement with the benchmark problem set

  16. Library Benchmarking

    Directory of Open Access Journals (Sweden)

    Wiji Suwarno

    2017-02-01

    Full Text Available The term benchmarking has been encountered in the implementation of total quality (TQM or in Indonesian termed holistic quality management because benchmarking is a tool to look for ideas or learn from the library. Benchmarking is a processof measuring and comparing for continuous business process of systematic and continuous measurement, the process of measuring and comparing for continuous business process of an organization to get information that can help these organization improve their performance efforts.

  17. FENDL neutronics benchmark: Specifications for the calculational neutronics and shielding benchmark

    International Nuclear Information System (INIS)

    Sawan, M.E.

    1994-12-01

    During the IAEA Advisory Group Meeting on ''Improved Evaluations and Integral Data Testing for FENDL'' held in Garching near Munich, Germany in the period 12-16 September 1994, the Working Group II on ''Experimental and Calculational Benchmarks on Fusion Neutronics for ITER'' recommended that a calculational benchmark representative of the ITER design should be developed. This report describes the neutronics and shielding calculational benchmark available for scientists interested in performing analysis for this benchmark. (author)

  18. Constraining dynamical neutrino mass generation with cosmological data

    Energy Technology Data Exchange (ETDEWEB)

    Koksbang, S.M.; Hannestad, S., E-mail: koksbang@phys.au.dk, E-mail: sth@phys.au.dk [Department of Physics and Astronomy, University of Aarhus, DK-8000 Aarhus C (Denmark)

    2017-09-01

    We study models in which neutrino masses are generated dynamically at cosmologically late times. Our study is purely phenomenological and parameterized in terms of three effective parameters characterizing the redshift of mass generation, the width of the transition region, and the present day neutrino mass. We also study the possibility that neutrinos become strongly self-interacting at the time where the mass is generated. We find that in a number of cases, models with large present day neutrino masses are allowed by current CMB, BAO and supernova data. The increase in the allowed mass range makes it possible that a non-zero neutrino mass could be measured in direct detection experiments such as KATRIN. Intriguingly we also find that there are allowed models in which neutrinos become strongly self-interacting around the epoch of recombination.

  19. The effect of dynamical quark mass on the calculation of a strange quark star's structure

    Institute of Scientific and Technical Information of China (English)

    Gholam Hossein Bordbar; Babak Ziaei

    2012-01-01

    We discuss the dynamical behavior of strange quark matter components,in particular the effects of density dependent quark mass on the equation of state of strange quark matter.The dynamical masses of quarks are computed within the Nambu-Jona-Lasinio model,then we perform strange quark matter calculations employing the MIT bag model with these dynamical masses.For the sake of comparing dynamical mass interaction with QCD quark-quark interaction,we consider the one-gluon-exchange term as the effective interaction between quarks for the MIT bag model.Our dynamical approach illustrates an improvement in the obtained equation of state values.We also investigate the structure of the strange quark star using TolmanOppenheimer-Volkoff equations for all applied models.Our results show that dynamical mass interaction leads to lower values for gravitational mass.

  20. Benchmarking and Performance Measurement.

    Science.gov (United States)

    Town, J. Stephen

    This paper defines benchmarking and its relationship to quality management, describes a project which applied the technique in a library context, and explores the relationship between performance measurement and benchmarking. Numerous benchmarking methods contain similar elements: deciding what to benchmark; identifying partners; gathering…

  1. Piping benchmark problems for the ABB/CE System 80+ Standardized Plant

    International Nuclear Information System (INIS)

    Bezler, P.; DeGrassi, G.; Braverman, J.; Wang, Y.K.

    1994-07-01

    To satisfy the need for verification of the computer programs and modeling techniques that will be used to perform the final piping analyses for the ABB/Combustion Engineering System 80+ Standardized Plant, three benchmark problems were developed. The problems are representative piping systems subjected to representative dynamic loads with solutions developed using the methods being proposed for analysis for the System 80+ standard design. It will be required that the combined license licensees demonstrate that their solution to these problems are in agreement with the benchmark problem set. The first System 80+ piping benchmark is a uniform support motion response spectrum solution for one section of the feedwater piping subjected to safe shutdown seismic loads. The second System 80+ piping benchmark is a time history solution for the feedwater piping subjected to the transient loading induced by a water hammer. The third System 80+ piping benchmark is a time history solution of the pressurizer surge line subjected to the accelerations induced by a main steam line pipe break. The System 80+ reactor is an advanced PWR type

  2. Benchmarking in the Netherlands

    International Nuclear Information System (INIS)

    1999-01-01

    In two articles an overview is given of the activities in the Dutch industry and energy sector with respect to benchmarking. In benchmarking operational processes of different competitive businesses are compared to improve your own performance. Benchmark covenants for energy efficiency between the Dutch government and industrial sectors contribute to a growth of the number of benchmark surveys in the energy intensive industry in the Netherlands. However, some doubt the effectiveness of the benchmark studies

  3. LHC benchmark scenarios for the real Higgs singlet extension of the standard model

    International Nuclear Information System (INIS)

    Robens, Tania; Stefaniak, Tim

    2016-01-01

    We present benchmark scenarios for searches for an additional Higgs state in the real Higgs singlet extension of the Standard Model in Run 2 of the LHC. The scenarios are selected such that they fulfill all relevant current theoretical and experimental constraints, but can potentially be discovered at the current LHC run. We take into account the results presented in earlier work and update the experimental constraints from relevant LHC Higgs searches and signal rate measurements. The benchmark scenarios are given separately for the low-mass and high-mass region, i.e. the mass range where the additional Higgs state is lighter or heavier than the discovered Higgs state at around 125 GeV. They have also been presented in the framework of the LHC Higgs Cross Section Working Group. (orig.)

  4. Light baryon masses with dynamical twisted mass fermions

    Energy Technology Data Exchange (ETDEWEB)

    Alexandrou, C. [Cyprus Univ., Nicosia (Cyprus). Dept. of Physics; Baron, R. [CEA-Saclay, IRFU/Service de Physique Nucleaire, Gif-sur-Yvette (France); Blossier, B. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (DE). John von Neumann-Inst. fuer Computing NIC] (and others)

    2008-03-15

    We present results on the mass of the nucleon and the {delta} using two dynamical degenerate twisted mass quarks. The evaluation is performed at four quark masses corresponding to a pion mass in the range of about 300-600 MeV on lattices of 2.1-2.7 fm. We check for cut-off effects by evaluating these baryon masses on lattices of spatial size 2.1 fm at {beta}=3.9 and {beta}=4.05 and on a lattice of 2.4 fm at {beta}=3.8. The values we find are compatible within our statistical errors. Lattice results are extrapolated to the physical limit using continuum chiral perturbation theory. Performing a combined fit to our lattice data at {beta}=3.9 and {beta}=4.05 we find a nucleon mass of 964{+-}28(stat.){+-}8(syst.) MeV where we used the lattice spacings determined from the pion decay constant to convert to physical units. The systematic error due to the chiral extrapolation is estimated by comparing results obtained at O(p{sup 3}) and O(p{sup 4}) heavy baryon chiral perturbation theory. The nucleon mass at the physical point provides an independent determination of the lattice spacing. Using heavy baryon chiral perturbation theory at O(p{sup 3}) we find a{sub {beta}}{sub =3.9}=0.0890{+-}0.0039(stat.){+-}0.0014(syst.) fm, and a{sub {beta}}{sub =4.05}=0.0691{+-}0.0034(stat.){+-}0.0010(syst.) fm, in good agreement with the values determined from the pion decay constant. Using results from our two smaller lattices spacings at constant r0m we estimate the continuum limit and check consistency with results from the coarser lattice. Results at the continuum limit are chirally extrapolated to the physical point. Isospin violating lattice artifacts in the {delta}-system are found to be compatible with zero for the values of the lattice spacings used in this work. Performing a combined fit to our lattice data at {beta}=3.9 and {beta}=4.05 we find for the masses of the {delta}{sup ++,-} and {delta}{sup +,0} 1316{+-}60(stat.) MeV and 1330{+-}74(stat.) MeV respectively. We confirm

  5. Light baryon masses with dynamical twisted mass fermions

    International Nuclear Information System (INIS)

    Alexandrou, C.; Blossier, B.

    2008-03-01

    We present results on the mass of the nucleon and the Δ using two dynamical degenerate twisted mass quarks. The evaluation is performed at four quark masses corresponding to a pion mass in the range of about 300-600 MeV on lattices of 2.1-2.7 fm. We check for cut-off effects by evaluating these baryon masses on lattices of spatial size 2.1 fm at β=3.9 and β=4.05 and on a lattice of 2.4 fm at β=3.8. The values we find are compatible within our statistical errors. Lattice results are extrapolated to the physical limit using continuum chiral perturbation theory. Performing a combined fit to our lattice data at β=3.9 and β=4.05 we find a nucleon mass of 964±28(stat.)±8(syst.) MeV where we used the lattice spacings determined from the pion decay constant to convert to physical units. The systematic error due to the chiral extrapolation is estimated by comparing results obtained at O(p 3 ) and O(p 4 ) heavy baryon chiral perturbation theory. The nucleon mass at the physical point provides an independent determination of the lattice spacing. Using heavy baryon chiral perturbation theory at O(p 3 ) we find a β=3.9 =0.0890±0.0039(stat.)±0.0014(syst.) fm, and a β=4.05 =0.0691±0.0034(stat.)±0.0010(syst.) fm, in good agreement with the values determined from the pion decay constant. Using results from our two smaller lattices spacings at constant r0m we estimate the continuum limit and check consistency with results from the coarser lattice. Results at the continuum limit are chirally extrapolated to the physical point. Isospin violating lattice artifacts in the Δ-system are found to be compatible with zero for the values of the lattice spacings used in this work. Performing a combined fit to our lattice data at β=3.9 and β=4.05 we find for the masses of the Δ ++,- and Δ +,0 1316±60(stat.) MeV and 1330±74(stat.) MeV respectively. We confirm that in the continuum limit they are also degenerate. (orig.)

  6. Mass Transport Properties of LiD-U Mixtures from Orbital FreeMolecular Dynamics Simulations and a Pressure-Matching Mixing Rule

    International Nuclear Information System (INIS)

    Burakovsky, Leonid; Kress, Joel D.; Collins, Lee A.

    2012-01-01

    Mass transport properties for LiD-U mixtures were calculated using a pressure matching mixture rule for the mixing of LiD and of U properties simulated with Orbital Free Molecular Dynamics (OFMD). The mixing rule was checked against benchmark OFMD simulations for the fully interacting three-component (Li, D, U) system. To obtain transport coefficients for LiD-U mixtures of different (LiD) x U (1-x) compositions as functions of temperature and mixture density is a tedious task. Quantum molecular dynamics (MD) simulations can be employed, as in the case LiD or U. However, due to the presence of the heavy constituent U, such simulations proceed so slowly that only a limited number of numerical data points in the (x, ρ, T) phase space can be obtained. To finesse this difficulty, transport coefficients for a mixture can be obtained using a pressure-matching mixing rule discussed. For both LiD and U, the corresponding transport coefficients were obtained earlier from quantum molecular dynamics simulations. In these simulations, the quantum behavior of the electrons was represented using an orbital free (OF) version of density functional theory, and ions were advanced in time using classical molecular dynamics. The total pressure of the system, P = nk B T/V + P e , is the sum of the ideal gas pressure of the ions plus the electron pressure. The mass self-diffusion coefficient for species α, D α , the mutual diffusion coefficient for species α and β, Dαβ, and the shear viscosity, η, are computed from the appropriate autocorrelation function. The details of similar QMD calculations on LiH are described in Ref. [1] for 0.5 eV < T < 3 eV, and in Ref. [2] for 2 eV < T < 6 eV.

  7. Start-up of a cold loop in a VVER-440, the 7th AER benchmark calculation with HEXTRAN-SMABRE-PORFLO

    International Nuclear Information System (INIS)

    Hovi, Ville; Taivassalo, Veikko; Haemaelaeinen, Anitta; Raety, Hanna; Syrjaelahti, Elina

    2017-01-01

    The 7 th dynamic AER benchmark is the first in which three-dimensional thermal hydraulics codes are supposed to be applied. The aim is to get a more precise core inlet temperature profile than the sector temperatures available typically with system codes. The benchmark consists of a start-up of the sixth, isolated loop in a VVER-440 plant. The isolated loop initially contains cold water without boric acid and the start-up leads to a somewhat asymmetrical core power increase due to feedbacks in the core. In this study, the 7 th AER benchmark is calculated with the three-dimensional nodal reactor dynamics code HEXTRAN-SMABRE coupled with the porous computational fluid dynamics code PORFLO. These three codes are developed at VTT. A novel two-way coupled simulation of the 7 th AER benchmark was performed successfully demonstrating the feasibility and advantages of the new reactor analysis framework. The modelling issues for this benchmark are reported and some evaluation against the previously reported comparisons between the system codes is provided.

  8. Study of mass consciousnessand its dynamics in sociologic research

    Directory of Open Access Journals (Sweden)

    S. V. Khobta

    2017-01-01

    Full Text Available The article is dedicated to analysis of the approaches used to study mass consciousness and the methods of their dynamics research. The two following approaches are reviewed: aggregative and group. The author shows that the study of the dynamics of mass consciousness in the modern day science is performed via computer modeling applying agent-oriented models and mass surveys of public opinion. Special attention is given to the concept of mass consciousness according to B. Grushin, where the mass consciousness is analyzed as a complex phenomenon according to its structure and the formation process. It is further analyzed what this particular concept is able to provide for the mass consciousness studies in the times of crises, similar to the situation of the military conflict at the East. It is then proven that the dialectic approach should be used as the basis for the mass consciousness studies adjusted for the interaction dynamics between the individual and collective, spontaneous and the institutionalized inside the collective consciousness. The mass consciousness is a complex structural formation, both heterogeneous and syncretical. Inside this structure one must distinguish between the layers that differ in depth and mobility and pay attention to its various conditions. The layers represent different worldviews, where, depending on the situation, scientific, religious or mystical images of the world can be actualized along with their ideological, moral and aesthetic precepts. These can cross, merge or coexist without contradicting each other and get actualized to a different extent. Besides the aforementioned the mass consciousness serves as a carrier of different kinds of deep, hard to change formations, such as «historical/collective memory», memlexes as well as superficial, most actualized forms, such as mems. It has «public/formal», socially accepted, and «private/real» levels that manifest themselves, in particular, in the forms of

  9. Internal Mass Motion for Spacecraft Dynamics and Control

    National Research Council Canada - National Science Library

    Hall, Christopher D

    2008-01-01

    We present a detailed description of the application of a noncanonical Hamiltonian formulation to the modeling, analysis, and simulation of the dynamics of gyrostat spacecraft with internal mass motion...

  10. WLUP benchmarks

    International Nuclear Information System (INIS)

    Leszczynski, Francisco

    2002-01-01

    The IAEA-WIMS Library Update Project (WLUP) is on the end stage. The final library will be released on 2002. It is a result of research and development made by more than ten investigators during 10 years. The organization of benchmarks for testing and choosing the best set of data has been coordinated by the author of this paper. It is presented the organization, name conventions, contents and documentation of WLUP benchmarks, and an updated list of the main parameters for all cases. First, the benchmarks objectives and types are given. Then, comparisons of results from different WIMSD libraries are included. Finally it is described the program QVALUE for analysis and plot of results. Some examples are given. The set of benchmarks implemented on this work is a fundamental tool for testing new multigroup libraries. (author)

  11. Benchmarking in pathology: development of a benchmarking complexity unit and associated key performance indicators.

    Science.gov (United States)

    Neil, Amanda; Pfeffer, Sally; Burnett, Leslie

    2013-01-01

    This paper details the development of a new type of pathology laboratory productivity unit, the benchmarking complexity unit (BCU). The BCU provides a comparative index of laboratory efficiency, regardless of test mix. It also enables estimation of a measure of how much complex pathology a laboratory performs, and the identification of peer organisations for the purposes of comparison and benchmarking. The BCU is based on the theory that wage rates reflect productivity at the margin. A weighting factor for the ratio of medical to technical staff time was dynamically calculated based on actual participant site data. Given this weighting, a complexity value for each test, at each site, was calculated. The median complexity value (number of BCUs) for that test across all participating sites was taken as its complexity value for the Benchmarking in Pathology Program. The BCU allowed implementation of an unbiased comparison unit and test listing that was found to be a robust indicator of the relative complexity for each test. Employing the BCU data, a number of Key Performance Indicators (KPIs) were developed, including three that address comparative organisational complexity, analytical depth and performance efficiency, respectively. Peer groups were also established using the BCU combined with simple organisational and environmental metrics. The BCU has enabled productivity statistics to be compared between organisations. The BCU corrects for differences in test mix and workload complexity of different organisations and also allows for objective stratification into peer groups.

  12. Benchmarking electricity distribution

    Energy Technology Data Exchange (ETDEWEB)

    Watts, K. [Department of Justice and Attorney-General, QLD (Australia)

    1995-12-31

    Benchmarking has been described as a method of continuous improvement that involves an ongoing and systematic evaluation and incorporation of external products, services and processes recognised as representing best practice. It is a management tool similar to total quality management (TQM) and business process re-engineering (BPR), and is best used as part of a total package. This paper discusses benchmarking models and approaches and suggests a few key performance indicators that could be applied to benchmarking electricity distribution utilities. Some recent benchmarking studies are used as examples and briefly discussed. It is concluded that benchmarking is a strong tool to be added to the range of techniques that can be used by electricity distribution utilities and other organizations in search of continuous improvement, and that there is now a high level of interest in Australia. Benchmarking represents an opportunity for organizations to approach learning from others in a disciplined and highly productive way, which will complement the other micro-economic reforms being implemented in Australia. (author). 26 refs.

  13. Nutrient cycle benchmarks for earth system land model

    Science.gov (United States)

    Zhu, Q.; Riley, W. J.; Tang, J.; Zhao, L.

    2017-12-01

    Projecting future biosphere-climate feedbacks using Earth system models (ESMs) relies heavily on robust modeling of land surface carbon dynamics. More importantly, soil nutrient (particularly, nitrogen (N) and phosphorus (P)) dynamics strongly modulate carbon dynamics, such as plant sequestration of atmospheric CO2. Prevailing ESM land models all consider nitrogen as a potentially limiting nutrient, and several consider phosphorus. However, including nutrient cycle processes in ESM land models potentially introduces large uncertainties that could be identified and addressed by improved observational constraints. We describe the development of two nutrient cycle benchmarks for ESM land models: (1) nutrient partitioning between plants and soil microbes inferred from 15N and 33P tracers studies and (2) nutrient limitation effects on carbon cycle informed by long-term fertilization experiments. We used these benchmarks to evaluate critical hypotheses regarding nutrient cycling and their representation in ESMs. We found that a mechanistic representation of plant-microbe nutrient competition based on relevant functional traits best reproduced observed plant-microbe nutrient partitioning. We also found that for multiple-nutrient models (i.e., N and P), application of Liebig's law of the minimum is often inaccurate. Rather, the Multiple Nutrient Limitation (MNL) concept better reproduces observed carbon-nutrient interactions.

  14. Benchmarking semantic web technology

    CERN Document Server

    García-Castro, R

    2009-01-01

    This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:

  15. Technicolor and the asymptotic behavior of dynamically generated masses

    International Nuclear Information System (INIS)

    Natale, A.A.

    1984-01-01

    Arguments are given in favor of a hard asymptotic behavior of dynamically generated masses, its consequences for technicolor models are analyzed and a model is proposed, where effects of flavor changing neutral currents are highly supressed and pseudo Goldstone bosons get masses of O(30-90) GeV. (Author) [pt

  16. RUNE benchmarks

    DEFF Research Database (Denmark)

    Peña, Alfredo

    This report contains the description of a number of benchmarks with the purpose of evaluating flow models for near-shore wind resource estimation. The benchmarks are designed based on the comprehensive database of observations that the RUNE coastal experiment established from onshore lidar...

  17. MCNP neutron benchmarks

    International Nuclear Information System (INIS)

    Hendricks, J.S.; Whalen, D.J.; Cardon, D.A.; Uhle, J.L.

    1991-01-01

    Over 50 neutron benchmark calculations have recently been completed as part of an ongoing program to validate the MCNP Monte Carlo radiation transport code. The new and significant aspects of this work are as follows: These calculations are the first attempt at a validation program for MCNP and the first official benchmarking of version 4 of the code. We believe the chosen set of benchmarks is a comprehensive set that may be useful for benchmarking other radiation transport codes and data libraries. These calculations provide insight into how well neutron transport calculations can be expected to model a wide variety of problems

  18. Model based energy benchmarking for glass furnace

    International Nuclear Information System (INIS)

    Sardeshpande, Vishal; Gaitonde, U.N.; Banerjee, Rangan

    2007-01-01

    Energy benchmarking of processes is important for setting energy efficiency targets and planning energy management strategies. Most approaches used for energy benchmarking are based on statistical methods by comparing with a sample of existing plants. This paper presents a model based approach for benchmarking of energy intensive industrial processes and illustrates this approach for industrial glass furnaces. A simulation model for a glass furnace is developed using mass and energy balances, and heat loss equations for the different zones and empirical equations based on operating practices. The model is checked with field data from end fired industrial glass furnaces in India. The simulation model enables calculation of the energy performance of a given furnace design. The model results show the potential for improvement and the impact of different operating and design preferences on specific energy consumption. A case study for a 100 TPD end fired furnace is presented. An achievable minimum energy consumption of about 3830 kJ/kg is estimated for this furnace. The useful heat carried by glass is about 53% of the heat supplied by the fuel. Actual furnaces operating at these production scales have a potential for reduction in energy consumption of about 20-25%

  19. Piping benchmark problems for the General Electric Advanced Boiling Water Reactor

    International Nuclear Information System (INIS)

    Bezler, P.; DeGrassi, G.; Braverman, J.; Wang, Y.K.

    1993-08-01

    To satisfy the need for verification of the computer programs and modeling techniques that will be used to perform the final piping analyses for an advanced boiling water reactor standard design, three benchmark problems were developed. The problems are representative piping systems subjected to representative dynamic loads with solutions developed using the methods being proposed for analysis for the advanced reactor standard design. It will be required that the combined license holders demonstrate that their solutions to these problems are in agreement with the benchmark problem set

  20. Start-up of a cold loop in a VVER-440, the 7{sup th} AER benchmark calculation with HEXTRAN-SMABRE-PORFLO

    Energy Technology Data Exchange (ETDEWEB)

    Hovi, Ville; Taivassalo, Veikko; Haemaelaeinen, Anitta; Raety, Hanna; Syrjaelahti, Elina [VTT Technical Research Centre of Finland Ltd, VTT (Finland)

    2017-09-15

    The 7{sup th} dynamic AER benchmark is the first in which three-dimensional thermal hydraulics codes are supposed to be applied. The aim is to get a more precise core inlet temperature profile than the sector temperatures available typically with system codes. The benchmark consists of a start-up of the sixth, isolated loop in a VVER-440 plant. The isolated loop initially contains cold water without boric acid and the start-up leads to a somewhat asymmetrical core power increase due to feedbacks in the core. In this study, the 7{sup th} AER benchmark is calculated with the three-dimensional nodal reactor dynamics code HEXTRAN-SMABRE coupled with the porous computational fluid dynamics code PORFLO. These three codes are developed at VTT. A novel two-way coupled simulation of the 7{sup th} AER benchmark was performed successfully demonstrating the feasibility and advantages of the new reactor analysis framework. The modelling issues for this benchmark are reported and some evaluation against the previously reported comparisons between the system codes is provided.

  1. Dynamic analysis of CO₂ labeling and cell respiration using membrane-inlet mass spectrometry.

    Science.gov (United States)

    Yang, Tae Hoon

    2014-01-01

    Here, we introduce a mass spectrometry-based analytical method and relevant technical details for dynamic cell respiration and CO2 labeling analysis. Such measurements can be utilized as additional information and constraints for model-based (13)C metabolic flux analysis. Dissolved dynamics of oxygen consumption and CO2 mass isotopomer evolution from (13)C-labeled tracer substrates through different cellular processes can be precisely measured on-line using a miniaturized reactor system equipped with a membrane-inlet mass spectrometer. The corresponding specific rates of physiologically relevant gases and CO2 mass isotopomers can be quantified within a short-term range based on the liquid-phase dynamics of dissolved fermentation gases.

  2. Benchmark ultra-cool dwarfs in widely separated binary systems

    Directory of Open Access Journals (Sweden)

    Jones H.R.A.

    2011-07-01

    Full Text Available Ultra-cool dwarfs as wide companions to subgiants, giants, white dwarfs and main sequence stars can be very good benchmark objects, for which we can infer physical properties with minimal reference to theoretical models, through association with the primary stars. We have searched for benchmark ultra-cool dwarfs in widely separated binary systems using SDSS, UKIDSS, and 2MASS. We then estimate spectral types using SDSS spectroscopy and multi-band colors, place constraints on distance, and perform proper motions calculations for all candidates which have sufficient epoch baseline coverage. Analysis of the proper motion and distance constraints show that eight of our ultra-cool dwarfs are members of widely separated binary systems. Another L3.5 dwarf, SDSS 0832, is shown to be a companion to the bright K3 giant η Cancri. Such primaries can provide age and metallicity constraints for any companion objects, yielding excellent benchmark objects. This is the first wide ultra-cool dwarf + giant binary system identified.

  3. An automated protocol for performance benchmarking a widefield fluorescence microscope.

    Science.gov (United States)

    Halter, Michael; Bier, Elianna; DeRose, Paul C; Cooksey, Gregory A; Choquette, Steven J; Plant, Anne L; Elliott, John T

    2014-11-01

    Widefield fluorescence microscopy is a highly used tool for visually assessing biological samples and for quantifying cell responses. Despite its widespread use in high content analysis and other imaging applications, few published methods exist for evaluating and benchmarking the analytical performance of a microscope. Easy-to-use benchmarking methods would facilitate the use of fluorescence imaging as a quantitative analytical tool in research applications, and would aid the determination of instrumental method validation for commercial product development applications. We describe and evaluate an automated method to characterize a fluorescence imaging system's performance by benchmarking the detection threshold, saturation, and linear dynamic range to a reference material. The benchmarking procedure is demonstrated using two different materials as the reference material, uranyl-ion-doped glass and Schott 475 GG filter glass. Both are suitable candidate reference materials that are homogeneously fluorescent and highly photostable, and the Schott 475 GG filter glass is currently commercially available. In addition to benchmarking the analytical performance, we also demonstrate that the reference materials provide for accurate day to day intensity calibration. Published 2014 Wiley Periodicals Inc. Published 2014 Wiley Periodicals Inc. This article is a US government work and, as such, is in the public domain in the United States of America.

  4. Light baryon masses with dynamical twisted mass fermions

    International Nuclear Information System (INIS)

    Alexandrou, C.; Korzec, T.; Koutsou, G.; Baron, R.; Guichon, P.; Blossier, B.; Herdoiza, G.; Jansen, K.; Brinet, M.; Carbonell, J.; Drach, V.; Dimopoulos, P.; Frezzotti, R.; Farchioni, F.; Liu, Z.; Pene, O.; Michael, C.; Shindler, A.; Urbach, C.; Wenger, U.

    2008-01-01

    We present results on the mass of the nucleon and the Δ using two dynamical degenerate twisted mass quarks and the tree-level Symanzik improved gauge action. The evaluation is performed at four quark masses corresponding to a pion mass in the range of about 300-600 MeV on lattices of 2.1-2.7 fm at three lattice spacings less than 0.1 fm. We check for cutoff effects by evaluating these baryon masses on lattices of spatial size 2.1 fm at β=3.9 and β=4.05 and on a lattice of 2.4 fm at β=3.8. The values we find are compatible within our statistical errors. Lattice results are extrapolated to the physical limit using continuum chiral perturbation theory. Performing a combined fit to our lattice data at β=3.9 and β=4.05 we find a nucleon mass of 963±12(stat)±8(syst) MeV where we used the lattice spacings determined from the pion decay constant to convert to physical units. The systematic error due to the chiral extrapolation is estimated by comparing results obtained at O(p 3 ) and O(p 4 ) heavy baryon chiral perturbation theory. The nucleon mass at the physical point provides an independent determination of the lattice spacing. Using heavy baryon chiral perturbation theory at O(p 3 ) we find a β=3.9 =0.0889±0.0012(stat)±0.0014(syst) fm, and a β=4.05 =0.0691±0.0010(stat)±0.0010(syst) fm, in good agreement with the values determined from the pion decay constant. Using results from our two smaller lattices spacings at constant r 0 m π we estimate the continuum limit and check consistency with results from the coarser lattice. Results at the continuum limit are chirally extrapolated to the physical point. Isospin violating lattice artifacts in the Δ-system are found to be compatible with zero for the values of the lattice spacings used in this work. Performing a combined fit to our lattice data at β=3.9 and β=4.05 we find for the masses of the Δ ++,- and Δ +,0 1315±24(stat) MeV and 1329±30(stat) MeV, respectively. We confirm that in the continuum limit

  5. Dynamical gluon mass in the instanton vacuum model

    Science.gov (United States)

    Musakhanov, M.; Egamberdiev, O.

    2018-04-01

    We consider the modifications of gluon properties in the instanton liquid model (ILM) for the QCD vacuum. Rescattering of gluons on instantons generates the dynamical momentum-dependent gluon mass Mg (q). First, we consider the case of a scalar gluon, no zero-mode problem occurs and its dynamical mass Ms (q) can be found. Using the typical phenomenological values of the average instanton size ρ = 1 / 3 fm and average inter-instanton distance R = 1 fm we get Ms (0) = 256 MeV. We then extend this approach to the real vector gluon with zero-modes carefully considered. We obtain the following expression Mg2 (q) = 2 Ms2 (q). This modification of the gluon in the instanton media will shed light on nonperturbative aspect on heavy quarkonium physics.

  6. Electron induced break-up of helium. Benchmark experiments on a dynamical four-body Coulomb system

    International Nuclear Information System (INIS)

    Duerr, M.

    2006-01-01

    This work presents an experimental study of fragmentation of helium by electron impact, in which absolute fully differential cross sections for single ionization, ionization-excitation and double ionization were determined. By applying a charged-particle imaging technique, the so-called 'reaction microscope', a large fraction of the final-state momentum space is covered, and the major limitations of previous experimental methods applied in this field could be overcome. Decisive modifications of the previous reaction microscope were undertaken, the most important one being the arrangement of the projectile beam parallel to the imaging-fields. For single ionization on helium an enhanced electron emission outside the projectile scattering plane is observed at both considered impact energies (102 eV and 1 keV), which is similar to the result found for ion-impact (M. Schulz et al., Nature (London) 422, 48 (2003)). The angle resolved cross sections obtained for double ionization at 105 eV impact energy reveal, that the process is dominated by the mutual repulsion of the three final-state continuum electrons. However, signatures of more complex dynamics are also observed. The data provide an ultimate benchmark for recently developed theories treating the dynamical three- and four-body Coulomb problem. (orig.)

  7. Electron induced break-up of helium. Benchmark experiments on a dynamical four-body Coulomb system

    Energy Technology Data Exchange (ETDEWEB)

    Duerr, M.

    2006-07-05

    This work presents an experimental study of fragmentation of helium by electron impact, in which absolute fully differential cross sections for single ionization, ionization-excitation and double ionization were determined. By applying a charged-particle imaging technique, the so-called 'reaction microscope', a large fraction of the final-state momentum space is covered, and the major limitations of previous experimental methods applied in this field could be overcome. Decisive modifications of the previous reaction microscope were undertaken, the most important one being the arrangement of the projectile beam parallel to the imaging-fields. For single ionization on helium an enhanced electron emission outside the projectile scattering plane is observed at both considered impact energies (102 eV and 1 keV), which is similar to the result found for ion-impact (M. Schulz et al., Nature (London) 422, 48 (2003)). The angle resolved cross sections obtained for double ionization at 105 eV impact energy reveal, that the process is dominated by the mutual repulsion of the three final-state continuum electrons. However, signatures of more complex dynamics are also observed. The data provide an ultimate benchmark for recently developed theories treating the dynamical three- and four-body Coulomb problem. (orig.)

  8. Numerical benchmarking of SPEEDUP trademark against point kinetics solutions

    International Nuclear Information System (INIS)

    Gregory, M.V.

    1993-02-01

    SPEEDUP trademark is a state-of-the-art, dynamic, chemical process modeling package offered by Aspen Technology. In anticipation of new customers' needs for new analytical tools to support the site's waste management activities, SRTC has secured a multiple-user license to SPEEDUP trademark. In order to verify both the installation and mathematical correctness of the algorithms in SPEEDUP trademark, we have performed several numerical benchmarking calculations. These calculations are the first steps in establishing an on-site quality assurance pedigree for SPEEDUP trademark. The benchmark calculations consisted of SPEEDUP trademark Version 5.3L representations of five neutron kinetics benchmarks (each a mathematically stiff system of seven coupled ordinary differential equations), whose exact solutions are documented in the open literature. In all cases, SPEEDUP trademark solutions to be in excellent agreement with the reference solutions. A minor peculiarity in dealing with a non-existent discontinuity in the OPERATION section of the model made itself evident

  9. Benchmark selection

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Tvede, Mich

    2002-01-01

    Within a production theoretic framework, this paper considers an axiomatic approach to benchmark selection. It is shown that two simple and weak axioms; efficiency and comprehensive monotonicity characterize a natural family of benchmarks which typically becomes unique. Further axioms are added...... in order to obtain a unique selection...

  10. Variation of galactic cold gas reservoirs with stellar mass

    NARCIS (Netherlands)

    Maddox, Natasha; Hess, Kelley M.; Obreschkow, Danail; Blyth, S.-L.; Jarvis, Matt J.

    The stellar and neutral hydrogen (H I) mass functions at z ˜ 0 are fundamental benchmarks for current models of galaxy evolution. A natural extension of these benchmarks is the two-dimensional distribution of galaxies in the plane spanned by stellar and H I mass, which provides a more stringent test

  11. Light hadrons from Nf=2+1+1 dynamical twisted mass fermions

    NARCIS (Netherlands)

    Baron, R.; Blossier, B.; Boucaud, P.; Carbonell, J.; Deuzeman, A.; Drach, V.; Farchioni, F.; Gimenez, V.; Herdoiza, G.; Jansen, K.; Michael, C.; Montvay, I.; Pallante, E.; Pène, O.; Reker, S.; Urbach, C.; Wagner, M.; Wenger, U.; Collaboration, for the ETM

    2011-01-01

    We present results of lattice QCD simulations with mass-degenerate up and down and mass-split strange and charm (Nf=2+1+1) dynamical quarks using Wilson twisted mass fermions at maximal twist. The tuning of the strange and charm quark masses is performed at three values of the lattice spacing a~0.06

  12. Benchmarking NWP Kernels on Multi- and Many-core Processors

    Science.gov (United States)

    Michalakes, J.; Vachharajani, M.

    2008-12-01

    Increased computing power for weather, climate, and atmospheric science has provided direct benefits for defense, agriculture, the economy, the environment, and public welfare and convenience. Today, very large clusters with many thousands of processors are allowing scientists to move forward with simulations of unprecedented size. But time-critical applications such as real-time forecasting or climate prediction need strong scaling: faster nodes and processors, not more of them. Moreover, the need for good cost- performance has never been greater, both in terms of performance per watt and per dollar. For these reasons, the new generations of multi- and many-core processors being mass produced for commercial IT and "graphical computing" (video games) are being scrutinized for their ability to exploit the abundant fine- grain parallelism in atmospheric models. We present results of our work to date identifying key computational kernels within the dynamics and physics of a large community NWP model, the Weather Research and Forecast (WRF) model. We benchmark and optimize these kernels on several different multi- and many-core processors. The goals are to (1) characterize and model performance of the kernels in terms of computational intensity, data parallelism, memory bandwidth pressure, memory footprint, etc. (2) enumerate and classify effective strategies for coding and optimizing for these new processors, (3) assess difficulties and opportunities for tool or higher-level language support, and (4) establish a continuing set of kernel benchmarks that can be used to measure and compare effectiveness of current and future designs of multi- and many-core processors for weather and climate applications.

  13. Benchmarking school nursing practice: the North West Regional Benchmarking Group

    OpenAIRE

    Littler, Nadine; Mullen, Margaret; Beckett, Helen; Freshney, Alice; Pinder, Lynn

    2016-01-01

    It is essential that the quality of care is reviewed regularly through robust processes such as benchmarking to ensure all outcomes and resources are evidence-based so that children and young people’s needs are met effectively. This article provides an example of the use of benchmarking in school nursing practice. Benchmarking has been defined as a process for finding, adapting and applying best practices (Camp, 1994). This concept was first adopted in the 1970s ‘from industry where it was us...

  14. Emergent Newtonian dynamics and the geometric origin of mass

    International Nuclear Information System (INIS)

    D’Alessio, Luca; Polkovnikov, Anatoli

    2014-01-01

    We consider a set of macroscopic (classical) degrees of freedom coupled to an arbitrary many-particle Hamiltonian system, quantum or classical. These degrees of freedom can represent positions of objects in space, their angles, shape distortions, magnetization, currents and so on. Expanding their dynamics near the adiabatic limit we find the emergent Newton’s second law (force is equal to the mass times acceleration) with an extra dissipative term. In systems with broken time reversal symmetry there is an additional Coriolis type force proportional to the Berry curvature. We give the microscopic definition of the mass tensor. The mass tensor is related to the non-equal time correlation functions in equilibrium and describes the dressing of the slow degree of freedom by virtual excitations in the system. In the classical (high-temperature) limit the mass tensor is given by the product of the inverse temperature and the Fubini–Study metric tensor determining the natural distance between the eigenstates of the Hamiltonian. For free particles this result reduces to the conventional definition of mass. This finding shows that any mass, at least in the classical limit, emerges from the distortions of the Hilbert space highlighting deep connections between any motion (not necessarily in space) and geometry. We illustrate our findings with four simple examples. -- Highlights: •Derive the macroscopic Newton’s equation from the microscopic many-particle Schrödinger’s equation. •Deep connection between geometry and dynamics. •Geometrical interpretation of the mass of macroscopic object as deformation of Hilbert space. •Microscopic expression for mass and friction tensors

  15. A benchmark study of 2D and 3D finite element calculations simulating dynamic pulse buckling tests of cylindrical shells under axial impact

    International Nuclear Information System (INIS)

    Hoffman, E.L.; Ammerman, D.J.

    1993-01-01

    A series of tests investigating dynamic pulse buckling of a cylindrical shell under axial impact is compared to several finite element simulations of the event. The purpose of the study is to compare the performance of the various analysis codes and element types with respect to a problem which is applicable to radioactive material transport packages, and ultimately to develop a benchmark problem to qualify finite element analysis codes for the transport package design industry

  16. Interactive benchmarking

    DEFF Research Database (Denmark)

    Lawson, Lartey; Nielsen, Kurt

    2005-01-01

    We discuss individual learning by interactive benchmarking using stochastic frontier models. The interactions allow the user to tailor the performance evaluation to preferences and explore alternative improvement strategies by selecting and searching the different frontiers using directional...... in the suggested benchmarking tool. The study investigates how different characteristics on dairy farms influences the technical efficiency....

  17. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  18. A Mass Loss Penetration Model to Investigate the Dynamic Response of a Projectile Penetrating Concrete considering Mass Abrasion

    Directory of Open Access Journals (Sweden)

    NianSong Zhang

    2015-01-01

    Full Text Available A study on the dynamic response of a projectile penetrating concrete is conducted. The evolutional process of projectile mass loss and the effect of mass loss on penetration resistance are investigated using theoretical methods. A projectile penetration model considering projectile mass loss is established in three stages, namely, cratering phase, mass loss penetration phase, and remainder rigid projectile penetration phase.

  19. Results from the IAEA benchmark of spallation models

    International Nuclear Information System (INIS)

    Leray, S.; David, J.C.; Khandaker, M.; Mank, G.; Mengoni, A.; Otsuka, N.; Filges, D.; Gallmeier, F.; Konobeyev, A.; Michel, R.

    2011-01-01

    Spallation reactions play an important role in a wide domain of applications. In the simulation codes used in this field, the nuclear interaction cross-sections and characteristics are computed by spallation models. The International Atomic Energy Agency (IAEA) has recently organised a benchmark of the spallation models used or that could be used in the future into high-energy transport codes. The objectives were, first, to assess the prediction capabilities of the different spallation models for the different mass and energy regions and the different exit channels and, second, to understand the reason for the success or deficiency of the models. Results of the benchmark concerning both the analysis of the prediction capabilities of the models and the first conclusions on the physics of spallation models are presented. (authors)

  20. The KMAT: Benchmarking Knowledge Management.

    Science.gov (United States)

    de Jager, Martha

    Provides an overview of knowledge management and benchmarking, including the benefits and methods of benchmarking (e.g., competitive, cooperative, collaborative, and internal benchmarking). Arthur Andersen's KMAT (Knowledge Management Assessment Tool) is described. The KMAT is a collaborative benchmarking tool, designed to help organizations make…

  1. Benchmarking in Mobarakeh Steel Company

    OpenAIRE

    Sasan Ghasemi; Mohammad Nazemi; Mehran Nejati

    2008-01-01

    Benchmarking is considered as one of the most effective ways of improving performance in companies. Although benchmarking in business organizations is a relatively new concept and practice, it has rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan's Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aims to share the process deployed for the benchmarking project in this company and illustrate how th...

  2. Benchmarking computational fluid dynamics models of lava flow simulation for hazard assessment, forecasting, and risk management

    Science.gov (United States)

    Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi; Richardson, Jacob A.; Cashman, Katharine V.

    2017-01-01

    Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, designing flow mitigation measures, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics (CFD) models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, COMSOL, and MOLASSES. We model viscous, cooling, and solidifying flows over horizontal planes, sloping surfaces, and into topographic obstacles. We compare model results to physical observations made during well-controlled analogue and molten basalt experiments, and to analytical theory when available. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and OpenFOAM and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We assess the goodness-of-fit of the simulation results and the computational cost. Our results guide the selection of numerical simulation codes for different applications, including inferring emplacement conditions of past lava flows, modeling the temporal evolution of ongoing flows during eruption, and probabilistic assessment of lava flow hazard prior to eruption. Finally, we outline potential experiments and desired key observational data from future flows that would extend existing benchmarking data sets.

  3. The physics benchmark processes for the detector performance studies used in CLIC CDR Volume 3

    CERN Document Server

    Allanach, B.J.; Desch, K.; Ellis, J.; Giudice, G.; Grefe, C.; Kraml, S.; Lastovicka, T.; Linssen, L.; Marschall, J.; Martin, S.P.; Muennich, A.; Poss, S.; Roloff, P.; Simon, F.; Strube, J.; Thomson, M.; Wells, J.D.

    2012-01-01

    This note describes the detector benchmark processes used in volume 3 of the CLIC conceptual design report (CDR), which explores a staged construction and operation of the CLIC accelerator. The goal of the detector benchmark studies is to assess the performance of the CLIC ILD and CLIC SiD detector concepts for different physics processes and at a few CLIC centre-of-mass energies.

  4. Effect of mass variation on dynamics of tethered system in orbital maneuvering

    Science.gov (United States)

    Sun, Liang; Zhao, Guowei; Huang, Hai

    2018-05-01

    In orbital maneuvering, the mass variation due to fuel consumption has an obvious impact on the dynamics of tethered system, which cannot be neglected. The contributions of the work are mainly shown in two aspects: 1) the improvement of the model; 2) the analysis of dynamics characteristics. As the mass is variable, and the derivative of the mass is directly considered in the traditional Lagrange equation, the expression of generalized force is complicated. To solve this problem, the coagulated derivative is adopted in the paper; besides, the attitude dynamics equations derived in this paper take into account the effect of mass variation and the drift of orbital trajectory at the same time. The bifurcation phenomenon, the pendular motion angular frequency, and amplitudes of tether vibration revealed in this paper can provide a reference for the parameters and controller design in practical engineering. In the article, a dumbbell model is adopted to analyze the dynamics of tethered system, in which the mass variation of base satellite is fully considered. Considering the practical application, the case of orbital transfer under a transversal thrust is mainly studied. Besides, compared with the analytical solutions of librational angles, the effects of mass variation on stability and librational characteristic are studied. Finally, in order to make an analysis of the effect on vibrational characteristic, a lumped model is introduced, which reveals a strong coupling of librational and vibrational characteristics.

  5. How Activists Use Benchmarks

    DEFF Research Database (Denmark)

    Seabrooke, Leonard; Wigan, Duncan

    2015-01-01

    Non-governmental organisations use benchmarks as a form of symbolic violence to place political pressure on firms, states, and international organisations. The development of benchmarks requires three elements: (1) salience, that the community of concern is aware of the issue and views...... are put to the test. The first is a reformist benchmarking cycle where organisations defer to experts to create a benchmark that conforms with the broader system of politico-economic norms. The second is a revolutionary benchmarking cycle driven by expert-activists that seek to contest strong vested...... interests and challenge established politico-economic norms. Differentiating these cycles provides insights into how activists work through organisations and with expert networks, as well as how campaigns on complex economic issues can be mounted and sustained....

  6. Benchmarking in Mobarakeh Steel Company

    Directory of Open Access Journals (Sweden)

    Sasan Ghasemi

    2008-05-01

    Full Text Available Benchmarking is considered as one of the most effective ways of improving performance incompanies. Although benchmarking in business organizations is a relatively new concept and practice, ithas rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan’s Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aimsto share the process deployed for the benchmarking project in this company and illustrate how the projectsystematic implementation led to succes.

  7. Aquatic Life Benchmarks

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Aquatic Life Benchmarks is an EPA-developed set of criteria for freshwater species. These benchmarks are based on toxicity values reviewed by EPA and used in the...

  8. Personalized recommendation based on preferential bidirectional mass diffusion

    Science.gov (United States)

    Chen, Guilin; Gao, Tianrun; Zhu, Xuzhen; Tian, Hui; Yang, Zhao

    2017-03-01

    Recommendation system provides a promising way to alleviate the dilemma of information overload. In physical dynamics, mass diffusion has been used to design effective recommendation algorithms on bipartite network. However, most of the previous studies focus overwhelmingly on unidirectional mass diffusion from collected objects to uncollected objects, while overlooking the opposite direction, leading to the risk of similarity estimation deviation and performance degradation. In addition, they are biased towards recommending popular objects which will not necessarily promote the accuracy but make the recommendation lack diversity and novelty that indeed contribute to the vitality of the system. To overcome the aforementioned disadvantages, we propose a preferential bidirectional mass diffusion (PBMD) algorithm by penalizing the weight of popular objects in bidirectional diffusion. Experiments are evaluated on three benchmark datasets (Movielens, Netflix and Amazon) by 10-fold cross validation, and results indicate that PBMD remarkably outperforms the mainstream methods in accuracy, diversity and novelty.

  9. Galaxy dynamics and the mass density of the universe.

    Science.gov (United States)

    Rubin, V C

    1993-06-01

    Dynamical evidence accumulated over the past 20 years has convinced astronomers that luminous matter in a spiral galaxy constitutes no more than 10% of the mass of a galaxy. An additional 90% is inferred by its gravitational effect on luminous material. Here I review recent observations concerning the distribution of luminous and nonluminous matter in the Milky Way, in galaxies, and in galaxy clusters. Observations of neutral hydrogen disks, some extending in radius several times the optical disk, confirm that a massive dark halo is a major component of virtually every spiral. A recent surprise has been the discovery that stellar and gas motions in ellipticals are enormously complex. To date, only for a few spheroidal galaxies do the velocities extend far enough to probe the outer mass distribution. But the diverse kinematics of inner cores, peripheral to deducing the overall mass distribution, offer additional evidence that ellipticals have acquired gas-rich systems after initial formation. Dynamical results are consistent with a low-density universe, in which the required dark matter could be baryonic. On smallest scales of galaxies [10 kiloparsec (kpc); Ho = 50 km.sec-1.megaparsec-1] the luminous matter constitutes only 1% of the closure density. On scales greater than binary galaxies (i.e., >/=100 kpc) all systems indicate a density approximately 10% of the closure density, a density consistent with the low baryon density in the universe. If large-scale motions in the universe require a higher mass density, these motions would constitute the first dynamical evidence for nonbaryonic matter in a universe of higher density.

  10. Regulatory Benchmarking

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    2017-01-01

    Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators. The appli......Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of bench-marking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...

  11. Regulatory Benchmarking

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    2017-01-01

    Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators. The appli......Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...

  12. Benchmarking and the laboratory

    Science.gov (United States)

    Galloway, M; Nadin, L

    2001-01-01

    This article describes how benchmarking can be used to assess laboratory performance. Two benchmarking schemes are reviewed, the Clinical Benchmarking Company's Pathology Report and the College of American Pathologists' Q-Probes scheme. The Clinical Benchmarking Company's Pathology Report is undertaken by staff based in the clinical management unit, Keele University with appropriate input from the professional organisations within pathology. Five annual reports have now been completed. Each report is a detailed analysis of 10 areas of laboratory performance. In this review, particular attention is focused on the areas of quality, productivity, variation in clinical practice, skill mix, and working hours. The Q-Probes scheme is part of the College of American Pathologists programme in studies of quality assurance. The Q-Probes scheme and its applicability to pathology in the UK is illustrated by reviewing two recent Q-Probe studies: routine outpatient test turnaround time and outpatient test order accuracy. The Q-Probes scheme is somewhat limited by the small number of UK laboratories that have participated. In conclusion, as a result of the government's policy in the UK, benchmarking is here to stay. Benchmarking schemes described in this article are one way in which pathologists can demonstrate that they are providing a cost effective and high quality service. Key Words: benchmarking • pathology PMID:11477112

  13. Benchmarking for Higher Education.

    Science.gov (United States)

    Jackson, Norman, Ed.; Lund, Helen, Ed.

    The chapters in this collection explore the concept of benchmarking as it is being used and developed in higher education (HE). Case studies and reviews show how universities in the United Kingdom are using benchmarking to aid in self-regulation and self-improvement. The chapters are: (1) "Introduction to Benchmarking" (Norman Jackson…

  14. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  15. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  16. Benchmarking and Learning in Public Healthcare

    DEFF Research Database (Denmark)

    Buckmaster, Natalie; Mouritsen, Jan

    2017-01-01

    This research investigates the effects of learning-oriented benchmarking in public healthcare settings. Benchmarking is a widely adopted yet little explored accounting practice that is part of the paradigm of New Public Management. Extant studies are directed towards mandated coercive benchmarking...... applications. The present study analyses voluntary benchmarking in a public setting that is oriented towards learning. The study contributes by showing how benchmarking can be mobilised for learning and offers evidence of the effects of such benchmarking for performance outcomes. It concludes that benchmarking...... can enable learning in public settings but that this requires actors to invest in ensuring that benchmark data are directed towards improvement....

  17. Benchmark job – Watch out!

    CERN Multimedia

    Staff Association

    2017-01-01

    On 12 December 2016, in Echo No. 259, we already discussed at length the MERIT and benchmark jobs. Still, we find that a couple of issues warrant further discussion. Benchmark job – administrative decision on 1 July 2017 On 12 January 2017, the HR Department informed all staff members of a change to the effective date of the administrative decision regarding benchmark jobs. The benchmark job title of each staff member will be confirmed on 1 July 2017, instead of 1 May 2017 as originally announced in HR’s letter on 18 August 2016. Postponing the administrative decision by two months will leave a little more time to address the issues related to incorrect placement in a benchmark job. Benchmark job – discuss with your supervisor, at the latest during the MERIT interview In order to rectify an incorrect placement in a benchmark job, it is essential that the supervisor and the supervisee go over the assigned benchmark job together. In most cases, this placement has been done autom...

  18. Dynamical interplay between pairing and quadrupole correlations in odd-mass nuclei

    International Nuclear Information System (INIS)

    Kaneko, Kazunari; Takada, Kenjiro; Sakata, Fumihiko; Tazaki, Shigeru.

    1982-01-01

    Study of the dynamical interplay between pairing and quadrupole correlations in odd-mass nuclei has been developed. One of the purposes of this paper is to predict that the new collective excited states may exist system-atically in odd-mass nuclei. Other purpose is to discuss a new collective band structure on the top of a unique-parity one-quasiparticle state. Through the numerical calculations, it has been clarified that the dynamical mutual interplay between the pairing and the quadrupole degrees of freedom played an important role in the odd-mass transitional nuclei to bring about the new type of collective states. The results of calculation were compared with the experimental data. (Kato, T.)

  19. Statistical benchmark for BosonSampling

    International Nuclear Information System (INIS)

    Walschaers, Mattia; Mayer, Klaus; Buchleitner, Andreas; Kuipers, Jack; Urbina, Juan-Diego; Richter, Klaus; Tichy, Malte Christopher

    2016-01-01

    Boson samplers—set-ups that generate complex many-particle output states through the transmission of elementary many-particle input states across a multitude of mutually coupled modes—promise the efficient quantum simulation of a classically intractable computational task, and challenge the extended Church–Turing thesis, one of the fundamental dogmas of computer science. However, as in all experimental quantum simulations of truly complex systems, one crucial problem remains: how to certify that a given experimental measurement record unambiguously results from enforcing the claimed dynamics, on bosons, fermions or distinguishable particles? Here we offer a statistical solution to the certification problem, identifying an unambiguous statistical signature of many-body quantum interference upon transmission across a multimode, random scattering device. We show that statistical analysis of only partial information on the output state allows to characterise the imparted dynamics through particle type-specific features of the emerging interference patterns. The relevant statistical quantifiers are classically computable, define a falsifiable benchmark for BosonSampling, and reveal distinctive features of many-particle quantum dynamics, which go much beyond mere bunching or anti-bunching effects. (fast track communication)

  20. Benchmarking reference services: an introduction.

    Science.gov (United States)

    Marshall, J G; Buchanan, H S

    1995-01-01

    Benchmarking is based on the common sense idea that someone else, either inside or outside of libraries, has found a better way of doing certain things and that your own library's performance can be improved by finding out how others do things and adopting the best practices you find. Benchmarking is one of the tools used for achieving continuous improvement in Total Quality Management (TQM) programs. Although benchmarking can be done on an informal basis, TQM puts considerable emphasis on formal data collection and performance measurement. Used to its full potential, benchmarking can provide a common measuring stick to evaluate process performance. This article introduces the general concept of benchmarking, linking it whenever possible to reference services in health sciences libraries. Data collection instruments that have potential application in benchmarking studies are discussed and the need to develop common measurement tools to facilitate benchmarking is emphasized.

  1. A Dynamical Origin of the Mass Hierarchy among Neutrinos, Charged Leptons, and Quarks

    OpenAIRE

    Akama, Keiichi; Katsuura, Kazuo

    1998-01-01

    We propose a dynamical mass-generation scenario which naturally realizes the mass hierarchy among the neutrinos, charged leptons and quarks, where the mass is dominated by the self-mass induced through the anomalous (i.e. non-minimal) gauge interactions.

  2. Computational fluid dynamics (CFD) round robin benchmark for a pressurized water reactor (PWR) rod bundle

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Shin K., E-mail: paengki1@tamu.edu; Hassan, Yassin A.

    2016-05-15

    Highlights: • The capabilities of steady RANS models were directly assessed for full axial scale experiment. • The importance of mesh and conjugate heat transfer was reaffirmed. • The rod inner-surface temperature was directly compared. • The steady RANS calculations showed a limitation in the prediction of circumferential distribution of the rod surface temperature. - Abstract: This study examined the capabilities and limitations of steady Reynolds-Averaged Navier–Stokes (RANS) approach for pressurized water reactor (PWR) rod bundle problems, based on the round robin benchmark of computational fluid dynamics (CFD) codes against the NESTOR experiment for a 5 × 5 rod bundle with typical split-type mixing vane grids (MVGs). The round robin exercise against the high-fidelity, broad-range (covering multi-spans and entire lateral domain) NESTOR experimental data for both the flow field and the rod temperatures enabled us to obtain important insights into CFD prediction and validation for the split-type MVG PWR rod bundle problem. It was found that the steady RANS turbulence models with wall function could reasonably predict two key variables for a rod bundle problem – grid span pressure loss and the rod surface temperature – once mesh (type, resolution, and configuration) was suitable and conjugate heat transfer was properly considered. However, they over-predicted the magnitude of the circumferential variation of the rod surface temperature and could not capture its peak azimuthal locations for a central rod in the wake of the MVG. These discrepancies in the rod surface temperature were probably because the steady RANS approach could not capture unsteady, large-scale cross-flow fluctuations and qualitative cross-flow pattern change due to the laterally confined test section. Based on this benchmarking study, lessons and recommendations about experimental methods as well as CFD methods were also provided for the future research.

  3. Internet based benchmarking

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Nielsen, Kurt

    2005-01-01

    We discuss the design of interactive, internet based benchmarking using parametric (statistical) as well as nonparametric (DEA) models. The user receives benchmarks and improvement potentials. The user is also given the possibility to search different efficiency frontiers and hereby to explore...

  4. Toxicological Benchmarks for Wildlife

    Energy Technology Data Exchange (ETDEWEB)

    Sample, B.E. Opresko, D.M. Suter, G.W.

    1993-01-01

    Ecological risks of environmental contaminants are evaluated by using a two-tiered process. In the first tier, a screening assessment is performed where concentrations of contaminants in the environment are compared to no observed adverse effects level (NOAEL)-based toxicological benchmarks. These benchmarks represent concentrations of chemicals (i.e., concentrations presumed to be nonhazardous to the biota) in environmental media (water, sediment, soil, food, etc.). While exceedance of these benchmarks does not indicate any particular level or type of risk, concentrations below the benchmarks should not result in significant effects. In practice, when contaminant concentrations in food or water resources are less than these toxicological benchmarks, the contaminants may be excluded from further consideration. However, if the concentration of a contaminant exceeds a benchmark, that contaminant should be retained as a contaminant of potential concern (COPC) and investigated further. The second tier in ecological risk assessment, the baseline ecological risk assessment, may use toxicological benchmarks as part of a weight-of-evidence approach (Suter 1993). Under this approach, based toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. Other sources of evidence include media toxicity tests, surveys of biota (abundance and diversity), measures of contaminant body burdens, and biomarkers. This report presents NOAEL- and lowest observed adverse effects level (LOAEL)-based toxicological benchmarks for assessment of effects of 85 chemicals on 9 representative mammalian wildlife species (short-tailed shrew, little brown bat, meadow vole, white-footed mouse, cottontail rabbit, mink, red fox, and whitetail deer) or 11 avian wildlife species (American robin, rough-winged swallow, American woodcock, wild turkey, belted kingfisher, great blue heron, barred owl, barn owl, Cooper's hawk, and red

  5. Dynamic pulse buckling of cylindrical shells under axial impact: A benchmark study of 2D and 3D finite element calculations

    International Nuclear Information System (INIS)

    Hoffman, E.L.; Ammerman, D.J.

    1995-01-01

    A series of tests investigating dynamic pulse buckling of a cylindrical shell under axial impact is compared to several 2D and 3D finite element simulations of the event. The purpose of the work is to investigate the performance of various analysis codes and element types on a problem which is applicable to radioactive material transport packages, and ultimately to develop a benchmark problem to qualify finite element analysis codes for the transport package design industry. During the pulse buckling tests, a buckle formed at each end of the cylinder, and one of the two buckles became unstable and collapsed. Numerical simulations of the test were performed using PRONTO, a Sandia developed transient dynamics analysis code, and ABAQUS/Explicit with both shell and continuum elements. The calculations are compared to the tests with respect to deformed shape and impact load history

  6. Sea quark contribution to the dynamical mass and light quark content of a nucleon

    International Nuclear Information System (INIS)

    Singh, J.P.

    1995-01-01

    We calculate the flavor mixing in the wave function of a light valence quark. For this, we use the idea of dynamical symmetry breaking. A sea quark of a different flavor may appear through the vacuum polarization of a gluon propagator which appears in the gap equation for the dynamical mass. We have also used the fact that any one of these quark lines may undergo condensation. The dependence of the dynamical mass, generated in this way, on the sea quark mass up to quadratic terms has been retained. The momentum dependence is like 1/p 4 , in contrast with the 1/p 2 kind of dependence which occurs for the leading term of the dynamical mass in the subasymptotic region. The extension of the result to the ''mass shell'' yields σ πN =53--54 MeV for the pion-nucleon σ term and m s left-angle p|bar ss|p right-angle=122--264 meV for the strange quark contribution to the proton mass, for different values of parameters. These are in reasonable agreement with current phenomenological estimates of these quantities

  7. X447 EBR-II Experiment Benchmark for Verification of Audit Code of SFR Metal Fuel

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yong Won; Bae, Moo-Hoon; Shin, Andong; Suh, Namduk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    In KINS (Korea Institute of Nuclear Safety), to prepare audit calculation of PGSFR licensing review, the project has been started to develop the regulatory technology for SFR system including a fuel area. To evaluate the fuel integrity and safety during an irradiation, the fuel performance code must be used for audit calculation. In this study, to verify the new code system, the benchmark analysis is performed. In the benchmark, X447 EBR-II experiment data are used. Additionally, the sensitivity analysis according to mass flux change of coolant is performed. In case of LWR fuel performance modeling, various and advanced models have been proposed and validated based on sufficient in-reactor test results. However, due to the lack of experience of SFR operation, the current understanding of SFR fuel behavior is limited. In this study, X447 EBR-II Experiment data are used for benchmark. The fuel composition of X447 assembly is U-10Zr and PGSFR also uses this composition in initial phase. So we select X447 EBR-II experiment for benchmark analysis. Due to the lack of experience of SFR operation and data, the current understanding of SFR fuel behavior is limited. However, in order to prepare the licensing of PGSFR, regulatory audit technologies of SFR must be secured. So, in this study, to verify the new audit fuel performance analysis code, the benchmark analysis is performed using X447 EBR-II experiment data. Also, the sensitivity analysis with mass flux change of coolant is performed. In terms of verification, it is considered that the results of benchmark and sensitivity analysis are reasonable.

  8. X447 EBR-II Experiment Benchmark for Verification of Audit Code of SFR Metal Fuel

    International Nuclear Information System (INIS)

    Choi, Yong Won; Bae, Moo-Hoon; Shin, Andong; Suh, Namduk

    2016-01-01

    In KINS (Korea Institute of Nuclear Safety), to prepare audit calculation of PGSFR licensing review, the project has been started to develop the regulatory technology for SFR system including a fuel area. To evaluate the fuel integrity and safety during an irradiation, the fuel performance code must be used for audit calculation. In this study, to verify the new code system, the benchmark analysis is performed. In the benchmark, X447 EBR-II experiment data are used. Additionally, the sensitivity analysis according to mass flux change of coolant is performed. In case of LWR fuel performance modeling, various and advanced models have been proposed and validated based on sufficient in-reactor test results. However, due to the lack of experience of SFR operation, the current understanding of SFR fuel behavior is limited. In this study, X447 EBR-II Experiment data are used for benchmark. The fuel composition of X447 assembly is U-10Zr and PGSFR also uses this composition in initial phase. So we select X447 EBR-II experiment for benchmark analysis. Due to the lack of experience of SFR operation and data, the current understanding of SFR fuel behavior is limited. However, in order to prepare the licensing of PGSFR, regulatory audit technologies of SFR must be secured. So, in this study, to verify the new audit fuel performance analysis code, the benchmark analysis is performed using X447 EBR-II experiment data. Also, the sensitivity analysis with mass flux change of coolant is performed. In terms of verification, it is considered that the results of benchmark and sensitivity analysis are reasonable

  9. Benchmarking in academic pharmacy departments.

    Science.gov (United States)

    Bosso, John A; Chisholm-Burns, Marie; Nappi, Jean; Gubbins, Paul O; Ross, Leigh Ann

    2010-10-11

    Benchmarking in academic pharmacy, and recommendations for the potential uses of benchmarking in academic pharmacy departments are discussed in this paper. Benchmarking is the process by which practices, procedures, and performance metrics are compared to an established standard or best practice. Many businesses and industries use benchmarking to compare processes and outcomes, and ultimately plan for improvement. Institutions of higher learning have embraced benchmarking practices to facilitate measuring the quality of their educational and research programs. Benchmarking is used internally as well to justify the allocation of institutional resources or to mediate among competing demands for additional program staff or space. Surveying all chairs of academic pharmacy departments to explore benchmarking issues such as department size and composition, as well as faculty teaching, scholarly, and service productivity, could provide valuable information. To date, attempts to gather this data have had limited success. We believe this information is potentially important, urge that efforts to gather it should be continued, and offer suggestions to achieve full participation.

  10. Benchmarking: applications to transfusion medicine.

    Science.gov (United States)

    Apelseth, Torunn Oveland; Molnar, Laura; Arnold, Emmy; Heddle, Nancy M

    2012-10-01

    Benchmarking is as a structured continuous collaborative process in which comparisons for selected indicators are used to identify factors that, when implemented, will improve transfusion practices. This study aimed to identify transfusion medicine studies reporting on benchmarking, summarize the benchmarking approaches used, and identify important considerations to move the concept of benchmarking forward in the field of transfusion medicine. A systematic review of published literature was performed to identify transfusion medicine-related studies that compared at least 2 separate institutions or regions with the intention of benchmarking focusing on 4 areas: blood utilization, safety, operational aspects, and blood donation. Forty-five studies were included: blood utilization (n = 35), safety (n = 5), operational aspects of transfusion medicine (n = 5), and blood donation (n = 0). Based on predefined criteria, 7 publications were classified as benchmarking, 2 as trending, and 36 as single-event studies. Three models of benchmarking are described: (1) a regional benchmarking program that collects and links relevant data from existing electronic sources, (2) a sentinel site model where data from a limited number of sites are collected, and (3) an institutional-initiated model where a site identifies indicators of interest and approaches other institutions. Benchmarking approaches are needed in the field of transfusion medicine. Major challenges include defining best practices and developing cost-effective methods of data collection. For those interested in initiating a benchmarking program, the sentinel site model may be most effective and sustainable as a starting point, although the regional model would be the ideal goal. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Benchmarks of Global Clean Energy Manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-01-01

    The Clean Energy Manufacturing Analysis Center (CEMAC), sponsored by the U.S. Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE), provides objective analysis and up-to-date data on global supply chains and manufacturing of clean energy technologies. Benchmarks of Global Clean Energy Manufacturing sheds light on several fundamental questions about the global clean technology manufacturing enterprise: How does clean energy technology manufacturing impact national economies? What are the economic opportunities across the manufacturing supply chain? What are the global dynamics of clean energy technology manufacturing?

  12. Adventure Tourism Benchmark – Analyzing the Case of Suesca, Cundinamarca

    Directory of Open Access Journals (Sweden)

    Juan Felipe Tsao Borrero

    2012-11-01

    Full Text Available Adventure tourism is a growing sector within the tourism industry and understanding its dynamics is fundamental for adventure tourism destinations and their local authorities. Destination benchmarking is a strong tool to identify the performance of tourism services offered at the destination in order to design appropriate policies to improve its competitiveness. The benchmarking study of Suesca, an adventure tourism destination in Colombia, helps to identify the gaps compared with successful adventure tourism destinations around the world, and provides valuable information to local policy-makers on the features to be improved. The lack of available information to tourists and financial facilities hinders the capability of Suesca to improve its competitiveness.

  13. Evidence for dynamic SU(5) symmetry breaking in meson mass multiplets

    International Nuclear Information System (INIS)

    Frikkee, E.

    1994-07-01

    It is shown that the mass differences and multiplet pattern for pseudoscalar and vector mesons correspond to a chain of dynamic symmetry reductions SU(n) contains SU(n-1)xU(1). In this symmetry-reduction model, the differences between the masses of the quark flavours are the result of intra-hadronic interactions. Quark confinement is explained as a consequence of the fact that this symmetry breaking chain only occurs in hadrons. The results of a quantitative analysis of mass splittings in meson multiplets indicate that SU(5) is probably the highest symmetry for hadron states. In the proposed dynamic symmetry breaking scheme with five quark flavours there is no one-to-one correspondence between lepton and quark generations. (orig.)

  14. Anisotropic dynamic mass density for fluidsolid composites

    KAUST Repository

    Wu, Ying

    2012-10-01

    By taking the low frequency limit of multiple-scattering theory, we obtain the dynamic effective mass density of fluidsolid composites with a two-dimensional rectangular lattice structure. The anisotropic mass density can be described by an angle-dependent dipole solution, to the leading-order of solid concentration. The angular dependence vanishes for the square lattice, but at high solid concentrations there is a structure-dependent factor that contributes to the leading-order solution. In all cases, Woods formula is found to be accurately valid for the effective bulk modulus, independent of the structures. Numerical evaluations from the solutions are shown to be in excellent agreement with finite-element simulations. © 2012 Elsevier B.V.

  15. Testing substellar models with dynamical mass measurements

    Directory of Open Access Journals (Sweden)

    Liu M.C.

    2011-07-01

    Full Text Available We have been using Keck laser guide star adaptive optics to monitor the orbits of ultracool binaries, providing dynamical masses at lower luminosities and temperatures than previously available and enabling strong tests of theoretical models. We have identified three specific problems with theory: (1 We find that model color–magnitude diagrams cannot be reliably used to infer masses as they do not accurately reproduce the colors of ultracool dwarfs of known mass. (2 Effective temperatures inferred from evolutionary model radii are typically inconsistent with temperatures derived from fitting atmospheric models to observed spectra by 100–300 K. (3 For the only known pair of field brown dwarfs with a precise mass (3% and age determination (≈25%, the measured luminosities are ~2–3× higher than predicted by model cooling rates (i.e., masses inferred from Lbol and age are 20–30% larger than measured. To make progress in understanding the observed discrepancies, more mass measurements spanning a wide range of luminosity, temperature, and age are needed, along with more accurate age determinations (e.g., via asteroseismology for primary stars with brown dwarf binary companions. Also, resolved optical and infrared spectroscopy are needed to measure lithium depletion and to characterize the atmospheres of binary components in order to better assess model deficiencies.

  16. International Benchmark on Pressurised Water Reactor Sub-channel and Bundle Tests. Volume II: Benchmark Results of Phase I: Void Distribution

    International Nuclear Information System (INIS)

    Rubin, Adam; Avramova, Maria; Velazquez-Lozada, Alexander

    2016-03-01

    This report summarised the first phase of the Nuclear Energy Agency (NEA) and the US Nuclear Regulatory Commission Benchmark based on NUPEC PWR Sub-channel and Bundle Tests (PSBT), which was intended to provide data for the verification of void distribution models in participants' codes. This phase was composed of four exercises; Exercise 1: steady-state single sub-channel benchmark, Exercise 2: steady-state rod bundle benchmark, Exercise 3: transient rod bundle benchmark and Exercise 4: a pressure drop benchmark. The experimental data provided to the participants of this benchmark is from a series of void measurement tests using full-size mock-up tests for both Boiling Water Reactors (BWRs) and Pressurised Water Reactors (PWRs). These tests were performed from 1987 to 1995 by the Nuclear Power Engineering Corporation (NUPEC) in Japan and made available by the Japan Nuclear Energy Safety Organisation (JNES) for the purposes of this benchmark, which was organised by Pennsylvania State University. Twenty-one institutions from nine countries participated in this benchmark. Seventeen different computer codes were used in Exercises 1, 2, 3 and 4. Among the computer codes were porous media, sub-channel, systems thermal-hydraulic code and Computational Fluid Dynamics (CFD) codes. It was observed that the codes tended to overpredict the thermal equilibrium quality at lower elevations and under predict it at higher elevations. There was also a tendency to overpredict void fraction at lower elevations and underpredict it at high elevations for the bundle test cases. The overprediction of void fraction at low elevations is likely caused by the x-ray densitometer measurement method used. Under sub-cooled boiling conditions, the voids accumulate at heated surfaces (and are therefore not seen in the centre of the sub-channel, where the measurements are being taken), so the experimentally-determined void fractions will be lower than the actual void fraction. Some of the best

  17. Benchmark testing and independent verification of the VS2DT computer code

    International Nuclear Information System (INIS)

    McCord, J.T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code's original documentation

  18. Fundmental Parameters of Low-Mass Stars, Brown Dwarfs, and Planets

    Science.gov (United States)

    Montet, Benjamin; Johnson, John A.; Bowler, Brendan; Shkolnik, Evgenya

    2016-01-01

    Despite advances in evolutionary models of low-mass stars and brown dwarfs, these models remain poorly constrained by observations. In order to test these predictions directly, masses of individual stars must be measured and combined with broadband photometry and medium-resolution spectroscopy to probe stellar atmospheres. I will present results from an astrometric and spectroscopic survey of low-mass pre-main sequence binary stars to measure individual dynamical masses and compare to model predictions. This is the first systematic test of a large number of stellar systems of intermediate age between young star-forming regions and old field stars. Stars in our sample are members of the Tuc-Hor, AB Doradus, and beta Pictoris moving groups, the last of which includes GJ 3305 AB, the wide binary companion to the imaged exoplanet host 51 Eri. I will also present results of Spitzer observations of secondary eclipses of LHS 6343 C, a T dwarf transiting one member of an M+M binary in the Kepler field. By combining these data with Kepler photometry and radial velocity observations, we can measure the luminosity, mass, and radius of the brown dwarf. This is the first non-inflated brown dwarf for which all three of these parameters have been measured, providing the first benchmark to test model predictions of the masses and radii of field T dwarfs. I will discuss these results in the context of K2 and TESS, which will find additional benchmark transiting brown dwarfs over the course of their missions, including a description of the first planet catalog developed from K2 data and a program to search for transiting planets around mid-M dwarfs.

  19. Determining Optimal Crude Oil Price Benchmark in Nigeria: An Empirical Approach

    Directory of Open Access Journals (Sweden)

    Saibu Olufemi Muibi

    2015-12-01

    Full Text Available This paper contributes to on-going empirical search for an appropriate crude oil price benchmark that ensures greater financial stability and efficient fiscal management in Nigeria. It adopted the seasonally adjusted ARIMA forecasting models using monthly data series from 2000m01 to 2012m12 to predict future movement in Nigeria crude oil prices. The paper derived a more robust and dynamic framework that accommodates fluctuation in crude oil price and also in government spending. The result shows that if the incessant withdrawal from the ECA fund and the increasing debt profile of government in recent times are factored into the benchmark, the real crude oil numerical fiscal rule is (US$82.3 for 2013 which is higher than the official benchmark of $75 used for 2013 and 2014 budget proposal. The paper argues that the current long run price rule based on 5-10 year moving average approach adopted by government is rigid and inflexible as a rule for managing Nigerian oil funds. The unrealistic assumption of the extant benchmark accounted for excessive depletion and lack of accountability of the excess crude oil account. The paper concludes that except the federal government can curtail its spending profligacy and adopts a more stringent fiscal discipline rules, the current benchmark is unrealistic and unsuitable for fiscal management of oil revenue in the context of Nigerian economic spending profile.

  20. Mass spectrometry in structural biology and biophysics architecture, dynamics, and interaction of biomolecules

    CERN Document Server

    Kaltashov, Igor A; Desiderio, Dominic M; Nibbering, Nico M

    2012-01-01

    The definitive guide to mass spectrometry techniques in biology and biophysics The use of mass spectrometry (MS) to study the architecture and dynamics of proteins is increasingly common within the biophysical community, and Mass Spectrometry in Structural Biology and Biophysics: Architecture, Dynamics, and Interaction of Biomolecules, Second Edition provides readers with detailed, systematic coverage of the current state of the art. Offering an unrivalled overview of modern MS-based armamentarium that can be used to solve the most challenging problems in biophysics, structural biol

  1. Variation in body mass dynamics among sites in Black Brant Branta bernicla nigricans supports adaptivity of mass loss during moult

    Science.gov (United States)

    Fondell, Thomas F.; Flint, Paul L.; Schmutz, Joel A.; Schamber, Jason L.; Nicolai, Christopher A.

    2013-01-01

    Birds employ varying strategies to accommodate the energetic demands of moult, one important example being changes in body mass. To understand better their physiological and ecological significance, we tested three hypotheses concerning body mass dynamics during moult. We studied Black Brant in 2006 and 2007 moulting at three sites in Alaska which varied in food availability, breeding status and whether geese undertook a moult migration. First we predicted that if mass loss during moult were simply the result of inadequate food resources then mass loss would be highest where food was least available. Secondly, we predicted that if mass loss during moult were adaptive, allowing birds to reduce activity during moult, then birds would gain mass prior to moult where feeding conditions allowed and mass loss would be positively related to mass at moult initiation. Thirdly, we predicted that if mass loss during moult were adaptive, allowing birds to regain flight sooner, then across sites and groups, mass at the end of the flightless period would converge on a theoretical optimum, i.e. the mass that permits the earliest possible return to flight. Mass loss was greatest where food was most available and thus our results did not support the prediction that mass loss resulted from inadequate food availability. Mass at moult initiation was positively related to both food availability and mass loss. In addition, among sites and years, variation in mass was high at moult initiation but greatly reduced at the end of the flightless period, appearing to converge. Thus, our results supported multiple predictions that mass loss during moult was adaptive and that the optimal moulting strategy was to gain mass prior to the flightless period, then through behavioural modifications use these body reserves to reduce activity and in so doing also reduce wing loading. Geese that undertook a moult migration initiated moult at the highest mass, indicating that they were more than able to

  2. Molecular theory of mass transfer kinetics and dynamics at gas-water interface

    International Nuclear Information System (INIS)

    Morita, Akihiro; Garrett, Bruce C

    2008-01-01

    The mass transfer mechanism across gas-water interface is studied with molecular dynamics (MD) simulation. The MD results provide a robust and qualitatively consistent picture to previous studies about microscopic aspects of mass transfer, including interface structure, free energy profiles for the uptake, scattering dynamics and energy relaxation of impinging molecules. These MD results are quantitatively compared with experimental uptake measurements, and we find that the apparent inconsistency between MD and experiment could be partly resolved by precise decomposition of the observed kinetics into elemental steps. Remaining issues and future perspectives toward constructing a comprehensive multi-scale description of interfacial mass transfer are summarized.

  3. EGS4 benchmark program

    International Nuclear Information System (INIS)

    Yasu, Y.; Hirayama, H.; Namito, Y.; Yashiro, S.

    1995-01-01

    This paper proposes EGS4 Benchmark Suite which consists of three programs called UCSAMPL4, UCSAMPL4I and XYZDOS. This paper also evaluates optimization methods of recent RISC/UNIX systems, such as IBM, HP, DEC, Hitachi and Fujitsu, for the benchmark suite. When particular compiler option and math library were included in the evaluation process, system performed significantly better. Observed performance of some of the RISC/UNIX systems were beyond some so-called Mainframes of IBM, Hitachi or Fujitsu. The computer performance of EGS4 Code System on an HP9000/735 (99MHz) was defined to be the unit of EGS4 Unit. The EGS4 Benchmark Suite also run on various PCs such as Pentiums, i486 and DEC alpha and so forth. The performance of recent fast PCs reaches that of recent RISC/UNIX systems. The benchmark programs have been evaluated with correlation of industry benchmark programs, namely, SPECmark. (author)

  4. Comparison of 250 MHz R10K Origin 2000 and 400 MHz Origin 2000 Using NAS Parallel Benchmarks

    Science.gov (United States)

    Turney, Raymond D.; Thigpen, William W. (Technical Monitor)

    2001-01-01

    This report describes results of benchmark tests on Steger, a 250 MHz Origin 2000 system with R10K processors, currently installed at the NASA Ames National Advanced Supercomputing (NAS) facility. For comparison purposes, the tests were also run on Lomax, a 400 MHz Origin 2000 with R12K processors. The BT, LU, and SP application benchmarks in the NAS Parallel Benchmark Suite and the kernel benchmark FT were chosen to measure system performance. Having been written to measure performance on Computational Fluid Dynamics applications, these benchmarks are assumed appropriate to represent the NAS workload. Since the NAS runs both message passing (MPI) and shared-memory, compiler directive type codes, both MPI and OpenMP versions of the benchmarks were used. The MPI versions used were the latest official release of the NAS Parallel Benchmarks, version 2.3. The OpenMP versions used were PBN3b2, a beta version that is in the process of being released. NPB 2.3 and PBN3b2 are technically different benchmarks, and NPB results are not directly comparable to PBN results.

  5. AER Benchmark Specification Sheet

    International Nuclear Information System (INIS)

    Aszodi, A.; Toth, S.

    2009-01-01

    In the WWER-440/213 type reactors, the core outlet temperature field is monitored with in-core thermocouples, which are installed above 210 fuel assemblies. These measured temperatures are used in determination of the fuel assembly powers and they have important role in the reactor power limitation. For these reasons, correct interpretation of the thermocouple signals is an important question. In order to interpret the signals in correct way, knowledge of the coolant mixing in the assembly heads is necessary. Computational fluid dynamics codes and experiments can help to understand better these mixing processes and they can provide information which can support the more adequate interpretation of the thermocouple signals. This benchmark deals with the 3D computational fluid dynamics modeling of the coolant mixing in the heads of the profiled fuel assemblies with 12.2 mm rod pitch. Two assemblies of the twenty third cycle of the Paks NPPs Unit 3 are investigated. One of them has symmetrical pin power profile and another possesses inclined profile. (Authors)

  6. CFD validation in OECD/NEA t-junction benchmark.

    Energy Technology Data Exchange (ETDEWEB)

    Obabko, A. V.; Fischer, P. F.; Tautges, T. J.; Karabasov, S.; Goloviznin, V. M.; Zaytsev, M. A.; Chudanov, V. V.; Pervichko, V. A.; Aksenova, A. E. (Mathematics and Computer Science); (Cambridge Univ.); (Moscow Institute of Nuclar Energy Safety)

    2011-08-23

    When streams of rapidly moving flow merge in a T-junction, the potential arises for large oscillations at the scale of the diameter, D, with a period scaling as O(D/U), where U is the characteristic flow velocity. If the streams are of different temperatures, the oscillations result in experimental fluctuations (thermal striping) at the pipe wall in the outlet branch that can accelerate thermal-mechanical fatigue and ultimately cause pipe failure. The importance of this phenomenon has prompted the nuclear energy modeling and simulation community to establish a benchmark to test the ability of computational fluid dynamics (CFD) codes to predict thermal striping. The benchmark is based on thermal and velocity data measured in an experiment designed specifically for this purpose. Thermal striping is intrinsically unsteady and hence not accessible to steady state simulation approaches such as steady state Reynolds-averaged Navier-Stokes (RANS) models.1 Consequently, one must consider either unsteady RANS or large eddy simulation (LES). This report compares the results for three LES codes: Nek5000, developed at Argonne National Laboratory (USA), and Cabaret and Conv3D, developed at the Moscow Institute of Nuclear Energy Safety at (IBRAE) in Russia. Nek5000 is based on the spectral element method (SEM), which is a high-order weighted residual technique that combines the geometric flexibility of the finite element method (FEM) with the tensor-product efficiencies of spectral methods. Cabaret is a 'compact accurately boundary-adjusting high-resolution technique' for fluid dynamics simulation. The method is second-order accurate on nonuniform grids in space and time, and has a small dispersion error and computational stencil defined within one space-time cell. The scheme is equipped with a conservative nonlinear correction procedure based on the maximum principle. CONV3D is based on the immersed boundary method and is validated on a wide set of the experimental

  7. Sliding Mode Control for Mass Moment Aerospace Vehicles Using Dynamic Inversion Approach

    Directory of Open Access Journals (Sweden)

    Xiao-Yu Zhang

    2013-01-01

    Full Text Available The moving mass actuation technique offers significant advantages over conventional aerodynamic control surfaces and reaction control systems, because the actuators are contained entirely within the airframe geometrical envelope. Modeling, control, and simulation of Mass Moment Aerospace Vehicles (MMAV utilizing moving mass actuators are discussed. Dynamics of the MMAV are separated into two parts on the basis of the two time-scale separation theory: the dynamics of fast state and the dynamics of slow state. And then, in order to restrain the system chattering and keep the track performance of the system by considering aerodynamic parameter perturbation, the flight control system is designed for the two subsystems, respectively, utilizing fuzzy sliding mode control approach. The simulation results describe the effectiveness of the proposed autopilot design approach. Meanwhile, the chattering phenomenon that frequently appears in the conventional variable structure systems is also eliminated without deteriorating the system robustness.

  8. Benchmarking ENDF/B-VII.0

    International Nuclear Information System (INIS)

    Marck, Steven C. van der

    2006-01-01

    The new major release VII.0 of the ENDF/B nuclear data library has been tested extensively using benchmark calculations. These were based upon MCNP-4C3 continuous-energy Monte Carlo neutronics simulations, together with nuclear data processed using the code NJOY. Three types of benchmarks were used, viz., criticality safety benchmarks (fusion) shielding benchmarks, and reference systems for which the effective delayed neutron fraction is reported. For criticality safety, more than 700 benchmarks from the International Handbook of Criticality Safety Benchmark Experiments were used. Benchmarks from all categories were used, ranging from low-enriched uranium, compound fuel, thermal spectrum ones (LEU-COMP-THERM), to mixed uranium-plutonium, metallic fuel, fast spectrum ones (MIX-MET-FAST). For fusion shielding many benchmarks were based on IAEA specifications for the Oktavian experiments (for Al, Co, Cr, Cu, LiF, Mn, Mo, Si, Ti, W, Zr), Fusion Neutronics Source in Japan (for Be, C, N, O, Fe, Pb), and Pulsed Sphere experiments at Lawrence Livermore National Laboratory (for 6 Li, 7 Li, Be, C, N, O, Mg, Al, Ti, Fe, Pb, D 2 O, H 2 O, concrete, polyethylene and teflon). For testing delayed neutron data more than thirty measurements in widely varying systems were used. Among these were measurements in the Tank Critical Assembly (TCA in Japan) and IPEN/MB-01 (Brazil), both with a thermal spectrum, and two cores in Masurca (France) and three cores in the Fast Critical Assembly (FCA, Japan), all with fast spectra. In criticality safety, many benchmarks were chosen from the category with a thermal spectrum, low-enriched uranium, compound fuel (LEU-COMP-THERM), because this is typical of most current-day reactors, and because these benchmarks were previously underpredicted by as much as 0.5% by most nuclear data libraries (such as ENDF/B-VI.8, JEFF-3.0). The calculated results presented here show that this underprediction is no longer there for ENDF/B-VII.0. The average over 257

  9. California commercial building energy benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Kinney, Satkartar; Piette, Mary Ann

    2003-07-01

    Building energy benchmarking is the comparison of whole-building energy use relative to a set of similar buildings. It provides a useful starting point for individual energy audits and for targeting buildings for energy-saving measures in multiple-site audits. Benchmarking is of interest and practical use to a number of groups. Energy service companies and performance contractors communicate energy savings potential with ''typical'' and ''best-practice'' benchmarks while control companies and utilities can provide direct tracking of energy use and combine data from multiple buildings. Benchmarking is also useful in the design stage of a new building or retrofit to determine if a design is relatively efficient. Energy managers and building owners have an ongoing interest in comparing energy performance to others. Large corporations, schools, and government agencies with numerous facilities also use benchmarking methods to compare their buildings to each other. The primary goal of Task 2.1.1 Web-based Benchmarking was the development of a web-based benchmarking tool, dubbed Cal-Arch, for benchmarking energy use in California commercial buildings. While there were several other benchmarking tools available to California consumers prior to the development of Cal-Arch, there were none that were based solely on California data. Most available benchmarking information, including the Energy Star performance rating, were developed using DOE's Commercial Building Energy Consumption Survey (CBECS), which does not provide state-level data. Each database and tool has advantages as well as limitations, such as the number of buildings and the coverage by type, climate regions and end uses. There is considerable commercial interest in benchmarking because it provides an inexpensive method of screening buildings for tune-ups and retrofits. However, private companies who collect and manage consumption data are concerned that the

  10. Benchmarking in Foodservice Operations

    National Research Council Canada - National Science Library

    Johnson, Bonnie

    1998-01-01

    The objective of this study was to identify usage of foodservice performance measures, important activities in foodservice benchmarking, and benchmarking attitudes, beliefs, and practices by foodservice directors...

  11. Benchmarking, benchmarks, or best practices? Applying quality improvement principles to decrease surgical turnaround time.

    Science.gov (United States)

    Mitchell, L

    1996-01-01

    The processes of benchmarking, benchmark data comparative analysis, and study of best practices are distinctly different. The study of best practices is explained with an example based on the Arthur Andersen & Co. 1992 "Study of Best Practices in Ambulatory Surgery". The results of a national best practices study in ambulatory surgery were used to provide our quality improvement team with the goal of improving the turnaround time between surgical cases. The team used a seven-step quality improvement problem-solving process to improve the surgical turnaround time. The national benchmark for turnaround times between surgical cases in 1992 was 13.5 minutes. The initial turnaround time at St. Joseph's Medical Center was 19.9 minutes. After the team implemented solutions, the time was reduced to an average of 16.3 minutes, an 18% improvement. Cost-benefit analysis showed a potential enhanced revenue of approximately $300,000, or a potential savings of $10,119. Applying quality improvement principles to benchmarking, benchmarks, or best practices can improve process performance. Understanding which form of benchmarking the institution wishes to embark on will help focus a team and use appropriate resources. Communicating with professional organizations that have experience in benchmarking will save time and money and help achieve the desired results.

  12. A benchmarking program to reduce red blood cell outdating: implementation, evaluation, and a conceptual framework.

    Science.gov (United States)

    Barty, Rebecca L; Gagliardi, Kathleen; Owens, Wendy; Lauzon, Deborah; Scheuermann, Sheena; Liu, Yang; Wang, Grace; Pai, Menaka; Heddle, Nancy M

    2015-07-01

    Benchmarking is a quality improvement tool that compares an organization's performance to that of its peers for selected indicators, to improve practice. Processes to develop evidence-based benchmarks for red blood cell (RBC) outdating in Ontario hospitals, based on RBC hospital disposition data from Canadian Blood Services, have been previously reported. These benchmarks were implemented in 160 hospitals provincewide with a multifaceted approach, which included hospital education, inventory management tools and resources, summaries of best practice recommendations, recognition of high-performing sites, and audit tools on the Transfusion Ontario website (http://transfusionontario.org). In this study we describe the implementation process and the impact of the benchmarking program on RBC outdating. A conceptual framework for continuous quality improvement of a benchmarking program was also developed. The RBC outdating rate for all hospitals trended downward continuously from April 2006 to February 2012, irrespective of hospitals' transfusion rates or their distance from the blood supplier. The highest annual outdating rate was 2.82%, at the beginning of the observation period. Each year brought further reductions, with a nadir outdating rate of 1.02% achieved in 2011. The key elements of the successful benchmarking strategy included dynamic targets, a comprehensive and evidence-based implementation strategy, ongoing information sharing, and a robust data system to track information. The Ontario benchmarking program for RBC outdating resulted in continuous and sustained quality improvement. Our conceptual iterative framework for benchmarking provides a guide for institutions implementing a benchmarking program. © 2015 AABB.

  13. Dynamic Stability of Pipe Conveying Fluid with Crack and Attached Masses

    International Nuclear Information System (INIS)

    Ahn, Tae Soo; Yoon, Han Ik; Son, In Soo; Ahn, Sung Jin

    2007-01-01

    In this paper, the dynamic stability of a cracked simply supported pipe conveying fluid with an attached mass is investigated. Also, the effect of attached masses on the dynamic stability of a simply supported pipe conveying fluid is presented for the different positions and depth of the crack. Based on the Euler-Bernoulli beam theory, the equation of motion can be constructed by the energy expressions using extended Hamilton's principle. The crack section is represented by a local flexibility matrix connecting two undamaged pipe segments. The crack is assumed to be in the first mode of a fracture and to be always opened during the vibrations. Finally, the critical flow velocities and stability maps of the pipe conveying fluid are obtained by changing the attached masses and crack severity

  14. Monte Carlo burnup simulation of the TAKAHAMA-3 benchmark experiment

    International Nuclear Information System (INIS)

    Dalle, Hugo M.

    2009-01-01

    High burnup PWR fuel is currently being studied at CDTN/CNEN-MG. Monte Carlo burnup code system MONTEBURNS is used to characterize the neutronic behavior of the fuel. In order to validate the code system and calculation methodology to be used in this study the Japanese Takahama-3 Benchmark was chosen, as it is the single burnup benchmark experimental data set freely available that partially reproduces the conditions of the fuel under evaluation. The burnup of the three PWR fuel rods of the Takahama-3 burnup benchmark was calculated by MONTEBURNS using the simplest infinite fuel pin cell model and also a more complex representation of an infinite heterogeneous fuel pin cells lattice. Calculations results for the mass of most isotopes of Uranium, Neptunium, Plutonium, Americium, Curium and some fission products, commonly used as burnup monitors, were compared with the Post Irradiation Examinations (PIE) values for all the three fuel rods. Results have shown some sensitivity to the MCNP neutron cross-section data libraries, particularly affected by the temperature in which the evaluated nuclear data files were processed. (author)

  15. Development of parallel benchmark code by sheet metal forming simulator 'ITAS'

    International Nuclear Information System (INIS)

    Watanabe, Hiroshi; Suzuki, Shintaro; Minami, Kazuo

    1999-03-01

    This report describes the development of parallel benchmark code by sheet metal forming simulator 'ITAS'. ITAS is a nonlinear elasto-plastic analysis program by the finite element method for the purpose of the simulation of sheet metal forming. ITAS adopts the dynamic analysis method that computes displacement of sheet metal at every time unit and utilizes the implicit method with the direct linear equation solver. Therefore the simulator is very robust. However, it requires a lot of computational time and memory capacity. In the development of the parallel benchmark code, we designed the code by MPI programming to reduce the computational time. In numerical experiments on the five kinds of parallel super computers at CCSE JAERI, i.e., SP2, SR2201, SX-4, T94 and VPP300, good performances are observed. The result will be shown to the public through WWW so that the benchmark results may become a guideline of research and development of the parallel program. (author)

  16. Benchmarking i den offentlige sektor

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj; Dietrichson, Lars; Sandalgaard, Niels

    2008-01-01

    I artiklen vil vi kort diskutere behovet for benchmarking i fraværet af traditionelle markedsmekanismer. Herefter vil vi nærmere redegøre for, hvad benchmarking er med udgangspunkt i fire forskellige anvendelser af benchmarking. Regulering af forsyningsvirksomheder vil blive behandlet, hvorefter...

  17. Regional Competitive Intelligence: Benchmarking and Policymaking

    OpenAIRE

    Huggins , Robert

    2010-01-01

    Benchmarking exercises have become increasingly popular within the sphere of regional policymaking in recent years. The aim of this paper is to analyse the concept of regional benchmarking and its links with regional policymaking processes. It develops a typology of regional benchmarking exercises and regional benchmarkers, and critically reviews the literature, both academic and policy oriented. It is argued that critics who suggest regional benchmarking is a flawed concept and technique fai...

  18. ABM11 parton distributions and benchmarks

    International Nuclear Information System (INIS)

    Alekhin, Sergey; Bluemlein, Johannes; Moch, Sven-Olaf

    2012-08-01

    We present a determination of the nucleon parton distribution functions (PDFs) and of the strong coupling constant α s at next-to-next-to-leading order (NNLO) in QCD based on the world data for deep-inelastic scattering and the fixed-target data for the Drell-Yan process. The analysis is performed in the fixed-flavor number scheme for n f =3,4,5 and uses the MS scheme for α s and the heavy quark masses. The fit results are compared with other PDFs and used to compute the benchmark cross sections at hadron colliders to the NNLO accuracy.

  19. ABM11 parton distributions and benchmarks

    Energy Technology Data Exchange (ETDEWEB)

    Alekhin, Sergey [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Institut Fiziki Vysokikh Ehnergij, Protvino (Russian Federation); Bluemlein, Johannes; Moch, Sven-Olaf [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany)

    2012-08-15

    We present a determination of the nucleon parton distribution functions (PDFs) and of the strong coupling constant {alpha}{sub s} at next-to-next-to-leading order (NNLO) in QCD based on the world data for deep-inelastic scattering and the fixed-target data for the Drell-Yan process. The analysis is performed in the fixed-flavor number scheme for n{sub f}=3,4,5 and uses the MS scheme for {alpha}{sub s} and the heavy quark masses. The fit results are compared with other PDFs and used to compute the benchmark cross sections at hadron colliders to the NNLO accuracy.

  20. Benchmarking Using Basic DBMS Operations

    Science.gov (United States)

    Crolotte, Alain; Ghazal, Ahmad

    The TPC-H benchmark proved to be successful in the decision support area. Many commercial database vendors and their related hardware vendors used these benchmarks to show the superiority and competitive edge of their products. However, over time, the TPC-H became less representative of industry trends as vendors keep tuning their database to this benchmark-specific workload. In this paper, we present XMarq, a simple benchmark framework that can be used to compare various software/hardware combinations. Our benchmark model is currently composed of 25 queries that measure the performance of basic operations such as scans, aggregations, joins and index access. This benchmark model is based on the TPC-H data model due to its maturity and well-understood data generation capability. We also propose metrics to evaluate single-system performance and compare two systems. Finally we illustrate the effectiveness of this model by showing experimental results comparing two systems under different conditions.

  1. High Energy Physics (HEP) benchmark program

    International Nuclear Information System (INIS)

    Yasu, Yoshiji; Ichii, Shingo; Yashiro, Shigeo; Hirayama, Hideo; Kokufuda, Akihiro; Suzuki, Eishin.

    1993-01-01

    High Energy Physics (HEP) benchmark programs are indispensable tools to select suitable computer for HEP application system. Industry standard benchmark programs can not be used for this kind of particular selection. The CERN and the SSC benchmark suite are famous HEP benchmark programs for this purpose. The CERN suite includes event reconstruction and event generator programs, while the SSC one includes event generators. In this paper, we found that the results from these two suites are not consistent. And, the result from the industry benchmark does not agree with either of these two. Besides, we describe comparison of benchmark results using EGS4 Monte Carlo simulation program with ones from two HEP benchmark suites. Then, we found that the result from EGS4 in not consistent with the two ones. The industry standard of SPECmark values on various computer systems are not consistent with the EGS4 results either. Because of these inconsistencies, we point out the necessity of a standardization of HEP benchmark suites. Also, EGS4 benchmark suite should be developed for users of applications such as medical science, nuclear power plant, nuclear physics and high energy physics. (author)

  2. Influence of tip mass on dynamic behavior of cracked cantilever pipe conveying fluid with moving mass

    International Nuclear Information System (INIS)

    Yoon, Han Ik; Son, In Soo

    2005-01-01

    In this paper, we studied about the effect of the open crack and a tip mass on the dynamic behavior of a cantilever pipe conveying fluid with a moving mass. The equation of motion is derived by using Lagrange's equation and analyzed by numerical method. The cantilever pipe is modelled by the Euler-Bernoulli beam theory. The crack section is represented by a local flexibility matrix connecting two undamaged pipe segments. The influences of the crack, the moving mass, the tip mass and its moment of inertia, the velocity of fluid, and the coupling of these factors on the vibration mode, the frequency, and the tip-displacement of the cantilever pipe are analytically clarified

  3. Dynamical Formation of Low-mass Merging Black Hole Binaries like GW151226

    Energy Technology Data Exchange (ETDEWEB)

    Chatterjee, Sourav; Rodriguez, Carl L.; Kalogera, Vicky; Rasio, Frederic A., E-mail: sourav.chatterjee@northwestern.edu [Center for Interdisciplinary Exploration and Research in Astrophysics (CIERA) Physics and Astronomy, Northwestern University, Evanston, IL 60202 (United States)

    2017-02-20

    Using numerical models for star clusters spanning a wide range in ages and metallicities (Z) we study the masses of binary black holes (BBHs) produced dynamically and merging in the local universe ( z ≲ 0.2). After taking into account cosmological constraints on star formation rate and metallicity evolution, which realistically relate merger delay times obtained from models with merger redshifts, we show here for the first time that while old, metal-poor globular clusters can naturally produce merging BBHs with heavier components, as observed in GW150914, lower-mass BBHs like GW151226 are easily formed dynamically in younger, higher-metallicity clusters. More specifically, we show that the mass of GW151226 is well within 1 σ of the mass distribution obtained from our models for clusters with Z/Z{sub ⊙} ≳ 0.5. Indeed, dynamical formation of a system like GW151226 likely requires a cluster that is younger and has a higher metallicity than typical Galactic globular clusters. The LVT151012 system, if real, could have been created in any cluster with Z/Z{sub ⊙} ≲ 0.25. On the other hand, GW150914 is more massive (beyond 1 σ ) than typical BBHs from even the lowest-metallicity (Z/Z{sub ⊙} = 0.005) clusters we consider, but is within 2 σ of the intrinsic mass distribution from our cluster models with Z/Z{sub ⊙} ≲ 0.05; of course, detection biases also push the observed distributions toward higher masses.

  4. Benchmarking Tool Kit.

    Science.gov (United States)

    Canadian Health Libraries Association.

    Nine Canadian health libraries participated in a pilot test of the Benchmarking Tool Kit between January and April, 1998. Although the Tool Kit was designed specifically for health libraries, the content and approach are useful to other types of libraries as well. Used to its full potential, benchmarking can provide a common measuring stick to…

  5. A Benchmark for Virtual Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Yannakakis, Georgios N.

    2015-01-01

    Automatically animating and placing the virtual camera in a dynamic environment is a challenging task. The camera is expected to maximise and maintain a set of properties — i.e. visual composition — while smoothly moving through the environment and avoiding obstacles. A large number of different....... For this reason, in this paper, we propose a benchmark for the problem of virtual camera control and we analyse a number of different problems in different virtual environments. Each of these scenarios is described through a set of complexity measures and, as a result of this analysis, a subset of scenarios...

  6. MOx Depletion Calculation Benchmark

    International Nuclear Information System (INIS)

    San Felice, Laurence; Eschbach, Romain; Dewi Syarifah, Ratna; Maryam, Seif-Eddine; Hesketh, Kevin

    2016-01-01

    Under the auspices of the NEA Nuclear Science Committee (NSC), the Working Party on Scientific Issues of Reactor Systems (WPRS) has been established to study the reactor physics, fuel performance, radiation transport and shielding, and the uncertainties associated with modelling of these phenomena in present and future nuclear power systems. The WPRS has different expert groups to cover a wide range of scientific issues in these fields. The Expert Group on Reactor Physics and Advanced Nuclear Systems (EGRPANS) was created in 2011 to perform specific tasks associated with reactor physics aspects of present and future nuclear power systems. EGRPANS provides expert advice to the WPRS and the nuclear community on the development needs (data and methods, validation experiments, scenario studies) for different reactor systems and also provides specific technical information regarding: core reactivity characteristics, including fuel depletion effects; core power/flux distributions; Core dynamics and reactivity control. In 2013 EGRPANS published a report that investigated fuel depletion effects in a Pressurised Water Reactor (PWR). This was entitled 'International Comparison of a Depletion Calculation Benchmark on Fuel Cycle Issues' NEA/NSC/DOC(2013) that documented a benchmark exercise for UO 2 fuel rods. This report documents a complementary benchmark exercise that focused on PuO 2 /UO 2 Mixed Oxide (MOX) fuel rods. The results are especially relevant to the back-end of the fuel cycle, including irradiated fuel transport, reprocessing, interim storage and waste repository. Saint-Laurent B1 (SLB1) was the first French reactor to use MOx assemblies. SLB1 is a 900 MWe PWR, with 30% MOx fuel loading. The standard MOx assemblies, used in Saint-Laurent B1 reactor, include three zones with different plutonium enrichments, high Pu content (5.64%) in the center zone, medium Pu content (4.42%) in the intermediate zone and low Pu content (2.91%) in the peripheral zone

  7. A Global Vision over Benchmarking Process: Benchmarking Based Enterprises

    OpenAIRE

    Sitnikov, Catalina; Giurca Vasilescu, Laura

    2008-01-01

    Benchmarking uses the knowledge and the experience of others to improve the enterprise. Starting from the analysis of the performance and underlying the strengths and weaknesses of the enterprise it should be assessed what must be done in order to improve its activity. Using benchmarking techniques, an enterprise looks at how processes in the value chain are performed. The approach based on the vision “from the whole towards the parts” (a fragmented image of the enterprise’s value chain) redu...

  8. Implementing ADM1 for plant-wide benchmark simulations in Matlab/Simulink

    DEFF Research Database (Denmark)

    Rosen, Christian; Vrecko, Darko; Gernaey, Krist

    2006-01-01

    , in particular if the ADM1 is to be included in dynamic simulations of plant-wide or even integrated systems. In this paper, the experiences gained from a Matlab/Simulink implementation of ADM1 into the extended COST/IWA Benchmark Simulation Model (BSM2) are presented. Aspects related to system stiffness, model...

  9. The Atacama Cosmology Telescope: Dynamical Masses for 44 SZ-Selected Galaxy Clusters over 755 Square Degrees

    Science.gov (United States)

    Sifon, Cristobal; Battaglia, Nick; Hasselfield, Matthew; Menanteau, Felipe; Barrientos, L. Felipe; Bond, J. Richard; Crichton, Devin; Devlin, Mark J.; Dunner, Rolando; Hilton, Matt; hide

    2016-01-01

    We present galaxy velocity dispersions and dynamical mass estimates for 44 galaxy clusters selected via the Sunyaev-Zeldovich (SZ) effect by the Atacama Cosmology Telescope. Dynamical masses for 18 clusters are reported here for the first time. Using N-body simulations, we model the different observing strategies used to measure the velocity dispersions and account for systematic effects resulting from these strategies. We find that the galaxy velocity distributions may be treated as isotropic, and that an aperture correction of up to 7 per cent in the velocity dispersion is required if the spectroscopic galaxy sample is sufficiently concentrated towards the cluster centre. Accounting for the radial profile of the velocity dispersion in simulations enables consistent dynamical mass estimates regardless of the observing strategy. Cluster masses M200 are in the range (1 - 15) times 10 (sup 14) Solar Masses. Comparing with masses estimated from the SZ distortion assuming a gas pressure profile derived from X-ray observations gives a mean SZ-to-dynamical mass ratio of 1:10 plus or minus 0:13, but there is an additional 0.14 systematic uncertainty due to the unknown velocity bias; the statistical uncertainty is dominated by the scatter in the mass-velocity dispersion scaling relation. This ratio is consistent with previous determinations at these mass scales.

  10. Polynomial friction pendulum isolators (PFPIs) for seismic performance control of benchmark highway bridge

    Science.gov (United States)

    Saha, Arijit; Saha, Purnachandra; Patro, Sanjaya Kumar

    2017-10-01

    The seismic response of a benchmark highway bridge isolated with passive polynomial friction pendulum isolators (PFPIs) is investigated and subjected to six bidirectional ground motion records. The benchmark study is based on a lumped mass finite-element model of the 91/5 highway overcrossing located in Southern California. The PFPI system possesses two important parameters; one is horizontal flexibility and the other is energy absorbing capacity through friction. The evaluation criteria of the benchmark bridge are analyzed considering two parameters, time period of the isolator and coefficient of friction of the isolation surface. The results of the numerical study are compared with those obtained from the traditional friction pendulum system (FPS). Dual design performance of the PFPI system suppressed the displacement and acceleration response of the benchmark highway bridge. The dual design hysteresis loop of the PFPI system is the main advantage over the linear hysteresis loop of the FPS. The numerical result indicates that the seismic performance of the PFPI system is better than that of the traditional FPS isolated system. Further, it is observed that variations of the isolation time period and coefficient of friction of the FPS and PFPI systems have a significant effect on the peak responses of the benchmark highway bridge.

  11. Experimental Benchmarking of Fire Modeling Simulations. Final Report

    International Nuclear Information System (INIS)

    Greiner, Miles; Lopez, Carlos

    2003-01-01

    A series of large-scale fire tests were performed at Sandia National Laboratories to simulate a nuclear waste transport package under severe accident conditions. The test data were used to benchmark and adjust the Container Analysis Fire Environment (CAFE) computer code. CAFE is a computational fluid dynamics fire model that accurately calculates the heat transfer from a large fire to a massive engulfed transport package. CAFE will be used in transport package design studies and risk analyses

  12. Argonne Code Center: Benchmark problem book.

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1977-06-01

    This book is an outgrowth of activities of the Computational Benchmark Problems Committee of the Mathematics and Computation Division of the American Nuclear Society. This is the second supplement of the original benchmark book which was first published in February, 1968 and contained computational benchmark problems in four different areas. Supplement No. 1, which was published in December, 1972, contained corrections to the original benchmark book plus additional problems in three new areas. The current supplement. Supplement No. 2, contains problems in eight additional new areas. The objectives of computational benchmark work and the procedures used by the committee in pursuing the objectives are outlined in the original edition of the benchmark book (ANL-7416, February, 1968). The members of the committee who have made contributions to Supplement No. 2 are listed below followed by the contributors to the earlier editions of the benchmark book.

  13. Comparison of the updated solutions of the 6th dynamic AER Benchmark - main steam line break in a NPP with WWER-440

    International Nuclear Information System (INIS)

    Kliem, S.

    2003-01-01

    The 6 th dynamic AER Benchmark is used for the systematic validation of coupled 3D neutron kinetic/thermal hydraulic system codes. It was defined at The 10 th AER-Symposium. In this benchmark, a hypothetical double ended break of one main steam line at full power in a WWER-440 plant is investigated. The main thermal hydraulic features are the consideration of incomplete coolant mixing in the lower and upper plenum of the reactor pressure vessel and an asymmetric operation of the feed water system. For the tuning of the different nuclear cross section data used by the participants, an isothermal re-criticality temperature was defined. The paper gives an overview on the behaviour of the main thermal hydraulic and neutron kinetic parameters in the provided solutions. The differences in the updated solution in comparison to the previous ones are described. Improvements in the modelling of the transient led to a better agreement of a part of the results while for another part the deviations rose up. The sensitivity of the core power behaviour on the secondary side modelling is discussed in detail (Authors)

  14. Multilaboratory particle image velocimetry analysis of the FDA benchmark nozzle model to support validation of computational fluid dynamics simulations.

    Science.gov (United States)

    Hariharan, Prasanna; Giarra, Matthew; Reddy, Varun; Day, Steven W; Manning, Keefe B; Deutsch, Steven; Stewart, Sandy F C; Myers, Matthew R; Berman, Michael R; Burgreen, Greg W; Paterson, Eric G; Malinauskas, Richard A

    2011-04-01

    This study is part of a FDA-sponsored project to evaluate the use and limitations of computational fluid dynamics (CFD) in assessing blood flow parameters related to medical device safety. In an interlaboratory study, fluid velocities and pressures were measured in a nozzle model to provide experimental validation for a companion round-robin CFD study. The simple benchmark nozzle model, which mimicked the flow fields in several medical devices, consisted of a gradual flow constriction, a narrow throat region, and a sudden expansion region where a fluid jet exited the center of the nozzle with recirculation zones near the model walls. Measurements of mean velocity and turbulent flow quantities were made in the benchmark device at three independent laboratories using particle image velocimetry (PIV). Flow measurements were performed over a range of nozzle throat Reynolds numbers (Re(throat)) from 500 to 6500, covering the laminar, transitional, and turbulent flow regimes. A standard operating procedure was developed for performing experiments under controlled temperature and flow conditions and for minimizing systematic errors during PIV image acquisition and processing. For laminar (Re(throat)=500) and turbulent flow conditions (Re(throat)≥3500), the velocities measured by the three laboratories were similar with an interlaboratory uncertainty of ∼10% at most of the locations. However, for the transitional flow case (Re(throat)=2000), the uncertainty in the size and the velocity of the jet at the nozzle exit increased to ∼60% and was very sensitive to the flow conditions. An error analysis showed that by minimizing the variability in the experimental parameters such as flow rate and fluid viscosity to less than 5% and by matching the inlet turbulence level between the laboratories, the uncertainties in the velocities of the transitional flow case could be reduced to ∼15%. The experimental procedure and flow results from this interlaboratory study (available

  15. Benchmarks for GADRAS performance validation

    International Nuclear Information System (INIS)

    Mattingly, John K.; Mitchell, Dean James; Rhykerd, Charles L. Jr.

    2009-01-01

    The performance of the Gamma Detector Response and Analysis Software (GADRAS) was validated by comparing GADRAS model results to experimental measurements for a series of benchmark sources. Sources for the benchmark include a plutonium metal sphere, bare and shielded in polyethylene, plutonium oxide in cans, a highly enriched uranium sphere, bare and shielded in polyethylene, a depleted uranium shell and spheres, and a natural uranium sphere. The benchmark experimental data were previously acquired and consist of careful collection of background and calibration source spectra along with the source spectra. The calibration data were fit with GADRAS to determine response functions for the detector in each experiment. A one-dimensional model (pie chart) was constructed for each source based on the dimensions of the benchmark source. The GADRAS code made a forward calculation from each model to predict the radiation spectrum for the detector used in the benchmark experiment. The comparisons between the GADRAS calculation and the experimental measurements are excellent, validating that GADRAS can correctly predict the radiation spectra for these well-defined benchmark sources.

  16. Benchmarking in Czech Higher Education

    Directory of Open Access Journals (Sweden)

    Plaček Michal

    2015-12-01

    Full Text Available The first part of this article surveys the current experience with the use of benchmarking at Czech universities specializing in economics and management. The results indicate that collaborative benchmarking is not used on this level today, but most actors show some interest in its introduction. The expression of the need for it and the importance of benchmarking as a very suitable performance-management tool in less developed countries are the impetus for the second part of our article. Based on an analysis of the current situation and existing needs in the Czech Republic, as well as on a comparison with international experience, recommendations for public policy are made, which lie in the design of a model of a collaborative benchmarking for Czech economics and management in higher-education programs. Because the fully complex model cannot be implemented immediately – which is also confirmed by structured interviews with academics who have practical experience with benchmarking –, the final model is designed as a multi-stage model. This approach helps eliminate major barriers to the implementation of benchmarking.

  17. Towards a benchmarking tool for minimizing wastewater utility greenhouse gas footprints.

    Science.gov (United States)

    Guo, L; Porro, J; Sharma, K R; Amerlinck, Y; Benedetti, L; Nopens, I; Shaw, A; Van Hulle, S W H; Yuan, Z; Vanrolleghem, P A

    2012-01-01

    A benchmark simulation model, which includes a wastewater treatment plant (WWTP)-wide model and a rising main sewer model, is proposed for testing mitigation strategies to reduce the system's greenhouse gas (GHG) emissions. The sewer model was run to predict methane emissions, and its output was used as the WWTP model input. An activated sludge model for GHG (ASMG) was used to describe nitrous oxide (N(2)O) generation and release in activated sludge process. N(2)O production through both heterotrophic and autotrophic pathways was included. Other GHG emissions were estimated using empirical relationships. Different scenarios were evaluated comparing GHG emissions, effluent quality and energy consumption. Aeration control played a clear role in N(2)O emissions, through concentrations and distributions of dissolved oxygen (DO) along the length of the bioreactor. The average value of N(2)O emission under dynamic influent cannot be simulated by a steady-state model subjected to a similar influent quality, stressing the importance of dynamic simulation and control. As the GHG models have yet to be validated, these results carry a degree of uncertainty; however, they fulfilled the objective of this study, i.e. to demonstrate the potential of a dynamic system-wide modelling and benchmarking approach for balancing water quality, operational costs and GHG emissions.

  18. ANN-Benchmarks: A Benchmarking Tool for Approximate Nearest Neighbor Algorithms

    DEFF Research Database (Denmark)

    Aumüller, Martin; Bernhardsson, Erik; Faithfull, Alexander

    2017-01-01

    This paper describes ANN-Benchmarks, a tool for evaluating the performance of in-memory approximate nearest neighbor algorithms. It provides a standard interface for measuring the performance and quality achieved by nearest neighbor algorithms on different standard data sets. It supports several...... visualise these as images, Open image in new window plots, and websites with interactive plots. ANN-Benchmarks aims to provide a constantly updated overview of the current state of the art of k-NN algorithms. In the short term, this overview allows users to choose the correct k-NN algorithm and parameters...... for their similarity search task; in the longer term, algorithm designers will be able to use this overview to test and refine automatic parameter tuning. The paper gives an overview of the system, evaluates the results of the benchmark, and points out directions for future work. Interestingly, very different...

  19. Extraction of left ventricular myocardial mass from dynamic 11C-acetate PET

    DEFF Research Database (Denmark)

    Harms, Hans; Tolbod, Lars Poulsen; Hansson, Nils Henrik

    Background: Dynamic 11C-acetate PET is used to quantify oxygen metabolism, which is used to calculate left ventricular (LV) myocardial efficiency, an early marker of heart failure. This requires estimation of LV myocardial mass and is typically derived from a separate cardiovascular magnetic...... resonance (CMR) scan. The aim of this study was to explore the feasibility of estimating myocardial mass directly from a dynamic 11C-acetate PET scan. Methods: 21 subjects underwent a 27-min 11C-acetate PET scan on a Siemens Biograph TruePoint 64 PET/CT scanner. In addition, 10 subjects underwent a dynamic...... 11C-acetate 27-min PET scan on a GE Discovery ST PET/CT scanner. Parametric images of uptake rate K1 and both arterial (VA) and venous (VV) spillover fractions were generated using a basis function implementation of the standard single tissue compartment model using non-gated dynamic data. The LV...

  20. Dynamic mass generation and renormalizations in quantum field theories

    International Nuclear Information System (INIS)

    Miransky, V.A.

    1979-01-01

    It is shown that the dynamic mass generation can destroy the multiplicative renormalization relations and lead to new type divergences in the massive phase. To remove these divergences the values of the bare coupling constants must be fixed. The phase diagrams of gauge theories are discussed

  1. Sigma terms and strangeness content of the nucleon with N{sub f}=2+1+1 twisted mass fermions

    Energy Technology Data Exchange (ETDEWEB)

    Dinter, Simon; Drach, Vincent [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Frezzotti, Roberto; Rossi, Giancarlo [Roma Tor Vergata Univ. (Italy). Dipt. di Fisica; INFN Sezione di Roma Tor Vergata, Roma (Italy); Herdoiza, Gregorio [Univ. Autonoma de Madrid (Spain). Dept. de Fisica Teorica y Inst. de Fisica Teorica UAM/CSIC; Jansen, Karl [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC; Roma Tor Vergata Univ. (Italy). Dipt. di Fisica; INFN Sezione di Roma Tor Vergata, Roma (Italy)

    2012-02-15

    We study the nucleon matrix elements of the quark scalar-density operator using maximally twisted mass fermions with dynamical light (u,d), strange and charm degrees of freedom. We demonstrate that in this setup the nucleon matrix elements of the light and strange quark densities can be obtained with good statistical accuracy, while for the charm quark counterpart only a bound can be provided. The present calculation which is performed at only one value of the lattice spacing and pion mass serves as a benchmark for a future more systematic computation of the scalar quark content of the nucleon. (orig.)

  2. Benchmarking Swiss electricity grids

    International Nuclear Information System (INIS)

    Walti, N.O.; Weber, Ch.

    2001-01-01

    This extensive article describes a pilot benchmarking project initiated by the Swiss Association of Electricity Enterprises that assessed 37 Swiss utilities. The data collected from these utilities on a voluntary basis included data on technical infrastructure, investments and operating costs. These various factors are listed and discussed in detail. The assessment methods and rating mechanisms that provided the benchmarks are discussed and the results of the pilot study are presented that are to form the basis of benchmarking procedures for the grid regulation authorities under the planned Switzerland's electricity market law. Examples of the practical use of the benchmarking methods are given and cost-efficiency questions still open in the area of investment and operating costs are listed. Prefaces by the Swiss Association of Electricity Enterprises and the Swiss Federal Office of Energy complete the article

  3. Maximally twisted mass lattice QCD at the physical pion mass

    International Nuclear Information System (INIS)

    Kostrzewa, Bartosz

    2016-01-01

    introduced which may become very useful on very large lattices. The pion mass splitting is studied as a function of the Sheikholeslami-Wohlert coefficient in simulations with four flavours and it is found to be approximately halved twisted mass quarks without this term. However, a dependence on the precise value of the coefficient cannot be identified within the large uncertainties and within the range of values studied. To optimise the Hybrid Monte Carlo algorithm, mass preconditioning is explored empirically through simple fits to the magnitude of molecular dynamics forces generated by quark determinants and determinant ratios with a wide range of parameter values. Based on the functional form of these fits, mass preconditioning and integration schemes are proposed in which the relationships between all parameters are tuned simultaneously and which may allow more efficient simulations with predictable relative force magnitudes. As a complement to this work, a tentative study of the oscillation frequencies of these forces is performed with the finding that mass preconditioning seems to suppress large amplitude, high frequency oscillations in addition to reducing force magnitudes. Crucial optimisations of the simulation software for twisted mass quarks are introduced. A multithreading strategy based on OpenMP is devised and kernels which overlap communication and computation are developed and benchmarked on various architectures. Testing methodologies for the simulation code are presented and it is shown how they complement each other based on specific examples, providing a rather general set of integration tests.

  4. Benchmark calculations on residue production within the EURISOL DS project; Part II: thick targets

    CERN Document Server

    David, J.-C; Boudard, A; Doré, D; Leray, S; Rapp, B; Ridikas, D; Thiollière, N

    Benchmark calculations on residue production using MCNPX 2.5.0. Calculations were compared to mass-distribution data for 5 different elements measured at ISOLDE, and to specific activities of 28 radionuclides in different places along the thick target measured in Dubna.

  5. DYNAMICS OF METAPHORIC MODELLING OF THE CONCEPT OF TERRORISM IN AMERICAN MASS MEDIA DISCOURSE

    Directory of Open Access Journals (Sweden)

    Rykova, O.V.

    2017-09-01

    Full Text Available The topicality of the research in modern linguistics is defined by the importance of studying the problem of the dynamic nature of the concept content, the need to define the connection type between the concept and discourse as well as to reveal the dependence of the concept content and verbalization means from the type of discourse. The subject of the research is dynamic properties of the verbalization of a socially marked concept in American mass media discourse. The aim is to define the dynamics of structuring and explicating the knowledge about terrorism in mass media discourse. To reach the aim the following tasks are set: to determine the corpus of linguistic units which serve as verbalizers of the concept of terrorism in American mass media discourse; to define the dynamics of the verbal representation of the concept of terrorism in American mass media discourse as exemplified by metaphoric modelling. The practical applicability of the research consists in the possibility of using its main points and results in such academic courses as general linguistics, stylistics, cultural linguistics, special courses in cognitive linguistics, theory of conceptual metaphor, discourse study and in lexicographic practice.

  6. Comparison of Various Dynamic Balancing Principles Regarding Additional Mass and Additional Inertia

    NARCIS (Netherlands)

    van der Wijk, V.; Demeulenaere, Bram; Herder, Justus Laurens

    2009-01-01

    The major disadvantage of existing dynamic balancing principles is that a considerable amount of mass and inertia is added to the system. The objectives of this article are to summarize, to compare, and to evaluate existing complete balancing principles regarding the addition of mass and the

  7. Dynamical mechanism of symmetry breaking and particle mass generation in gauge field theories

    International Nuclear Information System (INIS)

    Miranskij, V.A.; Fomin, P.I.

    1985-01-01

    The dynamics of the spotaneous symmetry breaking and the particle mass generation in gauge theories with no fundamental scalar fields is considered. The emphasis is on the consideration of the symmetry breaking mechanism connected with the dynamics of the supercritical Coulomb-like forces caused by the gauge boson exchange between fermions. This mechanism is applied to different gauge theories, in particular, to the description of the spontaneous chira symmetry breaking in quantum chromodynamics. The mass relations for pseudoscalar meson nonet are obtained and it is shown that this mechanism resuls in the dynamical realisation of the hypothesis of the partial conservation of the axial-vector currents. The qualitative description of scalar mesons is given. The nature of the ultraviolet divergencies in quantum electrodynamics (QED) is investigated from the viewpoint of the dynamics of the fermion mass generation. The mechanism of the appearance of the additional (in comparison with perturbation theory) ultraviolet divergencies in QED with large bare coupling constant is indicated. The physical phenomenon underlying this mechanism is identified as the field theory analogue of the quantum mechanical ''fall into the centre'' (collapse) phenomenon. The similr phenomenon is shown to take place in some two-dimensional quantum field models. The dynamics of the bifermion condensates formation in tumblin gauge theories is briefly discussed

  8. Characterization of the dynamic friction of woven fabrics: Experimental methods and benchmark results

    NARCIS (Netherlands)

    Sachs, Ulrich; Akkerman, Remko; Fetfatsidis, K.; Vidal-Sallé, E.; Schumacher, J.; Ziegmann, G.; Allaoui, S.; Hivet, G.; Maron, B.; Vanclooster, K.; Lomov, S.V.

    2014-01-01

    A benchmark exercise was conducted to compare various friction test set-ups with respect to the measured coefficients of friction. The friction was determined between Twintex®PP, a fabric of commingled yarns of glass and polypropylene filaments, and a metal surface. The same material was supplied to

  9. Benchmarking af kommunernes sagsbehandling

    DEFF Research Database (Denmark)

    Amilon, Anna

    Fra 2007 skal Ankestyrelsen gennemføre benchmarking af kommuernes sagsbehandlingskvalitet. Formålet med benchmarkingen er at udvikle praksisundersøgelsernes design med henblik på en bedre opfølgning og at forbedre kommunernes sagsbehandling. Dette arbejdspapir diskuterer metoder for benchmarking...

  10. Connection between Dynamically Derived Initial Mass Function Normalization and Stellar Population Parameters

    NARCIS (Netherlands)

    McDermid, Richard M.; Cappellari, Michele; Alatalo, Katherine; Bayet, Estelle; Blitz, Leo; Bois, Maxime; Bournaud, Frédéric; Bureau, Martin; Crocker, Alison F.; Davies, Roger L.; Davis, Timothy A.; de Zeeuw, P. T.; Duc, Pierre-Alain; Emsellem, Eric; Khochfar, Sadegh; Krajnović, Davor; Kuntschner, Harald; Morganti, Raffaella; Naab, Thorsten; Oosterloo, Tom; Sarzi, Marc; Scott, Nicholas; Serra, Paolo; Weijmans, Anne-Marie; Young, Lisa M.

    We report on empirical trends between the dynamically determined stellar initial mass function (IMF) and stellar population properties for a complete, volume-limited sample of 260 early-type galaxies from the ATLAS3D project. We study trends between our dynamically derived IMF normalization αdyn ≡

  11. Dynamic Responses of Flexible Cylinders with Low Mass Ratio

    Science.gov (United States)

    Olaoye, Abiodun; Wang, Zhicheng; Triantafyllou, Michael

    2017-11-01

    Flexible cylinders with low mass ratios such as composite risers are attractive in the offshore industry because they require lower top tension and are less likely to buckle under self-weight compared to steel risers. However, their relatively low stiffness characteristics make them more vulnerable to vortex induced vibrations. Additionally, numerical investigation of the dynamic responses of such structures based on realistic conditions is limited by high Reynolds number, complex sheared flow profile, large aspect ratio and low mass ratio challenges. In the framework of Fourier spectral/hp element method, the current technique employs entropy-viscosity method (EVM) based large-eddy simulation approach for flow solver and fictitious added mass method for structure solver. The combination of both methods can handle fluid-structure interaction problems at high Reynolds number with low mass ratio. A validation of the numerical approach is provided by comparison with experiments.

  12. MFTF TOTAL benchmark

    International Nuclear Information System (INIS)

    Choy, J.H.

    1979-06-01

    A benchmark of the TOTAL data base management system as applied to the Mirror Fusion Test Facility (MFTF) data base was implemented and run in February and March of 1979. The benchmark was run on an Interdata 8/32 and involved the following tasks: (1) data base design, (2) data base generation, (3) data base load, and (4) develop and implement programs to simulate MFTF usage of the data base

  13. The Drill Down Benchmark

    NARCIS (Netherlands)

    P.A. Boncz (Peter); T. Rühl (Tim); F. Kwakkel

    1998-01-01

    textabstractData Mining places specific requirements on DBMS query performance that cannot be evaluated satisfactorily using existing OLAP benchmarks. The DD Benchmark - defined here - provides a practical case and yardstick to explore how well a DBMS is able to support Data Mining applications. It

  14. Reanalysis of the radii of the Benchmark eclipsing binary V578 Mon

    International Nuclear Information System (INIS)

    Garcia, E. V.; Stassun, Keivan G.; Torres, Guillermo

    2013-01-01

    V578 Mon is an eclipsing binary system in which both stars have masses above 10 M ☉ determined with an accuracy better than 3%. It is one of only five such massive eclipsing binaries known that also possess eccentric orbits and measured apsidal motions, thus making it an important benchmark for theoretical stellar evolution models. However, recently reported determinations of the radii of V578 Mon differ significantly from previously reported values. We reanalyze the published data for V578 Mon and trace the discrepancy to the use of an incorrect formulation for the stellar potentials in the most recent analysis. Here we report corrected radii for this important benchmark eclipsing binary.

  15. Benchmarking and Learning in Public Healthcare

    DEFF Research Database (Denmark)

    Buckmaster, Natalie; Mouritsen, Jan

    2017-01-01

    This research investigates the effects of learning-oriented benchmarking in public healthcare settings. Benchmarking is a widely adopted yet little explored accounting practice that is part of the paradigm of New Public Management. Extant studies are directed towards mandated coercive benchmarking...

  16. LHC benchmarks from flavored gauge mediation

    Energy Technology Data Exchange (ETDEWEB)

    Ierushalmi, N.; Iwamoto, S.; Lee, G.; Nepomnyashy, V.; Shadmi, Y. [Physics Department, Technion - Israel Institute of Technology,Haifa 32000 (Israel)

    2016-07-12

    We present benchmark points for LHC searches from flavored gauge mediation models, in which messenger-matter couplings give flavor-dependent squark masses. Our examples include spectra in which a single squark — stop, scharm, or sup — is much lighter than all other colored superpartners, motivating improved quark flavor tagging at the LHC. Many examples feature flavor mixing; in particular, large stop-scharm mixing is possible. The correct Higgs mass is obtained in some examples by virtue of the large stop A-term. We also revisit the general flavor and CP structure of the models. Even though the A-terms can be substantial, their contributions to EDM’s are very suppressed, because of the particular dependence of the A-terms on the messenger coupling. This holds regardless of the messenger-coupling texture. More generally, the special structure of the soft terms often leads to stronger suppression of flavor- and CP-violating processes, compared to naive estimates.

  17. Benchmarking & European Sustainable Transport Policies

    DEFF Research Database (Denmark)

    Gudmundsson, H.

    2003-01-01

    , Benchmarking is one of the management tools that have recently been introduced in the transport sector. It is rapidly being applied to a wide range of transport operations, services and policies. This paper is a contribution to the discussion of the role of benchmarking in the future efforts to...... contribution to the discussions within the Eusponsored BEST Thematic Network (Benchmarking European Sustainable Transport) which ran from 2000 to 2003....

  18. Benchmarking – A tool for judgment or improvement?

    DEFF Research Database (Denmark)

    Rasmussen, Grane Mikael Gregaard

    2010-01-01

    perceptions of benchmarking will be presented; public benchmarking and best practice benchmarking. These two types of benchmarking are used to characterize and discuss the Danish benchmarking system and to enhance which effects, possibilities and challenges that follow in the wake of using this kind......Change in construction is high on the agenda for the Danish government and a comprehensive effort is done in improving quality and efficiency. This has led to an initiated governmental effort in bringing benchmarking into the Danish construction sector. This paper is an appraisal of benchmarking...... as it is presently carried out in the Danish construction sector. Many different perceptions of benchmarking and the nature of the construction sector, lead to an uncertainty in how to perceive and use benchmarking, hence, generating an uncertainty in understanding the effects of benchmarking. This paper addresses...

  19. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  20. The Isprs Benchmark on Indoor Modelling

    Science.gov (United States)

    Khoshelham, K.; Díaz Vilariño, L.; Peter, M.; Kang, Z.; Acharya, D.

    2017-09-01

    Automated generation of 3D indoor models from point cloud data has been a topic of intensive research in recent years. While results on various datasets have been reported in literature, a comparison of the performance of different methods has not been possible due to the lack of benchmark datasets and a common evaluation framework. The ISPRS benchmark on indoor modelling aims to address this issue by providing a public benchmark dataset and an evaluation framework for performance comparison of indoor modelling methods. In this paper, we present the benchmark dataset comprising several point clouds of indoor environments captured by different sensors. We also discuss the evaluation and comparison of indoor modelling methods based on manually created reference models and appropriate quality evaluation criteria. The benchmark dataset is available for download at: html"target="_blank">http://www2.isprs.org/commissions/comm4/wg5/benchmark-on-indoor-modelling.html.

  1. Seismic structural response analysis using consistent mass matrices having dynamic coupling

    International Nuclear Information System (INIS)

    Shaw, D.E.

    1977-01-01

    The basis for the theoretical development of this paper is the linear matrix equations of motion for an unconstrained structure subject to support excitation. The equations are formulated in terms of absolute displacement, velocity and acceleration vectors. By means of a transformation of the absolute response vectors into displacements, velocities and accelerations relative to the support motions, the homogeneous equations become non-homogeneous and the non-homogeneous boundary conditions become homogeneous with relative displacements, velocities and accelerations being zero at support points. The forcing function or inertial loading vector is shown to consist of two parts. The first part is comprised of the mass matrix times the suppport acceleration function times a vector of structural displacements resulting from a unit vector of support displacements in the direction of excitation. This inertial loading corresponds to the classical seismic loading vector and is indeed the only loading vector for lumped-mass systems. The second part of he inertial loading vectors consists of the mass matrix times the support acceleration function times a vector of structural accelerations resulting from unit support accelerations in the direction of excitation. This term is not present in classical seismic analysis formulations and results from the presence of off-diagonal terms in the mass matrices which give rise to dynamic coupling through the mass matrix. Thus, for lumped-mass models, the classical formulation of the inertial loading vector is correct. However, if dynamic coupling terms are included through off-diagonal terms in the mass matrix, an additional inertia loading vector must be considered

  2. Overview of Ice-Sheet Mass Balance and Dynamics from ICESat Measurements

    Science.gov (United States)

    Zwally, H. Jay

    2010-01-01

    The primary purpose of the ICESat mission was to determine the present-day mass balance of the Greenland and Antarctic ice sheets, identify changes that may be occurring in the surface-mass flux and ice dynamics, and estimate their contributions to global sea-level rise. Although ICESat's three lasers were planned to make continuous measurements for 3 to 5 years, the mission was re-planned to operate in 33-day campaigns 2 to 3 times each year following failure of the first laser after 36 days. Seventeen campaigns were conducted with the last one in the Fall of 2009. Mass balance maps derived from measured ice-sheet elevation changes show that the mass loss from Greenland has increased significantly to about 170 Gt/yr for 2003 to 2007 from a state of near balance in the 1990's. Increased losses (189 Gt/yr) from melting and dynamic thinning are over seven times larger'than increased gains (25 gt/yr) from precipitation. Parts of the West Antarctic ice sheet and the Antarctic Peninsula are losing mass at an increasing rate, but other parts of West Antarctica and the East Antarctic ice sheet are gaining mass at an increasing rate. Increased losses of 35 Gt/yr in Pine Island, Thwaites-Smith, and Marie-Bryd.Coast are more than balanced by gains in base of Peninsula and ice stream C, D, & E systems. From the 1992-2002 to 2003-2007 period, the overall mass balance for Antarctica changed from a loss of about 60 Gt/yr to near balance or slightly positive.

  3. Mass Distribution in Rotating Thin-Disk Galaxies According to Newtonian Dynamics

    Directory of Open Access Journals (Sweden)

    James Q. Feng

    2014-04-01

    Full Text Available An accurate computational method is presented for determining the mass distribution in a mature spiral galaxy from a given rotation curve by applying Newtonian dynamics for an axisymmetrically rotating thin disk of finite size with or without a central spherical bulge. The governing integral equation for mass distribution is transformed via a boundary-element method into a linear algebra matrix equation that can be solved numerically for rotation curves with a wide range of shapes. To illustrate the effectiveness of this computational method, mass distributions in several mature spiral galaxies are determined from their measured rotation curves. All the surface mass density profiles predicted by our model exhibit approximately a common exponential law of decay, quantitatively consistent with the observed surface brightness distributions. When a central spherical bulge is present, the mass distribution in the galaxy is altered in such a way that the periphery mass density is reduced, while more mass appears toward the galactic center. By extending the computational domain beyond the galactic edge, we can determine the rotation velocity outside the cut-off radius, which appears to continuously decrease and to gradually approach the Keplerian rotation velocity out over twice the cut-off radius. An examination of circular orbit stability suggests that galaxies with flat or rising rotation velocities are more stable than those with declining rotation velocities especially in the region near the galactic edge. Our results demonstrate the fact that Newtonian dynamics can be adequate for describing the observed rotation behavior of mature spiral galaxies.

  4. Benchmarking the energy efficiency of commercial buildings

    International Nuclear Information System (INIS)

    Chung, William; Hui, Y.V.; Lam, Y. Miu

    2006-01-01

    Benchmarking energy-efficiency is an important tool to promote the efficient use of energy in commercial buildings. Benchmarking models are mostly constructed in a simple benchmark table (percentile table) of energy use, which is normalized with floor area and temperature. This paper describes a benchmarking process for energy efficiency by means of multiple regression analysis, where the relationship between energy-use intensities (EUIs) and the explanatory factors (e.g., operating hours) is developed. Using the resulting regression model, these EUIs are then normalized by removing the effect of deviance in the significant explanatory factors. The empirical cumulative distribution of the normalized EUI gives a benchmark table (or percentile table of EUI) for benchmarking an observed EUI. The advantage of this approach is that the benchmark table represents a normalized distribution of EUI, taking into account all the significant explanatory factors that affect energy consumption. An application to supermarkets is presented to illustrate the development and the use of the benchmarking method

  5. Numisheet2005 Benchmark Analysis on Forming of an Automotive Underbody Cross Member: Benchmark 2

    International Nuclear Information System (INIS)

    Buranathiti, Thaweepat; Cao Jian

    2005-01-01

    This report presents an international cooperation benchmark effort focusing on simulations of a sheet metal stamping process. A forming process of an automotive underbody cross member using steel and aluminum blanks is used as a benchmark. Simulation predictions from each submission are analyzed via comparison with the experimental results. A brief summary of various models submitted for this benchmark study is discussed. Prediction accuracy of each parameter of interest is discussed through the evaluation of cumulative errors from each submission

  6. SKaMPI: A Comprehensive Benchmark for Public Benchmarking of MPI

    Directory of Open Access Journals (Sweden)

    Ralf Reussner

    2002-01-01

    Full Text Available The main objective of the MPI communication library is to enable portable parallel programming with high performance within the message-passing paradigm. Since the MPI standard has no associated performance model, and makes no performance guarantees, comprehensive, detailed and accurate performance figures for different hardware platforms and MPI implementations are important for the application programmer, both for understanding and possibly improving the behavior of a given program on a given platform, as well as for assuring a degree of predictable behavior when switching to another hardware platform and/or MPI implementation. We term this latter goal performance portability, and address the problem of attaining performance portability by benchmarking. We describe the SKaMPI benchmark which covers a large fraction of MPI, and incorporates well-accepted mechanisms for ensuring accuracy and reliability. SKaMPI is distinguished among other MPI benchmarks by an effort to maintain a public performance database with performance data from different hardware platforms and MPI implementations.

  7. Complexified quantum field theory and 'mass without mass' from multidimensional fractional actionlike variational approach with dynamical fractional exponents

    International Nuclear Information System (INIS)

    El-Nabulsi, Ahmad Rami

    2009-01-01

    Multidimensional fractional actionlike variational problem with time-dependent dynamical fractional exponents is constructed. Fractional Euler-Lagrange equations are derived and discussed in some details. The results obtained are used to explore some novel aspects of fractional quantum field theory where many interesting consequences are revealed, in particular the complexification of quantum field theory, in particular Dirac operators and the novel notion of 'mass without mass'.

  8. Entropy-based benchmarking methods

    NARCIS (Netherlands)

    Temurshoev, Umed

    2012-01-01

    We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth

  9. Power reactor pressure vessel benchmarks

    International Nuclear Information System (INIS)

    Rahn, F.J.

    1978-01-01

    A review is given of the current status of experimental and calculational benchmarks for use in understanding the radiation embrittlement effects in the pressure vessels of operating light water power reactors. The requirements of such benchmarks for application to pressure vessel dosimetry are stated. Recent developments in active and passive neutron detectors sensitive in the ranges of importance to embrittlement studies are summarized and recommendations for improvements in the benchmark are made. (author)

  10. Dynamic mass exchange in doubly degenerate binaries. I - 0.9 and 1.2 solar mass stars

    International Nuclear Information System (INIS)

    Benz, W.; Cameron, A.G.W.; Press, W.H.; Bowers, R.L.

    1990-01-01

    The dynamic mass exchange process in doubly degenerate binaries was investigated using a three-dimensional numerical simulation of the evolution of a doubly degenerate binary system in which the primary is a 1.2-solar-mass white dwarf and the Roche lobe filling secondary is a 0.9-solar-mass dwarf. The results show that, in a little more than two orbital periods, the secondary is completely destroyed and transformed into a thick disk orbiting about the primary. Since only a very small fraction of the mass (0.0063 solar mass) escapes the system, the evolution of the binary results in the formation of a massive object. This object is composed of three parts, the initial white dwarf primary, a very hot pressure-supported spherical envelope, and a rotationally supported outer disk. The evolution of the system can be understood in terms of a simple analytical model where it is shown that the angular momentum carried by the mass during the transfer and stored in the disk determines the evolution of the system. 34 refs

  11. Ansys Benchmark of the Single Heater Test

    International Nuclear Information System (INIS)

    H.M. Wade; H. Marr; M.J. Anderson

    2006-01-01

    The Single Heater Test (SHT) is the first of three in-situ thermal tests included in the site characterization program for the potential nuclear waste monitored geologic repository at Yucca Mountain. The heating phase of the SHT started in August 1996 and was concluded in May 1997 after 9 months of heating. Cooling continued until January 1998, at which time post-test characterization of the test block commenced. Numerous thermal, hydrological, mechanical, and chemical sensors monitored the coupled processes in the unsaturated fractured rock mass around the heater (CRWMS M and O 1999). The objective of this calculation is to benchmark a numerical simulation of the rock mass thermal behavior against the extensive data set that is available from the thermal test. The scope is limited to three-dimensional (3-D) numerical simulations of the computational domain of the Single Heater Test and surrounding rock mass. This calculation supports the waste package thermal design methodology, and is developed by Waste Package Department (WPD) under Office of Civilian Radioactive Waste Management (OCRWM) procedure AP-3.12Q, Revision 0, ICN 3, BSCN 1, Calculations

  12. VisGraB: A Benchmark for Vision-Based Grasping. Paladyn Journal of Behavioral Robotics

    DEFF Research Database (Denmark)

    Kootstra, Gert; Popovic, Mila; Jørgensen, Jimmy Alison

    2012-01-01

    that a large number of grasps can be executed and evaluated while dealing with dynamics and the noise and uncertainty present in the real world images. VisGraB enables a fair comparison among different grasping methods. The user furthermore does not need to deal with robot hardware, focusing on the vision......We present a database and a software tool, VisGraB, for benchmarking of methods for vision-based grasping of unknown objects with no prior object knowledge. The benchmark is a combined real-world and simulated experimental setup. Stereo images of real scenes containing several objects in different...

  13. Shielding benchmark problems, (2)

    International Nuclear Information System (INIS)

    Tanaka, Shun-ichi; Sasamoto, Nobuo; Oka, Yoshiaki; Shin, Kazuo; Tada, Keiko.

    1980-02-01

    Shielding benchmark problems prepared by Working Group of Assessment of Shielding Experiments in the Research Committee on Shielding Design in the Atomic Energy Society of Japan were compiled by Shielding Laboratory in Japan Atomic Energy Research Institute. Fourteen shielding benchmark problems are presented newly in addition to twenty-one problems proposed already, for evaluating the calculational algorithm and accuracy of computer codes based on discrete ordinates method and Monte Carlo method and for evaluating the nuclear data used in codes. The present benchmark problems are principally for investigating the backscattering and the streaming of neutrons and gamma rays in two- and three-dimensional configurations. (author)

  14. Electricity consumption in school buildings - benchmark and web tools; Elforbrug i skoler - benchmark og webvaerktoej

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-07-01

    The aim of this project has been to produce benchmarks for electricity consumption in Danish schools in order to encourage electricity conservation. An internet programme has been developed with the aim of facilitating schools' access to benchmarks and to evaluate energy consumption. The overall purpose is to create increased attention to the electricity consumption of each separate school by publishing benchmarks which take the schools' age and number of pupils as well as after school activities into account. Benchmarks can be used to make green accounts and work as markers in e.g. energy conservation campaigns, energy management and for educational purposes. The internet tool can be found on www.energiguiden.dk. (BA)

  15. A SURPRISING DYNAMICAL MASS FOR V773 Tau B

    Energy Technology Data Exchange (ETDEWEB)

    Boden, Andrew F. [Division of Physics, Mathematics, and Astronomy, California Institute of Technology, MS 11-17, Pasadena, CA 91125 (United States); Torres, Guillermo [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Duchene, Gaspard [Division of Astronomy and Astrophysics, University of California, Berkeley, CA 94720 (United States); Konopacky, Quinn [Lawrence Livermore National Laboratory, 7000 East Avenue, Livermore, CA 94550 (United States); Ghez, A. M. [Department of Physics and Astronomy, UCLA, Los Angeles, CA 90095-1562 (United States); Torres, Rosa M. [Argelander-Institut fuer Astronomie, Universitaet Bonn, Auf dem Huegel 71, D-53121 Bonn (Germany); Loinard, Laurent [Centro de Radiostronomia y Astrofisica, Universidad Nacional Autonoma de Mexico, Apartado Postal 72-3 (Xangari), 58089 Morelia, Michoacan (Mexico)

    2012-03-01

    We report on new high-resolution imaging and spectroscopy on the multiple T Tauri star system V773 Tau over the 2003-2009 period. With these data we derive relative astrometry, photometry between the A and B components, and radial velocity (RV) of the A-subsystem components. Combining these new data with previously published astrometry and RVs, we update the relative A-B orbit model. This updated orbit model, the known system distance, and A-subsystem parameters yield a dynamical mass for the B component for the first time. Remarkably, the derived B dynamical mass is in the range 1.7-3.0 M{sub Sun }. This is much higher than previous estimates and suggests that like A, B is also a multiple stellar system. Among these data, spatially resolved spectroscopy provides new insight into the nature of the B component. Similar to A, these near-IR spectra indicate that the dominant source in B is of mid-K spectral type. If B is in fact a multiple star system as suggested by the dynamical mass estimate, the simplest assumption is that B is composed of similar {approx}1.2 M{sub Sun} pre-main-sequence stars in a close (<1 AU) binary system. This inference is supported by line-shape changes in near-IR spectroscopy of B, tentatively interpreted as changing RV among components in V773 Tau B. Relative photometry indicates that B is highly variable in the near-IR. The most likely explanation for this variability is circum-B material resulting in variable line-of-sight extinction. The distribution of this material must be significantly affected by both the putative B multiplicity and the A-B orbit.

  16. HS06 Benchmark for an ARM Server

    Science.gov (United States)

    Kluth, Stefan

    2014-06-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  17. HS06 benchmark for an ARM server

    International Nuclear Information System (INIS)

    Kluth, Stefan

    2014-01-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  18. Performance Targets and External Benchmarking

    DEFF Research Database (Denmark)

    Friis, Ivar; Hansen, Allan; Vámosi, Tamás S.

    Research on relative performance measures, transfer pricing, beyond budgeting initiatives, target costing, piece rates systems and value based management has for decades underlined the importance of external benchmarking in performance management. Research conceptualises external benchmarking...... as a market mechanism that can be brought inside the firm to provide incentives for continuous improvement and the development of competitive advances. However, whereas extant research primarily has focused on the importance and effects of using external benchmarks, less attention has been directed towards...... the conditions upon which the market mechanism is performing within organizations. This paper aims to contribute to research by providing more insight to the conditions for the use of external benchmarking as an element in performance management in organizations. Our study explores a particular type of external...

  19. EVA Health and Human Performance Benchmarking Study

    Science.gov (United States)

    Abercromby, A. F.; Norcross, J.; Jarvis, S. L.

    2016-01-01

    Multiple HRP Risks and Gaps require detailed characterization of human health and performance during exploration extravehicular activity (EVA) tasks; however, a rigorous and comprehensive methodology for characterizing and comparing the health and human performance implications of current and future EVA spacesuit designs does not exist. This study will identify and implement functional tasks and metrics, both objective and subjective, that are relevant to health and human performance, such as metabolic expenditure, suit fit, discomfort, suited postural stability, cognitive performance, and potentially biochemical responses for humans working inside different EVA suits doing functional tasks under the appropriate simulated reduced gravity environments. This study will provide health and human performance benchmark data for humans working in current EVA suits (EMU, Mark III, and Z2) as well as shirtsleeves using a standard set of tasks and metrics with quantified reliability. Results and methodologies developed during this test will provide benchmark data against which future EVA suits, and different suit configurations (eg, varied pressure, mass, CG) may be reliably compared in subsequent tests. Results will also inform fitness for duty standards as well as design requirements and operations concepts for future EVA suits and other exploration systems.

  20. Benchmarking in Czech Higher Education

    OpenAIRE

    Plaček Michal; Ochrana František; Půček Milan

    2015-01-01

    The first part of this article surveys the current experience with the use of benchmarking at Czech universities specializing in economics and management. The results indicate that collaborative benchmarking is not used on this level today, but most actors show some interest in its introduction. The expression of the need for it and the importance of benchmarking as a very suitable performance-management tool in less developed countries are the impetus for the second part of our article. Base...

  1. Benchmark simulation models, quo vadis?

    Science.gov (United States)

    Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.

  2. A Seafloor Benchmark for 3-dimensional Geodesy

    Science.gov (United States)

    Chadwell, C. D.; Webb, S. C.; Nooner, S. L.

    2014-12-01

    We have developed an inexpensive, permanent seafloor benchmark to increase the longevity of seafloor geodetic measurements. The benchmark provides a physical tie to the sea floor lasting for decades (perhaps longer) on which geodetic sensors can be repeatedly placed and removed with millimeter resolution. Global coordinates estimated with seafloor geodetic techniques will remain attached to the benchmark allowing for the interchange of sensors as they fail or become obsolete, or for the sensors to be removed and used elsewhere, all the while maintaining a coherent series of positions referenced to the benchmark. The benchmark has been designed to free fall from the sea surface with transponders attached. The transponder can be recalled via an acoustic command sent from the surface to release from the benchmark and freely float to the sea surface for recovery. The duration of the sensor attachment to the benchmark will last from a few days to a few years depending on the specific needs of the experiment. The recovered sensors are then available to be reused at other locations, or again at the same site in the future. Three pins on the sensor frame mate precisely and unambiguously with three grooves on the benchmark. To reoccupy a benchmark a Remotely Operated Vehicle (ROV) uses its manipulator arm to place the sensor pins into the benchmark grooves. In June 2014 we deployed four benchmarks offshore central Oregon. We used the ROV Jason to successfully demonstrate the removal and replacement of packages onto the benchmark. We will show the benchmark design and its operational capabilities. Presently models of megathrust slip within the Cascadia Subduction Zone (CSZ) are mostly constrained by the sub-aerial GPS vectors from the Plate Boundary Observatory, a part of Earthscope. More long-lived seafloor geodetic measures are needed to better understand the earthquake and tsunami risk associated with a large rupture of the thrust fault within the Cascadia subduction zone

  3. The Concepts "Benchmarks and Benchmarking" Used in Education Planning: Teacher Education as Example

    Science.gov (United States)

    Steyn, H. J.

    2015-01-01

    Planning in education is a structured activity that includes several phases and steps that take into account several kinds of information (Steyn, Steyn, De Waal & Wolhuter, 2002: 146). One of the sets of information that are usually considered is the (so-called) "benchmarks" and "benchmarking" regarding the focus of a…

  4. Aerodynamic Benchmarking of the Deepwind Design

    DEFF Research Database (Denmark)

    Bedona, Gabriele; Schmidt Paulsen, Uwe; Aagaard Madsen, Helge

    2015-01-01

    The aerodynamic benchmarking for the DeepWind rotor is conducted comparing different rotor geometries and solutions and keeping the comparison as fair as possible. The objective for the benchmarking is to find the most suitable configuration in order to maximize the power production and minimize...... the blade solicitation and the cost of energy. Different parameters are considered for the benchmarking study. The DeepWind blade is characterized by a shape similar to the Troposkien geometry but asymmetric between the top and bottom parts: this shape is considered as a fixed parameter in the benchmarking...

  5. Towards dynamic reference information models: Readiness for ICT mass customisation

    NARCIS (Netherlands)

    Verdouw, C.N.; Beulens, A.J.M.; Trienekens, J.H.; Verwaart, D.

    2010-01-01

    Current dynamic demand-driven networks make great demands on, in particular, the interoperability and agility of information systems. This paper investigates how reference information models can be used to meet these demands by enhancing ICT mass customisation. It was found that reference models for

  6. Depollution benchmarks for capacitors, batteries and printed wiring boards from waste electrical and electronic equipment (WEEE)

    International Nuclear Information System (INIS)

    Savi, Daniel; Kasser, Ueli; Ott, Thomas

    2013-01-01

    Highlights: • We’ve analysed data on the dismantling of electronic and electrical appliances. • Ten years of mass balance data of more than recycling companies have been considered. • Percentages of dismantled batteries, capacitors and PWB have been studied. • Threshold values and benchmarks for batteries and capacitors have been identified. • No benchmark for the dismantling of printed wiring boards should be set. - Abstract: The article compiles and analyses sample data for toxic components removed from waste electronic and electrical equipment (WEEE) from more than 30 recycling companies in Switzerland over the past ten years. According to European and Swiss legislation, toxic components like batteries, capacitors and printed wiring boards have to be removed from WEEE. The control bodies of the Swiss take back schemes have been monitoring the activities of WEEE recyclers in Switzerland for about 15 years. All recyclers have to provide annual mass balance data for every year of operation. From this data, percentage shares of removed batteries and capacitors are calculated in relation to the amount of each respective WEEE category treated. A rationale is developed, why such an indicator should not be calculated for printed wiring boards. The distributions of these de-pollution indicators are analysed and their suitability for defining lower threshold values and benchmarks for the depollution of WEEE is discussed. Recommendations for benchmarks and threshold values for the removal of capacitors and batteries are given

  7. Depollution benchmarks for capacitors, batteries and printed wiring boards from waste electrical and electronic equipment (WEEE)

    Energy Technology Data Exchange (ETDEWEB)

    Savi, Daniel, E-mail: d.savi@umweltchemie.ch [Dipl. Environmental Sci. ETH, büro für umweltchemie, Zurich (Switzerland); Kasser, Ueli [Lic. Phil. Nat. (Chemist), büro für umweltchemie, Zurich (Switzerland); Ott, Thomas [Dipl. Phys. ETH, Institute of Applied Simulation, Zurich University of Applied Sciences, Wädenswil (Switzerland)

    2013-12-15

    Highlights: • We’ve analysed data on the dismantling of electronic and electrical appliances. • Ten years of mass balance data of more than recycling companies have been considered. • Percentages of dismantled batteries, capacitors and PWB have been studied. • Threshold values and benchmarks for batteries and capacitors have been identified. • No benchmark for the dismantling of printed wiring boards should be set. - Abstract: The article compiles and analyses sample data for toxic components removed from waste electronic and electrical equipment (WEEE) from more than 30 recycling companies in Switzerland over the past ten years. According to European and Swiss legislation, toxic components like batteries, capacitors and printed wiring boards have to be removed from WEEE. The control bodies of the Swiss take back schemes have been monitoring the activities of WEEE recyclers in Switzerland for about 15 years. All recyclers have to provide annual mass balance data for every year of operation. From this data, percentage shares of removed batteries and capacitors are calculated in relation to the amount of each respective WEEE category treated. A rationale is developed, why such an indicator should not be calculated for printed wiring boards. The distributions of these de-pollution indicators are analysed and their suitability for defining lower threshold values and benchmarks for the depollution of WEEE is discussed. Recommendations for benchmarks and threshold values for the removal of capacitors and batteries are given.

  8. Vver-1000 Mox core computational benchmark

    International Nuclear Information System (INIS)

    2006-01-01

    The NEA Nuclear Science Committee has established an Expert Group that deals with the status and trends of reactor physics, fuel performance and fuel cycle issues related to disposing of weapons-grade plutonium in mixed-oxide fuel. The objectives of the group are to provide NEA member countries with up-to-date information on, and to develop consensus regarding, core and fuel cycle issues associated with burning weapons-grade plutonium in thermal water reactors (PWR, BWR, VVER-1000, CANDU) and fast reactors (BN-600). These issues concern core physics, fuel performance and reliability, and the capability and flexibility of thermal water reactors and fast reactors to dispose of weapons-grade plutonium in standard fuel cycles. The activities of the NEA Expert Group on Reactor-based Plutonium Disposition are carried out in close co-operation (jointly, in most cases) with the NEA Working Party on Scientific Issues in Reactor Systems (WPRS). A prominent part of these activities include benchmark studies. At the time of preparation of this report, the following benchmarks were completed or in progress: VENUS-2 MOX Core Benchmarks: carried out jointly with the WPRS (formerly the WPPR) (completed); VVER-1000 LEU and MOX Benchmark (completed); KRITZ-2 Benchmarks: carried out jointly with the WPRS (formerly the WPPR) (completed); Hollow and Solid MOX Fuel Behaviour Benchmark (completed); PRIMO MOX Fuel Performance Benchmark (ongoing); VENUS-2 MOX-fuelled Reactor Dosimetry Calculation (ongoing); VVER-1000 In-core Self-powered Neutron Detector Calculational Benchmark (started); MOX Fuel Rod Behaviour in Fast Power Pulse Conditions (started); Benchmark on the VENUS Plutonium Recycling Experiments Configuration 7 (started). This report describes the detailed results of the benchmark investigating the physics of a whole VVER-1000 reactor core using two-thirds low-enriched uranium (LEU) and one-third MOX fuel. It contributes to the computer code certification process and to the

  9. Adaptive unified continuum FEM modeling of a 3D FSI benchmark problem.

    Science.gov (United States)

    Jansson, Johan; Degirmenci, Niyazi Cem; Hoffman, Johan

    2017-09-01

    In this paper, we address a 3D fluid-structure interaction benchmark problem that represents important characteristics of biomedical modeling. We present a goal-oriented adaptive finite element methodology for incompressible fluid-structure interaction based on a streamline diffusion-type stabilization of the balance equations for mass and momentum for the entire continuum in the domain, which is implemented in the Unicorn/FEniCS software framework. A phase marker function and its corresponding transport equation are introduced to select the constitutive law, where the mesh tracks the discontinuous fluid-structure interface. This results in a unified simulation method for fluids and structures. We present detailed results for the benchmark problem compared with experiments, together with a mesh convergence study. Copyright © 2016 John Wiley & Sons, Ltd.

  10. Shielding benchmark problems

    International Nuclear Information System (INIS)

    Tanaka, Shun-ichi; Sasamoto, Nobuo; Oka, Yoshiaki; Kawai, Masayoshi; Nakazawa, Masaharu.

    1978-09-01

    Shielding benchmark problems were prepared by the Working Group of Assessment of Shielding Experiments in the Research Comittee on Shielding Design of the Atomic Energy Society of Japan, and compiled by the Shielding Laboratory of Japan Atomic Energy Research Institute. Twenty-one kinds of shielding benchmark problems are presented for evaluating the calculational algorithm and the accuracy of computer codes based on the discrete ordinates method and the Monte Carlo method and for evaluating the nuclear data used in the codes. (author)

  11. Gas cooled fast reactor benchmarks for JNC and Cea neutronic tools assessment

    International Nuclear Information System (INIS)

    Rimpault, G.; Sugino, K.; Hayashi, H.

    2005-01-01

    In order to verify the adequacy of JNC and Cea computational tools for the definition of GCFR (gas cooled fast reactor) core characteristics, GCFR neutronic benchmarks have been performed. The benchmarks have been carried out on two different cores: 1) a conventional Gas-Cooled fast Reactor (EGCR) core with pin-type fuel, and 2) an innovative He-cooled Coated-Particle Fuel (CPF) core. Core characteristics being studied include: -) Criticality (Effective multiplication factor or K-effective), -) Instantaneous breeding gain (BG), -) Core Doppler effect, and -) Coolant depressurization reactivity. K-effective and coolant depressurization reactivity at EOEC (End Of Equilibrium Cycle) state were calculated since these values are the most critical characteristics in the core design. In order to check the influence due to the difference of depletion calculation systems, a simple depletion calculation benchmark was performed. Values such as: -) burnup reactivity loss, -) mass balance of heavy metals and fission products (FP) were calculated. Results of the core design characteristics calculated by both JNC and Cea sides agree quite satisfactorily in terms of core conceptual design study. Potential features for improving the GCFR computational tools have been discovered during the course of this benchmark such as the way to calculate accurately the breeding gain. Different ways to improve the accuracy of the calculations have also been identified. In particular, investigation on nuclear data for steel is important for EGCR and for lumped fission products in both cores. The outcome of this benchmark is already satisfactory and will help to design more precisely GCFR cores. (authors)

  12. Resonance Ionization Mass Spectrometry (RIMS): applications in spectroscopy and chemical dynamics

    International Nuclear Information System (INIS)

    Naik, P.D.; Kumar, Awadhesh; Upadhyaya, Hari; Bajaj, P.N.

    2009-01-01

    Resonance ionization is a photophysical process wherein electromagnetic radiation is used to ionize atoms, molecules, transient species, etc., by exciting them through their quantum states. The number of photons required to ionize depends on the species being investigated and energy of the photon. Once a charged particle is produced, it is easy to detect it with high efficiency. With the advent of narrow band high power pulsed and cw tunable dye lasers, it has blossomed into a powerful spectroscopic and analytical technique, commonly known as resonance ionization spectroscopy (RIS)/resonance enhanced multiphoton ionization (REMPI). The alliance of resonance ionization with mass spectrometry has grown into a still more powerful technique, known as resonance ionization mass spectrometry (RIMS), which has made significant contributions in a variety of frontier areas of research and development, such as spectroscopy, chemical dynamics, analytical chemistry, cluster science, surface science, radiochemistry, nuclear physics, biology, environmental science, material science, etc. In this article, we shall describe the application of resonance ionization mass spectrometry to spectroscopy of uranium and chemical dynamics of polyatomic molecules

  13. Exploring effects of strong interactions in enhancing masses of dynamical origin

    International Nuclear Information System (INIS)

    Cabo Montes de Oca, Alejandro

    2011-01-01

    A previous study of the dynamical generation of masses in massless QCD is considered from another viewpoint. The quark mass is assumed to have a dynamical origin and is substituted for by a scalar field without self-interaction. The potential for the new field background is evaluated up to two loops. Expressing the running coupling in terms of the scale parameter μ, the potential minimum is chosen to fix m top =175 GeV when μ 0 =498 MeV. The second derivative of the potential predicts a scalar field mass of 126.76 GeV. This number is close to the value 114 GeV, which preliminary data taken at CERN suggested to be associated with the Higgs particle. However, the simplifying assumptions limit the validity of the calculations done, as indicated by the large value of α=(g 2 )/(4π)=1.077 obtained. However, supporting statements about the possibility of improving the scheme come from the necessary inclusion of weak and scalar field couplings and mass counterterms in the renormalization procedure, in common with the seemingly needed consideration of the massive W and Z fields, if the real conditions of the SM model are intended to be approached. (orig.)

  14. Dynamical Masses of Cool White Dwarfs in Double-Degenerate Visual Binaries

    Science.gov (United States)

    Bond, Howard E.; Nelan, E. P.; Schaefer, G.

    2014-01-01

    The cool white dwarfs (WDs) WD 1639+153 and WD 1818+126 were originally resolved into close visual binaries containing two WDs each during a survey with the Hubble Space Telescope (HST) and its Fine Guidance Sensors (FGS). Follow up FGS observations of these two double-degenerate (DD) systems, along with the previously known DD G 107-70, have yielded the orbital elements of all three visual binaries. We find orbital periods of 3.88 yr, 12.19 yr, and 18.84 yr for WD 1639+153, WD 1818+126, and G 107-70, respectively. Moreover, for each of the systems we have been observing nearby field stars with FGS1r in POS mode to determine the local inertial reference frame, from which we obtain the parallax and proper motion of the DD, along with the motion of each WD about its system barycenter. This leads directly to a dynamical mass for each WD. We have also used HST STIS observations to obtain individual spectra of each of the six WDs, which provide the effective temperature and subclass of each WD. This provides insight into the cooling age of each star. From the cooling ages and dynamical masses, we obtain constraints on the initial-mass/final-mass relation for WD stars.

  15. Benchmarking von Krankenhausinformationssystemen – eine vergleichende Analyse deutschsprachiger Benchmarkingcluster

    Directory of Open Access Journals (Sweden)

    Jahn, Franziska

    2015-08-01

    Full Text Available Benchmarking is a method of strategic information management used by many hospitals today. During the last years, several benchmarking clusters have been established within the German-speaking countries. They support hospitals in comparing and positioning their information system’s and information management’s costs, performance and efficiency against other hospitals. In order to differentiate between these benchmarking clusters and to provide decision support in selecting an appropriate benchmarking cluster, a classification scheme is developed. The classification scheme observes both general conditions and examined contents of the benchmarking clusters. It is applied to seven benchmarking clusters which have been active in the German-speaking countries within the last years. Currently, performance benchmarking is the most frequent benchmarking type, whereas the observed benchmarking clusters differ in the number of benchmarking partners and their cooperation forms. The benchmarking clusters also deal with different benchmarking subjects. Assessing costs and quality application systems, physical data processing systems, organizational structures of information management and IT services processes are the most frequent benchmarking subjects. There is still potential for further activities within the benchmarking clusters to measure strategic and tactical information management, IT governance and quality of data and data-processing processes. Based on the classification scheme and the comparison of the benchmarking clusters, we derive general recommendations for benchmarking of hospital information systems.

  16. Medical school benchmarking - from tools to programmes.

    Science.gov (United States)

    Wilkinson, Tim J; Hudson, Judith N; Mccoll, Geoffrey J; Hu, Wendy C Y; Jolly, Brian C; Schuwirth, Lambert W T

    2015-02-01

    Benchmarking among medical schools is essential, but may result in unwanted effects. To apply a conceptual framework to selected benchmarking activities of medical schools. We present an analogy between the effects of assessment on student learning and the effects of benchmarking on medical school educational activities. A framework by which benchmarking can be evaluated was developed and applied to key current benchmarking activities in Australia and New Zealand. The analogy generated a conceptual framework that tested five questions to be considered in relation to benchmarking: what is the purpose? what are the attributes of value? what are the best tools to assess the attributes of value? what happens to the results? and, what is the likely "institutional impact" of the results? If the activities were compared against a blueprint of desirable medical graduate outcomes, notable omissions would emerge. Medical schools should benchmark their performance on a range of educational activities to ensure quality improvement and to assure stakeholders that standards are being met. Although benchmarking potentially has positive benefits, it could also result in perverse incentives with unforeseen and detrimental effects on learning if it is undertaken using only a few selected assessment tools.

  17. CONFIRMATION OF SMALL DYNAMICAL AND STELLAR MASSES FOR EXTREME EMISSION LINE GALAXIES AT z ∼ 2

    Energy Technology Data Exchange (ETDEWEB)

    Maseda, Michael V.; Van der Wel, Arjen; Da Cunha, Elisabete; Rix, Hans-Walter [Max-Planck-Institut für Astronomie, Königstuhl 17, D-69117 Heidelberg (Germany); Pacifici, Camilla [Yonsei University Observatory, Yonsei University, Seoul 120-749 (Korea, Republic of); Momcheva, Ivelina; Van Dokkum, Pieter; Nelson, Erica J. [Department of Astronomy, Yale University, New Haven, CT 06520 (United States); Brammer, Gabriel B.; Grogin, Norman A.; Koekemoer, Anton M. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Franx, Marijn; Fumagalli, Mattia; Patel, Shannon G. [Leiden Observatory, Leiden University, Leiden (Netherlands); Bell, Eric F. [Department of Astronomy, University of Michigan, 500 Church Street, Ann Arbor, MI 48109 (United States); Kocevski, Dale D. [Department of Physics and Astronomy, University of Kentucky, Lexington, KY 40506 (United States); Lundgren, Britt F. [Department of Astronomy, University of Wisconsin, 475 N Charter Street, Madison, WI 53706 (United States); Marchesini, Danilo [Physics and Astronomy Department, Tufts University, Robinson Hall, Room 257, Medford, MA 02155 (United States); Skelton, Rosalind E. [South African Astronomical Observatory, P.O. Box 9, Observatory 7935 (South Africa); Straughn, Amber N., E-mail: maseda@mpia.de [Astrophysics Science Division, Goddard Space Flight Center, Code 665, Greenbelt, MD 20771 (United States); and others

    2013-11-20

    Spectroscopic observations from the Large Binocular Telescope and the Very Large Telescope reveal kinematically narrow lines (∼50 km s{sup –1}) for a sample of 14 extreme emission line galaxies at redshifts 1.4 < z < 2.3. These measurements imply that the total dynamical masses of these systems are low (≲ 3 × 10{sup 9} M {sub ☉}). Their large [O III] λ5007 equivalent widths (500-1100 Å) and faint blue continuum emission imply young ages of 10-100 Myr and stellar masses of 10{sup 8}-10{sup 9} M {sub ☉}, confirming the presence of a violent starburst. The dynamical masses represent the first such determinations for low-mass galaxies at z > 1. The stellar mass formed in this vigorous starburst phase represents a large fraction of the total (dynamical) mass, without a significantly massive underlying population of older stars. The occurrence of such intense events in shallow potentials strongly suggests that supernova-driven winds must be of critical importance in the subsequent evolution of these systems.

  18. Shielding Benchmark Computational Analysis

    International Nuclear Information System (INIS)

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.

    2000-01-01

    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)

  19. A CFD benchmarking exercise based on flow mixing in a T-junction

    Energy Technology Data Exchange (ETDEWEB)

    Smith, B.L., E-mail: brian.smith@psi.ch [Thermal Hydraulics Laboratory, Nuclear Energy and Safety Department, Paul Scherrer Institut, CH-5232 Villigen PSI (Switzerland); Mahaffy, J.H. [Wheelsmith Farm, Spring Mill, PA (United States); Angele, K. [Vattenfall R and D, Älvkarleby (Sweden)

    2013-11-15

    The paper describes an international benchmarking exercise, sponsored by the OECD Nuclear Energy Agency (NEA), aimed at testing the ability of state-of-the-art computational fluid dynamics (CFD) codes to predict the important fluid flow parameters affecting high-cycle thermal fatigue induced by turbulent mixing in T-junctions. The results from numerical simulations are compared to measured data from an experiment performed at 1:2 scale by Vattenfall Research and Development, Älvkarleby, Sweden. The test data were released only at the end of the exercise making this a truly blind CFD-validation benchmark. Details of the organizational procedures, the experimental set-up and instrumentation, the different modeling approaches adopted, synthesis of results, and overall conclusions and perspectives are presented.

  20. Issues in Benchmark Metric Selection

    Science.gov (United States)

    Crolotte, Alain

    It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.

  1. Landscape seasons and air mass dynamics in Latvia

    International Nuclear Information System (INIS)

    Krauklis, A.; Draveniece, A.

    2004-01-01

    Latvia is located in the middle of an area where the boreal and nemoral zones and the regions of oceanic and continental climate meet, and it was studied as a model territory of the most typical variation of boreo-nemoral ecotone. The subject of this study was seasonal dynamics of the state of landscapes and diachronous links between seasons. It was found that landscapes undergo 12 seasonal states or seasons during the annual cycle of insulation and air mass occurrence. Each season may be distinguished by a definite amount of solar radiation, distinctive state of heat and water balance, phenological state of vegetation, and a distinctive occurrence of different air mass types and their particular 'association'. During each season these variables show a particular combination of numerical values and a distinctive landscape pattern

  2. Benchmarking clinical photography services in the NHS.

    Science.gov (United States)

    Arbon, Giles

    2015-01-01

    Benchmarking is used in services across the National Health Service (NHS) using various benchmarking programs. Clinical photography services do not have a program in place and services have to rely on ad hoc surveys of other services. A trial benchmarking exercise was undertaken with 13 services in NHS Trusts. This highlights valuable data and comparisons that can be used to benchmark and improve services throughout the profession.

  3. Analysis of CSNI benchmark test on containment using the code CONTRAN

    International Nuclear Information System (INIS)

    Haware, S.K.; Ghosh, A.K.; Raj, V.V.; Kakodkar, A.

    1994-01-01

    A programme of experimental as well as analytical studies on the behaviour of nuclear reactor containment is being actively pursued. A large number ol' experiments on pressure and temperature transients have been carried out on a one-tenth scale model vapour suppression pool containment experimental facility, simulating the 220 MWe Indian Pressurised Heavy Water Reactors. A programme of development of computer codes is underway to enable prediction of containment behaviour under accident conditions. This includes codes for pressure and temperature transients, hydrogen behaviour, aerosol behaviour etc. As a part of this ongoing work, the code CONTRAN (CONtainment TRansient ANalysis) has been developed for predicting the thermal hydraulic transients in a multicompartment containment. For the assessment of the hydrogen behaviour, the models for hydrogen transportation in a multicompartment configuration and hydrogen combustion have been incorporated in the code CONTRAN. The code also has models for the heat and mass transfer due to condensation and convection heat transfer. The structural heat transfer is modeled using the one-dimensional transient heat conduction equation. Extensive validation exercises have been carried out with the code CONTRAN. The code CONTRAN has been successfully used for the analysis of the benchmark test devised by Committee on the Safety of Nuclear Installations (CSNI) of the Organisation for Economic Cooperation and Development (OECD), to test the numerical accuracy and convergence errors in the computation of mass and energy conservation for the fluid and in the computation of heat conduction in structural walls. The salient features of the code CONTRAN, description of the CSNI benchmark test and a comparison of the CONTRAN predictions with the benchmark test results are presented and discussed in the paper. (author)

  4. Benchmarking Danish Industries

    DEFF Research Database (Denmark)

    Gammelgaard, Britta; Bentzen, Eric; Aagaard Andreassen, Mette

    2003-01-01

    compatible survey. The International Manufacturing Strategy Survey (IMSS) doesbring up the question of supply chain management, but unfortunately, we did not have access to thedatabase. Data from the members of the SCOR-model, in the form of benchmarked performance data,may exist, but are nonetheless...... not public. The survey is a cooperative project "Benchmarking DanishIndustries" with CIP/Aalborg University, the Danish Technological University, the DanishTechnological Institute and Copenhagen Business School as consortia partners. The project has beenfunded by the Danish Agency for Trade and Industry...

  5. OECD/NRC BWR Turbine Trip Transient Benchmark as a Basis for Comprehensive Qualification and Studying Best-Estimate Coupled Codes

    International Nuclear Information System (INIS)

    Ivanov, Kostadin; Olson, Andy; Sartori, Enrico

    2004-01-01

    An Organisation for Economic Co-operation and Development (OECD)/U.S. Nuclear Regulatory Commission (NRC)-sponsored coupled-code benchmark has been initiated for a boiling water reactor (BWR) turbine trip (TT) transient. Turbine trip transients in a BWR are pressurization events in which the coupling between core space-dependent neutronic phenomena and system dynamics plays an important role. In addition, the available real plant experimental data make this benchmark problem very valuable. Over the course of defining and coordinating the BWR TT benchmark, a systematic approach has been established to validate best-estimate coupled codes. This approach employs a multilevel methodology that not only allows for a consistent and comprehensive validation process but also contributes to the study of different numerical and computational aspects of coupled best-estimate simulations. This paper provides an overview of the OECD/NRC BWR TT benchmark activities with emphasis on the discussion of the numerical and computational aspects of the benchmark

  6. SUMMARY OF GENERAL WORKING GROUP A+B+D: CODES BENCHMARKING.

    Energy Technology Data Exchange (ETDEWEB)

    WEI, J.; SHAPOSHNIKOVA, E.; ZIMMERMANN, F.; HOFMANN, I.

    2006-05-29

    Computer simulation is an indispensable tool in assisting the design, construction, and operation of accelerators. In particular, computer simulation complements analytical theories and experimental observations in understanding beam dynamics in accelerators. The ultimate function of computer simulation is to study mechanisms that limit the performance of frontier accelerators. There are four goals for the benchmarking of computer simulation codes, namely debugging, validation, comparison and verification: (1) Debugging--codes should calculate what they are supposed to calculate; (2) Validation--results generated by the codes should agree with established analytical results for specific cases; (3) Comparison--results from two sets of codes should agree with each other if the models used are the same; and (4) Verification--results from the codes should agree with experimental measurements. This is the summary of the joint session among working groups A, B, and D of the HI32006 Workshop on computer codes benchmarking.

  7. Benchmarking of human resources management

    Directory of Open Access Journals (Sweden)

    David M. Akinnusi

    2008-11-01

    Full Text Available This paper reviews the role of human resource management (HRM which, today, plays a strategic partnership role in management. The focus of the paper is on HRM in the public sector, where much hope rests on HRM as a means of transforming the public service and achieving much needed service delivery. However, a critical evaluation of HRM practices in the public sector reveals that these services leave much to be desired. The paper suggests the adoption of benchmarking as a process to revamp HRM in the public sector so that it is able to deliver on its promises. It describes the nature and process of benchmarking and highlights the inherent difficulties in applying benchmarking in HRM. It concludes with some suggestions for a plan of action. The process of identifying “best” practices in HRM requires the best collaborative efforts of HRM practitioners and academicians. If used creatively, benchmarking has the potential to bring about radical and positive changes in HRM in the public sector. The adoption of the benchmarking process is, in itself, a litmus test of the extent to which HRM in the public sector has grown professionally.

  8. Integrating Best Practice and Performance Indicators To Benchmark the Performance of a School System. Benchmarking Paper 940317.

    Science.gov (United States)

    Cuttance, Peter

    This paper provides a synthesis of the literature on the role of benchmarking, with a focus on its use in the public sector. Benchmarking is discussed in the context of quality systems, of which it is an important component. The paper describes the basic types of benchmarking, pertinent research about its application in the public sector, the…

  9. Dynamical mass generation in the continuum Thirring model

    International Nuclear Information System (INIS)

    Girardello, L.; Immirzi, G.; Rossi, P.; Massachusetts Inst. of Tech., Cambridge; Massachusetts Inst. of Tech., Cambridge

    1982-01-01

    We study the renormalization of the Thirring model in the neighbourhood of μ = 0,g = -π/2, and find that on the trajectory which tends to this point when the scale goes to infinity the behaviour of the model reproduces what one obtains decomposing the N = 2 Gross-Neveu model. The existence of this trajectory is consistent with the dynamical mass generation found by McCoy and Wu in the discrete version of the massless model. (orig.)

  10. Benchmarking and Sustainable Transport Policy

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Wyatt, Andrew; Gordon, Lucy

    2004-01-01

    Order to learn from the best. In 2000 the European Commission initiated research to explore benchmarking as a tool to promote policies for ‘sustainable transport’. This paper reports findings and recommendations on how to address this challenge. The findings suggest that benchmarking is a valuable...... tool that may indeed help to move forward the transport policy agenda. However, there are major conditions and limitations. First of all it is not always so straightforward to delimit, measure and compare transport services in order to establish a clear benchmark. Secondly ‘sustainable transport......’ evokes a broad range of concerns that are hard to address fully at the level of specific practices. Thirdly policies are not directly comparable across space and context. For these reasons attempting to benchmark ‘sustainable transport policies’ against one another would be a highly complex task, which...

  11. Combining Rosetta with molecular dynamics (MD): A benchmark of the MD-based ensemble protein design.

    Science.gov (United States)

    Ludwiczak, Jan; Jarmula, Adam; Dunin-Horkawicz, Stanislaw

    2018-07-01

    Computational protein design is a set of procedures for computing amino acid sequences that will fold into a specified structure. Rosetta Design, a commonly used software for protein design, allows for the effective identification of sequences compatible with a given backbone structure, while molecular dynamics (MD) simulations can thoroughly sample near-native conformations. We benchmarked a procedure in which Rosetta design is started on MD-derived structural ensembles and showed that such a combined approach generates 20-30% more diverse sequences than currently available methods with only a slight increase in computation time. Importantly, the increase in diversity is achieved without a loss in the quality of the designed sequences assessed by their resemblance to natural sequences. We demonstrate that the MD-based procedure is also applicable to de novo design tasks started from backbone structures without any sequence information. In addition, we implemented a protocol that can be used to assess the stability of designed models and to select the best candidates for experimental validation. In sum our results demonstrate that the MD ensemble-based flexible backbone design can be a viable method for protein design, especially for tasks that require a large pool of diverse sequences. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. RELAP5/MOD2 benchmarking study: Critical heat flux under low-flow conditions

    International Nuclear Information System (INIS)

    Ruggles, E.; Williams, P.T.

    1990-01-01

    Experimental studies by Mishima and Ishii performed at Argonne National Laboratory and subsequent experimental studies performed by Mishima and Nishihara have investigated the critical heat flux (CHF) for low-pressure low-mass flux situations where low-quality burnout may occur. These flow situations are relevant to long-term decay heat removal after a loss of forced flow. The transition from burnout at high quality to burnout at low quality causes very low burnout heat flux values. Mishima and Ishii postulated a model for the low-quality burnout based on flow regime transition from churn turbulent to annular flow. This model was validated by both flow visualization and burnout measurements. Griffith et al. also studied CHF in low mass flux, low-pressure situations and correlated data for upflows, counter-current flows, and downflows with the local fluid conditions. A RELAP5/MOD2 CHF benchmarking study was carried out investigating the performance of the code for low-flow conditions. Data from the experimental study by Mishima and Ishii were the basis for the benchmark comparisons

  13. Prospects for detecting supersymmetric dark matter at Post-LEP benchmark points

    International Nuclear Information System (INIS)

    Ellis, J.; Matchev, K.T.; Feng, J.L.; Ferstl, A.; Olive, K.A.

    2002-01-01

    A new set of supersymmetric benchmark scenarios has recently been proposed in the context of the constrained MSSM (CMSSM) with universal soft supersymmetry-breaking masses, taking into account the constraints from LEP, b→sγ and g μ -2. These points have previously been used to discuss the physics reaches of different accelerators. In this paper, we discuss the prospects for discovering supersymmetric dark matter in these scenarios. We consider direct detection through spin-independent and spin-dependent nuclear scattering, as well as indirect detection through relic annihilations to neutrinos, photons, and positrons. We find that several of the benchmark scenarios offer good prospects for direct detection via spin-independent nuclear scattering and indirect detection via muons produced by neutrinos from relic annihilations inside the Sun, and some models offer good prospects for detecting photons from relic annihilations in the galactic centre. (orig.)

  14. Influence of foundation mass and surface roughness on dynamic response of beam on dynamic foundation subjected to the moving load

    Science.gov (United States)

    Tran Quoc, Tinh; Khong Trong, Toan; Luong Van, Hai

    2018-04-01

    In this paper, Improved Moving Element Method (IMEM) is used to analyze the dynamic response of Euler-Bernoulli beam structures on the dynamic foundation model subjected to the moving load. The effects of characteristic foundation model parameters such as Winkler stiffness, shear layer based on the Pasternak model, viscoelastic dashpot and characteristic parameter of mass on foundation. Beams are modeled by moving elements while the load is fixed. Based on the principle of the publicly virtual balancing and the theory of moving element method, the motion differential equation of the system is established and solved by means of the numerical integration based on the Newmark algorithm. The influence of mass on foundation and the roughness of the beam surface on the dynamic response of beam are examined in details.

  15. Dynamics of laser mass-limited foil interaction at ultra-high laser intensities

    Energy Technology Data Exchange (ETDEWEB)

    Yu, T. P., E-mail: tongpu@nudt.edu.cn [College of Science, National University of Defense Technology, Changsha 410073 (China); State Key Laboratory of High Performance Computing, National University of Defense Technology, Changsha 410073 (China); Sheng, Z. M. [Key Laboratory for Laser Plasmas (MoE) and Department of Physics, Shanghai Jiao Tong University, Shanghai 200240 (China); SUPA, Department of Physics, University of Strathclyde, Glasgow G4 0NG (United Kingdom); Yin, Y.; Zhuo, H. B.; Ma, Y. Y.; Shao, F. Q. [College of Science, National University of Defense Technology, Changsha 410073 (China); Pukhov, A. [Institut für Theoretische Physik I, Heinrich-Heine-Universität Düsseldorf, 40225 Düsseldorf (Germany)

    2014-05-15

    By using three-dimensional particle-in-cell simulations with synchrotron radiation damping incorporated, dynamics of ultra-intense laser driven mass-limited foils is presented. When a circularly polarized laser pulse with a peak intensity of ∼10{sup 22} W/cm{sup 2} irradiates a mass-limited nanofoil, electrons are pushed forward collectively and a strong charge separation field forms which acts as a “light sail” and accelerates the protons. When the laser wing parts overtake the foil from the foil boundaries, electrons do a betatron-like oscillation around the center proton bunch. Under some conditions, betatron-like resonance takes place, resulting in energetic circulating electrons. Finally, bright femto-second x rays are emitted in a small cone. It is also shown that the radiation damping does not alter the foil dynamics radically at considered laser intensities. The effects of the transverse foil size and laser polarization on x-ray emission and foil dynamics are also discussed.

  16. Benchmarking for Cost Improvement. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The US Department of Energy`s (DOE) Office of Environmental Restoration and Waste Management (EM) conducted the Benchmarking for Cost Improvement initiative with three objectives: Pilot test benchmarking as an EM cost improvement tool; identify areas for cost improvement and recommend actions to address these areas; provide a framework for future cost improvement. The benchmarking initiative featured the use of four principal methods (program classification, nationwide cost improvement survey, paired cost comparison and component benchmarking). Interested parties contributed during both the design and execution phases. The benchmarking initiative was conducted on an accelerated basis. Of necessity, it considered only a limited set of data that may not be fully representative of the diverse and complex conditions found at the many DOE installations. The initiative generated preliminary data about cost differences and it found a high degree of convergence on several issues. Based on this convergence, the report recommends cost improvement strategies and actions. This report describes the steps taken as part of the benchmarking initiative and discusses the findings and recommended actions for achieving cost improvement. The results and summary recommendations, reported below, are organized by the study objectives.

  17. Benchmarking for controllere: metoder, teknikker og muligheder

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj; Sandalgaard, Niels Erik; Dietrichson, Lars Grubbe

    2008-01-01

    Benchmarking indgår på mange måder i både private og offentlige virksomheders ledelsespraksis. I økonomistyring anvendes benchmark-baserede indikatorer (eller nøgletal), eksempelvis ved fastlæggelse af mål i resultatkontrakter eller for at angive det ønskede niveau for visse nøgletal i et Balanced...... Scorecard eller tilsvarende målstyringsmodeller. Artiklen redegør for begrebet benchmarking ved at præsentere og diskutere forskellige facetter af det, samt redegør for fire forskellige anvendelser af benchmarking for at vise begrebets bredde og væsentligheden af at klarlægge formålet med et...... benchmarkingprojekt. Dernæst bliver forskellen på resultatbenchmarking og procesbenchmarking behandlet, hvorefter brugen af intern hhv. ekstern benchmarking, samt brugen af benchmarking i budgetlægning og budgetopfølgning, behandles....

  18. Professional Performance and Bureaucratic Benchmarking Information

    DEFF Research Database (Denmark)

    Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz

    Prior research documents positive effects of benchmarking information provision on performance and attributes this to social comparisons. However, the effects on professional recipients are unclear. Studies of professional control indicate that professional recipients often resist bureaucratic...... controls because of organizational-professional conflicts. We therefore analyze the association between bureaucratic benchmarking information provision and professional performance and suggest that the association is more positive if prior professional performance was low. We test our hypotheses based...... on archival, publicly disclosed, professional performance data for 191 German orthopedics departments, matched with survey data on bureaucratic benchmarking information given to chief orthopedists by the administration. We find a positive association between bureaucratic benchmarking information provision...

  19. SSI and structural benchmarks

    International Nuclear Information System (INIS)

    Philippacopoulos, A.J.; Miller, C.A.; Costantino, C.J.; Graves, H.

    1987-01-01

    This paper presents the latest results of the ongoing program entitled, Standard Problems for Structural Computer Codes, currently being worked on at BNL for the USNRC, Office of Nuclear Regulatory Research. During FY 1986, efforts were focussed on three tasks, namely, (1) an investigation of ground water effects on the response of Category I structures, (2) the Soil-Structure Interaction Workshop and (3) studies on structural benchmarks associated with Category I structures. The objective of the studies on ground water effects is to verify the applicability and the limitations of the SSI methods currently used by the industry in performing seismic evaluations of nuclear plants which are located at sites with high water tables. In a previous study by BNL (NUREG/CR-4588), it has been concluded that the pore water can influence significantly the soil-structure interaction process. This result, however, is based on the assumption of fully saturated soil profiles. Consequently, the work was further extended to include cases associated with variable water table depths. In this paper, results related to cut-off depths beyond which the pore water effects can be ignored in seismic calculations, are addressed. Comprehensive numerical data are given for soil configurations typical to those encountered in nuclear plant sites. These data were generated by using a modified version of the SLAM code which is capable of handling problems related to the dynamic response of saturated soils. Further, the paper presents some key aspects of the Soil-Structure Interaction Workshop (NUREG/CP-0054) which was held in Bethesda, MD on June 1, 1986. Finally, recent efforts related to the task on the structural benchmarks are described

  20. Investigation of dynamic characteristics of shells with holes and added mass

    Directory of Open Access Journals (Sweden)

    Seregin Sergey Valer’evich

    2014-04-01

    Full Text Available Thin cylindrical shells are widely used in construction, engineering and other industries. In case of designing a reservoir for the isothermal storage of liquefied gases such cases are inevitable, when housing requires various technical holes. A point wise added mass can appear into practice in the form of suspended spotlights, radar, architectural inclusions in buildings and structures of various purposes. It is known, that the dynamic asymmetry as an initial irregular geometric shape, including holes, and the added mass leads to specific effects in shells. In the paper the impact of a cut on the frequency and form of its own vibrations of thin circular cylindrical shells is theoretically examined with the help of the equations of linear shallow shell theory. For modal equations with Nav’e boundary conditions, we used the Bubnov - Galerkin method. The authors have expressed a formula for finding the lowest of the split-frequency vibrations of a shell with a cutout. It is stated, that in case of an appropriate choice of added mass value the lower frequencies are comparable with the case of vibrations of a shell with a hole. By numerical and experimental modeling and finite element method in the environment of MSC "Nastran" oscillation frequencies a shell supporting a concentrated mass and a shell with a cutout were compared. It is shown, that the results of the dynamic analysis of shells with holes with a suitable choice of the attached mass values are comparable with the results of the analysis of shells carrying a point mass. It was concluded that the edges in the holes, significantly affect the reduction in the lowest frequency, and need to be strengthened.

  1. EPA's Benchmark Dose Modeling Software

    Science.gov (United States)

    The EPA developed the Benchmark Dose Software (BMDS) as a tool to help Agency risk assessors facilitate applying benchmark dose (BMD) method’s to EPA’s human health risk assessment (HHRA) documents. The application of BMD methods overcomes many well know limitations ...

  2. Accelerator shielding benchmark problems

    International Nuclear Information System (INIS)

    Hirayama, H.; Ban, S.; Nakamura, T.

    1993-01-01

    Accelerator shielding benchmark problems prepared by Working Group of Accelerator Shielding in the Research Committee on Radiation Behavior in the Atomic Energy Society of Japan were compiled by Radiation Safety Control Center of National Laboratory for High Energy Physics. Twenty-five accelerator shielding benchmark problems are presented for evaluating the calculational algorithm, the accuracy of computer codes and the nuclear data used in codes. (author)

  3. The Dynamical Evolution of Stellar-Mass Black Holes in Dense Star Clusters

    Science.gov (United States)

    Morscher, Maggie

    Globular clusters are gravitationally bound systems containing up to millions of stars, and are found ubiquitously in massive galaxies, including the Milky Way. With densities as high as a million stars per cubic parsec, they are one of the few places in the Universe where stars interact with one another. They therefore provide us with a unique laboratory for studying how gravitational interactions can facilitate the formation of exotic systems, such as X-ray binaries containing black holes, and merging double black hole binaries, which are produced much less efficiently in isolation. While telescopes can provide us with a snapshot of what these dense clusters look like at present, we must rely on detailed numerical simulations to learn about their evolution. These simulations are quite challenging, however, since dense star clusters are described by a complicated set of physical processes occurring on many different length and time scales, including stellar and binary evolution, weak gravitational scattering encounters, strong resonant binary interactions, and tidal stripping by the host galaxy. Until very recently, it was not possible to model the evolution of systems with millions of stars, the actual number contained in the largest clusters, including all the relevant physics required describe these systems accurately. The Northwestern Group's Henon Monte Carlo code, CMC, which has been in development for over a decade, is a powerful tool that can be used to construct detailed evolutionary models of large star clusters. With its recent parallelization, CMC is now capable of addressing a particularly interesting unsolved problem in astrophysics: the dynamical evolution of stellar black holes in dense star clusters. Our current understanding of the stellar initial mass function and massive star evolution suggests that young globular clusters may have formed hundreds to thousands of stellar-mass black holes, the remnants of stars with initial masses from 20 - 100

  4. MSSM fine tuning problem and dynamical suppression of the Higgs mass parameters

    International Nuclear Information System (INIS)

    Kobayashi, Tatsuo; Terao, Haruhiko

    2004-01-01

    There have been several proposals for extension of the MSSM so as to ameliorate the fine tuning problem, which may be classified roughly into two categories; scenarios with enhanced quartic Higgs couplings and scenarios with radiatively stable Higgs soft masses. After a brief remark on some generic aspects of these approaches, we show a scenario with use of superconformal dynamics suppressing the Higgs mass parameters. (author)

  5. MSSM fine tuning problem and dynamical suppression of the Higgs mass parameters

    Energy Technology Data Exchange (ETDEWEB)

    Kobayashi, Tatsuo [Kyoto Univ., Dept. of Physics, Kyoto (Japan); Terao, Haruhiko [Kanazawa Univ., Institute for Theoretical Physics, Kanazawa, Ishikawa (Japan)

    2004-12-01

    There have been several proposals for extension of the MSSM so as to ameliorate the fine tuning problem, which may be classified roughly into two categories; scenarios with enhanced quartic Higgs couplings and scenarios with radiatively stable Higgs soft masses. After a brief remark on some generic aspects of these approaches, we show a scenario with use of superconformal dynamics suppressing the Higgs mass parameters. (author)

  6. Deviating From the Benchmarks

    DEFF Research Database (Denmark)

    Rocha, Vera; Van Praag, Mirjam; Carneiro, Anabela

    This paper studies three related questions: To what extent otherwise similar startups employ different quantities and qualities of human capital at the moment of entry? How persistent are initial human capital choices over time? And how does deviating from human capital benchmarks influence firm......, founders human capital, and the ownership structure of startups (solo entrepreneurs versus entrepreneurial teams). We then study the survival implications of exogenous deviations from these benchmarks, based on spline models for survival data. Our results indicate that (especially negative) deviations from...... the benchmark can be substantial, are persistent over time, and hinder the survival of firms. The implications may, however, vary according to the sector and the ownership structure at entry. Given the stickiness of initial choices, wrong human capital decisions at entry turn out to be a close to irreversible...

  7. Comparison of linear intrascan and interscan dynamic ranges of Orbitrap and ion-mobility time-of-flight mass spectrometers.

    Science.gov (United States)

    Kaufmann, Anton; Walker, Stephan

    2017-11-30

    The linear intrascan and interscan dynamic ranges of mass spectrometers are important in metabolome and residue analysis. A large linear dynamic range is mandatory if both low- and high-abundance ions have to be detected and quantitated in heavy matrix samples. These performance criteria, as provided by modern high-resolution mass spectrometry (HRMS), were systematically investigated. The comparison included two generations of Orbitraps, and an ion mobility quadrupole time-of-flight (QTOF) system In addition, different scan modes, as provided by the utilized instruments, were investigated. Calibration curves of different compounds covering a concentration range of five orders of magnitude were measured to evaluate the linear interscan dynamic range. The linear intrascan dynamic range and the resulting mass accuracy were evaluated by repeating these measurements in the presence of a very intense background. Modern HRMS instruments can show linear dynamic ranges of five orders of magnitude. Often, however, the linear dynamic range is limited by the detection capability (sensitivity and selectivity) and by the electrospray ionization. Orbitraps, as opposed to TOF instruments, show a reduced intrascan dynamic range. This is due to the limited C-trap and Orbitrap capacity. The tested TOF instrument shows poorer mass accuracies than the Orbitraps. In contrast, hyphenation with an ion-mobility device seems not to affect the linear dynamic range. The linear dynamic range of modern HRMS instrumentation has been significantly improved. This also refers to the virtual absence of systematic mass shifts at high ion abundances. The intrascan dynamic range of the current Orbitrap technology may still be a limitation when analyzing complex matrix extracts. On the other hand, the linear dynamic range is not only limited by the detector technology, but can also be shortened by peripheral devices, where the ionization and transfer of ions take place. Copyright © 2017 John Wiley

  8. SHOCK, Nonlinear Dynamic Structure Analysis, Spring and Mass Model, Runge-Kutta-Gill Method

    International Nuclear Information System (INIS)

    Gabrielson, V. K.

    1981-01-01

    1 - Description of problem or function: SHOCK calculates the dynamic response of a structure modeled as a spring-mass system having one or two degrees of freedom for each mass when subjected to specified environments. The code determines the behavior of each lumped mass (displacement, velocity, and acceleration for each degree of freedom) and the behavior of each spring or coupling (force, shear, moment, and displacement) as a function of time. Two types of models, axial, having one degree of freedom, and lateral, having two degrees of freedom at each mass can be processed. Damping can be included in all models and shock spectrums of responses can be obtained. 2 - Method of solution: Two methods of numerical integration of the second-order dynamic equations are provided: the Runge-Kutta-Gill method with variable step-size is recommended for highly nonlinear problems, and a variation of the Newmark-Beta method is available for use with large linear problems. 3 - Restrictions on the complexity of the problem: Maxima of: 100 masses, 200 springs or couplings. Complex arrangements of nonlinear options must be carefully checked by the user

  9. Benchmark Two-Good Utility Functions

    NARCIS (Netherlands)

    de Jaegher, K.

    Benchmark two-good utility functions involving a good with zero income elasticity and unit income elasticity are well known. This paper derives utility functions for the additional benchmark cases where one good has zero cross-price elasticity, unit own-price elasticity, and zero own price

  10. Post LHC8 SUSY benchmark points for ILC physics

    International Nuclear Information System (INIS)

    Baer, Howard; List, Jenny

    2013-07-01

    We re-evaluate prospects for supersymmetry at the proposed International Linear e + e - Collider (ILC) in light of the first two years of serious data taking at LHC: LHC7 with ∝5 fb -1 of pp collisions at √(s)=7 TeV and LHC8 with ∝20 fb -1 at √(s)=8 TeV. Strong new limits from LHC8 SUSY searches, along with the discovery of a Higgs boson with m h ≅125 GeV, suggest a paradigm shift from previously popular models to ones with new and compelling signatures. After a review of the current status of supersymmetry, we present a variety of new ILC benchmark models, including: natural SUSY, radiatively-driven natural SUSY (RNS), NUHM2 with low m A , a focus point case from mSUGRA/CMSSM, non-universal gaugino mass (NUGM) model, τ-coannihilation, Kallosh-Linde/spread SUSY model, mixed gauge-gravity mediation, normal scalar mass hierarchy (NMH), and one example with the recently discovered Higgs boson being the heavy CP-even state H. While all these models at present elude the latest LHC8 limits, they do offer intriguing case study possibilities for ILC operating at √(s)≅ 0.25-1 TeV. The benchmark points also present a view of the widely diverse SUSY phenomena which might still be expected in the post LHC8 era at both LHC and ILC.

  11. Post LHC8 SUSY benchmark points for ILC physics

    Energy Technology Data Exchange (ETDEWEB)

    Baer, Howard [Oklahoma Univ., Norman, OK (United States); List, Jenny [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2013-07-15

    We re-evaluate prospects for supersymmetry at the proposed International Linear e{sup +}e{sup -} Collider (ILC) in light of the first two years of serious data taking at LHC: LHC7 with {proportional_to}5 fb{sup -1} of pp collisions at {radical}(s)=7 TeV and LHC8 with {proportional_to}20 fb{sup -1} at {radical}(s)=8 TeV. Strong new limits from LHC8 SUSY searches, along with the discovery of a Higgs boson with m{sub h}{approx_equal}125 GeV, suggest a paradigm shift from previously popular models to ones with new and compelling signatures. After a review of the current status of supersymmetry, we present a variety of new ILC benchmark models, including: natural SUSY, radiatively-driven natural SUSY (RNS), NUHM2 with low m{sub A}, a focus point case from mSUGRA/CMSSM, non-universal gaugino mass (NUGM) model, {tau}-coannihilation, Kallosh-Linde/spread SUSY model, mixed gauge-gravity mediation, normal scalar mass hierarchy (NMH), and one example with the recently discovered Higgs boson being the heavy CP-even state H. While all these models at present elude the latest LHC8 limits, they do offer intriguing case study possibilities for ILC operating at {radical}(s){approx_equal} 0.25-1 TeV. The benchmark points also present a view of the widely diverse SUSY phenomena which might still be expected in the post LHC8 era at both LHC and ILC.

  12. Developing integrated benchmarks for DOE performance measurement

    Energy Technology Data Exchange (ETDEWEB)

    Barancik, J.I.; Kramer, C.F.; Thode, Jr. H.C.

    1992-09-30

    The objectives of this task were to describe and evaluate selected existing sources of information on occupational safety and health with emphasis on hazard and exposure assessment, abatement, training, reporting, and control identifying for exposure and outcome in preparation for developing DOE performance benchmarks. Existing resources and methodologies were assessed for their potential use as practical performance benchmarks. Strengths and limitations of current data resources were identified. Guidelines were outlined for developing new or improved performance factors, which then could become the basis for selecting performance benchmarks. Data bases for non-DOE comparison populations were identified so that DOE performance could be assessed relative to non-DOE occupational and industrial groups. Systems approaches were described which can be used to link hazards and exposure, event occurrence, and adverse outcome factors, as needed to generate valid, reliable, and predictive performance benchmarks. Data bases were identified which contain information relevant to one or more performance assessment categories . A list of 72 potential performance benchmarks was prepared to illustrate the kinds of information that can be produced through a benchmark development program. Current information resources which may be used to develop potential performance benchmarks are limited. There is need to develop an occupational safety and health information and data system in DOE, which is capable of incorporating demonstrated and documented performance benchmarks prior to, or concurrent with the development of hardware and software. A key to the success of this systems approach is rigorous development and demonstration of performance benchmark equivalents to users of such data before system hardware and software commitments are institutionalized.

  13. Benchmarking gate-based quantum computers

    Science.gov (United States)

    Michielsen, Kristel; Nocon, Madita; Willsch, Dennis; Jin, Fengping; Lippert, Thomas; De Raedt, Hans

    2017-11-01

    With the advent of public access to small gate-based quantum processors, it becomes necessary to develop a benchmarking methodology such that independent researchers can validate the operation of these processors. We explore the usefulness of a number of simple quantum circuits as benchmarks for gate-based quantum computing devices and show that circuits performing identity operations are very simple, scalable and sensitive to gate errors and are therefore very well suited for this task. We illustrate the procedure by presenting benchmark results for the IBM Quantum Experience, a cloud-based platform for gate-based quantum computing.

  14. A Heterogeneous Medium Analytical Benchmark

    International Nuclear Information System (INIS)

    Ganapol, B.D.

    1999-01-01

    A benchmark, called benchmark BLUE, has been developed for one-group neutral particle (neutron or photon) transport in a one-dimensional sub-critical heterogeneous plane parallel medium with surface illumination. General anisotropic scattering is accommodated through the Green's Function Method (GFM). Numerical Fourier transform inversion is used to generate the required Green's functions which are kernels to coupled integral equations that give the exiting angular fluxes. The interior scalar flux is then obtained through quadrature. A compound iterative procedure for quadrature order and slab surface source convergence provides highly accurate benchmark qualities (4- to 5- places of accuracy) results

  15. Dependence of Dynamic Modeling Accuracy on Sensor Measurements, Mass Properties, and Aircraft Geometry

    Science.gov (United States)

    Grauer, Jared A.; Morelli, Eugene A.

    2013-01-01

    The NASA Generic Transport Model (GTM) nonlinear simulation was used to investigate the effects of errors in sensor measurements, mass properties, and aircraft geometry on the accuracy of identified parameters in mathematical models describing the flight dynamics and determined from flight data. Measurements from a typical flight condition and system identification maneuver were systematically and progressively deteriorated by introducing noise, resolution errors, and bias errors. The data were then used to estimate nondimensional stability and control derivatives within a Monte Carlo simulation. Based on these results, recommendations are provided for maximum allowable errors in sensor measurements, mass properties, and aircraft geometry to achieve desired levels of dynamic modeling accuracy. Results using additional flight conditions and parameter estimation methods, as well as a nonlinear flight simulation of the General Dynamics F-16 aircraft, were compared with these recommendations

  16. Benchmarking i eksternt regnskab og revision

    DEFF Research Database (Denmark)

    Thinggaard, Frank; Kiertzner, Lars

    2001-01-01

    løbende i en benchmarking-proces. Dette kapitel vil bredt undersøge, hvor man med nogen ret kan få benchmarking-begrebet knyttet til eksternt regnskab og revision. Afsnit 7.1 beskæftiger sig med det eksterne årsregnskab, mens afsnit 7.2 tager fat i revisionsområdet. Det sidste afsnit i kapitlet opsummerer...... betragtningerne om benchmarking i forbindelse med begge områder....

  17. EPRI depletion benchmark calculations using PARAGON

    International Nuclear Information System (INIS)

    Kucukboyaci, Vefa N.

    2015-01-01

    Highlights: • PARAGON depletion calculations are benchmarked against the EPRI reactivity decrement experiments. • Benchmarks cover a wide range of enrichments, burnups, cooling times, and burnable absorbers, and different depletion and storage conditions. • Results from PARAGON-SCALE scheme are more conservative relative to the benchmark data. • ENDF/B-VII based data reduces the excess conservatism and brings the predictions closer to benchmark reactivity decrement values. - Abstract: In order to conservatively apply burnup credit in spent fuel pool criticality analyses, code validation for both fresh and used fuel is required. Fresh fuel validation is typically done by modeling experiments from the “International Handbook.” A depletion validation can determine a bias and bias uncertainty for the worth of the isotopes not found in the fresh fuel critical experiments. Westinghouse’s burnup credit methodology uses PARAGON™ (Westinghouse 2-D lattice physics code) and its 70-group cross-section library, which have been benchmarked, qualified, and licensed both as a standalone transport code and as a nuclear data source for core design simulations. A bias and bias uncertainty for the worth of depletion isotopes, however, are not available for PARAGON. Instead, the 5% decrement approach for depletion uncertainty is used, as set forth in the Kopp memo. Recently, EPRI developed a set of benchmarks based on a large set of power distribution measurements to ascertain reactivity biases. The depletion reactivity has been used to create 11 benchmark cases for 10, 20, 30, 40, 50, and 60 GWd/MTU and 3 cooling times 100 h, 5 years, and 15 years. These benchmark cases are analyzed with PARAGON and the SCALE package and sensitivity studies are performed using different cross-section libraries based on ENDF/B-VI.3 and ENDF/B-VII data to assess that the 5% decrement approach is conservative for determining depletion uncertainty

  18. OECD/NRC BWR Turbine Trip Benchmark: Simulation by POLCA-T Code

    International Nuclear Information System (INIS)

    Panayotov, Dobromir

    2004-01-01

    Westinghouse transient code POLCA-T brings together the system thermal-hydraulics plant models and three-dimensional (3-D) neutron kinetics core models. Participation in the OECD/NRC BWR Turbine Trip (TT) Benchmark is a part of our efforts toward the code's validation. The paper describes the objectives for TT analyses and gives a brief overview of the developed plant system input deck and 3-D core model.The results of exercise 1, system model without netronics, are presented. Sensitivity studies performed cover the maximal time step, turbine stop valve position and mass flow, feedwater temperature, and steam bypass mass flow. Results of exercise 2, 3-D core neutronic and thermal-hydraulic model with boundary conditions, are also presented. Sensitivity studies include the core inlet temperature, cladding properties, and direct heating to core coolant and bypass.The entire plant model was validated in the framework of the benchmark's phase 3. Sensitivity studies include the effect of SCRAM initialization and carry-under. The results obtained - transient fission power and its initial axial distribution and steam dome, core exit, lower and upper plenum, main steam line, and turbine inlet pressures - showed good agreement with measured data. Thus, the POLCA-T code capabilities for correct simulation of pressurizing transients with very fast power were proved

  19. Benchmark dynamics in the environmental performance of ports.

    Science.gov (United States)

    Puig, Martí; Michail, Antonis; Wooldridge, Chris; Darbra, Rosa Mari

    2017-08-15

    This paper analyses the 2016 environmental benchmark performance of the port sector, based on a wide representation of EcoPorts members. This is the fifth time that this study has been conducted as an initiative of the European Sea Ports Organisation (ESPO). The data and results are derived from the Self-Diagnosis Method (SDM), a concise checklist against which port managers can self-assess the environmental management of their port in relation to the performance of the EcoPorts membership. The SDM tool was developed in the framework of the ECOPORTS project (2002-2005) and it is managed by ESPO. A total number of 91 ports from 20 different European Maritime States contributed to this evaluation. The main results are that air quality remains as the top environmental priority of the respondent ports, followed by energy consumption and noise. In terms of environmental management, the study confirms that key components are commonly implemented in the majority of European ports. 94% of contributing ports have a designated environmental manager, 92% own an environmental policy and 82% implement an environmental monitoring program. Waste is identified as the most monitored issue in ports (80%), followed by energy consumption (73%) and water quality (70%). Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. AER benchmark specification sheet

    International Nuclear Information System (INIS)

    Aszodi, A.; Toth, S.

    2009-01-01

    In the VVER-440/213 type reactors, the core outlet temperature field is monitored with in-core thermocouples, which are installed above 210 fuel assemblies. These measured temperatures are used in determination of the fuel assembly powers and they have important role in the reactor power limitation. For these reasons, correct interpretation of the thermocouple signals is an important question. In order to interpret the signals in correct way, knowledge of the coolant mixing in the assembly heads is necessary. Computational Fluid Dynamics (CFD) codes and experiments can help to understand better these mixing processes and they can provide information which can support the more adequate interpretation of the thermocouple signals. This benchmark deals with the 3D CFD modeling of the coolant mixing in the heads of the profiled fuel assemblies with 12.2 mm rod pitch. Two assemblies of the 23rd cycle of the Paks NPP's Unit 3 are investigated. One of them has symmetrical pin power profile and another possesses inclined profile. (authors)

  1. A GFR benchmark comparison of transient analysis codes based on the ETDR concept

    International Nuclear Information System (INIS)

    Bubelis, E.; Coddington, P.; Castelliti, D.; Dor, I.; Fouillet, C.; Geus, E. de; Marshall, T.D.; Van Rooijen, W.; Schikorr, M.; Stainsby, R.

    2007-01-01

    A GFR (Gas-cooled Fast Reactor) transient benchmark study was performed to investigate the ability of different code systems to calculate the transition in the core heat removal from the main circuit forced flow to natural circulation cooling using the Decay Heat Removal (DHR) system. This benchmark is based on a main blower failure in the Experimental Technology Demonstration Reactor (ETDR) with reactor scram. The codes taking part into the benchmark are: RELAP5, TRAC/AAA, CATHARE, SIM-ADS, MANTA and SPECTRA. For comparison purposes the benchmark was divided into several stages: the initial steady-state solution, the main blower flow run-down, the opening of the DHR loop and the transition to natural circulation and finally the 'quasi' steady heat removal from the core by the DHR system. The results submitted by the participants showed that all the codes gave consistent results for all four stages of the benchmark. In the steady-state the calculations revealed some differences in the clad and fuel temperatures, the core and main loop pressure drops and in the total Helium mass inventory. Also some disagreements were observed in the Helium and water flow rates in the DHR loop during the final natural circulation stage. Good agreement was observed for the total main blower flow rate and Helium temperature rise in the core, as well as for the Helium inlet temperature into the core. In order to understand the reason for the differences in the initial 'blind' calculations a second round of calculations was performed using a more precise set of boundary conditions

  2. Ad hoc committee on reactor physics benchmarks

    International Nuclear Information System (INIS)

    Diamond, D.J.; Mosteller, R.D.; Gehin, J.C.

    1996-01-01

    In the spring of 1994, an ad hoc committee on reactor physics benchmarks was formed under the leadership of two American Nuclear Society (ANS) organizations. The ANS-19 Standards Subcommittee of the Reactor Physics Division and the Computational Benchmark Problem Committee of the Mathematics and Computation Division had both seen a need for additional benchmarks to help validate computer codes used for light water reactor (LWR) neutronics calculations. Although individual organizations had employed various means to validate the reactor physics methods that they used for fuel management, operations, and safety, additional work in code development and refinement is under way, and to increase accuracy, there is a need for a corresponding increase in validation. Both organizations thought that there was a need to promulgate benchmarks based on measured data to supplement the LWR computational benchmarks that have been published in the past. By having an organized benchmark activity, the participants also gain by being able to discuss their problems and achievements with others traveling the same route

  3. Recent advances in applying mass spectrometry and systems biology to determine brain dynamics.

    Science.gov (United States)

    Scifo, Enzo; Calza, Giulio; Fuhrmann, Martin; Soliymani, Rabah; Baumann, Marc; Lalowski, Maciej

    2017-06-01

    Neurological disorders encompass various pathologies which disrupt normal brain physiology and function. Poor understanding of their underlying molecular mechanisms and their societal burden argues for the necessity of novel prevention strategies, early diagnostic techniques and alternative treatment options to reduce the scale of their expected increase. Areas covered: This review scrutinizes mass spectrometry based approaches used to investigate brain dynamics in various conditions, including neurodegenerative and neuropsychiatric disorders. Different proteomics workflows for isolation/enrichment of specific cell populations or brain regions, sample processing; mass spectrometry technologies, for differential proteome quantitation, analysis of post-translational modifications and imaging approaches in the brain are critically deliberated. Future directions, including analysis of cellular sub-compartments, targeted MS platforms (selected/parallel reaction monitoring) and use of mass cytometry are also discussed. Expert commentary: Here, we summarize and evaluate current mass spectrometry based approaches for determining brain dynamics in health and diseases states, with a focus on neurological disorders. Furthermore, we provide insight on current trends and new MS technologies with potential to improve this analysis.

  4. Computational Chemistry Comparison and Benchmark Database

    Science.gov (United States)

    SRD 101 NIST Computational Chemistry Comparison and Benchmark Database (Web, free access)   The NIST Computational Chemistry Comparison and Benchmark Database is a collection of experimental and ab initio thermochemical properties for a selected set of molecules. The goals are to provide a benchmark set of molecules for the evaluation of ab initio computational methods and allow the comparison between different ab initio computational methods for the prediction of thermochemical properties.

  5. Shape memory alloys applied to improve rotor-bearing system dynamics - an experimental investigation

    DEFF Research Database (Denmark)

    Enemark, Søren; Santos, Ilmar; Savi, Marcelo A.

    2015-01-01

    passing through critical speeds. In this work, the feasibility of applying shape memory alloys to a rotating system is experimentally investigated. Shape memory alloys can change their stiffness with temperature variations and thus they may change system dynamics. Shape memory alloys also exhibit...... perturbations and mass imbalance responses of the rotor-bearing system at different temperatures and excitation frequencies are carried out to determine the dynamic behaviour of the system. The behaviour and the performance in terms of vibration reduction and system adaptability are compared against a benchmark...... configuration comprised by the same system having steel springs instead of shape memory alloy springs. The experimental results clearly show that the stiffness changes and hysteretic behaviour of the shape memory alloys springs alter system dynamics both in terms of critical speeds and mode shapes. Vibration...

  6. Diagnostic Accuracy of Dynamic Contrast Enhanced Magnetic Resonance Imaging in Characterizing Lung Masses

    Science.gov (United States)

    Inan, Nagihan; Arslan, Arzu; Donmez, Muhammed; Sarisoy, Hasan Tahsin

    2016-01-01

    Background Imaging plays a critical role not only in the detection, but also in the characterization of lung masses as benign or malignant. Objectives To determine the diagnostic accuracy of dynamic magnetic resonance imaging (MRI) in the differential diagnosis of benign and malignant lung masses. Patients and Methods Ninety-four masses were included in this prospective study. Five dynamic series of T1-weighted spoiled gradient echo (FFE) images were obtained, followed by a T1-weighted FFE sequence in the late phase (5th minutes). Contrast enhancement patterns in the early (25th second) and late (5th minute) phase images were evaluated. For the quantitative evaluation, signal intensity (SI)-time curves were obtained and the maximum relative enhancement, wash-in rate, and time-to-peak enhancement of masses in both groups were calculated. Results The early phase contrast enhancement patterns were homogeneous in 78.2% of the benign masses, while heterogeneous in 74.4% of the malignant tumors. On the late phase images, 70.8% of the benign masses showed homogeneous enhancement, while most of the malignant masses showed heterogeneous enhancement (82.4%). During the first pass, the maximum relative enhancement and wash-in rate values of malignant masses were significantly higher than those of the benign masses (P = 0.03 and 0.04, respectively). The cutoff value at 15% yielded a sensitivity of 85.4%, specificity of 61.2%, and positive predictive value of 68.7% for the maximum relative enhancement. Conclusion Contrast enhancement patterns and SI-time curve analysis of MRI are helpful in the differential diagnosis of benign and malignant lung masses. PMID:27703654

  7. BENCHMARKING FOR THE ROMANIAN HEAVY COMMERCIAL VEHICLES MARKET

    Directory of Open Access Journals (Sweden)

    Pop Nicolae Alexandru

    2014-07-01

    Full Text Available The globalization has led to a better integration of international markets of goods, services and capital markets, fact which leads to a significant increase of investments in those regions with low labor cost and with access to commercial routes. The development of international trade has imposed a continuous growth of the volumes of transported goods and the development of a transport system, able to stand against the new pressure exercised by cost, time and space. The solution to efficient transport is the intermodal transportation relying on state-of-the-art technological platforms, which integrates the advantages specific to each means of transportation: flexibility for road transportation, high capacity for railway, low costs for sea, and speed for air transportation. Romania’s integration in the pan-European transport system alongside with the EU’s enlargement towards the east will change Romania’s positioning into a central one. The integrated governmental program of improving the intermodal infrastructure will ensure fast railway, road and air connections. For the Danube harbors and for the sea ports, EU grants and allowances will be used thus increasing Romania’s importance in its capacity as one of Europe’s logistical hubs. The present paper intends to use benchmarking, the management and strategic marketing tool, in order to realize an evaluation of the Romanian heavy commercial vehicles market, within European context. Benchmarking encourages change in a complex and dynamic context where a permanent solution cannot be found. The different results stimulate the use of benchmarking as a solution to reduce gaps. MAN’s case study shows the dynamics of the players on the Romanian market for heavy commercial vehicles, when considering the strong growth of Romanian exported goods but with a modest internal demand, a limited but developing road infrastructure, and an unfavorable international economical context together with

  8. MTCB: A Multi-Tenant Customizable database Benchmark

    NARCIS (Netherlands)

    van der Zijden, WIm; Hiemstra, Djoerd; van Keulen, Maurice

    2017-01-01

    We argue that there is a need for Multi-Tenant Customizable OLTP systems. Such systems need a Multi-Tenant Customizable Database (MTC-DB) as a backing. To stimulate the development of such databases, we propose the benchmark MTCB. Benchmarks for OLTP exist and multi-tenant benchmarks exist, but no

  9. Internal Benchmarking for Institutional Effectiveness

    Science.gov (United States)

    Ronco, Sharron L.

    2012-01-01

    Internal benchmarking is an established practice in business and industry for identifying best in-house practices and disseminating the knowledge about those practices to other groups in the organization. Internal benchmarking can be done with structures, processes, outcomes, or even individuals. In colleges or universities with multicampuses or a…

  10. Benchmarking the Netherlands. Benchmarking for growth

    International Nuclear Information System (INIS)

    2003-01-01

    This is the fourth edition of the Ministry of Economic Affairs' publication 'Benchmarking the Netherlands', which aims to assess the competitiveness of the Dutch economy. The methodology and objective of the benchmarking remain the same. The basic conditions for economic activity (institutions, regulation, etc.) in a number of benchmark countries are compared in order to learn from the solutions found by other countries for common economic problems. This publication is devoted entirely to the potential output of the Dutch economy. In other words, its ability to achieve sustainable growth and create work over a longer period without capacity becoming an obstacle. This is important because economic growth is needed to increase prosperity in the broad sense and meeting social needs. Prosperity in both a material (per capita GDP) and immaterial (living environment, environment, health, etc) sense, in other words. The economy's potential output is determined by two structural factors: the growth of potential employment and the structural increase in labour productivity. Analysis by the Netherlands Bureau for Economic Policy Analysis (CPB) shows that in recent years the increase in the capacity for economic growth has been realised mainly by increasing the supply of labour and reducing the equilibrium unemployment rate. In view of the ageing of the population in the coming years and decades the supply of labour is unlikely to continue growing at the pace we have become accustomed to in recent years. According to a number of recent studies, to achieve a respectable rate of sustainable economic growth the aim will therefore have to be to increase labour productivity. To realise this we have to focus on for six pillars of economic policy: (1) human capital, (2) functioning of markets, (3) entrepreneurship, (4) spatial planning, (5) innovation, and (6) sustainability. These six pillars determine the course for economic policy aiming at higher productivity growth. Throughout

  11. Benchmarking the Netherlands. Benchmarking for growth

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-01-01

    This is the fourth edition of the Ministry of Economic Affairs' publication 'Benchmarking the Netherlands', which aims to assess the competitiveness of the Dutch economy. The methodology and objective of the benchmarking remain the same. The basic conditions for economic activity (institutions, regulation, etc.) in a number of benchmark countries are compared in order to learn from the solutions found by other countries for common economic problems. This publication is devoted entirely to the potential output of the Dutch economy. In other words, its ability to achieve sustainable growth and create work over a longer period without capacity becoming an obstacle. This is important because economic growth is needed to increase prosperity in the broad sense and meeting social needs. Prosperity in both a material (per capita GDP) and immaterial (living environment, environment, health, etc) sense, in other words. The economy's potential output is determined by two structural factors: the growth of potential employment and the structural increase in labour productivity. Analysis by the Netherlands Bureau for Economic Policy Analysis (CPB) shows that in recent years the increase in the capacity for economic growth has been realised mainly by increasing the supply of labour and reducing the equilibrium unemployment rate. In view of the ageing of the population in the coming years and decades the supply of labour is unlikely to continue growing at the pace we have become accustomed to in recent years. According to a number of recent studies, to achieve a respectable rate of sustainable economic growth the aim will therefore have to be to increase labour productivity. To realise this we have to focus on for six pillars of economic policy: (1) human capital, (2) functioning of markets, (3) entrepreneurship, (4) spatial planning, (5) innovation, and (6) sustainability. These six pillars determine the course for economic policy aiming at higher productivity

  12. Dynamical shake-up and the low mass of Mars

    Science.gov (United States)

    Bromley, Benjamin C.; Kenyon, Scott

    2017-10-01

    The low mass of Mars and the lack of planets in the asteroid belt are important constraints on theories of planet formation. We revisit the idea that sweeping secular resonances involving the gas giants and theSun's dissipating protoplanetary disk can explain these features of our Solar System. To test this "dynamical shake-up" scenario, we perform an extensive suite of simulations to track terrestrial planet formation from planetesimals. We find that if the Sun’s gas disk depletes in roughly a million years, then a sweeping resonance with Jupiter inhibits planet formation in the asteroid belt and substantially limits the mass of Mars. We explore how this phenomenon might lead to asteroid belt analogs around other stars with long-period, massive planets.

  13. Extension of the pseudo dynamic method to test structures with distributed mass

    International Nuclear Information System (INIS)

    Renda, V.; Papa, L.; Bellorini, S.

    1993-01-01

    The PsD method is a mixed numerical and experimental procedure. At each time step the dynamic deformation of the structure, computed by solving the equation of the motion for a given input signal, is reproduced in the laboratory by means of actuators attached to the sample at specific points. The reaction forces at those points are measured and used to compute the deformation for the next time step. The reaction forces being known, knowledge of the stiffness of the structure is not needed, so that the method can be effective also for deformations leading to strong nonlinear behaviour of the structure. On the contrary, the mass matrix and the applied forces must be well known. For this reason the PsD method can be applied without approximations when the masses can be considered as lumped at the testing points of the sample. The present work investigates the possibility to extend the PsD method to test structures with distributed mass. A standard procedure is proposed to provide an equivalent mass matrix and force vector reduced to the testing points and to verify the reliability of the model. The verification is obtained comparing the results of multi-degrees of freedom dynamic analysis, done by means of a Finite Elements (FE) numerical program, with a simulation of the PsD method based on the reduced degrees of freedom mass matrix and external forces, assuming in place of the experimental reactions, those computed with the general FE model. The method has been applied to a numerical simulation of the behaviour of a realistic and complex structure with distributed mass consisting of a masonry building of two floors. The FE model consists of about two thousand degrees of freedom and the condensation has been made for four testing points. A dynamic analysis has been performed with the general FE model and the reactions of the structure have been recorded in a file and used as input for the PsD simulation with the four degree of freedom model. The comparison between

  14. Benchmark simulation models, quo vadis?

    DEFF Research Database (Denmark)

    Jeppsson, U.; Alex, J; Batstone, D. J.

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to p...

  15. Benchmark for Strategic Performance Improvement.

    Science.gov (United States)

    Gohlke, Annette

    1997-01-01

    Explains benchmarking, a total quality management tool used to measure and compare the work processes in a library with those in other libraries to increase library performance. Topics include the main groups of upper management, clients, and staff; critical success factors for each group; and benefits of benchmarking. (Author/LRW)

  16. A high dynamic range pulse counting detection system for mass spectrometry.

    Science.gov (United States)

    Collings, Bruce A; Dima, Martian D; Ivosev, Gordana; Zhong, Feng

    2014-01-30

    A high dynamic range pulse counting system has been developed that demonstrates an ability to operate at up to 2e8 counts per second (cps) on a triple quadrupole mass spectrometer. Previous pulse counting detection systems have typically been limited to about 1e7 cps at the upper end of the systems dynamic range. Modifications to the detection electronics and dead time correction algorithm are described in this paper. A high gain transimpedance amplifier is employed that allows a multi-channel electron multiplier to be operated at a significantly lower bias potential than in previous pulse counting systems. The system utilises a high-energy conversion dynode, a multi-channel electron multiplier, a high gain transimpedance amplifier, non-paralysing detection electronics and a modified dead time correction algorithm. Modification of the dead time correction algorithm is necessary due to a characteristic of the pulse counting electronics. A pulse counting detection system with the capability to count at ion arrival rates of up to 2e8 cps is described. This is shown to provide a linear dynamic range of nearly five orders of magnitude for a sample of aprazolam with concentrations ranging from 0.0006970 ng/mL to 3333 ng/mL while monitoring the m/z 309.1 → m/z 205.2 transition. This represents an upward extension of the detector's linear dynamic range of about two orders of magnitude. A new high dynamic range pulse counting system has been developed demonstrating the ability to operate at up to 2e8 cps on a triple quadrupole mass spectrometer. This provides an upward extension of the detector's linear dynamic range by about two orders of magnitude over previous pulse counting systems. Copyright © 2013 John Wiley & Sons, Ltd.

  17. Revaluering benchmarking - A topical theme for the construction industry

    DEFF Research Database (Denmark)

    Rasmussen, Grane Mikael Gregaard

    2011-01-01

    and questioning the concept objectively. This paper addresses the underlying nature of benchmarking, and accounts for the importance of focusing attention on the sociological impacts benchmarking has in organizations. To understand these sociological impacts, benchmarking research needs to transcend...... the perception of benchmarking systems as secondary and derivative and instead studying benchmarking as constitutive of social relations and as irredeemably social phenomena. I have attempted to do so in this paper by treating benchmarking using a calculative practice perspective, and describing how...

  18. Establishing benchmarks and metrics for utilization management.

    Science.gov (United States)

    Melanson, Stacy E F

    2014-01-01

    The changing environment of healthcare reimbursement is rapidly leading to a renewed appreciation of the importance of utilization management in the clinical laboratory. The process of benchmarking of laboratory operations is well established for comparing organizational performance to other hospitals (peers) and for trending data over time through internal benchmarks. However, there are relatively few resources available to assist organizations in benchmarking for laboratory utilization management. This article will review the topic of laboratory benchmarking with a focus on the available literature and services to assist in managing physician requests for laboratory testing. © 2013.

  19. How Benchmarking and Higher Education Came Together

    Science.gov (United States)

    Levy, Gary D.; Ronco, Sharron L.

    2012-01-01

    This chapter introduces the concept of benchmarking and how higher education institutions began to use benchmarking for a variety of purposes. Here, benchmarking is defined as a strategic and structured approach whereby an organization compares aspects of its processes and/or outcomes to those of another organization or set of organizations to…

  20. CONNECTION BETWEEN DYNAMICALLY DERIVED INITIAL MASS FUNCTION NORMALIZATION AND STELLAR POPULATION PARAMETERS

    International Nuclear Information System (INIS)

    McDermid, Richard M.; Cappellari, Michele; Bayet, Estelle; Bureau, Martin; Davies, Roger L.; Alatalo, Katherine; Blitz, Leo; Bois, Maxime; Bournaud, Frédéric; Duc, Pierre-Alain; Crocker, Alison F.; Davis, Timothy A.; De Zeeuw, P. T.; Emsellem, Eric; Kuntschner, Harald; Khochfar, Sadegh; Krajnović, Davor; Morganti, Raffaella; Oosterloo, Tom; Naab, Thorsten

    2014-01-01

    We report on empirical trends between the dynamically determined stellar initial mass function (IMF) and stellar population properties for a complete, volume-limited sample of 260 early-type galaxies from the ATLAS 3D project. We study trends between our dynamically derived IMF normalization α dyn ≡ (M/L) stars /(M/L) Salp and absorption line strengths, and interpret these via single stellar population-equivalent ages, abundance ratios (measured as [α/Fe]), and total metallicity, [Z/H]. We find that old and alpha-enhanced galaxies tend to have on average heavier (Salpeter-like) mass normalization of the IMF, but stellar population does not appear to be a good predictor of the IMF, with a large range of α dyn at a given population parameter. As a result, we find weak α dyn -[α/Fe] and α dyn –Age correlations and no significant α dyn –[Z/H] correlation. The observed trends appear significantly weaker than those reported in studies that measure the IMF normalization via the low-mass star demographics inferred through stellar spectral analysis

  1. Connection between Dynamically Derived Initial Mass Function Normalization and Stellar Population Parameters

    Science.gov (United States)

    McDermid, Richard M.; Cappellari, Michele; Alatalo, Katherine; Bayet, Estelle; Blitz, Leo; Bois, Maxime; Bournaud, Frédéric; Bureau, Martin; Crocker, Alison F.; Davies, Roger L.; Davis, Timothy A.; de Zeeuw, P. T.; Duc, Pierre-Alain; Emsellem, Eric; Khochfar, Sadegh; Krajnović, Davor; Kuntschner, Harald; Morganti, Raffaella; Naab, Thorsten; Oosterloo, Tom; Sarzi, Marc; Scott, Nicholas; Serra, Paolo; Weijmans, Anne-Marie; Young, Lisa M.

    2014-09-01

    We report on empirical trends between the dynamically determined stellar initial mass function (IMF) and stellar population properties for a complete, volume-limited sample of 260 early-type galaxies from the ATLAS3D project. We study trends between our dynamically derived IMF normalization αdyn ≡ (M/L)stars/(M/L)Salp and absorption line strengths, and interpret these via single stellar population-equivalent ages, abundance ratios (measured as [α/Fe]), and total metallicity, [Z/H]. We find that old and alpha-enhanced galaxies tend to have on average heavier (Salpeter-like) mass normalization of the IMF, but stellar population does not appear to be a good predictor of the IMF, with a large range of αdyn at a given population parameter. As a result, we find weak αdyn-[α/Fe] and αdyn -Age correlations and no significant αdyn -[Z/H] correlation. The observed trends appear significantly weaker than those reported in studies that measure the IMF normalization via the low-mass star demographics inferred through stellar spectral analysis.

  2. Plutonium Critical Mass Curve Comparison to Mass at Upper Subcritical Limit (USL) Using Whisper

    International Nuclear Information System (INIS)

    Alwin, Jennifer Louise; Zhang, Ning

    2016-01-01

    Whisper is computational software designed to assist the nuclear criticality safety analyst with validation studies with the MCNP ® Monte Carlo radiation transport package. Standard approaches to validation rely on the selection of benchmarks based upon expert judgment. Whisper uses sensitivity/uncertainty (S/U) methods to select relevant benchmarks to a particular application or set of applications being analyzed. Using these benchmarks, Whisper computes a calculational margin. Whisper attempts to quantify the margin of subcriticality (MOS) from errors in software and uncertainties in nuclear data. The combination of the Whisper-derived calculational margin and MOS comprise the baseline upper subcritical limit (USL), to which an additional margin may be applied by the nuclear criticality safety analyst as appropriate to ensure subcriticality. A series of critical mass curves for plutonium, similar to those found in Figure 31 of LA-10860-MS, have been generated using MCNP6.1.1 and the iterative parameter study software, WORM S olver. The baseline USL for each of the data points of the curves was then computed using Whisper 1.1. The USL was then used to determine the equivalent mass for plutonium metal-water system. ANSI/ANS-8.1 states that it is acceptable to use handbook data, such as the data directly from the LA-10860-MS, as it is already considered validated (Section 4.3 4) ''Use of subcritical limit data provided in ANSI/ANS standards or accepted reference publications does not require further validation.''). This paper attempts to take a novel approach to visualize traditional critical mass curves and allows comparison with the amount of mass for which the k eff is equal to the USL (calculational margin + margin of subcriticality). However, the intent is to plot the critical mass data along with USL, not to suggest that already accepted handbook data should have new and more rigorous requirements for validation.

  3. Benchmark for Evaluating Moving Object Indexes

    DEFF Research Database (Denmark)

    Chen, Su; Jensen, Christian Søndergaard; Lin, Dan

    2008-01-01

    that targets techniques for the indexing of the current and near-future positions of moving objects. This benchmark enables the comparison of existing and future indexing techniques. It covers important aspects of such indexes that have not previously been covered by any benchmark. Notable aspects covered......Progress in science and engineering relies on the ability to measure, reliably and in detail, pertinent properties of artifacts under design. Progress in the area of database-index design thus relies on empirical studies based on prototype implementations of indexes. This paper proposes a benchmark...... include update efficiency, query efficiency, concurrency control, and storage requirements. Next, the paper applies the benchmark to half a dozen notable moving-object indexes, thus demonstrating the viability of the benchmark and offering new insight into the performance properties of the indexes....

  4. Benchmarking infrastructure for mutation text mining.

    Science.gov (United States)

    Klein, Artjom; Riazanov, Alexandre; Hindle, Matthew M; Baker, Christopher Jo

    2014-02-25

    Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption.

  5. Benchmarking infrastructure for mutation text mining

    Science.gov (United States)

    2014-01-01

    Background Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. Results We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. Conclusion We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption. PMID:24568600

  6. Benchmarking: A Process for Improvement.

    Science.gov (United States)

    Peischl, Thomas M.

    One problem with the outcome-based measures used in higher education is that they measure quantity but not quality. Benchmarking, or the use of some external standard of quality to measure tasks, processes, and outputs, is partially solving that difficulty. Benchmarking allows for the establishment of a systematic process to indicate if outputs…

  7. Dynamics of Dwarf Galaxies Disfavor Stellar-Mass Black Holes as Dark Matter.

    Science.gov (United States)

    Koushiappas, Savvas M; Loeb, Abraham

    2017-07-28

    We study the effects of black hole dark matter on the dynamical evolution of stars in dwarf galaxies. We find that mass segregation leads to a depletion of stars in the center of dwarf galaxies and the appearance of a ring in the projected stellar surface density profile. Using Segue 1 as an example we show that current observations of the projected surface stellar density rule out at the 99.9% confidence level the possibility that more than 6% of the dark matter is composed of black holes with a mass of few tens of solar masses.

  8. Comprehensive Benchmark Suite for Simulation of Particle Laden Flows Using the Discrete Element Method with Performance Profiles from the Multiphase Flow with Interface eXchanges (MFiX) Code

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Peiyuan [Univ. of Colorado, Boulder, CO (United States); Brown, Timothy [Univ. of Colorado, Boulder, CO (United States); Fullmer, William D. [Univ. of Colorado, Boulder, CO (United States); Hauser, Thomas [Univ. of Colorado, Boulder, CO (United States); Hrenya, Christine [Univ. of Colorado, Boulder, CO (United States); Grout, Ray [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sitaraman, Hariswaran [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-01-29

    Five benchmark problems are developed and simulated with the computational fluid dynamics and discrete element model code MFiX. The benchmark problems span dilute and dense regimes, consider statistically homogeneous and inhomogeneous (both clusters and bubbles) particle concentrations and a range of particle and fluid dynamic computational loads. Several variations of the benchmark problems are also discussed to extend the computational phase space to cover granular (particles only), bidisperse and heat transfer cases. A weak scaling analysis is performed for each benchmark problem and, in most cases, the scalability of the code appears reasonable up to approx. 103 cores. Profiling of the benchmark problems indicate that the most substantial computational time is being spent on particle-particle force calculations, drag force calculations and interpolating between discrete particle and continuum fields. Hardware performance analysis was also carried out showing significant Level 2 cache miss ratios and a rather low degree of vectorization. These results are intended to serve as a baseline for future developments to the code as well as a preliminary indicator of where to best focus performance optimizations.

  9. An analytical model for the study of a small LFR core dynamics: development and benchmark

    International Nuclear Information System (INIS)

    Bortot, S.; Cammi, A.; Lorenzi, S.; Moisseytsev, A.

    2011-01-01

    An analytical model for the study of a small Lead-cooled Fast Reactor (LFR) control-oriented dynamics has been developed aimed at providing a useful, very flexible and straightforward, though accurate, tool allowing relatively quick transient design-basis and stability analyses. A simplified lumped-parameter approach has been adopted to couple neutronics and thermal-hydraulics: the point-kinetics approximation has been employed and an average-temperature heat-exchange model has been implemented. The reactor transient responses following postulated accident initiators such as Unprotected Control Rod Withdrawal (UTOP), Loss of Heat Sink (ULOHS) and Loss of Flow (ULOF) have been studied for a MOX and a metal-fuelled core at the Beginning of Cycle (BoC) and End of Cycle (EoC) configurations. A benchmark analysis has been then performed by means of the SAS4A/SASSYS-1 Liquid Metal Reactor Code System, in which a core model based on three representative channels has been built with the purpose of providing verification for the analytical outcomes and indicating how the latter relate to more realistic one-dimensional calculations. As a general result, responses concerning the main core characteristics (namely, power, reactivity, etc.) have turned out to be mutually consistent in terms of both steady-state absolute figures and transient developments, showing discrepancies of the order of only some percents, thus confirming a very satisfactory agreement. (author)

  10. Hospital benchmarking: are U.S. eye hospitals ready?

    Science.gov (United States)

    de Korne, Dirk F; van Wijngaarden, Jeroen D H; Sol, Kees J C A; Betz, Robert; Thomas, Richard C; Schein, Oliver D; Klazinga, Niek S

    2012-01-01

    Benchmarking is increasingly considered a useful management instrument to improve quality in health care, but little is known about its applicability in hospital settings. The aims of this study were to assess the applicability of a benchmarking project in U.S. eye hospitals and compare the results with an international initiative. We evaluated multiple cases by applying an evaluation frame abstracted from the literature to five U.S. eye hospitals that used a set of 10 indicators for efficiency benchmarking. Qualitative analysis entailed 46 semistructured face-to-face interviews with stakeholders, document analyses, and questionnaires. The case studies only partially met the conditions of the evaluation frame. Although learning and quality improvement were stated as overall purposes, the benchmarking initiative was at first focused on efficiency only. No ophthalmic outcomes were included, and clinicians were skeptical about their reporting relevance and disclosure. However, in contrast with earlier findings in international eye hospitals, all U.S. hospitals worked with internal indicators that were integrated in their performance management systems and supported benchmarking. Benchmarking can support performance management in individual hospitals. Having a certain number of comparable institutes provide similar services in a noncompetitive milieu seems to lay fertile ground for benchmarking. International benchmarking is useful only when these conditions are not met nationally. Although the literature focuses on static conditions for effective benchmarking, our case studies show that it is a highly iterative and learning process. The journey of benchmarking seems to be more important than the destination. Improving patient value (health outcomes per unit of cost) requires, however, an integrative perspective where clinicians and administrators closely cooperate on both quality and efficiency issues. If these worlds do not share such a relationship, the added

  11. Radiation Detection Computational Benchmark Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.; McDonald, Ben S.

    2013-09-24

    Modeling forms an important component of radiation detection development, allowing for testing of new detector designs, evaluation of existing equipment against a wide variety of potential threat sources, and assessing operation performance of radiation detection systems. This can, however, result in large and complex scenarios which are time consuming to model. A variety of approaches to radiation transport modeling exist with complementary strengths and weaknesses for different problems. This variety of approaches, and the development of promising new tools (such as ORNL’s ADVANTG) which combine benefits of multiple approaches, illustrates the need for a means of evaluating or comparing different techniques for radiation detection problems. This report presents a set of 9 benchmark problems for comparing different types of radiation transport calculations, identifying appropriate tools for classes of problems, and testing and guiding the development of new methods. The benchmarks were drawn primarily from existing or previous calculations with a preference for scenarios which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22. From a technical perspective, the benchmarks were chosen to span a range of difficulty and to include gamma transport, neutron transport, or both and represent different important physical processes and a range of sensitivity to angular or energy fidelity. Following benchmark identification, existing information about geometry, measurements, and previous calculations were assembled. Monte Carlo results (MCNP decks) were reviewed or created and re-run in order to attain accurate computational times and to verify agreement with experimental data, when present. Benchmark information was then conveyed to ORNL in order to guide testing and development of hybrid calculations. The results of those ADVANTG calculations were then sent to PNNL for

  12. WWER-1000 Burnup Credit Benchmark (CB5)

    International Nuclear Information System (INIS)

    Manolova, M.A.

    2002-01-01

    In the paper the specification of WWER-1000 Burnup Credit Benchmark first phase (depletion calculations), given. The second phase - criticality calculations for the WWER-1000 fuel pin cell, will be given after the evaluation of the results, obtained at the first phase. The proposed benchmark is a continuation of the WWER benchmark activities in this field (Author)

  13. The role of benchmarking for yardstick competition

    International Nuclear Information System (INIS)

    Burns, Phil; Jenkins, Cloda; Riechmann, Christoph

    2005-01-01

    With the increasing interest in yardstick regulation, there is a need to understand the most appropriate method for realigning tariffs at the outset. Benchmarking is the tool used for such realignment and is therefore a necessary first-step in the implementation of yardstick competition. A number of concerns have been raised about the application of benchmarking, making some practitioners reluctant to move towards yardstick based regimes. We assess five of the key concerns often discussed and find that, in general, these are not as great as perceived. The assessment is based on economic principles and experiences with applying benchmarking to regulated sectors, e.g. in the electricity and water industries in the UK, The Netherlands, Austria and Germany in recent years. The aim is to demonstrate that clarity on the role of benchmarking reduces the concern about its application in different regulatory regimes. We find that benchmarking can be used in regulatory settlements, although the range of possible benchmarking approaches that are appropriate will be small for any individual regulatory question. Benchmarking is feasible as total cost measures and environmental factors are better defined in practice than is commonly appreciated and collusion is unlikely to occur in environments with more than 2 or 3 firms (where shareholders have a role in monitoring and rewarding performance). Furthermore, any concern about companies under-recovering costs is a matter to be determined through the regulatory settlement and does not affect the case for using benchmarking as part of that settlement. (author)

  14. Dynamic modeling of fixed-bed adsorption of flue gas using a variable mass transfer model

    International Nuclear Information System (INIS)

    Park, Jehun; Lee, Jae W.

    2016-01-01

    This study introduces a dynamic mass transfer model for the fixed-bed adsorption of a flue gas. The derivation of the variable mass transfer coefficient is based on pore diffusion theory and it is a function of effective porosity, temperature, and pressure as well as the adsorbate composition. Adsorption experiments were done at four different pressures (1.8, 5, 10 and 20 bars) and three different temperatures (30, 50 and 70 .deg. C) with zeolite 13X as the adsorbent. To explain the equilibrium adsorption capacity, the Langmuir-Freundlich isotherm model was adopted, and the parameters of the isotherm equation were fitted to the experimental data for a wide range of pressures and temperatures. Then, dynamic simulations were performed using the system equations for material and energy balance with the equilibrium adsorption isotherm data. The optimal mass transfer and heat transfer coefficients were determined after iterative calculations. As a result, the dynamic variable mass transfer model can estimate the adsorption rate for a wide range of concentrations and precisely simulate the fixed-bed adsorption process of a flue gas mixture of carbon dioxide and nitrogen.

  15. Performance Benchmarking of Fast Multipole Methods

    KAUST Repository

    Al-Harthi, Noha A.

    2013-06-01

    The current trends in computer architecture are shifting towards smaller byte/flop ratios, while available parallelism is increasing at all levels of granularity – vector length, core count, and MPI process. Intel’s Xeon Phi coprocessor, NVIDIA’s Kepler GPU, and IBM’s BlueGene/Q all have a Byte/flop ratio close to 0.2, which makes it very difficult for most algorithms to extract a high percentage of the theoretical peak flop/s from these architectures. Popular algorithms in scientific computing such as FFT are continuously evolving to keep up with this trend in hardware. In the meantime it is also necessary to invest in novel algorithms that are more suitable for computer architectures of the future. The fast multipole method (FMM) was originally developed as a fast algorithm for ap- proximating the N-body interactions that appear in astrophysics, molecular dynamics, and vortex based fluid dynamics simulations. The FMM possesses have a unique combination of being an efficient O(N) algorithm, while having an operational intensity that is higher than a matrix-matrix multiplication. In fact, the FMM can reduce the requirement of Byte/flop to around 0.01, which means that it will remain compute bound until 2020 even if the cur- rent trend in microprocessors continues. Despite these advantages, there have not been any benchmarks of FMM codes on modern architectures such as Xeon Phi, Kepler, and Blue- Gene/Q. This study aims to provide a comprehensive benchmark of a state of the art FMM code “exaFMM” on the latest architectures, in hopes of providing a useful reference for deciding when the FMM will become useful as the computational engine in a given application code. It may also serve as a warning to certain problem size domains areas where the FMM will exhibit insignificant performance improvements. Such issues depend strongly on the asymptotic constants rather than the asymptotics themselves, and therefore are strongly implementation and hardware

  16. SP2Bench: A SPARQL Performance Benchmark

    Science.gov (United States)

    Schmidt, Michael; Hornung, Thomas; Meier, Michael; Pinkel, Christoph; Lausen, Georg

    A meaningful analysis and comparison of both existing storage schemes for RDF data and evaluation approaches for SPARQL queries necessitates a comprehensive and universal benchmark platform. We present SP2Bench, a publicly available, language-specific performance benchmark for the SPARQL query language. SP2Bench is settled in the DBLP scenario and comprises a data generator for creating arbitrarily large DBLP-like documents and a set of carefully designed benchmark queries. The generated documents mirror vital key characteristics and social-world distributions encountered in the original DBLP data set, while the queries implement meaningful requests on top of this data, covering a variety of SPARQL operator constellations and RDF access patterns. In this chapter, we discuss requirements and desiderata for SPARQL benchmarks and present the SP2Bench framework, including its data generator, benchmark queries and performance metrics.

  17. Helium generation reaction rates for 6Li and 10B in benchmark facilities

    International Nuclear Information System (INIS)

    Farrar, Harry IV; Oliver, B.M.; Lippincott, E.P.

    1980-01-01

    The helium generation rates for 10 B and 6 Li have been measured in two benchmark reactor facilities having neutron spectra similar to those found in a breeder reactor. The irradiations took place in the Coupled Fast Reactivity Measurements Facility (CFRMF) and in the 10% enriched 235 U critical assembly, BIG-10. The helium reaction rates were obtained by precise high-sensitivity gas mass spectrometric analyses of the helium content of numerous small samples. Comparison of these reaction rates with other reaction rates measured in the same facilities, and with rates calculated from published cross sections and from best estimates of the neutron spectral shapes, indicate significant discrepancies in the calculated values. Additional irradiations in other benchmark facilities have been undertaken to better determine the energy ranges where the discrepancies lie

  18. A mass-conserving lattice Boltzmann method with dynamic grid refinement for immiscible two-phase flows

    Energy Technology Data Exchange (ETDEWEB)

    Fakhari, Abbas, E-mail: afakhari@nd.edu [Department of Civil and Environmental Engineering and Earth Sciences, University of Notre Dame, Notre Dame, IN 46556 (United States); Geier, Martin [TU Braunschweig, Institute for Computational Modeling in Civil Engineering (iRMB), TU-Braunschweig, Pockelsstr. 3, 38106 Braunschweig (Germany); Lee, Taehun [Department of Mechanical Engineering, The City College of the City University of New York, New York, NY 10031 (United States)

    2016-06-15

    A mass-conserving lattice Boltzmann method (LBM) for multiphase flows is presented in this paper. The proposed LBM improves a previous model (Lee and Liu, 2010 [21]) in terms of mass conservation, speed-up, and efficiency, and also extends its capabilities for implementation on non-uniform grids. The presented model consists of a phase-field lattice Boltzmann equation (LBE) for tracking the interface between different fluids and a pressure-evolution LBM for recovering the hydrodynamic properties. In addition to the mass conservation property and the simplicity of the algorithm, the advantages of the current phase-field LBE are that it is an order of magnitude faster than the previous interface tracking LBE proposed by Lee and Liu (2010) [21] and it requires less memory resources for data storage. Meanwhile, the pressure-evolution LBM is equipped with a multi-relaxation-time (MRT) collision operator to facilitate attainability of small relaxation rates thereby allowing simulation of multiphase flows at higher Reynolds numbers. Additionally, we reformulate the presented MRT-LBM on nonuniform grids within an adaptive mesh refinement (AMR) framework. Various benchmark studies such as a rising bubble and a falling drop under buoyancy, droplet splashing on a wet surface, and droplet coalescence onto a fluid interface are conducted to examine the accuracy and versatility of the proposed AMR-LBM. The proposed model is further validated by comparing the results with other LB models on uniform grids. A factor of about 20 in savings of computational resources is achieved by using the proposed AMR-LBM. As a more demanding application, the Kelvin–Helmholtz instability (KHI) of a shear-layer flow is investigated for both density-matched and density-stratified binary fluids. The KHI results of the density-matched fluids are shown to be in good agreement with the benchmark AMR results based on the sharp-interface approach. When a density contrast between the two fluids exists, a

  19. A mass-conserving lattice Boltzmann method with dynamic grid refinement for immiscible two-phase flows

    International Nuclear Information System (INIS)

    Fakhari, Abbas; Geier, Martin; Lee, Taehun

    2016-01-01

    A mass-conserving lattice Boltzmann method (LBM) for multiphase flows is presented in this paper. The proposed LBM improves a previous model (Lee and Liu, 2010 [21]) in terms of mass conservation, speed-up, and efficiency, and also extends its capabilities for implementation on non-uniform grids. The presented model consists of a phase-field lattice Boltzmann equation (LBE) for tracking the interface between different fluids and a pressure-evolution LBM for recovering the hydrodynamic properties. In addition to the mass conservation property and the simplicity of the algorithm, the advantages of the current phase-field LBE are that it is an order of magnitude faster than the previous interface tracking LBE proposed by Lee and Liu (2010) [21] and it requires less memory resources for data storage. Meanwhile, the pressure-evolution LBM is equipped with a multi-relaxation-time (MRT) collision operator to facilitate attainability of small relaxation rates thereby allowing simulation of multiphase flows at higher Reynolds numbers. Additionally, we reformulate the presented MRT-LBM on nonuniform grids within an adaptive mesh refinement (AMR) framework. Various benchmark studies such as a rising bubble and a falling drop under buoyancy, droplet splashing on a wet surface, and droplet coalescence onto a fluid interface are conducted to examine the accuracy and versatility of the proposed AMR-LBM. The proposed model is further validated by comparing the results with other LB models on uniform grids. A factor of about 20 in savings of computational resources is achieved by using the proposed AMR-LBM. As a more demanding application, the Kelvin–Helmholtz instability (KHI) of a shear-layer flow is investigated for both density-matched and density-stratified binary fluids. The KHI results of the density-matched fluids are shown to be in good agreement with the benchmark AMR results based on the sharp-interface approach. When a density contrast between the two fluids exists, a

  20. The Influence of Slowly Varying Mass on Severity of Dynamics Nonlinearity of Bearing-Rotor Systems with Pedestal Looseness

    Directory of Open Access Journals (Sweden)

    Mian Jiang

    2018-01-01

    Full Text Available Nonlinearity measure is proposed to investigate the influence of slowly varying mass on severity of dynamics nonlinearity of bearing-rotor systems with pedestal looseness. A nonlinear mathematical model including the effect of slowly varying disk mass is developed for a bearing-rotor system with pedestal looseness. The varying of equivalent disk mass is described by a cosine function, and the amplitude coefficient is used as a control parameter. Then, nonlinearity measure is employed to quantify the severity of dynamics nonlinearity of bearing-rotor systems. With the increasing of looseness clearances, the curves that denote the trend of nonlinearity degree are plotted for each amplitude coefficient of mass varying. It can be concluded that larger amplitude coefficients of the disk mass varying will have more influence on the severity of dynamics nonlinearity and generation of chaotic behaviors in rotor systems with pedestal looseness.

  1. General squark flavour mixing: constraints, phenomenology and benchmarks

    CERN Document Server

    De Causmaecker, Karen; Herrmann, Bjoern; Mahmoudi, Farvah; O'Leary, Ben; Porod, Werner; Sekmen, Sezen; Strobbe, Nadja

    2015-11-19

    We present an extensive study of non-minimal flavour violation in the squark sector in the framework of the Minimal Supersymmetric Standard Model. We investigate the effects of multiple non-vanishing flavour-violating elements in the squark mass matrices by means of a Markov Chain Monte Carlo scanning technique and identify parameter combinations that are favoured by both current data and theoretical constraints. We then detail the resulting distributions of the flavour-conserving and flavour-violating model parameters. Based on this analysis, we propose a set of benchmark scenarios relevant for future studies of non-minimal flavour violation in the Minimal Supersymmetric Standard Model.

  2. RB reactor benchmark cores

    International Nuclear Information System (INIS)

    Pesic, M.

    1998-01-01

    A selected set of the RB reactor benchmark cores is presented in this paper. The first results of validation of the well-known Monte Carlo MCNP TM code and adjoining neutron cross section libraries are given. They confirm the idea for the proposal of the new U-D 2 O criticality benchmark system and support the intention to include this system in the next edition of the recent OECD/NEA Project: International Handbook of Evaluated Criticality Safety Experiment, in near future. (author)

  3. The influence of global benchmark oil prices on the regional oil spot market in multi-period evolution

    International Nuclear Information System (INIS)

    Jiang, Meihui; An, Haizhong; Jia, Xiaoliang; Sun, Xiaoqi

    2017-01-01

    Crude benchmark oil prices play a crucial role in energy policy and investment management. Previous research confined itself to studying the static, uncertain, short- or long-term relationship between global benchmark oil prices, ignoring the time-varying, quantitative, dynamic nature of the relationship during various stages of oil price volatility. This paper proposes a novel approach combining grey relation analysis, optimization wavelet analysis, and Bayesian network modeling to explore the multi-period evolution of the dynamic relationship between global benchmark oil prices and regional oil spot price. We analyze the evolution of the most significant decision-making risk periods, as well as the combined strategy-making reference oil prices and the corresponding periods during various stages of volatility. Furthermore, we determine that the network evolution of the quantitative lead/lag relationship between different influences of global benchmark oil prices shows a multi-period evolution phenomenon. For policy makers and market investors, our combined model can provide decision-making periods with the lowest expected risk and decision-making target reference oil prices and corresponding weights for strategy adjustment and market arbitrage. This study provides further information regarding period weights of target reference oil prices, facilitating efforts to perform multi-agent energy policy and intertemporal market arbitrage. - Highlights: • Multi-period evolution of the influence of different oil prices is discovered. • We combined grey relation analysis, optimization wavelet and Bayesian network. • The intensity of volatility, synchronization, and lead/lag effects are analyzed. • The target reference oil prices and corresponding period weights are determined.

  4. Benchmarking specialty hospitals, a scoping review on theory and practice.

    Science.gov (United States)

    Wind, A; van Harten, W H

    2017-04-04

    Although benchmarking may improve hospital processes, research on this subject is limited. The aim of this study was to provide an overview of publications on benchmarking in specialty hospitals and a description of study characteristics. We searched PubMed and EMBASE for articles published in English in the last 10 years. Eligible articles described a project stating benchmarking as its objective and involving a specialty hospital or specific patient category; or those dealing with the methodology or evaluation of benchmarking. Of 1,817 articles identified in total, 24 were included in the study. Articles were categorized into: pathway benchmarking, institutional benchmarking, articles on benchmark methodology or -evaluation and benchmarking using a patient registry. There was a large degree of variability:(1) study designs were mostly descriptive and retrospective; (2) not all studies generated and showed data in sufficient detail; and (3) there was variety in whether a benchmarking model was just described or if quality improvement as a consequence of the benchmark was reported upon. Most of the studies that described a benchmark model described the use of benchmarking partners from the same industry category, sometimes from all over the world. Benchmarking seems to be more developed in eye hospitals, emergency departments and oncology specialty hospitals. Some studies showed promising improvement effects. However, the majority of the articles lacked a structured design, and did not report on benchmark outcomes. In order to evaluate the effectiveness of benchmarking to improve quality in specialty hospitals, robust and structured designs are needed including a follow up to check whether the benchmark study has led to improvements.

  5. Development of a California commercial building benchmarking database

    International Nuclear Information System (INIS)

    Kinney, Satkartar; Piette, Mary Ann

    2002-01-01

    Building energy benchmarking is a useful starting point for commercial building owners and operators to target energy savings opportunities. There are a number of tools and methods for benchmarking energy use. Benchmarking based on regional data can provides more relevant information for California buildings than national tools such as Energy Star. This paper discusses issues related to benchmarking commercial building energy use and the development of Cal-Arch, a building energy benchmarking database for California. Currently Cal-Arch uses existing survey data from California's Commercial End Use Survey (CEUS), a largely underutilized wealth of information collected by California's major utilities. Doe's Commercial Building Energy Consumption Survey (CBECS) is used by a similar tool, Arch, and by a number of other benchmarking tools. Future versions of Arch/Cal-Arch will utilize additional data sources including modeled data and individual buildings to expand the database

  6. Third post-Newtonian dynamics of compact binaries: equations of motion in the centre-of-mass frame

    CERN Document Server

    Blanchet, L

    2003-01-01

    The equations of motion of compact binary systems and their associated Lagrangian formulation have been derived in previous works at the third post-Newtonian (3PN) approximation of general relativity in harmonic coordinates. In the present work, we investigate the binary's relative dynamics in the centre-of-mass frame (centre of mass located at the origin of the coordinates). We obtain the 3PN-accurate expressions of the centre-of-mass positions and equations of the relative binary motion. We show that the equations derive from a Lagrangian (neglecting the radiation reaction), from which we deduce the conserved centre-of-mass energy and angular momentum at the 3PN order. The harmonic-coordinates centre-of-mass Lagrangian is equivalent, via a contact transformation of the particles' variables, to the centre-of-mass Hamiltonian in ADM coordinates that is known from the post-Newtonian ADM-Hamiltonian formalism. As an application we investigate the dynamical stability of circular binary orbits at the 3PN order.

  7. A Method for Analyzing the Dynamic Response of a Structural System with Variable Mass, Damping and Stiffness

    Directory of Open Access Journals (Sweden)

    Mike D.R. Zhang

    2001-01-01

    Full Text Available In this paper, a method for analyzing the dynamic response of a structural system with variable mass, damping and stiffness is first presented. The dynamic equations of the structural system with variable mass and stiffness are derived according to the whole working process of a bridge bucket unloader. At the end of the paper, an engineering numerical example is given.

  8. How benchmarking can improve patient nutrition.

    Science.gov (United States)

    Ellis, Jane

    Benchmarking is a tool that originated in business to enable organisations to compare their services with industry-wide best practice. Early last year the Department of Health published The Essence of Care, a benchmarking toolkit adapted for use in health care. It focuses on eight elements of care that are crucial to patients' experiences. Nurses and other health care professionals at a London NHS trust have begun a trust-wide benchmarking project. The aim is to improve patients' experiences of health care by sharing and comparing information, and by identifying examples of good practice and areas for improvement. The project began with two of the eight elements of The Essence of Care, with the intention of covering the rest later. This article describes the benchmarking process for nutrition and some of the consequent improvements in care.

  9. XWeB: The XML Warehouse Benchmark

    Science.gov (United States)

    Mahboubi, Hadj; Darmont, Jérôme

    With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems.

  10. IAEA sodium void reactivity benchmark calculations

    International Nuclear Information System (INIS)

    Hill, R.N.; Finck, P.J.

    1992-01-01

    In this paper, the IAEA-1 992 ''Benchmark Calculation of Sodium Void Reactivity Effect in Fast Reactor Core'' problem is evaluated. The proposed design is a large axially heterogeneous oxide-fueled fast reactor as described in Section 2; the core utilizes a sodium plenum above the core to enhance leakage effects. The calculation methods used in this benchmark evaluation are described in Section 3. In Section 4, the calculated core performance results for the benchmark reactor model are presented; and in Section 5, the influence of steel and interstitial sodium heterogeneity effects is estimated

  11. Benchmark Imagery FY11 Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pope, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-06-14

    This report details the work performed in FY11 under project LL11-GS-PD06, “Benchmark Imagery for Assessing Geospatial Semantic Extraction Algorithms.” The original LCP for the Benchmark Imagery project called for creating a set of benchmark imagery for verifying and validating algorithms that extract semantic content from imagery. More specifically, the first year was slated to deliver real imagery that had been annotated, the second year to deliver real imagery that had composited features, and the final year was to deliver synthetic imagery modeled after the real imagery.

  12. Creating a benchmark of vertical axis wind turbines in dynamic stall for validating numerical models

    DEFF Research Database (Denmark)

    Castelein, D.; Ragni, D.; Tescione, G.

    2015-01-01

    An experimental campaign using Particle Image Velocimetry (2C-PIV) technique has been conducted on a H-type Vertical Axis Wind Turbine (VAWT) to create a benchmark for validating and comparing numerical models. The turbine is operated at tip speed ratios (TSR) of 4.5 and 2, at an average chord...

  13. Light hadrons from N{sub f}=2+1+1 dynamical twisted mass fermions

    Energy Technology Data Exchange (ETDEWEB)

    Baron, R. [CEA, Centre de Saclay, Gif-sur-Yvette (France). IRFU/Service de Physique Nucleaire; Blossier, B.; Boucaud, P. [Paris 11 Univ., Orsay (FR). Lab. de Physique Theorique] (and others)

    2011-01-15

    We present results of lattice QCD simulations with mass-degenerate up and down and mass-split strange and charm (N{sub f}=2+1+1) dynamical quarks using Wilson twisted mass fermions at maximal twist. The tuning of the strange and charm quark masses is performed at three values of the lattice spacing a{approx}0.06 fm, a{approx}0.08 fm and a{approx}0.09 fm with lattice sizes ranging from L{approx}1.9 fm to L{approx}3.9 fm. We perform a preliminary study of SU(2) chiral perturbation theory by combining our lattice data from these three values of the lattice spacing. (orig.)

  14. Benchmarking and Hardware-In-The-Loop Operation of a ...

    Science.gov (United States)

    Engine Performance evaluation in support of LD MTE. EPA used elements of its ALPHA model to apply hardware-in-the-loop (HIL) controls to the SKYACTIV engine test setup to better understand how the engine would operate in a chassis test after combined with future leading edge technologies, advanced high-efficiency transmission, reduced mass, and reduced roadload. Predict future vehicle performance with Atkinson engine. As part of its technology assessment for the upcoming midterm evaluation of the 2017-2025 LD vehicle GHG emissions regulation, EPA has been benchmarking engines and transmissions to generate inputs for use in its ALPHA model

  15. Review for session K - benchmarks

    International Nuclear Information System (INIS)

    McCracken, A.K.

    1980-01-01

    Eight of the papers to be considered in Session K are directly concerned, at least in part, with the Pool Critical Assembly (P.C.A.) benchmark at Oak Ridge. The remaining seven papers in this session, the subject of this review, are concerned with a variety of topics related to the general theme of Benchmarks and will be considered individually

  16. Topological mass of magnetic Skyrmions probed by ultrafast dynamic imaging

    International Nuclear Information System (INIS)

    Buettner, Felix

    2013-01-01

    In this thesis, we investigate the GHz dynamics of skyrmionic spin structures by means of pump-probe dynamic imaging to determine the equation of motion that governs the behavior of these technologically relevant spin structures. To achieve this goal, we first designed and optimized a perpendicular magnetic anisotropy CoB/Pt multilayer material for low magnetic pinning, as required for ultrafast pump-probe imaging experiments. Second, we developed an integrated sample design for X-ray holography capable of tracking relative magnetic positional changes down to 3 nm spatial resolution. These advances enabled us to image the trajectory of a single magnetic Skyrmion. We find that the motion is comprised of two gyrotropic modes, one clockwise and one counterclockwise. The existence of two modes shows that Skyrmions are massive quasiparticles. From their derived frequencies we find an inertial mass for the Skyrmion which is a factor of five larger than expected based on existing models for inertia in magnetism. Our results demonstrate that the mass of Skyrmions is based on a novel mechanism emerging from their confined nature, which is a direct consequence of their topology.

  17. Tourism Destination Benchmarking: Evaluation and Selection of the Benchmarking Partners

    Directory of Open Access Journals (Sweden)

    Luštický Martin

    2012-03-01

    Full Text Available Tourism development has an irreplaceable role in regional policy of almost all countries. This is due to its undeniable benefits for the local population with regards to the economic, social and environmental sphere. Tourist destinations compete for visitors at tourism market and subsequently get into a relatively sharp competitive struggle. The main goal of regional governments and destination management institutions is to succeed in this struggle by increasing the competitiveness of their destination. The quality of strategic planning and final strategies is a key factor of competitiveness. Even though the tourism sector is not the typical field where the benchmarking methods are widely used, such approaches could be successfully applied. The paper focuses on key phases of the benchmarking process which lies in the search for suitable referencing partners. The partners are consequently selected to meet general requirements to ensure the quality if strategies. Following from this, some specific characteristics are developed according to the SMART approach. The paper tests this procedure with an expert evaluation of eight selected regional tourism strategies of regions in the Czech Republic, Slovakia and Great Britain. In this way it validates the selected criteria in the frame of the international environment. Hence, it makes it possible to find strengths and weaknesses of selected strategies and at the same time facilitates the discovery of suitable benchmarking partners.

  18. Statistical benchmarking in utility regulation: Role, standards and methods

    International Nuclear Information System (INIS)

    Newton Lowry, Mark; Getachew, Lullit

    2009-01-01

    Statistical benchmarking is being used with increasing frequency around the world in utility rate regulation. We discuss how and where benchmarking is in use for this purpose and the pros and cons of regulatory benchmarking. We then discuss alternative performance standards and benchmarking methods in regulatory applications. We use these to propose guidelines for the appropriate use of benchmarking in the rate setting process. The standards, which we term the competitive market and frontier paradigms, have a bearing on method selection. These along with regulatory experience suggest that benchmarking can either be used for prudence review in regulation or to establish rates or rate setting mechanisms directly

  19. Development of a California commercial building benchmarking database

    Energy Technology Data Exchange (ETDEWEB)

    Kinney, Satkartar; Piette, Mary Ann

    2002-05-17

    Building energy benchmarking is a useful starting point for commercial building owners and operators to target energy savings opportunities. There are a number of tools and methods for benchmarking energy use. Benchmarking based on regional data can provides more relevant information for California buildings than national tools such as Energy Star. This paper discusses issues related to benchmarking commercial building energy use and the development of Cal-Arch, a building energy benchmarking database for California. Currently Cal-Arch uses existing survey data from California's Commercial End Use Survey (CEUS), a largely underutilized wealth of information collected by California's major utilities. Doe's Commercial Building Energy Consumption Survey (CBECS) is used by a similar tool, Arch, and by a number of other benchmarking tools. Future versions of Arch/Cal-Arch will utilize additional data sources including modeled data and individual buildings to expand the database.

  20. Dynamic Modeling Accuracy Dependence on Errors in Sensor Measurements, Mass Properties, and Aircraft Geometry

    Science.gov (United States)

    Grauer, Jared A.; Morelli, Eugene A.

    2013-01-01

    A nonlinear simulation of the NASA Generic Transport Model was used to investigate the effects of errors in sensor measurements, mass properties, and aircraft geometry on the accuracy of dynamic models identified from flight data. Measurements from a typical system identification maneuver were systematically and progressively deteriorated and then used to estimate stability and control derivatives within a Monte Carlo analysis. Based on the results, recommendations were provided for maximum allowable errors in sensor measurements, mass properties, and aircraft geometry to achieve desired levels of dynamic modeling accuracy. Results using other flight conditions, parameter estimation methods, and a full-scale F-16 nonlinear aircraft simulation were compared with these recommendations.

  1. Sfermion mass degeneracy, superconformal dynamics, and supersymmetric grand unified theories

    International Nuclear Information System (INIS)

    Kobayashi, Tatsuo; Noguchi, Tatsuya; Nakano, Hiroaki; Terao, Haruhiko

    2002-01-01

    We discuss issues in a scenario where hierarchical Yukawa couplings are generated through the strong dynamics of superconformal field theories (SCFTs). Independently of the mediation mechanism of supersymmetry breaking, the infrared convergence property of SCFTs can provide an interesting solution to the supersymmetric flavor problem; sfermion masses are suppressed around the decoupling scale of SCFTs and eventually become degenerate to some degree, thanks to family-independent radiative corrections governed by the gaugino masses of the minimal supersymmetric standard model (MSSM). We discuss under what conditions the degeneracy of the sfermion mass can be estimated in a simple manner. We also discuss the constraints from lepton flavor violations. We then explicitly study sfermion mass degeneracy within the framework of grand unified theories coupled to SCFTs. It is found that the degeneracy for right-handed sleptons becomes worse in the conventional SU(5) model than in the MSSM. On the other hand, in the flipped SU(5)xU(1) model, each right-handed lepton is still an SU(5) singlet, whereas the B-ino mass M 1 is determined by two independent gaugino masses of SU(5)xU(1). These two properties enable us to have an improved degeneracy for the right-handed sleptons. We also speculate on how further improvement can be obtained in the SCFT approach

  2. CONNECTION BETWEEN DYNAMICALLY DERIVED INITIAL MASS FUNCTION NORMALIZATION AND STELLAR POPULATION PARAMETERS

    Energy Technology Data Exchange (ETDEWEB)

    McDermid, Richard M. [Department of Physics and Astronomy, Macquarie University, Sydney NSW 2109 (Australia); Cappellari, Michele; Bayet, Estelle; Bureau, Martin; Davies, Roger L. [Sub-Department of Astrophysics, Department of Physics, University of Oxford, Denys Wilkinson Building, Keble Road, Oxford, OX1 3RH (United Kingdom); Alatalo, Katherine [Infrared Processing and Analysis Center, California Institute of Technology, Pasadena, CA 91125 (United States); Blitz, Leo [Department of Astronomy, Campbell Hall, University of California, Berkeley, CA 94720 (United States); Bois, Maxime [Observatoire de Paris, LERMA and CNRS, 61 Av. de l' Observatoire, F-75014 Paris (France); Bournaud, Frédéric; Duc, Pierre-Alain [Laboratoire AIM Paris-Saclay, CEA/IRFU/SAp- CNRS-Université Paris Diderot, F-91191 Gif-sur-Yvette Cedex (France); Crocker, Alison F. [Ritter Astrophysical Observatory, University of Toledo, Toledo, OH 43606 (United States); Davis, Timothy A.; De Zeeuw, P. T.; Emsellem, Eric; Kuntschner, Harald [European Southern Observatory, Karl-Schwarzschild-Str. 2, D-85748 Garching (Germany); Khochfar, Sadegh [Institute for Astronomy, University of Edinburgh, Royal Observatory, Edinburgh, EH9 3HJ (United Kingdom); Krajnović, Davor [Leibniz-Institut für Astrophysik Potsdam (AIP), An der Sternwarte 16, D-14482 Potsdam (Germany); Morganti, Raffaella; Oosterloo, Tom [Netherlands Institute for Radio Astronomy (ASTRON), Postbus 2, 7990 AA Dwingeloo (Netherlands); Naab, Thorsten, E-mail: richard.mcdermid@mq.edu.au [Max-Planck-Institut für Astrophysik, Karl-Schwarzschild-Str. 1, D-85741 Garching (Germany); and others

    2014-09-10

    We report on empirical trends between the dynamically determined stellar initial mass function (IMF) and stellar population properties for a complete, volume-limited sample of 260 early-type galaxies from the ATLAS{sup 3D} project. We study trends between our dynamically derived IMF normalization α{sub dyn} ≡ (M/L){sub stars}/(M/L){sub Salp} and absorption line strengths, and interpret these via single stellar population-equivalent ages, abundance ratios (measured as [α/Fe]), and total metallicity, [Z/H]. We find that old and alpha-enhanced galaxies tend to have on average heavier (Salpeter-like) mass normalization of the IMF, but stellar population does not appear to be a good predictor of the IMF, with a large range of α{sub dyn} at a given population parameter. As a result, we find weak α{sub dyn}-[α/Fe] and α{sub dyn} –Age correlations and no significant α{sub dyn} –[Z/H] correlation. The observed trends appear significantly weaker than those reported in studies that measure the IMF normalization via the low-mass star demographics inferred through stellar spectral analysis.

  3. Evidence of ghost suppression in gluon mass scale dynamics

    Science.gov (United States)

    Aguilar, A. C.; Binosi, D.; Figueiredo, C. T.; Papavassiliou, J.

    2018-03-01

    In this work we study the impact that the ghost sector of pure Yang-Mills theories may have on the generation of a dynamical gauge boson mass scale, which hinges on the appearance of massless poles in the fundamental vertices of the theory, and the subsequent realization of the well-known Schwinger mechanism. The process responsible for the formation of such structures is itself dynamical in nature, and is governed by a set of Bethe-Salpeter type of integral equations. While in previous studies the presence of massless poles was assumed to be exclusively associated with the background-gauge three-gluon vertex, in the present analysis we allow them to appear also in the corresponding ghost-gluon vertex. The full analysis of the resulting Bethe-Salpeter system reveals that the contribution of the poles associated with the ghost-gluon vertex are particularly suppressed, their sole discernible effect being a slight modification in the running of the gluon mass scale, for momenta larger than a few GeV. In addition, we examine the behavior of the (background-gauge) ghost-gluon vertex in the limit of vanishing ghost momentum, and derive the corresponding version of Taylor's theorem. These considerations, together with a suitable Ansatz, permit us the full reconstruction of the pole sector of the two vertices involved.

  4. 40 CFR 141.172 - Disinfection profiling and benchmarking.

    Science.gov (United States)

    2010-07-01

    ... benchmarking. 141.172 Section 141.172 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... Disinfection-Systems Serving 10,000 or More People § 141.172 Disinfection profiling and benchmarking. (a... sanitary surveys conducted by the State. (c) Disinfection benchmarking. (1) Any system required to develop...

  5. Raising Quality and Achievement. A College Guide to Benchmarking.

    Science.gov (United States)

    Owen, Jane

    This booklet introduces the principles and practices of benchmarking as a way of raising quality and achievement at further education colleges in Britain. Section 1 defines the concept of benchmarking. Section 2 explains what benchmarking is not and the steps that should be taken before benchmarking is initiated. The following aspects and…

  6. Prismatic Core Coupled Transient Benchmark

    International Nuclear Information System (INIS)

    Ortensi, J.; Pope, M.A.; Strydom, G.; Sen, R.S.; DeHart, M.D.; Gougar, H.D.; Ellis, C.; Baxter, A.; Seker, V.; Downar, T.J.; Vierow, K.; Ivanov, K.

    2011-01-01

    The Prismatic Modular Reactor (PMR) is one of the High Temperature Reactor (HTR) design concepts that have existed for some time. Several prismatic units have operated in the world (DRAGON, Fort St. Vrain, Peach Bottom) and one unit is still in operation (HTTR). The deterministic neutronics and thermal-fluids transient analysis tools and methods currently available for the design and analysis of PMRs have lagged behind the state of the art compared to LWR reactor technologies. This has motivated the development of more accurate and efficient tools for the design and safety evaluations of the PMR. In addition to the work invested in new methods, it is essential to develop appropriate benchmarks to verify and validate the new methods in computer codes. The purpose of this benchmark is to establish a well-defined problem, based on a common given set of data, to compare methods and tools in core simulation and thermal hydraulics analysis with a specific focus on transient events. The benchmark-working group is currently seeking OECD/NEA sponsorship. This benchmark is being pursued and is heavily based on the success of the PBMR-400 exercise.

  7. Benchmarking and Performance Management

    Directory of Open Access Journals (Sweden)

    Adrian TANTAU

    2010-12-01

    Full Text Available The relevance of the chosen topic is explained by the meaning of the firm efficiency concept - the firm efficiency means the revealed performance (how well the firm performs in the actual market environment given the basic characteristics of the firms and their markets that are expected to drive their profitability (firm size, market power etc.. This complex and relative performance could be due to such things as product innovation, management quality, work organization, some other factors can be a cause even if they are not directly observed by the researcher. The critical need for the management individuals/group to continuously improve their firm/company’s efficiency and effectiveness, the need for the managers to know which are the success factors and the competitiveness determinants determine consequently, what performance measures are most critical in determining their firm’s overall success. Benchmarking, when done properly, can accurately identify both successful companies and the underlying reasons for their success. Innovation and benchmarking firm level performance are critical interdependent activities. Firm level variables, used to infer performance, are often interdependent due to operational reasons. Hence, the managers need to take the dependencies among these variables into account when forecasting and benchmarking performance. This paper studies firm level performance using financial ratio and other type of profitability measures. It uses econometric models to describe and then propose a method to forecast and benchmark performance.

  8. Benchmarking of refinery emissions performance : Executive summary

    International Nuclear Information System (INIS)

    2003-07-01

    This study was undertaken to collect emissions performance data for Canadian and comparable American refineries. The objective was to examine parameters that affect refinery air emissions performance and develop methods or correlations to normalize emissions performance. Another objective was to correlate and compare the performance of Canadian refineries to comparable American refineries. For the purpose of this study, benchmarking involved the determination of levels of emission performance that are being achieved for generic groups of facilities. A total of 20 facilities were included in the benchmarking analysis, and 74 American refinery emission correlations were developed. The recommended benchmarks, and the application of those correlations for comparison between Canadian and American refinery performance, were discussed. The benchmarks were: sulfur oxides, nitrogen oxides, carbon monoxide, particulate, volatile organic compounds, ammonia and benzene. For each refinery in Canada, benchmark emissions were developed. Several factors can explain differences in Canadian and American refinery emission performance. 4 tabs., 7 figs

  9. How to Advance TPC Benchmarks with Dependability Aspects

    Science.gov (United States)

    Almeida, Raquel; Poess, Meikel; Nambiar, Raghunath; Patil, Indira; Vieira, Marco

    Transactional systems are the core of the information systems of most organizations. Although there is general acknowledgement that failures in these systems often entail significant impact both on the proceeds and reputation of companies, the benchmarks developed and managed by the Transaction Processing Performance Council (TPC) still maintain their focus on reporting bare performance. Each TPC benchmark has to pass a list of dependability-related tests (to verify ACID properties), but not all benchmarks require measuring their performances. While TPC-E measures the recovery time of some system failures, TPC-H and TPC-C only require functional correctness of such recovery. Consequently, systems used in TPC benchmarks are tuned mostly for performance. In this paper we argue that nowadays systems should be tuned for a more comprehensive suite of dependability tests, and that a dependability metric should be part of TPC benchmark publications. The paper discusses WHY and HOW this can be achieved. Two approaches are introduced and discussed: augmenting each TPC benchmark in a customized way, by extending each specification individually; and pursuing a more unified approach, defining a generic specification that could be adjoined to any TPC benchmark.

  10. ZZ ECN-BUBEBO, ECN-Petten Burnup Benchmark Book, Inventories, Afterheat

    International Nuclear Information System (INIS)

    Kloosterman, Jan Leen

    1999-01-01

    Description of program or function: Contains experimental benchmarks which can be used for the validation of burnup code systems and accompanied data libraries. Although the benchmarks presented here are thoroughly described in literature, it is in many cases not straightforward to retrieve unambiguously the correct input data and corresponding results from the benchmark Descriptions. Furthermore, results which can easily be measured, are sometimes difficult to calculate because of conversions to be made. Therefore, emphasis has been put to clarify the input of the benchmarks and to present the benchmark results in such a way that they can easily be calculated and compared. For more thorough Descriptions of the benchmarks themselves, the literature referred to here should be consulted. This benchmark book is divided in 11 chapters/files containing the following in text and tabular form: chapter 1: Introduction; chapter 2: Burnup Credit Criticality Benchmark Phase 1-B; chapter 3: Yankee-Rowe Core V Fuel Inventory Study; chapter 4: H.B. Robinson Unit 2 Fuel Inventory Study; chapter 5: Turkey Point Unit 3 Fuel Inventory Study; chapter 6: Turkey Point Unit 3 Afterheat Power Study; chapter 7: Dickens Benchmark on Fission Product Energy Release of U-235; chapter 8: Dickens Benchmark on Fission Product Energy Release of Pu-239; chapter 9: Yarnell Benchmark on Decay Heat Measurements of U-233; chapter 10: Yarnell Benchmark on Decay Heat Measurements of U-235; chapter 11: Yarnell Benchmark on Decay Heat Measurements of Pu-239

  11. Analysis of Benchmark 2 results

    International Nuclear Information System (INIS)

    Bacha, F.; Lefievre, B.; Maillard, J.; Silva, J.

    1994-01-01

    The code GEANT315 has been compared to different codes in two benchmarks. We analyze its performances through our results, especially in the thick target case. In spite of gaps in nucleus-nucleus interaction theories at intermediate energies, benchmarks allow possible improvements of physical models used in our codes. Thereafter, a scheme of radioactive waste burning system is studied. (authors). 4 refs., 7 figs., 1 tab

  12. Benchmarking for Best Practice

    CERN Document Server

    Zairi, Mohamed

    1998-01-01

    Benchmarking for Best Practice uses up-to-the-minute case-studies of individual companies and industry-wide quality schemes to show how and why implementation has succeeded. For any practitioner wanting to establish best practice in a wide variety of business areas, this book makes essential reading. .It is also an ideal textbook on the applications of TQM since it describes concepts, covers definitions and illustrates the applications with first-hand examples. Professor Mohamed Zairi is an international expert and leading figure in the field of benchmarking. His pioneering work in this area l

  13. HPCG Benchmark Technical Specification

    Energy Technology Data Exchange (ETDEWEB)

    Heroux, Michael Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dongarra, Jack [Univ. of Tennessee, Knoxville, TN (United States); Luszczek, Piotr [Univ. of Tennessee, Knoxville, TN (United States)

    2013-10-01

    The High Performance Conjugate Gradient (HPCG) benchmark [cite SNL, UTK reports] is a tool for ranking computer systems based on a simple additive Schwarz, symmetric Gauss-Seidel preconditioned conjugate gradient solver. HPCG is similar to the High Performance Linpack (HPL), or Top 500, benchmark [1] in its purpose, but HPCG is intended to better represent how today’s applications perform. In this paper we describe the technical details of HPCG: how it is designed and implemented, what code transformations are permitted and how to interpret and report results.

  14. [Do you mean benchmarking?].

    Science.gov (United States)

    Bonnet, F; Solignac, S; Marty, J

    2008-03-01

    The purpose of benchmarking is to settle improvement processes by comparing the activities to quality standards. The proposed methodology is illustrated by benchmark business cases performed inside medical plants on some items like nosocomial diseases or organization of surgery facilities. Moreover, the authors have built a specific graphic tool, enhanced with balance score numbers and mappings, so that the comparison between different anesthesia-reanimation services, which are willing to start an improvement program, is easy and relevant. This ready-made application is even more accurate as far as detailed tariffs of activities are implemented.

  15. Dynamics calculation with variable mass of mountain self-propelled chassis

    Directory of Open Access Journals (Sweden)

    R.M. Makharoblidze

    2016-12-01

    Full Text Available Many technological processes in the field of agricultural production mechanization, such as a grain crop, planting root-tuber fruits, fertilizing, spraying and dusting, pressing feed materials, harvesting of various cultures, etc. are performed by the machine-tractor units with variable mass of links or processed media and materials. In recent years, are also developing the systems of automatic control, adjusting and control of technological processes and working members in agriculture production. Is studied the dynamics of transition processes of mountain self-propelled chassis with variable mass at real change disconnect or joining masses that is most often used in the function of movement (m(t = ctm(t = ct. Are derived the formulas of change of velocity of movement on displacement of unit and is defined the dependence of this velocity on the tractor and technological machine performance, with taking into account the gradual increase or removing of agricultural materials masses. According to the equation is possible to define a linear movement of machine-tractor unit. According to the obtained expressions we can define the basic operating parameters of machine-tractor unit with variable mass. The results of research would be applied at definition of characteristics of units, at development of new agricultural tractors.

  16. Normal dynamic deformation characteristics of non-consecutive jointed rock masses under impact loads

    Science.gov (United States)

    Zeng, Sheng; Jiang, Bowei; Sun, Bing

    2017-08-01

    In order to study deformation characteristics of non-consecutive single jointed rock masses under impact loads, we used the cement mortar materials to make simulative jointed rock mass samples, and tested the samples under impact loads by the drop hammer. Through analyzing the time-history signal of the force and the displacement, first we find that the dynamic compression displacement of the jointed rock mass is significantly larger than that of the intact jointless rock mass, the compression displacement is positively correlated with the joint length and the impact height. Secondly, the vertical compressive displacement of the jointed rock mass is mainly due to the closure of opening joints under small impact loads. Finally, the peak intensity of the intact rock mass is larger than that of the non-consecutive jointed rock mass and negatively correlated with the joint length under the same impact energy.

  17. Benchmarking in digital circuit design automation

    NARCIS (Netherlands)

    Jozwiak, L.; Gawlowski, D.M.; Slusarczyk, A.S.

    2008-01-01

    This paper focuses on benchmarking, which is the main experimental approach to the design method and EDA-tool analysis, characterization and evaluation. We discuss the importance and difficulties of benchmarking, as well as the recent research effort related to it. To resolve several serious

  18. Benchmarking, Total Quality Management, and Libraries.

    Science.gov (United States)

    Shaughnessy, Thomas W.

    1993-01-01

    Discussion of the use of Total Quality Management (TQM) in higher education and academic libraries focuses on the identification, collection, and use of reliable data. Methods for measuring quality, including benchmarking, are described; performance measures are considered; and benchmarking techniques are examined. (11 references) (MES)

  19. GeneNetWeaver: in silico benchmark generation and performance profiling of network inference methods.

    Science.gov (United States)

    Schaffter, Thomas; Marbach, Daniel; Floreano, Dario

    2011-08-15

    Over the last decade, numerous methods have been developed for inference of regulatory networks from gene expression data. However, accurate and systematic evaluation of these methods is hampered by the difficulty of constructing adequate benchmarks and the lack of tools for a differentiated analysis of network predictions on such benchmarks. Here, we describe a novel and comprehensive method for in silico benchmark generation and performance profiling of network inference methods available to the community as an open-source software called GeneNetWeaver (GNW). In addition to the generation of detailed dynamical models of gene regulatory networks to be used as benchmarks, GNW provides a network motif analysis that reveals systematic prediction errors, thereby indicating potential ways of improving inference methods. The accuracy of network inference methods is evaluated using standard metrics such as precision-recall and receiver operating characteristic curves. We show how GNW can be used to assess the performance and identify the strengths and weaknesses of six inference methods. Furthermore, we used GNW to provide the international Dialogue for Reverse Engineering Assessments and Methods (DREAM) competition with three network inference challenges (DREAM3, DREAM4 and DREAM5). GNW is available at http://gnw.sourceforge.net along with its Java source code, user manual and supporting data. Supplementary data are available at Bioinformatics online. dario.floreano@epfl.ch.

  20. Nonlinear model updating applied to the IMAC XXXII Round Robin benchmark system

    Science.gov (United States)

    Kurt, Mehmet; Moore, Keegan J.; Eriten, Melih; McFarland, D. Michael; Bergman, Lawrence A.; Vakakis, Alexander F.

    2017-05-01

    We consider the application of a new nonlinear model updating strategy to a computational benchmark system. The approach relies on analyzing system response time series in the frequency-energy domain by constructing both Hamiltonian and forced and damped frequency-energy plots (FEPs). The system parameters are then characterized and updated by matching the backbone branches of the FEPs with the frequency-energy wavelet transforms of experimental and/or computational time series. The main advantage of this method is that no nonlinearity model is assumed a priori, and the system model is updated solely based on simulation and/or experimental measured time series. By matching the frequency-energy plots of the benchmark system and its reduced-order model, we show that we are able to retrieve the global strongly nonlinear dynamics in the frequency and energy ranges of interest, identify bifurcations, characterize local nonlinearities, and accurately reconstruct time series. We apply the proposed methodology to a benchmark problem, which was posed to the system identification community prior to the IMAC XXXII (2014) and XXXIII (2015) Conferences as a "Round Robin Exercise on Nonlinear System Identification". We show that we are able to identify the parameters of the non-linear element in the problem with a priori knowledge about its position.

  1. On the dynamical mass generation in gauge-invariant non-linear σ-models

    International Nuclear Information System (INIS)

    Diaz, A.; Helayel-Neto, J.A.; Smith, A.W.

    1987-12-01

    We argue that external gauge fields coupled in a gauge-invariant way to both the bosonic and supersymmetric two-dimensional non-linear σ-models acquire a dynamical mass term whenever the target space is restricted to be a group manifold. (author). 11 refs

  2. Benchmark studies of BOUT++ code and TPSMBI code on neutral transport during SMBI

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Y.H. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); University of Science and Technology of China, Hefei 230026 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China); Wang, Z.H., E-mail: zhwang@swip.ac.cn [Southwestern Institute of Physics, Chengdu 610041 (China); Guo, W., E-mail: wfguo@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China); Ren, Q.L. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Sun, A.P.; Xu, M.; Wang, A.K. [Southwestern Institute of Physics, Chengdu 610041 (China); Xiang, N. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei 230031 (China); Center for Magnetic Fusion Theory, Chinese Academy of Sciences, Hefei 230031 (China)

    2017-06-09

    SMBI (supersonic molecule beam injection) plays an important role in tokamak plasma fuelling, density control and ELM mitigation in magnetic confinement plasma physics, which has been widely used in many tokamaks. The trans-neut module of BOUT++ code is the only large-scale parallel 3D fluid code used to simulate the SMBI fueling process, while the TPSMBI (transport of supersonic molecule beam injection) code is a recent developed 1D fluid code of SMBI. In order to find a method to increase SMBI fueling efficiency in H-mode plasma, especially for ITER, it is significant to first verify the codes. The benchmark study between the trans-neut module of BOUT++ code and the TPSMBI code on radial transport dynamics of neutral during SMBI has been first successfully achieved in both slab and cylindrical coordinates. The simulation results from the trans-neut module of BOUT++ code and TPSMBI code are consistent very well with each other. Different upwind schemes have been compared to deal with the sharp gradient front region during the inward propagation of SMBI for the code stability. The influence of the WENO3 (weighted essentially non-oscillatory) and the third order upwind schemes on the benchmark results has also been discussed. - Highlights: • A 1D model of SMBI has developed. • Benchmarks of BOUT++ and TPSMBI codes have first been finished. • The influence of the WENO3 and the third order upwind schemes on the benchmark results has also been discussed.

  3. Study of LBS for characterization and analysis of big data benchmarks

    International Nuclear Information System (INIS)

    Chandio, A.A.; Zhang, F.; Memon, T.D.

    2014-01-01

    In the past few years, most organizations are gradually diverting their applications and services to Cloud. This is because Cloud paradigm enables (a) on-demand accessed and (b) large data processing for their applications and users on Internet anywhere in the world. The rapid growth of urbanization in developed and developing countries leads a new emerging concept called Urban Computing, one of the application domains that is rapidly deployed to the Cloud. More precisely, in the concept of Urban Computing, sensors, vehicles, devices, buildings, and roads are used as a component to probe city dynamics. Their data representation is widely available including GPS traces of vehicles. However, their applications are more towards data processing and storage hungry, which is due to their data increment in large volume starts from few dozen of TB (Tera Bytes) to thousands of PT (Peta Bytes) (i.e. Big Data). To increase the development and the assessment of the applications such as LBS (Location Based Services), a benchmark of Big Data is urgently needed. This research is a novel research on LBS to characterize and analyze the Big Data benchmarks. We focused on map-matching, which is being used as pre-processing step in many LBS applications. In this preliminary work, this paper also describes current status of Big Data benchmarks and our future direction. (author)

  4. Study on LBS for Characterization and Analysis of Big Data Benchmarks

    Directory of Open Access Journals (Sweden)

    Aftab Ahmed Chandio

    2014-10-01

    Full Text Available In the past few years, most organizations are gradually diverting their applications and services to Cloud. This is because Cloud paradigm enables (a on-demand accessed and (b large data processing for their applications and users on Internet anywhere in the world. The rapid growth of urbanization in developed and developing countries leads a new emerging concept called Urban Computing, one of the application domains that is rapidly deployed to the Cloud. More precisely, in the concept of Urban Computing, sensors, vehicles, devices, buildings, and roads are used as a component to probe city dynamics. Their data representation is widely available including GPS traces of vehicles. However, their applications are more towards data processing and storage hungry, which is due to their data increment in large volume starts from few dozen of TB (Tera Bytes to thousands of PT (Peta Bytes (i.e. Big Data. To increase the development and the assessment of the applications such as LBS (Location Based Services, a benchmark of Big Data is urgently needed. This research is a novel research on LBS to characterize and analyze the Big Data benchmarks. We focused on map-matching, which is being used as pre-processing step in many LBS applications. In this preliminary work, this paper also describes current status of Big Data benchmarks and our future direction

  5. Toxicological benchmarks for wildlife: 1994 Revision

    International Nuclear Information System (INIS)

    Opresko, D.M.; Sample, B.E.; Suter, G.W. II.

    1994-09-01

    The process by which ecological risks of environmental contaminants are evaluated is two-tiered. The first tier is a screening assessment where concentrations of contaminants in the environment are compared to toxicological benchmarks which represent concentrations of chemicals in environmental media (water, sediment, soil, food, etc.) that are presumed to be nonhazardous to the surrounding biota. The second tier is a baseline ecological risk assessment where toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. The report presents toxicological benchmarks for assessment of effects of 76 chemicals on 8 representative mammalian wildlife species and 31 chemicals on 9 avian wildlife species. The chemicals are some of those that occur at United States Department of Energy waste sites; the wildlife species were chosen because they are widely distributed and provide a representative range of body sizes and diets. Further descriptions of the chosen wildlife species and chemicals are provided in the report. The benchmarks presented in this report represent values believed to be nonhazardous for the listed wildlife species. These benchmarks only consider contaminant exposure through oral ingestion of contaminated media; exposure through inhalation or direct dermal exposure are not considered in this report

  6. Toxicological benchmarks for wildlife: 1994 Revision

    Energy Technology Data Exchange (ETDEWEB)

    Opresko, D.M.; Sample, B.E.; Suter, G.W. II

    1994-09-01

    The process by which ecological risks of environmental contaminants are evaluated is two-tiered. The first tier is a screening assessment where concentrations of contaminants in the environment are compared to toxicological benchmarks which represent concentrations of chemicals in environmental media (water, sediment, soil, food, etc.) that are presumed to be nonhazardous to the surrounding biota. The second tier is a baseline ecological risk assessment where toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. The report presents toxicological benchmarks for assessment of effects of 76 chemicals on 8 representative mammalian wildlife species and 31 chemicals on 9 avian wildlife species. The chemicals are some of those that occur at United States Department of Energy waste sites; the wildlife species were chosen because they are widely distributed and provide a representative range of body sizes and diets. Further descriptions of the chosen wildlife species and chemicals are provided in the report. The benchmarks presented in this report represent values believed to be nonhazardous for the listed wildlife species. These benchmarks only consider contaminant exposure through oral ingestion of contaminated media; exposure through inhalation or direct dermal exposure are not considered in this report.

  7. Modification of the Newtonian dynamics as a possible alternative to the hidden mass hypothesis1

    International Nuclear Information System (INIS)

    Milgrom, M.; The Institute for Advanced Study)

    1983-01-01

    I consider the possibility that there is not, in fact, much hidden mass in galaxies and galaxy systems. If a certain modified version of the Newtonian dynamics is used to describe the motion of bodies in a gravitational field (of a galaxy, say), the observational results are reproduced with no need to assume hidden mass in appreciable quantities. Various characteristics of galaxies result with no further assumptions. In the basis of the modification is the assumption that in the limit of small acceleration a 0 , the acceleration of a particle at distance r from a mass M satisfies approximately a 2 /a 0 roughly-equalMGr -2 , where a 0 is a constant of the dimensions of an acceleration. A success of this modified dynamics in explaining the data may be interpreted as implying a need to change the law of inertia in the limit of small accelerations or a more limited change of gravity alone. I discuss various observational constraints on possible theories for the modified dynamics from data which exist already and suggest other systems which may provide useful constraints

  8. INTEGRAL BENCHMARKS AVAILABLE THROUGH THE INTERNATIONAL REACTOR PHYSICS EXPERIMENT EVALUATION PROJECT AND THE INTERNATIONAL CRITICALITY SAFETY BENCHMARK EVALUATION PROJECT

    Energy Technology Data Exchange (ETDEWEB)

    J. Blair Briggs; Lori Scott; Enrico Sartori; Yolanda Rugama

    2008-09-01

    Interest in high-quality integral benchmark data is increasing as efforts to quantify and reduce calculational uncertainties accelerate to meet the demands of next generation reactor and advanced fuel cycle concepts. The International Reactor Physics Experiment Evaluation Project (IRPhEP) and the International Criticality Safety Benchmark Evaluation Project (ICSBEP) continue to expand their efforts and broaden their scope to identify, evaluate, and provide integral benchmark data for method and data validation. Benchmark model specifications provided by these two projects are used heavily by the international reactor physics, nuclear data, and criticality safety communities. Thus far, 14 countries have contributed to the IRPhEP, and 20 have contributed to the ICSBEP. The status of the IRPhEP and ICSBEP is discussed in this paper, and the future of the two projects is outlined and discussed. Selected benchmarks that have been added to the IRPhEP and ICSBEP handbooks since PHYSOR’06 are highlighted, and the future of the two projects is discussed.

  9. INTEGRAL BENCHMARKS AVAILABLE THROUGH THE INTERNATIONAL REACTOR PHYSICS EXPERIMENT EVALUATION PROJECT AND THE INTERNATIONAL CRITICALITY SAFETY BENCHMARK EVALUATION PROJECT

    International Nuclear Information System (INIS)

    J. Blair Briggs; Lori Scott; Enrico Sartori; Yolanda Rugama

    2008-01-01

    Interest in high-quality integral benchmark data is increasing as efforts to quantify and reduce calculational uncertainties accelerate to meet the demands of next generation reactor and advanced fuel cycle concepts. The International Reactor Physics Experiment Evaluation Project (IRPhEP) and the International Criticality Safety Benchmark Evaluation Project (ICSBEP) continue to expand their efforts and broaden their scope to identify, evaluate, and provide integral benchmark data for method and data validation. Benchmark model specifications provided by these two projects are used heavily by the international reactor physics, nuclear data, and criticality safety communities. Thus far, 14 countries have contributed to the IRPhEP, and 20 have contributed to the ICSBEP. The status of the IRPhEP and ICSBEP is discussed in this paper, and the future of the two projects is outlined and discussed. Selected benchmarks that have been added to the IRPhEP and ICSBEP handbooks since PHYSOR-06 are highlighted, and the future of the two projects is discussed

  10. Development of solutions to benchmark piping problems

    Energy Technology Data Exchange (ETDEWEB)

    Reich, M; Chang, T Y; Prachuktam, S; Hartzman, M

    1977-12-01

    Benchmark problems and their solutions are presented. The problems consist in calculating the static and dynamic response of selected piping structures subjected to a variety of loading conditions. The structures range from simple pipe geometries to a representative full scale primary nuclear piping system, which includes the various components and their supports. These structures are assumed to behave in a linear elastic fashion only, i.e., they experience small deformations and small displacements with no existing gaps, and remain elastic through their entire response. The solutions were obtained by using the program EPIPE, which is a modification of the widely available program SAP IV. A brief outline of the theoretical background of this program and its verification is also included.

  11. Automatic extraction of myocardial mass and volumes using parametric images from dynamic nongated PET

    DEFF Research Database (Denmark)

    Harms, Hendrik Johannes; Hansson, Nils Henrik Stubkjær; Tolbod, Lars Poulsen

    2016-01-01

    Dynamic cardiac positron emission tomography (PET) is used to quantify molecular processes in vivo. However, measurements of left-ventricular (LV) mass and volumes require electrocardiogram (ECG)-gated PET data. The aim of this study was to explore the feasibility of measuring LV geometry using non......-gated dynamic cardiac PET. METHODS: Thirty-five patients with aortic-valve stenosis and 10 healthy controls (HC) underwent a 27-min 11C-acetate PET/CT scan and cardiac magnetic resonance imaging (CMR). HC were scanned twice to assess repeatability. Parametric images of uptake rate K1 and the blood pool were......LV and WT only and an overestimation for LVEF at lower values. Intra- and inter-observer correlations were >0.95 for all PET measurements. PET repeatability accuracy in HC was comparable to CMR. CONCLUSION: LV mass and volumes are accurately and automatically generated from dynamic 11C-acetate PET without...

  12. Financial Integrity Benchmarks

    Data.gov (United States)

    City of Jackson, Mississippi — This data compiles standard financial integrity benchmarks that allow the City to measure its financial standing. It measure the City's debt ratio and bond ratings....

  13. Dynamics of Symmetric Conserved Mass Aggregation Model on Complex Networks

    Institute of Scientific and Technical Information of China (English)

    HUA Da-Yin

    2009-01-01

    We investigate the dynamical behaviour of the aggregation process in the symmetric conserved mass aggregation model under three different topological structures. The dispersion σ(t, L) = (∑i(mi - ρ0)2/L)1/2 is defined to describe the dynamical behaviour where ρ0 is the density of particle and mi is the particle number on a site. It is found numerically that for a regular lattice and a scale-free network, σ(t, L) follows a power-law scaling σ(t, L) ~ tδ1 and σ(t, L) ~ tδ4 from a random initial condition to the stationary states, respectively. However, for a small-world network, there are two power-law scaling regimes, σ(t, L) ~ tδ2 when t<T and a(t, L) ~ tδ3 when tT. Moreover, it is found numerically that δ2 is near to δ1 for small rewiring probability q, and δ3 hardly changes with varying q and it is almost the same as δ4. We speculate that the aggregation of the connection degree accelerates the mass aggregation in the initial relaxation stage and the existence of the long-distance interactions in the complex networks results in the acceleration of the mass aggregation when tT for the small-world networks. We also show that the relaxation time T follows a power-law scaling τ Lz and σ(t, L) in the stationary state follows a power-law σs(L) ~ Lσ for three different structures.

  14. Analysis of a molten salt reactor benchmark

    International Nuclear Information System (INIS)

    Ghosh, Biplab; Bajpai, Anil; Degweker, S.B.

    2013-01-01

    This paper discusses results of our studies of an IAEA molten salt reactor (MSR) benchmark. The benchmark, proposed by Japan, involves burnup calculations of a single lattice cell of a MSR for burning plutonium and other minor actinides. We have analyzed this cell with in-house developed burnup codes BURNTRAN and McBURN. This paper also presents a comparison of the results of our codes and those obtained by the proposers of the benchmark. (author)

  15. Dynamical gluon masses in perturbative calculations at the loop level

    International Nuclear Information System (INIS)

    Machado, Fatima A.; Natale, Adriano A.

    2013-01-01

    Full text: In the phenomenology of strong interactions one always has to deal at some extent with the interplay between perturbative and non-perturbative QCD. On one hand, the former has quite developed tools, yielded by asymptotic freedom. On the other, concerning the latter, we nowadays envisage the following scenario: 1) There are strong evidences for a dynamically massive gluon propagator and infrared finite coupling constant; 2) There is an extensive and successful use of an infrared finite coupling constant in phenomenological calculations at tree level; 3) The infrared finite coupling improves the perturbative series convergence; 4) The dynamical gluon mass provides a natural infrared cutoff in the physical processes at the tree level. Considering this scenario it is natural to ask how these non-perturbative results can be used in perturbative calculations of physical observables at the loop level. Recent papers discuss how off-shell gauge and renormalization group invariant Green functions can be computed with the use of the Pinch Technique (PT), with IR divergences removed by the dynamical gluon mass, and using a well defined effective charge. In this work we improve the former results by the authors, which evaluate 1-loop corrections to some two- and three-point functions of SU(3) pure Yang-Mills, investigating the dressing of quantities that could account for an extension of loop calculations to the infrared domain of the theory, in a way applicable to phenomenological calculations. One of these improvements is maintaining the gluon propagator transverse in such a scheme. (author)

  16. Reactor fuel depletion benchmark of TINDER

    International Nuclear Information System (INIS)

    Martin, W.J.; Oliveira, C.R.E. de; Hecht, A.A.

    2014-01-01

    Highlights: • A reactor burnup benchmark of TINDER, coupling MCNP6 to CINDER2008, was performed. • TINDER is a poor candidate for fuel depletion calculations using its current libraries. • Data library modification is necessary if fuel depletion is desired from TINDER. - Abstract: Accurate burnup calculations are key to proper nuclear reactor design, fuel cycle modeling, and disposal estimations. The TINDER code, originally designed for activation analyses, has been modified to handle full burnup calculations, including the widely used predictor–corrector feature. In order to properly characterize the performance of TINDER for this application, a benchmark calculation was performed. Although the results followed the trends of past benchmarked codes for a UO 2 PWR fuel sample from the Takahama-3 reactor, there were obvious deficiencies in the final result, likely in the nuclear data library that was used. Isotopic comparisons versus experiment and past code benchmarks are given, as well as hypothesized areas of deficiency and future work

  17. Automatic Extraction of Myocardial Mass and Volume Using Parametric Images from Dynamic Nongated PET.

    Science.gov (United States)

    Harms, Hendrik Johannes; Stubkjær Hansson, Nils Henrik; Tolbod, Lars Poulsen; Kim, Won Yong; Jakobsen, Steen; Bouchelouche, Kirsten; Wiggers, Henrik; Frøkiaer, Jørgen; Sörensen, Jens

    2016-09-01

    Dynamic cardiac PET is used to quantify molecular processes in vivo. However, measurements of left ventricular (LV) mass and volume require electrocardiogram-gated PET data. The aim of this study was to explore the feasibility of measuring LV geometry using nongated dynamic cardiac PET. Thirty-five patients with aortic-valve stenosis and 10 healthy controls underwent a 27-min (11)C-acetate PET/CT scan and cardiac MRI (CMR). The controls were scanned twice to assess repeatability. Parametric images of uptake rate K1 and the blood pool were generated from nongated dynamic data. Using software-based structure recognition, the LV wall was automatically segmented from K1 images to derive functional assessments of LV mass (mLV) and wall thickness. End-systolic and end-diastolic volumes were calculated using blood pool images and applied to obtain stroke volume and LV ejection fraction (LVEF). PET measurements were compared with CMR. High, linear correlations were found for LV mass (r = 0.95), end-systolic volume (r = 0.93), and end-diastolic volume (r = 0.90), and slightly lower correlations were found for stroke volume (r = 0.74), LVEF (r = 0.81), and thickness (r = 0.78). Bland-Altman analyses showed significant differences for mLV and thickness only and an overestimation for LVEF at lower values. Intra- and interobserver correlations were greater than 0.95 for all PET measurements. PET repeatability accuracy in the controls was comparable to CMR. LV mass and volume are accurately and automatically generated from dynamic (11)C-acetate PET without electrocardiogram gating. This method can be incorporated in a standard routine without any additional workload and can, in theory, be extended to other PET tracers. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  18. Benchmarking: contexts and details matter.

    Science.gov (United States)

    Zheng, Siyuan

    2017-07-05

    Benchmarking is an essential step in the development of computational tools. We take this opportunity to pitch in our opinions on tool benchmarking, in light of two correspondence articles published in Genome Biology.Please see related Li et al. and Newman et al. correspondence articles: www.dx.doi.org/10.1186/s13059-017-1256-5 and www.dx.doi.org/10.1186/s13059-017-1257-4.

  19. The need for speed: escape velocity and dynamical mass measurements of the Andromeda galaxy

    Science.gov (United States)

    Kafle, Prajwal R.; Sharma, Sanjib; Lewis, Geraint F.; Robotham, Aaron S. G.; Driver, Simon P.

    2018-04-01

    Our nearest large cosmological neighbour, the Andromeda galaxy (M31), is a dynamical system, and an accurate measurement of its total mass is central to our understanding of its assembly history, the life-cycles of its satellite galaxies, and its role in shaping the Local Group environment. Here, we apply a novel approach to determine the dynamical mass of M31 using high-velocity Planetary Nebulae, establishing a hierarchical Bayesian model united with a scheme to capture potential outliers and marginalize over tracers unknown distances. With this, we derive the escape velocity run of M31 as a function of galactocentric distance, with both parametric and non-parametric approaches. We determine the escape velocity of M31 to be 470 ± 40 km s-1 at a galactocentric distance of 15 kpc, and also, derive the total potential of M31, estimating the virial mass and radius of the galaxy to be 0.8 ± 0.1 × 1012 M⊙ and 240 ± 10 kpc, respectively. Our M31 mass is on the low side of the measured range, this supports the lower expected mass of the M31-Milky Way system from the timing and momentum arguments, satisfying the H I constraint on circular velocity between 10 ≲ R/ kpc < 35, and agreeing with the stellar mass Tully-Fisher relation. To place these results in a broader context, we compare them to the key predictions of the ΛCDM cosmological paradigm, including the stellar-mass-halo-mass and the dark matter halo concentration-virial mass correlation, and finding it to be an outlier to this relation.

  20. Benchmark analysis of MCNP trademark ENDF/B-VI iron

    International Nuclear Information System (INIS)

    Court, J.D.; Hendricks, J.S.

    1994-12-01

    The MCNP ENDF/B-VI iron cross-section data was subjected to four benchmark studies as part of the Hiroshima/Nagasaki dose re-evaluation for the National Academy of Science and the Defense Nuclear Agency. The four benchmark studies were: (1) the iron sphere benchmarks from the Lawrence Livermore Pulsed Spheres; (2) the Oak Ridge National Laboratory Fusion Reactor Shielding Benchmark; (3) a 76-cm diameter iron sphere benchmark done at the University of Illinois; (4) the Oak Ridge National Laboratory Benchmark for Neutron Transport through Iron. MCNP4A was used to model each benchmark and computational results from the ENDF/B-VI iron evaluations were compared to ENDF/B-IV, ENDF/B-V, the MCNP Recommended Data Set (which includes Los Alamos National Laboratory Group T-2 evaluations), and experimental data. The results show that the ENDF/B-VI iron evaluations are as good as, or better than, previous data sets

  1. Analysis of an OECD/NEA high-temperature reactor benchmark

    International Nuclear Information System (INIS)

    Hosking, J. G.; Newton, T. D.; Koeberl, O.; Morris, P.; Goluoglu, S.; Tombakoglu, T.; Colak, U.; Sartori, E.

    2006-01-01

    This paper describes analyses of the OECD/NEA HTR benchmark organized by the 'Working Party on the Scientific Issues of Reactor Systems (WPRS)', formerly the 'Working Party on the Physics of Plutonium Fuels and Innovative Fuel Cycles'. The benchmark was specifically designed to provide inter-comparisons for plutonium and thorium fuels when used in HTR systems. Calculations considering uranium fuel have also been included in the benchmark, in order to identify any increased uncertainties when using plutonium or thorium fuels. The benchmark consists of five phases, which include cell and whole-core calculations. Analysis of the benchmark has been performed by a number of international participants, who have used a range of deterministic and Monte Carlo code schemes. For each of the benchmark phases, neutronics parameters have been evaluated. Comparisons are made between the results of the benchmark participants, as well as comparisons between the predictions of the deterministic calculations and those from detailed Monte Carlo calculations. (authors)

  2. Mass exchange and angular distribution in a dynamical treatment of heavy ion collisions

    International Nuclear Information System (INIS)

    Ngo, C.; Hofmann, H.

    1977-01-01

    One presents a first numerical computation of the absolute value of the double differential cross section as a function of mass asymmetry and detection angle including a dynamical coupling between relative motion and mass asymmetry. One applies it to the 63 Cu+ 197 Au experiment at two different energies. The equation of motion used is a Fokker-Planck equation for distribution function in classical phase space. The coefficients needed are those known from classical model calculations, besides a friction coefficient introduced for the mass asymmetry degree. Encouraging agreement between the calculated and experimental curves is found

  3. Diagnostic Algorithm Benchmarking

    Science.gov (United States)

    Poll, Scott

    2011-01-01

    A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.

  4. Boiling water reactor turbine trip (TT) benchmark. Volume II: Summary Results of Exercise 1

    International Nuclear Information System (INIS)

    Akdeniz, Bedirhan; Ivanov, Kostadin N.; Olson, Andy M.

    2005-06-01

    The OECD Nuclear Energy Agency (NEA) completed under US Nuclear Regulatory Commission (NRC) sponsorship a PWR main steam line break (MSLB) benchmark against coupled system three-dimensional (3-D) neutron kinetics and thermal-hydraulic codes. Another OECD/NRC coupled-code benchmark was recently completed for a BWR turbine trip (TT) transient and is the object of the present report. Turbine trip transients in a BWR are pressurisation events in which the coupling between core space-dependent neutronic phenomena and system dynamics plays an important role. The data made available from actual experiments carried out at the Peach Bottom 2 plant make the present benchmark particularly valuable. While defining and coordinating the BWR TT benchmark, a systematic approach and level methodology not only allowed for a consistent and comprehensive validation process, but also contributed to the study of key parameters of pressurisation transients. The benchmark consists of three separate exercises, two initial states and five transient scenarios. The BWR TT Benchmark will be published in four volumes as NEA reports. CD-ROMs will also be prepared and will include the four reports and the transient boundary conditions, decay heat values as a function of time, cross-section libraries and supplementary tables and graphs not published in the paper version. BWR TT Benchmark - Volume I: Final Specifications was issued in 2001 [NEA/NSC/DOC(2001)]. The benchmark team [Pennsylvania State University (PSU) in co-operation with Exelon Nuclear and the NEA] has been responsible for coordinating benchmark activities, answering participant questions and assisting them in developing their models, as well as analysing submitted solutions and providing reports summarising the results for each phase. The benchmark team has also been involved in the technical aspects of the benchmark, including sensitivity studies for the different exercises. Volume II summarises the results for Exercise 1 of the

  5. Computation of the chiral condensate using Nf=2 and Nf=2+1+1 dynamical flavors of twisted mass fermions

    International Nuclear Information System (INIS)

    Cichy, K.; Jansen, K.; Shindler, A.; Forschungszentrum Juelich; Forschungszentrum Juelich

    2013-12-01

    We apply the spectral projector method, recently introduced by Giusti and Luescher, to compute the chiral condensate using N f =2 and N f =2+1+1 dynamical flavors of maximally twisted mass fermions. We present our results for several quark masses at three different lattice spacings which allows us to perform the chiral and continuum extrapolations. In addition we report our analysis on the O(a) improvement of the chiral condensate for twisted mass fermions. We also study the effect of the dynamical strange and charm quarks by comparing our results for N f =2 and N f =2+1+1 dynamical flavors.

  6. MoMaS reactive transport benchmark using PFLOTRAN

    Science.gov (United States)

    Park, H.

    2017-12-01

    MoMaS benchmark was developed to enhance numerical simulation capability for reactive transport modeling in porous media. The benchmark was published in late September of 2009; it is not taken from a real chemical system, but realistic and numerically challenging tests. PFLOTRAN is a state-of-art massively parallel subsurface flow and reactive transport code that is being used in multiple nuclear waste repository projects at Sandia National Laboratories including Waste Isolation Pilot Plant and Used Fuel Disposition. MoMaS benchmark has three independent tests with easy, medium, and hard chemical complexity. This paper demonstrates how PFLOTRAN is applied to this benchmark exercise and shows results of the easy benchmark test case which includes mixing of aqueous components and surface complexation. Surface complexations consist of monodentate and bidentate reactions which introduces difficulty in defining selectivity coefficient if the reaction applies to a bulk reference volume. The selectivity coefficient becomes porosity dependent for bidentate reaction in heterogeneous porous media. The benchmark is solved by PFLOTRAN with minimal modification to address the issue and unit conversions were made properly to suit PFLOTRAN.

  7. A comparative study on effective dynamic modeling methods for flexible pipe

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Ho; Hong, Sup; Kim, Hyung Woo [Korea Research Institute of Ships and Ocean Engineering, Daejeon (Korea, Republic of); Kim, Sung Soo [Chungnam National University, Daejeon (Korea, Republic of)

    2015-07-15

    In this paper, in order to select a suitable method that is applicable to the large deflection with a small strain problem of pipe systems in the deep seabed mining system, the finite difference method with lumped mass from the field of cable dynamics and the substructure method from the field of flexible multibody dynamics were compared. Due to the difficulty of obtaining experimental results from an actual pipe system in the deep seabed mining system, a thin cantilever beam model with experimental results was employed for the comparative study. Accuracy of the methods was investigated by comparing the experimental results and simulation results from the cantilever beam model with different numbers of elements. Efficiency of the methods was also examined by comparing the operational counts required for solving equations of motion. Finally, this cantilever beam model with comparative study results can be promoted to be a benchmark problem for the flexible multibody dynamics.

  8. Criteria of benchmark selection for efficient flexible multibody system formalisms

    Directory of Open Access Journals (Sweden)

    Valášek M.

    2007-10-01

    Full Text Available The paper deals with the selection process of benchmarks for testing and comparing efficient flexible multibody formalisms. The existing benchmarks are briefly summarized. The purposes for benchmark selection are investigated. The result of this analysis is the formulation of the criteria of benchmark selection for flexible multibody formalisms. Based on them the initial set of suitable benchmarks is described. Besides that the evaluation measures are revised and extended.

  9. On the dynamics of tapered vibro-impacting cantilever with tip mass

    Energy Technology Data Exchange (ETDEWEB)

    Gandhi, P. S.; Vyas, Vishal [Suman Mashruwala Advanced Microengineering Laboratory, Dept. of Mechanical Engineering, Indian Institute of Technology - Bombay, Mumai (India)

    2017-01-15

    This paper explores nonlinear dynamic behavior of vibro-impacting tapered cantilever with tip mass with regard to frequency response analysis. A typical frequency response curve of vibro-impacting beams displays well-known resonance frequency shift along with a hysteric jump and drop phenomena. We did a comprehensive parametric analysis capturing the effects of taper, tip-mass, stop location, and gap on the non-smooth frequency response. Analysis is presented in a non-dimensional form useful for other similar cases. Simulation results are further validated with corresponding experimental results for a few cases. Illustrative comparison of simulation results for varying parameters brings out several interesting aspects of variation in the nonlinear behavior.

  10. Criticality safety benchmarking of PASC-3 and ECNJEF1.1

    International Nuclear Information System (INIS)

    Li, J.

    1992-09-01

    To validate the code system PASC-3 and the multigroup cross section library ECNJEF1.1 on various applications many benchmarks are required. This report presents the results of critically safety benchmarking for five calculational and four experimental benchmarks. These benchmarks are related to the transport package of fissile materials such as spent fuel. The fissile nuclides in these benchmarks are 235 U and 239 Pu. The modules of PASC-3 which have been used for the calculations are BONAMI, NITAWL and KENO.5A. The final results for the experimental benchmarks do agree well with experimental data. For the calculational benchmarks the results presented here are in reasonable agreement with the results from other investigations. (author). 8 refs.; 20 figs.; 5 tabs

  11. Kinematic scaling relations of CALIFA galaxies: A dynamical mass proxy for galaxies across the Hubble sequence.

    Science.gov (United States)

    Aquino-Ortíz, E.; Valenzuela, O.; Sánchez, S. F.; Hernández-Toledo, H.; Ávila-Reese, V.; van de Ven, G.; Rodríguez-Puebla, A.; Zhu, L.; Mancillas, B.; Cano-Díaz, M.; García-Benito, R.

    2018-06-01

    We used ionized gas and stellar kinematics for 667 spatially resolved galaxies publicly available from the Calar Alto Legacy Integral Field Area survey (CALIFA) 3rd Data Release with the aim of studying kinematic scaling relations as the Tully & Fisher (TF) relation using rotation velocity, Vrot, the Faber & Jackson (FJ) relation using velocity dispersion, σ, and also a combination of Vrot and σ through the SK parameter defined as SK^2 = KV_{rot}^2 + σ ^2 with constant K. Late-type and early-type galaxies reproduce the TF and FJ relations. Some early-type galaxies also follow the TF relation and some late-type galaxies the FJ relation, but always with larger scatter. On the contrary, when we use the SK parameter, all galaxies, regardless of the morphological type, lie on the same scaling relation, showing a tight correlation with the total stellar mass, M⋆. Indeed, we find that the scatter in this relation is smaller or equal to that of the TF and FJ relations. We explore different values of the K parameter without significant differences (slope and scatter) in our final results with respect the case K = 0.5 besides than a small change in the zero point. We calibrate the kinematic SK^2 dynamical mass proxy in order to make it consistent with sophisticated published dynamical models within 0.15 dex. We show that the SK proxy is able to reproduce the relation between the dynamical mass and the stellar mass in the inner regions of galaxies. Our result may be useful in order to produce fast estimations of the central dynamical mass in galaxies and to study correlations in large galaxy surveys.

  12. Revaluering benchmarking - A topical theme for the construction industry

    OpenAIRE

    Rasmussen, Grane Mikael Gregaard

    2011-01-01

    Over the past decade, benchmarking has increasingly gained foothold in the construction industry. The predominant research, perceptions and uses of benchmarking are valued so strongly and uniformly, that what may seem valuable, is actually abstaining researchers and practitioners from studying and questioning the concept objectively. This paper addresses the underlying nature of benchmarking, and accounts for the importance of focusing attention on the sociological impacts benchmarking has in...

  13. Numerical methods: Analytical benchmarking in transport theory

    International Nuclear Information System (INIS)

    Ganapol, B.D.

    1988-01-01

    Numerical methods applied to reactor technology have reached a high degree of maturity. Certainly one- and two-dimensional neutron transport calculations have become routine, with several programs available on personal computer and the most widely used programs adapted to workstation and minicomputer computational environments. With the introduction of massive parallelism and as experience with multitasking increases, even more improvement in the development of transport algorithms can be expected. Benchmarking an algorithm is usually not a very pleasant experience for the code developer. Proper algorithmic verification by benchmarking involves the following considerations: (1) conservation of particles, (2) confirmation of intuitive physical behavior, and (3) reproduction of analytical benchmark results. By using today's computational advantages, new basic numerical methods have been developed that allow a wider class of benchmark problems to be considered

  14. Benchmarking and validation activities within JEFF project

    Directory of Open Access Journals (Sweden)

    Cabellos O.

    2017-01-01

    Full Text Available The challenge for any nuclear data evaluation project is to periodically release a revised, fully consistent and complete library, with all needed data and covariances, and ensure that it is robust and reliable for a variety of applications. Within an evaluation effort, benchmarking activities play an important role in validating proposed libraries. The Joint Evaluated Fission and Fusion (JEFF Project aims to provide such a nuclear data library, and thus, requires a coherent and efficient benchmarking process. The aim of this paper is to present the activities carried out by the new JEFF Benchmarking and Validation Working Group, and to describe the role of the NEA Data Bank in this context. The paper will also review the status of preliminary benchmarking for the next JEFF-3.3 candidate cross-section files.

  15. Benchmarking and validation activities within JEFF project

    Science.gov (United States)

    Cabellos, O.; Alvarez-Velarde, F.; Angelone, M.; Diez, C. J.; Dyrda, J.; Fiorito, L.; Fischer, U.; Fleming, M.; Haeck, W.; Hill, I.; Ichou, R.; Kim, D. H.; Klix, A.; Kodeli, I.; Leconte, P.; Michel-Sendis, F.; Nunnenmann, E.; Pecchia, M.; Peneliau, Y.; Plompen, A.; Rochman, D.; Romojaro, P.; Stankovskiy, A.; Sublet, J. Ch.; Tamagno, P.; Marck, S. van der

    2017-09-01

    The challenge for any nuclear data evaluation project is to periodically release a revised, fully consistent and complete library, with all needed data and covariances, and ensure that it is robust and reliable for a variety of applications. Within an evaluation effort, benchmarking activities play an important role in validating proposed libraries. The Joint Evaluated Fission and Fusion (JEFF) Project aims to provide such a nuclear data library, and thus, requires a coherent and efficient benchmarking process. The aim of this paper is to present the activities carried out by the new JEFF Benchmarking and Validation Working Group, and to describe the role of the NEA Data Bank in this context. The paper will also review the status of preliminary benchmarking for the next JEFF-3.3 candidate cross-section files.

  16. Benchmarking

    OpenAIRE

    Beretta Sergio; Dossi Andrea; Grove Hugh

    2000-01-01

    Due to their particular nature, the benchmarking methodologies tend to exceed the boundaries of management techniques, and to enter the territories of managerial culture. A culture that is also destined to break into the accounting area not only strongly supporting the possibility of fixing targets, and measuring and comparing the performance (an aspect that is already innovative and that is worthy of attention), but also questioning one of the principles (or taboos) of the accounting or...

  17. Topological susceptibility and chiral condensate with Nf=2+1+1 dynamical flavors of maximally twisted mass fermions

    International Nuclear Information System (INIS)

    Cichy, K.

    2012-03-01

    We study the 'spectral projector' method for the computation of the chiral condensate and the topological susceptibility, using N f =2+1+1 dynamical flavors of maximally twisted mass Wilson fermions. In particular, we perform a study of the quark mass dependence of the chiral condensate Σ and topological susceptibility χ top in the range 270 MeV π top in the quenched approximation where we match the lattice spacing to the N f =2+1+1 dynamical simulations. Using the Kaon, η and η' meson masses computed on the N f =2+1+1 ensembles, we then perform a preliminary test of the Witten-Veneziano relation.

  18. Three-dimensional RAMA fluence methodology benchmarking

    International Nuclear Information System (INIS)

    Baker, S. P.; Carter, R. G.; Watkins, K. E.; Jones, D. B.

    2004-01-01

    This paper describes the benchmarking of the RAMA Fluence Methodology software, that has been performed in accordance with U. S. Nuclear Regulatory Commission Regulatory Guide 1.190. The RAMA Fluence Methodology has been developed by TransWare Enterprises Inc. through funding provided by the Electric Power Research Inst., Inc. (EPRI) and the Boiling Water Reactor Vessel and Internals Project (BWRVIP). The purpose of the software is to provide an accurate method for calculating neutron fluence in BWR pressure vessels and internal components. The Methodology incorporates a three-dimensional deterministic transport solution with flexible arbitrary geometry representation of reactor system components, previously available only with Monte Carlo solution techniques. Benchmarking was performed on measurements obtained from three standard benchmark problems which include the Pool Criticality Assembly (PCA), VENUS-3, and H. B. Robinson Unit 2 benchmarks, and on flux wire measurements obtained from two BWR nuclear plants. The calculated to measured (C/M) ratios range from 0.93 to 1.04 demonstrating the accuracy of the RAMA Fluence Methodology in predicting neutron flux, fluence, and dosimetry activation. (authors)

  19. Handbook of critical experiments benchmarks

    International Nuclear Information System (INIS)

    Durst, B.M.; Bierman, S.R.; Clayton, E.D.

    1978-03-01

    Data from critical experiments have been collected together for use as benchmarks in evaluating calculational techniques and nuclear data. These benchmarks have been selected from the numerous experiments performed on homogeneous plutonium systems. No attempt has been made to reproduce all of the data that exists. The primary objective in the collection of these data is to present representative experimental data defined in a concise, standardized format that can easily be translated into computer code input

  20. On the nucleon effective mass role to the high energy proton spallation reactions

    Energy Technology Data Exchange (ETDEWEB)

    Santos, B.M., E-mail: biank_ce@if.uff.br [Instituto de Física, Universidade Federal Fluminense, Av. Gal. Milton Tavares de Souza, 24210-346 Niterói, RJ (Brazil); Instituto Militar de Engenharia, Praça General Tibúrcio 80, 22290-270 Rio de Janeiro, RJ (Brazil); Pinheiro, A.R.C. [Centro Brasileiro de Pesquisas Físicas, Rua Dr. Xavier Sigaud 150, 22290-180 Rio de Janeiro, RJ (Brazil); Universidade Federal do Acre, BR 364 km 04, 69920-900 Rio Branco, AC (Brazil); Gonçalves, M. [Comissão Nacional de Energia Nuclear, Rua General Severiano 90, 22290-901 Rio de Janeiro, RJ (Brazil); Duarte, S.B. [Centro Brasileiro de Pesquisas Físicas, Rua Dr. Xavier Sigaud 150, 22290-180 Rio de Janeiro, RJ (Brazil); Cabral, R.G. [Instituto Militar de Engenharia, Praça General Tibúrcio 80, 22290-270 Rio de Janeiro, RJ (Brazil)

    2016-04-15

    We explore the effect of the nucleon effective mass to the dynamic evolution of the rapid phase of proton–nucleus spallation reactions. The analysis of the relaxation time for the non-equilibrium phase is studied by variations in the effective mass parameter. We determine the final excitation energy of the hot residual nucleus at the end of cascade phase and the de-excitation of the nuclear system is carried out considering the competition of particle evaporation and fission processes. It was shown that the excitation energy depends of the hot compound residual nucleus at the end of the rapid phase on the changing effective mass. The multiplicity of particles was also analyzed in cascade and evaporation phase of the reaction. The use of nucleon effective mass during cascade phase can be considered as an effect of the many-body nuclear interactions not included explicitly in a treatment to the nucleon–nucleon interaction inside the nucleus. This procedure represents a more realistic scenario to obtain the neutron multiplicity generated in this reaction, which is a benchmark for the calculation of the neutronic in the ADS reactors.

  1. Benchmarking computer platforms for lattice QCD applications

    International Nuclear Information System (INIS)

    Hasenbusch, M.; Jansen, K.; Pleiter, D.; Stueben, H.; Wegner, P.; Wettig, T.; Wittig, H.

    2004-01-01

    We define a benchmark suite for lattice QCD and report on benchmark results from several computer platforms. The platforms considered are apeNEXT, CRAY T3E; Hitachi SR8000, IBM p690, PC-Clusters, and QCDOC

  2. H.B. Robinson-2 pressure vessel benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Remec, I.; Kam, F.B.K.

    1998-02-01

    The H. B. Robinson Unit 2 Pressure Vessel Benchmark (HBR-2 benchmark) is described and analyzed in this report. Analysis of the HBR-2 benchmark can be used as partial fulfillment of the requirements for the qualification of the methodology for calculating neutron fluence in pressure vessels, as required by the U.S. Nuclear Regulatory Commission Regulatory Guide DG-1053, Calculational and Dosimetry Methods for Determining Pressure Vessel Neutron Fluence. Section 1 of this report describes the HBR-2 benchmark and provides all the dimensions, material compositions, and neutron source data necessary for the analysis. The measured quantities, to be compared with the calculated values, are the specific activities at the end of fuel cycle 9. The characteristic feature of the HBR-2 benchmark is that it provides measurements on both sides of the pressure vessel: in the surveillance capsule attached to the thermal shield and in the reactor cavity. In section 2, the analysis of the HBR-2 benchmark is described. Calculations with the computer code DORT, based on the discrete-ordinates method, were performed with three multigroup libraries based on ENDF/B-VI: BUGLE-93, SAILOR-95 and BUGLE-96. The average ratio of the calculated-to-measured specific activities (C/M) for the six dosimeters in the surveillance capsule was 0.90 {+-} 0.04 for all three libraries. The average C/Ms for the cavity dosimeters (without neptunium dosimeter) were 0.89 {+-} 0.10, 0.91 {+-} 0.10, and 0.90 {+-} 0.09 for the BUGLE-93, SAILOR-95 and BUGLE-96 libraries, respectively. It is expected that the agreement of the calculations with the measurements, similar to the agreement obtained in this research, should typically be observed when the discrete-ordinates method and ENDF/B-VI libraries are used for the HBR-2 benchmark analysis.

  3. Pool critical assembly pressure vessel facility benchmark

    International Nuclear Information System (INIS)

    Remec, I.; Kam, F.B.K.

    1997-07-01

    This pool critical assembly (PCA) pressure vessel wall facility benchmark (PCA benchmark) is described and analyzed in this report. Analysis of the PCA benchmark can be used for partial fulfillment of the requirements for the qualification of the methodology for pressure vessel neutron fluence calculations, as required by the US Nuclear Regulatory Commission regulatory guide DG-1053. Section 1 of this report describes the PCA benchmark and provides all data necessary for the benchmark analysis. The measured quantities, to be compared with the calculated values, are the equivalent fission fluxes. In Section 2 the analysis of the PCA benchmark is described. Calculations with the computer code DORT, based on the discrete-ordinates method, were performed for three ENDF/B-VI-based multigroup libraries: BUGLE-93, SAILOR-95, and BUGLE-96. An excellent agreement of the calculated (C) and measures (M) equivalent fission fluxes was obtained. The arithmetic average C/M for all the dosimeters (total of 31) was 0.93 ± 0.03 and 0.92 ± 0.03 for the SAILOR-95 and BUGLE-96 libraries, respectively. The average C/M ratio, obtained with the BUGLE-93 library, for the 28 measurements was 0.93 ± 0.03 (the neptunium measurements in the water and air regions were overpredicted and excluded from the average). No systematic decrease in the C/M ratios with increasing distance from the core was observed for any of the libraries used

  4. Benchmarking computer platforms for lattice QCD applications

    International Nuclear Information System (INIS)

    Hasenbusch, M.; Jansen, K.; Pleiter, D.; Wegner, P.; Wettig, T.

    2003-09-01

    We define a benchmark suite for lattice QCD and report on benchmark results from several computer platforms. The platforms considered are apeNEXT, CRAY T3E, Hitachi SR8000, IBM p690, PC-Clusters, and QCDOC. (orig.)

  5. A simplified 2D HTTR benchmark problem

    International Nuclear Information System (INIS)

    Zhang, Z.; Rahnema, F.; Pounders, J. M.; Zhang, D.; Ougouag, A.

    2009-01-01

    To access the accuracy of diffusion or transport methods for reactor calculations, it is desirable to create heterogeneous benchmark problems that are typical of relevant whole core configurations. In this paper we have created a numerical benchmark problem in 2D configuration typical of a high temperature gas cooled prismatic core. This problem was derived from the HTTR start-up experiment. For code-to-code verification, complex details of geometry and material specification of the physical experiments are not necessary. To this end, the benchmark problem presented here is derived by simplifications that remove the unnecessary details while retaining the heterogeneity and major physics properties from the neutronics viewpoint. Also included here is a six-group material (macroscopic) cross section library for the benchmark problem. This library was generated using the lattice depletion code HELIOS. Using this library, benchmark quality Monte Carlo solutions are provided for three different configurations (all-rods-in, partially-controlled and all-rods-out). The reference solutions include the core eigenvalue, block (assembly) averaged fuel pin fission density distributions, and absorption rate in absorbers (burnable poison and control rods). (authors)

  6. Regression Benchmarking: An Approach to Quality Assurance in Performance

    OpenAIRE

    Bulej, Lubomír

    2005-01-01

    The paper presents a short summary of our work in the area of regression benchmarking and its application to software development. Specially, we explain the concept of regression benchmarking, the requirements for employing regression testing in a software project, and methods used for analyzing the vast amounts of data resulting from repeated benchmarking. We present the application of regression benchmarking on a real software project and conclude with a glimpse at the challenges for the fu...

  7. Molecular Line Emission from Multifluid Shock Waves. I. Numerical Methods and Benchmark Tests

    Science.gov (United States)

    Ciolek, Glenn E.; Roberge, Wayne G.

    2013-05-01

    We describe a numerical scheme for studying time-dependent, multifluid, magnetohydrodynamic shock waves in weakly ionized interstellar clouds and cores. Shocks are modeled as propagating perpendicular to the magnetic field and consist of a neutral molecular fluid plus a fluid of ions and electrons. The scheme is based on operator splitting, wherein time integration of the governing equations is split into separate parts. In one part, independent homogeneous Riemann problems for the two fluids are solved using Godunov's method. In the other, equations containing the source terms for transfer of mass, momentum, and energy between the fluids are integrated using standard numerical techniques. We show that, for the frequent case where the thermal pressures of the ions and electrons are Lt magnetic pressure, the Riemann problems for the neutral and ion-electron fluids have a similar mathematical structure which facilitates numerical coding. Implementation of the scheme is discussed and several benchmark tests confirming its accuracy are presented, including (1) MHD wave packets ranging over orders of magnitude in length- and timescales, (2) early evolution of multifluid shocks caused by two colliding clouds, and (3) a multifluid shock with mass transfer between the fluids by cosmic-ray ionization and ion-electron recombination, demonstrating the effect of ion mass loading on magnetic precursors of MHD shocks. An exact solution to an MHD Riemann problem forming the basis for an approximate numerical solver used in the homogeneous part of our scheme is presented, along with derivations of the analytic benchmark solutions and tests showing the convergence of the numerical algorithm.

  8. MOLECULAR LINE EMISSION FROM MULTIFLUID SHOCK WAVES. I. NUMERICAL METHODS AND BENCHMARK TESTS

    International Nuclear Information System (INIS)

    Ciolek, Glenn E.; Roberge, Wayne G.

    2013-01-01

    We describe a numerical scheme for studying time-dependent, multifluid, magnetohydrodynamic shock waves in weakly ionized interstellar clouds and cores. Shocks are modeled as propagating perpendicular to the magnetic field and consist of a neutral molecular fluid plus a fluid of ions and electrons. The scheme is based on operator splitting, wherein time integration of the governing equations is split into separate parts. In one part, independent homogeneous Riemann problems for the two fluids are solved using Godunov's method. In the other, equations containing the source terms for transfer of mass, momentum, and energy between the fluids are integrated using standard numerical techniques. We show that, for the frequent case where the thermal pressures of the ions and electrons are << magnetic pressure, the Riemann problems for the neutral and ion-electron fluids have a similar mathematical structure which facilitates numerical coding. Implementation of the scheme is discussed and several benchmark tests confirming its accuracy are presented, including (1) MHD wave packets ranging over orders of magnitude in length- and timescales, (2) early evolution of multifluid shocks caused by two colliding clouds, and (3) a multifluid shock with mass transfer between the fluids by cosmic-ray ionization and ion-electron recombination, demonstrating the effect of ion mass loading on magnetic precursors of MHD shocks. An exact solution to an MHD Riemann problem forming the basis for an approximate numerical solver used in the homogeneous part of our scheme is presented, along with derivations of the analytic benchmark solutions and tests showing the convergence of the numerical algorithm.

  9. Second benchmark problem for WIPP structural computations

    International Nuclear Information System (INIS)

    Krieg, R.D.; Morgan, H.S.; Hunter, T.O.

    1980-12-01

    This report describes the second benchmark problem for comparison of the structural codes used in the WIPP project. The first benchmark problem consisted of heated and unheated drifts at a depth of 790 m, whereas this problem considers a shallower level (650 m) more typical of the repository horizon. But more important, the first problem considered a homogeneous salt configuration, whereas this problem considers a configuration with 27 distinct geologic layers, including 10 clay layers - 4 of which are to be modeled as possible slip planes. The inclusion of layering introduces complications in structural and thermal calculations that were not present in the first benchmark problem. These additional complications will be handled differently by the various codes used to compute drift closure rates. This second benchmark problem will assess these codes by evaluating the treatment of these complications

  10. Physical conditions, dynamics and mass distribution in the center of the galaxy

    Science.gov (United States)

    Genzel, R.; Townes, C. H.

    1987-01-01

    Investigations of the central 10 pc of the Galaxy, and conclusions on energetics, dynamics, and mass distribution derived from X and gamma ray measurements and from infrared and microwave studies, especially from spectroscopy, high resolution imaging, and interferometry are reviewed. Evidence for and against a massive black hole is analyzed.

  11. Benchmark calculations on nuclear characteristics of JRR-4 HEU core by SRAC code system

    International Nuclear Information System (INIS)

    Arigane, Kenji

    1987-04-01

    The reduced enrichment program for the JRR-4 has been progressing based on JAERI's RERTR (Reduced Enrichment Research and Test Reactor) program. The SRAC (JAERI Thermal Reactor Standard Code System for Reactor Design and Analysis) is used for the neutronic design of the JRR-4 LEU Core. This report describes the benchmark calculations on the neutronic characteristics of the JRR-4 HEU Core in order to validate the calculation method. The benchmark calculations were performed on the various kind of neutronic characteristics such as excess reactivity, criticality, control rod worth, thermal neutron flux distribution, void coefficient, temperature coefficient, mass coefficient, kinetic parameters and poisoning effect by Xe-135 build up. As the result, it was confirmed that these calculated values are in satisfactory agreement with the measured values. Therefore, the calculational method by the SRAC was validated. (author)

  12. Benchmarks: The Development of a New Approach to Student Evaluation.

    Science.gov (United States)

    Larter, Sylvia

    The Toronto Board of Education Benchmarks are libraries of reference materials that demonstrate student achievement at various levels. Each library contains video benchmarks, print benchmarks, a staff handbook, and summary and introductory documents. This book is about the development and the history of the benchmark program. It has taken over 3…

  13. Systemic Liquidity Crisis with Dynamic Haircuts

    OpenAIRE

    Sever, Can

    2014-01-01

    In this paper, using network tools, I analyse systemic impacts of liquidity shocks in interbank market in case of endogenous haircuts. Gai, Haldane and Kapadia (2011) introduce a benchmark for liquidity crisis following haircut shocks, and Gorton and Metrick (2010) reveal the evidence from 2007-09 crisis for increasing haircuts with banking panic. In the benchmark model, I endogenize and update haircuts dynamically during the period of stress. The results significantly differ from...

  14. Aerodynamic benchmarking of the DeepWind design

    DEFF Research Database (Denmark)

    Bedon, Gabriele; Schmidt Paulsen, Uwe; Aagaard Madsen, Helge

    The aerodynamic benchmarking for the DeepWind rotor is conducted comparing different rotor geometries and solutions and keeping the comparison as fair as possible. The objective for the benchmarking is to find the most suitable configuration in order to maximize the power production and minimize...... the blade solicitation and the cost of energy. Different parameters are considered for the benchmarking study. The DeepWind blade is characterized by a shape similar to the Troposkien geometry but asymmetric between the top and bottom parts. The blade shape is considered as a fixed parameter...

  15. Integral benchmarks with reference to thorium fuel cycle

    International Nuclear Information System (INIS)

    Ganesan, S.

    2003-01-01

    This is a power point presentation about the Indian participation in the CRP 'Evaluated Data for the Thorium-Uranium fuel cycle'. The plans and scope of the Indian participation are to provide selected integral experimental benchmarks for nuclear data validation, including Indian Thorium burn up benchmarks, post-irradiation examination studies, comparison of basic evaluated data files and analysis of selected benchmarks for Th-U fuel cycle

  16. Benchmark Evaluation of HTR-PROTEUS Pebble Bed Experimental Program

    International Nuclear Information System (INIS)

    Bess, John D.; Montierth, Leland; Köberl, Oliver

    2014-01-01

    Benchmark models were developed to evaluate 11 critical core configurations of the HTR-PROTEUS pebble bed experimental program. Various additional reactor physics measurements were performed as part of this program; currently only a total of 37 absorber rod worth measurements have been evaluated as acceptable benchmark experiments for Cores 4, 9, and 10. Dominant uncertainties in the experimental keff for all core configurations come from uncertainties in the 235 U enrichment of the fuel, impurities in the moderator pebbles, and the density and impurity content of the radial reflector. Calculations of k eff with MCNP5 and ENDF/B-VII.0 neutron nuclear data are greater than the benchmark values but within 1% and also within the 3σ uncertainty, except for Core 4, which is the only randomly packed pebble configuration. Repeated calculations of k eff with MCNP6.1 and ENDF/B-VII.1 are lower than the benchmark values and within 1% (~3σ) except for Cores 5 and 9, which calculate lower than the benchmark eigenvalues within 4σ. The primary difference between the two nuclear data libraries is the adjustment of the absorption cross section of graphite. Simulations of the absorber rod worth measurements are within 3σ of the benchmark experiment values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments

  17. Compression dynamics of quasi-spherical wire arrays with different linear mass profiles

    International Nuclear Information System (INIS)

    Mitrofanov, K. N.; Aleksandrov, V. V.; Gritsuk, A. N.; Grabovski, E. V.; Frolov, I. N.; Laukhin, Ya. N.; Oleinik, G. M.; Ol’khovskaya, O. G.

    2016-01-01

    Results of experimental studies of the implosion of quasi-spherical wire (or metalized fiber) arrays are presented. The goal of the experiments was to achieve synchronous three-dimensional compression of the plasma produced in different regions of a quasi-spherical array into its geometrical center. To search for optimal synchronization conditions, quasi-spherical arrays with different initial profiles of the linear mass were used. The following dependences of the linear mass on the poloidal angle were used: m_l(θ) ∝ sin"–"1θ and m_l(θ) ∝ sin"–"2θ. The compression dynamics of such arrays was compared with that of quasi-spherical arrays without linear mass profiling, m_l(θ) = const. To verify the experimental data, the spatiotemporal dynamics of plasma compression in quasi-spherical arrays was studied using various diagnostics. The experiments on three-dimensional implosion of quasi-spherical arrays made it possible to study how the frozen-in magnetic field of the discharge current penetrates into the array. By measuring the magnetic field in the plasma of a quasi-spherical array, information is obtained on the processes of plasma production and formation of plasma flows from the wire/fiber regions with and without an additionally deposited mass. It is found that penetration of the magnetic flux depends on the initial linear mass profile m_l(θ) of the quasi-spherical array. From space-resolved spectral measurements and frame imaging of plasma X-ray emission, information is obtained on the dimensions and shape of the X-ray source formed during the implosion of a quasi-spherical array. The intensity of this source is estimated and compared with that of the Z-pinch formed during the implosion of a cylindrical array.

  18. Modeling of Phenix End-of-Life control rod withdrawal benchmark with DYN3D SFR version

    Energy Technology Data Exchange (ETDEWEB)

    Nikitin, Evgeny; Fridman, Emil [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany). Reactor Safety

    2017-06-01

    The reactor dynamics code DYN3D is currently under extension for Sodium cooled Fast Reactor applications. The control rod withdrawal benchmark from the Phenix End-of-Life experiments was selected for verification and validation purposes. This report presents some selected results to demonstrate the feasibility of using DYN3D for steady-state Sodium cooled Fast Reactor analyses.

  19. The extent of benchmarking in the South African financial sector

    Directory of Open Access Journals (Sweden)

    W Vermeulen

    2014-06-01

    Full Text Available Benchmarking is the process of identifying, understanding and adapting outstanding practices from within the organisation or from other businesses, to help improve performance. The importance of benchmarking as an enabler of business excellence has necessitated an in-depth investigation into the current state of benchmarking in South Africa. This research project highlights the fact that respondents realise the importance of benchmarking, but that various problems hinder the effective implementation of benchmarking. Based on the research findings, recommendations for achieving success are suggested.

  20. Reliability and mass analysis of dynamic power conversion systems with parallel or standby redundancy

    Science.gov (United States)

    Juhasz, Albert J.; Bloomfield, Harvey S.

    1987-01-01

    A combinatorial reliability approach was used to identify potential dynamic power conversion systems for space mission applications. A reliability and mass analysis was also performed, specifically for a 100-kWe nuclear Brayton power conversion system with parallel redundancy. Although this study was done for a reactor outlet temperature of 1100 K, preliminary system mass estimates are also included for reactor outlet temperatures ranging up to 1500 K.

  1. Reliability and mass analysis of dynamic power conversion systems with parallel of standby redundancy

    Science.gov (United States)

    Juhasz, A. J.; Bloomfield, H. S.

    1985-01-01

    A combinatorial reliability approach is used to identify potential dynamic power conversion systems for space mission applications. A reliability and mass analysis is also performed, specifically for a 100 kWe nuclear Brayton power conversion system with parallel redundancy. Although this study is done for a reactor outlet temperature of 1100K, preliminary system mass estimates are also included for reactor outlet temperatures ranging up to 1500 K.

  2. Predictive uncertainty reduction in coupled neutron-kinetics/thermal hydraulics modeling of the BWR-TT2 benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Badea, Aurelian F., E-mail: aurelian.badea@kit.edu [Karlsruhe Institute of Technology, Vincenz-Prießnitz-Str. 3, 76131 Karlsruhe (Germany); Cacuci, Dan G. [Center for Nuclear Science and Energy/Dept. of ME, University of South Carolina, 300 Main Street, Columbia, SC 29208 (United States)

    2017-03-15

    Highlights: • BWR Turbine Trip 2 (BWR-TT2) benchmark. • Substantial (up to 50%) reduction of uncertainties in the predicted transient power. • 6660 uncertain model parameters were calibrated. - Abstract: By applying a comprehensive predictive modeling methodology, this work demonstrates a substantial (up to 50%) reduction of uncertainties in the predicted total transient power in the BWR Turbine Trip 2 (BWR-TT2) benchmark while calibrating the numerical simulation of this benchmark, comprising 6090 macroscopic cross sections, and 570 thermal-hydraulics parameters involved in modeling the phase-slip correlation, transient outlet pressure, and total mass flow. The BWR-TT2 benchmark is based on an experiment that was carried out in 1977 in the NPP Peach Bottom 2, involving the closure of the turbine stop valve which caused a pressure wave that propagated with attenuation into the reactor core. The condensation of the steam in the reactor core caused by the pressure increase led to a positive reactivity insertion. The subsequent rise of power was limited by the feedback and the insertion of the control rods. The BWR-TT2 benchmark was modeled with the three-dimensional reactor physics code system DYN3D, by coupling neutron kinetics with two-phase thermal-hydraulics. All 6660 DYN3D model parameters were calibrated by applying a predictive modeling methodology that combines experimental and computational information to produce optimally predicted best-estimate results with reduced predicted uncertainties. Simultaneously, the predictive modeling methodology yields optimally predicted values for the BWR total transient power while reducing significantly the accompanying predicted standard deviations.

  3. Predictive uncertainty reduction in coupled neutron-kinetics/thermal hydraulics modeling of the BWR-TT2 benchmark

    International Nuclear Information System (INIS)

    Badea, Aurelian F.; Cacuci, Dan G.

    2017-01-01

    Highlights: • BWR Turbine Trip 2 (BWR-TT2) benchmark. • Substantial (up to 50%) reduction of uncertainties in the predicted transient power. • 6660 uncertain model parameters were calibrated. - Abstract: By applying a comprehensive predictive modeling methodology, this work demonstrates a substantial (up to 50%) reduction of uncertainties in the predicted total transient power in the BWR Turbine Trip 2 (BWR-TT2) benchmark while calibrating the numerical simulation of this benchmark, comprising 6090 macroscopic cross sections, and 570 thermal-hydraulics parameters involved in modeling the phase-slip correlation, transient outlet pressure, and total mass flow. The BWR-TT2 benchmark is based on an experiment that was carried out in 1977 in the NPP Peach Bottom 2, involving the closure of the turbine stop valve which caused a pressure wave that propagated with attenuation into the reactor core. The condensation of the steam in the reactor core caused by the pressure increase led to a positive reactivity insertion. The subsequent rise of power was limited by the feedback and the insertion of the control rods. The BWR-TT2 benchmark was modeled with the three-dimensional reactor physics code system DYN3D, by coupling neutron kinetics with two-phase thermal-hydraulics. All 6660 DYN3D model parameters were calibrated by applying a predictive modeling methodology that combines experimental and computational information to produce optimally predicted best-estimate results with reduced predicted uncertainties. Simultaneously, the predictive modeling methodology yields optimally predicted values for the BWR total transient power while reducing significantly the accompanying predicted standard deviations.

  4. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  5. Benchmarking to improve the quality of cystic fibrosis care.

    Science.gov (United States)

    Schechter, Michael S

    2012-11-01

    Benchmarking involves the ascertainment of healthcare programs with most favorable outcomes as a means to identify and spread effective strategies for delivery of care. The recent interest in the development of patient registries for patients with cystic fibrosis (CF) has been fueled in part by an interest in using them to facilitate benchmarking. This review summarizes reports of how benchmarking has been operationalized in attempts to improve CF care. Although certain goals of benchmarking can be accomplished with an exclusive focus on registry data analysis, benchmarking programs in Germany and the United States have supplemented these data analyses with exploratory interactions and discussions to better understand successful approaches to care and encourage their spread throughout the care network. Benchmarking allows the discovery and facilitates the spread of effective approaches to care. It provides a pragmatic alternative to traditional research methods such as randomized controlled trials, providing insights into methods that optimize delivery of care and allowing judgments about the relative effectiveness of different therapeutic approaches.

  6. Dynamical and topological considerations in low and high mass diffractive dissociation

    International Nuclear Information System (INIS)

    Bishari, M.

    1978-01-01

    The topological structure of a given process completely specifies the 1/N dependence. However dynamics seems to be crucial in characterizing strongly interacting reactions, as illustrated in the study of elastic scattering, low mass diffraction and the triple pomeron mechanism. The ''1/N dual unitarization'' scheme is a viable framework for Gribov's Reggeon field theory, since it clarifies and determines the bare parameters of Gribov's Lagrangian. (author)

  7. Expanding the linear dynamic range for quantitative liquid chromatography-high resolution mass spectrometry utilizing natural isotopologue signals

    International Nuclear Information System (INIS)

    Liu, Hanghui; Lam, Lily; Yan, Lin; Chi, Bert; Dasgupta, Purnendu K.

    2014-01-01

    Highlights: • Less abundant isotopologue ions were utilized to decrease detector saturation. • A 25–50 fold increase in the upper limit of dynamic range was demonstrated. • Linear dynamic range was expanded without compromising mass resolution. - Abstract: The linear dynamic range (LDR) for quantitative liquid chromatography–mass spectrometry can be extended until ionization saturation is reached by using a number of target isotopologue ions in addition to the normally used target ion that provides the highest sensitivity. Less abundant isotopologue ions extend the LDR: the lower ion abundance decreases the probability of ion detector saturation. Effectively the sensitivity decreases and the upper limit of the LDR increases. We show in this paper that the technique is particularly powerful with a high resolution time of flight mass spectrometer because the data for all ions are automatically acquired, and we demonstrated this for four small organic molecules; the upper limits of LDRs increased by 25–50 times

  8. Benchmarking and Regulation

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    . The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...

  9. Calculation of the Thermal Radiation Benchmark Problems for a CANDU Fuel Channel Analysis Using the CFX-10 Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyoung Tae; Park, Joo Hwan; Rhee, Bo Wook

    2006-07-15

    To justify the use of a commercial Computational Fluid Dynamics (CFD) code for a CANDU fuel channel analysis, especially for the radiation heat transfer dominant conditions, the CFX-10 code is tested against three benchmark problems which were used for the validation of a radiation heat transfer in the CANDU analysis code, a CATHENA. These three benchmark problems are representative of the CANDU fuel channel configurations from a simple geometry to whole fuel channel geometry. With assumptions of a non-participating medium completely enclosed with the diffuse, gray and opaque surfaces, the solutions of the benchmark problems are obtained by the concept of surface resistance to radiation accounting for the view factors and the emissivities. The view factors are calculated by the program MATRIX version 1.0 avoiding the difficulty of hand calculation for the complex geometries. For the solutions of the benchmark problems, the temperature or the net radiation heat flux boundary conditions are prescribed for each radiating surface to determine the radiation heat transfer rate or the surface temperature, respectively by using the network method. The Discrete Transfer Model (DTM) is used for the CFX-10 radiation model and its calculation results are compared with the solutions of the benchmark problems. The CFX-10 results for the three benchmark problems are in close agreement with these solutions, so it is concluded that the CFX-10 with a DTM radiation model can be applied to the CANDU fuel channel analysis where a surface radiation heat transfer is a dominant mode of the heat transfer.

  10. Calculation of the Thermal Radiation Benchmark Problems for a CANDU Fuel Channel Analysis Using the CFX-10 Code

    International Nuclear Information System (INIS)

    Kim, Hyoung Tae; Park, Joo Hwan; Rhee, Bo Wook

    2006-07-01

    To justify the use of a commercial Computational Fluid Dynamics (CFD) code for a CANDU fuel channel analysis, especially for the radiation heat transfer dominant conditions, the CFX-10 code is tested against three benchmark problems which were used for the validation of a radiation heat transfer in the CANDU analysis code, a CATHENA. These three benchmark problems are representative of the CANDU fuel channel configurations from a simple geometry to whole fuel channel geometry. With assumptions of a non-participating medium completely enclosed with the diffuse, gray and opaque surfaces, the solutions of the benchmark problems are obtained by the concept of surface resistance to radiation accounting for the view factors and the emissivities. The view factors are calculated by the program MATRIX version 1.0 avoiding the difficulty of hand calculation for the complex geometries. For the solutions of the benchmark problems, the temperature or the net radiation heat flux boundary conditions are prescribed for each radiating surface to determine the radiation heat transfer rate or the surface temperature, respectively by using the network method. The Discrete Transfer Model (DTM) is used for the CFX-10 radiation model and its calculation results are compared with the solutions of the benchmark problems. The CFX-10 results for the three benchmark problems are in close agreement with these solutions, so it is concluded that the CFX-10 with a DTM radiation model can be applied to the CANDU fuel channel analysis where a surface radiation heat transfer is a dominant mode of the heat transfer

  11. Benchmarking multi-dimensional large strain consolidation analyses

    International Nuclear Information System (INIS)

    Priestley, D.; Fredlund, M.D.; Van Zyl, D.

    2010-01-01

    Analyzing the consolidation of tailings slurries and dredged fills requires a more extensive formulation than is used for common (small strain) consolidation problems. Large strain consolidation theories have traditionally been limited to 1-D formulations. SoilVision Systems has developed the capacity to analyze large strain consolidation problems in 2 and 3-D. The benchmarking of such formulations is not a trivial task. This paper presents several examples of modeling large strain consolidation in the beta versions of the new software. These examples were taken from the literature and were used to benchmark the large strain formulation used by the new software. The benchmarks reported here are: a comparison to the consolidation software application CONDES0, Townsend's Scenario B and a multi-dimensional analysis of long-term column tests performed on oil sands tailings. All three of these benchmarks were attained using the SVOffice suite. (author)

  12. A Study of Fixed-Order Mixed Norm Designs for a Benchmark Problem in Structural Control

    Science.gov (United States)

    Whorton, Mark S.; Calise, Anthony J.; Hsu, C. C.

    1998-01-01

    This study investigates the use of H2, p-synthesis, and mixed H2/mu methods to construct full-order controllers and optimized controllers of fixed dimensions. The benchmark problem definition is first extended to include uncertainty within the controller bandwidth in the form of parametric uncertainty representative of uncertainty in the natural frequencies of the design model. The sensitivity of H2 design to unmodelled dynamics and parametric uncertainty is evaluated for a range of controller levels of authority. Next, mu-synthesis methods are applied to design full-order compensators that are robust to both unmodelled dynamics and to parametric uncertainty. Finally, a set of mixed H2/mu compensators are designed which are optimized for a fixed compensator dimension. These mixed norm designs recover the H, design performance levels while providing the same levels of robust stability as the u designs. It is shown that designing with the mixed norm approach permits higher levels of controller authority for which the H, designs are destabilizing. The benchmark problem is that of an active tendon system. The controller designs are all based on the use of acceleration feedback.

  13. JENDL-4.0 benchmarking for fission reactor applications

    International Nuclear Information System (INIS)

    Chiba, Go; Okumura, Keisuke; Sugino, Kazuteru; Nagaya, Yasunobu; Yokoyama, Kenji; Kugo, Teruhiko; Ishikawa, Makoto; Okajima, Shigeaki

    2011-01-01

    Benchmark testing for the newly developed Japanese evaluated nuclear data library JENDL-4.0 is carried out by using a huge amount of integral data. Benchmark calculations are performed with a continuous-energy Monte Carlo code and with the deterministic procedure, which has been developed for fast reactor analyses in Japan. Through the present benchmark testing using a wide range of benchmark data, significant improvement in the performance of JENDL-4.0 for fission reactor applications is clearly demonstrated in comparison with the former library JENDL-3.3. Much more accurate and reliable prediction for neutronic parameters for both thermal and fast reactors becomes possible by using the library JENDL-4.0. (author)

  14. Dynamical twisted mass fermions and baryon spectroscopy

    International Nuclear Information System (INIS)

    Drach, V.

    2010-06-01

    The aim of this work is an ab initio computation of the baryon masses starting from quantum chromodynamics (QCD). This theory describes the interaction between quarks and gluons and has been established at high energy thanks to one of its fundamental properties: the asymptotic freedom. This property predicts that the running coupling constant tends to zero at high energy and thus that perturbative expansions in the coupling constant are justified in this regime. On the contrary the low energy dynamics can only be understood in terms of a non perturbative approach. To date, the only known method that allows the computation of observables in this regime together with a control of its systematic effects is called lattice QCD. It consists in formulating the theory on an Euclidean space-time and to evaluating numerically suitable functional integrals. First chapter is an introduction to the QCD in the continuum and on a discrete space time. The chapter 2 describes the formalism of maximally twisted fermions used in the European Twisted Mass (ETM) collaboration. The chapter 3 deals with the techniques needed to build hadronic correlator starting from gauge configuration. We then discuss how we determine hadron masses and their statistical errors. The numerical estimation of functional integral is explained in chapter 4. It is stressed that it requires sophisticated algorithm and massive parallel computing on Blue-Gene type architecture. Gauge configuration production is an important part of the work realized during my Ph.D. Chapter 5 is a critical review on chiral perturbation theory in the baryon sector. The two last chapter are devoted to the analysis in the light and strange baryon sector. Systematics and chiral extrapolation are extensively discussed. (author)

  15. Improving the accuracy of dynamic mass calculation

    Directory of Open Access Journals (Sweden)

    Oleksandr F. Dashchenko

    2015-06-01

    Full Text Available With the acceleration of goods transporting, cargo accounting plays an important role in today's global and complex environment. Weight is the most reliable indicator of the materials control. Unlike many other variables that can be measured indirectly, the weight can be measured directly and accurately. Using strain-gauge transducers, weight value can be obtained within a few milliseconds; such values correspond to the momentary load, which acts on the sensor. Determination of the weight of moving transport is only possible by appropriate processing of the sensor signal. The aim of the research is to develop a methodology for weighing freight rolling stock, which increases the accuracy of the measurement of dynamic mass, in particular wagon that moves. Apart from time-series methods, preliminary filtration for improving the accuracy of calculation is used. The results of the simulation are presented.

  16. Benchmark problems for numerical implementations of phase field models

    International Nuclear Information System (INIS)

    Jokisaari, A. M.; Voorhees, P. W.; Guyer, J. E.; Warren, J.; Heinonen, O. G.

    2016-01-01

    Here, we present the first set of benchmark problems for phase field models that are being developed by the Center for Hierarchical Materials Design (CHiMaD) and the National Institute of Standards and Technology (NIST). While many scientific research areas use a limited set of well-established software, the growing phase field community continues to develop a wide variety of codes and lacks benchmark problems to consistently evaluate the numerical performance of new implementations. Phase field modeling has become significantly more popular as computational power has increased and is now becoming mainstream, driving the need for benchmark problems to validate and verify new implementations. We follow the example set by the micromagnetics community to develop an evolving set of benchmark problems that test the usability, computational resources, numerical capabilities and physical scope of phase field simulation codes. In this paper, we propose two benchmark problems that cover the physics of solute diffusion and growth and coarsening of a second phase via a simple spinodal decomposition model and a more complex Ostwald ripening model. We demonstrate the utility of benchmark problems by comparing the results of simulations performed with two different adaptive time stepping techniques, and we discuss the needs of future benchmark problems. The development of benchmark problems will enable the results of quantitative phase field models to be confidently incorporated into integrated computational materials science and engineering (ICME), an important goal of the Materials Genome Initiative.

  17. A 3D stylized half-core CANDU benchmark problem

    International Nuclear Information System (INIS)

    Pounders, Justin M.; Rahnema, Farzad; Serghiuta, Dumitru; Tholammakkil, John

    2011-01-01

    A 3D stylized half-core Canadian deuterium uranium (CANDU) reactor benchmark problem is presented. The benchmark problem is comprised of a heterogeneous lattice of 37-element natural uranium fuel bundles, heavy water moderated, heavy water cooled, with adjuster rods included as reactivity control devices. Furthermore, a 2-group macroscopic cross section library has been developed for the problem to increase the utility of this benchmark for full-core deterministic transport methods development. Monte Carlo results are presented for the benchmark problem in cooled, checkerboard void, and full coolant void configurations.

  18. International handbook of evaluated criticality safety benchmark experiments

    International Nuclear Information System (INIS)

    2010-01-01

    The Criticality Safety Benchmark Evaluation Project (CSBEP) was initiated in October of 1992 by the United States Department of Energy. The project quickly became an international effort as scientists from other interested countries became involved. The International Criticality Safety Benchmark Evaluation Project (ICSBEP) became an official activity of the Organization for Economic Cooperation and Development - Nuclear Energy Agency (OECD-NEA) in 1995. This handbook contains criticality safety benchmark specifications that have been derived from experiments performed at various nuclear critical facilities around the world. The benchmark specifications are intended for use by criticality safety engineers to validate calculational techniques used to establish minimum subcritical margins for operations with fissile material and to determine criticality alarm requirement and placement. Many of the specifications are also useful for nuclear data testing. Example calculations are presented; however, these calculations do not constitute a validation of the codes or cross section data. The evaluated criticality safety benchmark data are given in nine volumes. These volumes span over 55,000 pages and contain 516 evaluations with benchmark specifications for 4,405 critical, near critical, or subcritical configurations, 24 criticality alarm placement / shielding configurations with multiple dose points for each, and 200 configurations that have been categorized as fundamental physics measurements that are relevant to criticality safety applications. Experiments that are found unacceptable for use as criticality safety benchmark experiments are discussed in these evaluations; however, benchmark specifications are not derived for such experiments (in some cases models are provided in an appendix). Approximately 770 experimental configurations are categorized as unacceptable for use as criticality safety benchmark experiments. Additional evaluations are in progress and will be

  19. Repeated Results Analysis for Middleware Regression Benchmarking

    Czech Academy of Sciences Publication Activity Database

    Bulej, Lubomír; Kalibera, T.; Tůma, P.

    2005-01-01

    Roč. 60, - (2005), s. 345-358 ISSN 0166-5316 R&D Projects: GA ČR GA102/03/0672 Institutional research plan: CEZ:AV0Z10300504 Keywords : middleware benchmarking * regression benchmarking * regression testing Subject RIV: JD - Computer Applications, Robotics Impact factor: 0.756, year: 2005

  20. ZZ WPPR, Pu Recycling Benchmark Results

    International Nuclear Information System (INIS)

    Lutz, D.; Mattes, M.; Delpech, Marc; Juanola, Marc

    2002-01-01

    Description of program or function: The NEA NSC Working Party on Physics of Plutonium Recycling has commissioned a series of benchmarks covering: - Plutonium recycling in pressurized-water reactors; - Void reactivity effect in pressurized-water reactors; - Fast Plutonium-burner reactors: beginning of life; - Plutonium recycling in fast reactors; - Multiple recycling in advanced pressurized-water reactors. The results have been published (see references). ZZ-WPPR-1-A/B contains graphs and tables relative to the PWR Mox pin cell benchmark, representing typical fuel for plutonium recycling, one corresponding to a first cycle, the second for a fifth cycle. These computer readable files contain the complete set of results, while the printed report contains only a subset. ZZ-WPPR-2-CYC1 are the results from cycle 1 of the multiple recycling benchmarks

  1. Interior beam searchlight semi-analytical benchmark

    International Nuclear Information System (INIS)

    Ganapol, Barry D.; Kornreich, Drew E.

    2008-01-01

    Multidimensional semi-analytical benchmarks to provide highly accurate standards to assess routine numerical particle transport algorithms are few and far between. Because of the well-established 1D theory for the analytical solution of the transport equation, it is sometimes possible to 'bootstrap' a 1D solution to generate a more comprehensive solution representation. Here, we consider the searchlight problem (SLP) as a multidimensional benchmark. A variation of the usual SLP is the interior beam SLP (IBSLP) where a beam source lies beneath the surface of a half space and emits directly towards the free surface. We consider the establishment of a new semi-analytical benchmark based on a new FN formulation. This problem is important in radiative transfer experimental analysis to determine cloud absorption and scattering properties. (authors)

  2. Benchmark referencing of neutron dosimetry measurements

    International Nuclear Information System (INIS)

    Eisenhauer, C.M.; Grundl, J.A.; Gilliam, D.M.; McGarry, E.D.; Spiegel, V.

    1980-01-01

    The concept of benchmark referencing involves interpretation of dosimetry measurements in applied neutron fields in terms of similar measurements in benchmark fields whose neutron spectra and intensity are well known. The main advantage of benchmark referencing is that it minimizes or eliminates many types of experimental uncertainties such as those associated with absolute detection efficiencies and cross sections. In this paper we consider the cavity external to the pressure vessel of a power reactor as an example of an applied field. The pressure vessel cavity is an accessible location for exploratory dosimetry measurements aimed at understanding embrittlement of pressure vessel steel. Comparisons with calculated predictions of neutron fluence and spectra in the cavity provide a valuable check of the computational methods used to estimate pressure vessel safety margins for pressure vessel lifetimes

  3. Atomic Energy Research benchmark activity

    International Nuclear Information System (INIS)

    Makai, M.

    1998-01-01

    The test problems utilized in the validation and verification process of computer programs in Atomic Energie Research are collected into one bunch. This is the first step towards issuing a volume in which tests for VVER are collected, along with reference solutions and a number of solutions. The benchmarks do not include the ZR-6 experiments because they have been published along with a number of comparisons in the Final reports of TIC. The present collection focuses on operational and mathematical benchmarks which cover almost the entire range of reaktor calculation. (Author)

  4. Comparison of three-dimensional ocean general circulation models on a benchmark problem

    International Nuclear Information System (INIS)

    Chartier, M.

    1990-12-01

    A french and an american Ocean General Circulation Models for deep-sea disposal of radioactive wastes are compared on a benchmark test problem. Both models are three-dimensional. They solve the hydrostatic primitive equations of the ocean with two different finite difference techniques. Results show that the dynamics simulated by both models are consistent. Several methods for the running of a model from a known state are tested in the French model: the diagnostic method, the prognostic method, the acceleration of convergence and the robust-diagnostic method

  5. The CMSSW benchmarking suite: Using HEP code to measure CPU performance

    International Nuclear Information System (INIS)

    Benelli, G

    2010-01-01

    The demanding computing needs of the CMS experiment require thoughtful planning and management of its computing infrastructure. A key factor in this process is the use of realistic benchmarks when assessing the computing power of the different architectures available. In recent years a discrepancy has been observed between the CPU performance estimates given by the reference benchmark for HEP computing (SPECint) and actual performances of HEP code. Making use of the CPU performance tools from the CMSSW performance suite, comparative CPU performance studies have been carried out on several architectures. A benchmarking suite has been developed and integrated in the CMSSW framework, to allow computing centers and interested third parties to benchmark architectures directly with CMSSW. The CMSSW benchmarking suite can be used out of the box, to test and compare several machines in terms of CPU performance and report with the wanted level of detail the different benchmarking scores (e.g. by processing step) and results. In this talk we describe briefly the CMSSW software performance suite, and in detail the CMSSW benchmarking suite client/server design, the performance data analysis and the available CMSSW benchmark scores. The experience in the use of HEP code for benchmarking will be discussed and CMSSW benchmark results presented.

  6. PMLB: a large benchmark suite for machine learning evaluation and comparison.

    Science.gov (United States)

    Olson, Randal S; La Cava, William; Orzechowski, Patryk; Urbanowicz, Ryan J; Moore, Jason H

    2017-01-01

    The selection, development, or comparison of machine learning methods in data mining can be a difficult task based on the target problem and goals of a particular study. Numerous publicly available real-world and simulated benchmark datasets have emerged from different sources, but their organization and adoption as standards have been inconsistent. As such, selecting and curating specific benchmarks remains an unnecessary burden on machine learning practitioners and data scientists. The present study introduces an accessible, curated, and developing public benchmark resource to facilitate identification of the strengths and weaknesses of different machine learning methodologies. We compare meta-features among the current set of benchmark datasets in this resource to characterize the diversity of available data. Finally, we apply a number of established machine learning methods to the entire benchmark suite and analyze how datasets and algorithms cluster in terms of performance. From this study, we find that existing benchmarks lack the diversity to properly benchmark machine learning algorithms, and there are several gaps in benchmarking problems that still need to be considered. This work represents another important step towards understanding the limitations of popular benchmarking suites and developing a resource that connects existing benchmarking standards to more diverse and efficient standards in the future.

  7. Benchmarking set for domestic smart grid management

    NARCIS (Netherlands)

    Bosman, M.G.C.; Bakker, Vincent; Molderink, Albert; Hurink, Johann L.; Smit, Gerardus Johannes Maria

    2010-01-01

    In this paper we propose a benchmark for domestic smart grid management. It consists of an in-depth description of a domestic smart grid, in which local energy consumers, producers and buffers can be controlled. First, from this description a general benchmark framework is derived, which can be used

  8. Mass accretion and nested array dynamics from Ni-Clad Ti-Al wire array Z pinches

    International Nuclear Information System (INIS)

    Jones, Brent Manley; Jennings, Christopher A.; Coverdale, Christine Anne; Cuneo, Michael Edward; Maron, Yitzhak; LePell, Paul David; Deeney, Christopher

    2010-01-01

    Analysis of 50 mm diameter wire arrays at the Z Accelerator has shown experimentally the accretion of mass in a stagnating z pinch and provided insight into details of the radiating plasma species and plasma conditions. This analysis focused on nested wire arrays with a 2:1 (outeninner) mass, radius, and wire number ratio where Al wires were fielded on the outer array and Ni-clad Ti wires were fielded on the inner array.In this presentation, we will present analysis of data from other mixed Al/Ni-clad Ti configurations to further evaluate nested wire array dynamics and mass accretion. These additional configurations include the opposite configuration to that described above (Ni-clad Ti wires on the outer array, with Al wires on the inner array) as well as higher wire number Al configurations fielded to vary the interaction of the two arrays. These same variations were also assessed for a smaller diameter nested array configuration (40 mm). Variations in the emitted radiation and plasma conditions will be presented, along with a discussion of what the results indicate about the nested array dynamics. Additional evidence for mass accretion will also be presented.

  9. The extent of benchmarking in the South African financial sector

    OpenAIRE

    W Vermeulen

    2014-01-01

    Benchmarking is the process of identifying, understanding and adapting outstanding practices from within the organisation or from other businesses, to help improve performance. The importance of benchmarking as an enabler of business excellence has necessitated an in-depth investigation into the current state of benchmarking in South Africa. This research project highlights the fact that respondents realise the importance of benchmarking, but that various problems hinder the effective impleme...

  10. Attitude dynamics and control of a spacecraft using shifting mass distribution

    Science.gov (United States)

    Ahn, Young Tae

    Spacecraft need specific attitude control methods that depend on the mission type or special tasks. The dynamics and the attitude control of a spacecraft with a shifting mass distribution within the system are examined. The behavior and use of conventional attitude control actuators are widely developed and performing at the present time. However, the advantage of a shifting mass distribution concept can complement spacecraft attitude control, save mass, and extend a satellite's life. This can be adopted in practice by moving mass from one tank to another, similar to what an airplane does to balance weight. Using this shifting mass distribution concept, in conjunction with other attitude control devices, can augment the three-axis attitude control process. Shifting mass involves changing the center-of-mass of the system, and/or changing the moments of inertia of the system, which then ultimately can change the attitude behavior of the system. This dissertation consists of two parts. First, the equations of motion for the shifting mass concept (also known as morphing) are developed. They are tested for their effects on attitude control by showing how shifting the mass changes the spacecraft's attitude behavior. Second, a method for optimal mass redistribution is shown using a combinatorial optimization theory under constraints. It closes with a simple example demonstrating an optimal reconfiguration. The procedure of optimal reconfiguration from one mass distribution to another to accomplish attitude control has been demonstrated for several simple examples. Mass shifting could work as an attitude controller for fine-tuning attitude behavior in small satellites. Various constraints can be applied for different situations, such as no mass shift between two tanks connected by a failed pipe or total amount of shifted mass per pipe being set for the time optimum solution. Euler angle changes influenced by the mass reconfiguration are accomplished while stability

  11. Mass profile and dynamical status of the z ~ 0.8 galaxy cluster LCDCS 0504

    Science.gov (United States)

    Guennou, L.; Biviano, A.; Adami, C.; Limousin, M.; Lima Neto, G. B.; Mamon, G. A.; Ulmer, M. P.; Gavazzi, R.; Cypriano, E. S.; Durret, F.; Clowe, D.; LeBrun, V.; Allam, S.; Basa, S.; Benoist, C.; Cappi, A.; Halliday, C.; Ilbert, O.; Johnston, D.; Jullo, E.; Just, D.; Kubo, J. M.; Márquez, I.; Marshall, P.; Martinet, N.; Maurogordato, S.; Mazure, A.; Murphy, K. J.; Plana, H.; Rostagni, F.; Russeil, D.; Schirmer, M.; Schrabback, T.; Slezak, E.; Tucker, D.; Zaritsky, D.; Ziegler, B.

    2014-06-01

    Context. Constraints on the mass distribution in high-redshift clusters of galaxies are currently not very strong. Aims: We aim to constrain the mass profile, M(r), and dynamical status of the z ~ 0.8 LCDCS 0504 cluster of galaxies that is characterized by prominent giant gravitational arcs near its center. Methods: Our analysis is based on deep X-ray, optical, and infrared imaging as well as optical spectroscopy, collected with various instruments, which we complemented with archival data. We modeled the mass distribution of the cluster with three different mass density profiles, whose parameters were constrained by the strong lensing features of the inner cluster region, by the X-ray emission from the intracluster medium, and by the kinematics of 71 cluster members. Results: We obtain consistent M(r) determinations from three methods based on kinematics (dispersion-kurtosis, caustics, and MAMPOSSt), out to the cluster virial radius, ≃1.3 Mpc and beyond. The mass profile inferred by the strong lensing analysis in the central cluster region is slightly higher than, but still consistent with, the kinematics estimate. On the other hand, the X-ray based M(r) is significantly lower than the kinematics and strong lensing estimates. Theoretical predictions from ΛCDM cosmology for the concentration-mass relation agree with our observational results, when taking into account the uncertainties in the observational and theoretical estimates. There appears to be a central deficit in the intracluster gas mass fraction compared with nearby clusters. Conclusions: Despite the relaxed appearance of this cluster, the determinations of its mass profile by different probes show substantial discrepancies, the origin of which remains to be determined. The extension of a dynamical analysis similar to that of other clusters of the DAFT/FADA survey with multiwavelength data of sufficient quality will allow shedding light on the possible systematics that affect the determination of mass

  12. A Benchmarking System for Domestic Water Use

    Directory of Open Access Journals (Sweden)

    Dexter V. L. Hunt

    2014-05-01

    Full Text Available The national demand for water in the UK is predicted to increase, exacerbated by a growing UK population, and home-grown demands for energy and food. When set against the context of overstretched existing supply sources vulnerable to droughts, particularly in increasingly dense city centres, the delicate balance of matching minimal demands with resource secure supplies becomes critical. When making changes to "internal" demands the role of technological efficiency and user behaviour cannot be ignored, yet existing benchmarking systems traditionally do not consider the latter. This paper investigates the practicalities of adopting a domestic benchmarking system (using a band rating that allows individual users to assess their current water use performance against what is possible. The benchmarking system allows users to achieve higher benchmarks through any approach that reduces water consumption. The sensitivity of water use benchmarks are investigated by making changes to user behaviour and technology. The impact of adopting localised supplies (i.e., Rainwater harvesting—RWH and Grey water—GW and including "external" gardening demands are investigated. This includes the impacts (in isolation and combination of the following: occupancy rates (1 to 4; roof size (12.5 m2 to 100 m2; garden size (25 m2 to 100 m2 and geographical location (North West, Midlands and South East, UK with yearly temporal effects (i.e., rainfall and temperature. Lessons learnt from analysis of the proposed benchmarking system are made throughout this paper, in particular its compatibility with the existing Code for Sustainable Homes (CSH accreditation system. Conclusions are subsequently drawn for the robustness of the proposed system.

  13. Supply network configuration—A benchmarking problem

    Science.gov (United States)

    Brandenburg, Marcus

    2018-03-01

    Managing supply networks is a highly relevant task that strongly influences the competitiveness of firms from various industries. Designing supply networks is a strategic process that considerably affects the structure of the whole network. In contrast, supply networks for new products are configured without major adaptations of the existing structure, but the network has to be configured before the new product is actually launched in the marketplace. Due to dynamics and uncertainties, the resulting planning problem is highly complex. However, formal models and solution approaches that support supply network configuration decisions for new products are scant. The paper at hand aims at stimulating related model-based research. To formulate mathematical models and solution procedures, a benchmarking problem is introduced which is derived from a case study of a cosmetics manufacturer. Tasks, objectives, and constraints of the problem are described in great detail and numerical values and ranges of all problem parameters are given. In addition, several directions for future research are suggested.

  14. Numisheet2005 Benchmark Analysis on Forming of an Automotive Deck Lid Inner Panel: Benchmark 1

    International Nuclear Information System (INIS)

    Buranathiti, Thaweepat; Cao Jian

    2005-01-01

    Numerical simulations in sheet metal forming processes have been a very challenging topic in industry. There are many computer codes and modeling techniques existing today. However, there are many unknowns affecting the prediction accuracy. Systematic benchmark tests are needed to accelerate the future implementations and to provide as a reference. This report presents an international cooperative benchmark effort for an automotive deck lid inner panel. Predictions from simulations are analyzed and discussed against the corresponding experimental results. The correlations between accuracy of each parameter of interest are discussed in this report

  15. Dynamical Mass Measurements of Contaminated Galaxy Clusters Using Support Distribution Machines

    Science.gov (United States)

    Ntampaka, Michelle; Trac, Hy; Sutherland, Dougal; Fromenteau, Sebastien; Poczos, Barnabas; Schneider, Jeff

    2018-01-01

    We study dynamical mass measurements of galaxy clusters contaminated by interlopers and show that a modern machine learning (ML) algorithm can predict masses by better than a factor of two compared to a standard scaling relation approach. We create two mock catalogs from Multidark’s publicly available N-body MDPL1 simulation, one with perfect galaxy cluster membership infor- mation and the other where a simple cylindrical cut around the cluster center allows interlopers to contaminate the clusters. In the standard approach, we use a power-law scaling relation to infer cluster mass from galaxy line-of-sight (LOS) velocity dispersion. Assuming perfect membership knowledge, this unrealistic case produces a wide fractional mass error distribution, with a width E=0.87. Interlopers introduce additional scatter, significantly widening the error distribution further (E=2.13). We employ the support distribution machine (SDM) class of algorithms to learn from distributions of data to predict single values. Applied to distributions of galaxy observables such as LOS velocity and projected distance from the cluster center, SDM yields better than a factor-of-two improvement (E=0.67) for the contaminated case. Remarkably, SDM applied to contaminated clusters is better able to recover masses than even the scaling relation approach applied to uncon- taminated clusters. We show that the SDM method more accurately reproduces the cluster mass function, making it a valuable tool for employing cluster observations to evaluate cosmological models.

  16. Instantons: Dynamical mass generation, chiral ward identities and the topological charge correlation function

    International Nuclear Information System (INIS)

    McDougall, N.A.

    1983-01-01

    When dynamical mass generation resulting from the breakdown of chiral symmetry is taken into account, instanton dynamics treated within the dilute gas approximation may satisfy the constraints on the quark condensates and the topological charge correlation function derived by Crewther from an analysis of the chiral Ward identities assuming the absence of a physical axial U(1) Goldstone boson. From a consideration of the contribution of the eta' to the topological charge correlation function, a relationship is derived in which msub(eta') 2 fsub(eta') 2 is proportional to the vacuum energy density. (orig.)

  17. Performance Evaluation of Supercomputers using HPCC and IMB Benchmarks

    Science.gov (United States)

    Saini, Subhash; Ciotti, Robert; Gunney, Brian T. N.; Spelce, Thomas E.; Koniges, Alice; Dossa, Don; Adamidis, Panagiotis; Rabenseifner, Rolf; Tiyyagura, Sunil R.; Mueller, Matthias; hide

    2006-01-01

    The HPC Challenge (HPCC) benchmark suite and the Intel MPI Benchmark (IMB) are used to compare and evaluate the combined performance of processor, memory subsystem and interconnect fabric of five leading supercomputers - SGI Altix BX2, Cray XI, Cray Opteron Cluster, Dell Xeon cluster, and NEC SX-8. These five systems use five different networks (SGI NUMALINK4, Cray network, Myrinet, InfiniBand, and NEC IXS). The complete set of HPCC benchmarks are run on each of these systems. Additionally, we present Intel MPI Benchmarks (IMB) results to study the performance of 11 MPI communication functions on these systems.

  18. Bench-marking beam-beam simulations using coherent quadrupole effects

    International Nuclear Information System (INIS)

    Krishnagopal, S.; Chin, Y.H.

    1992-06-01

    Computer simulations are used extensively in the study of the beam-beam interaction. The proliferation of such codes raises the important question of their reliability, and motivates the development of a dependable set of bench-marks. We argue that rather than detailed quantitative comparisons, the ability of different codes to predict the same qualitative physics should be used as a criterion for such bench-marks. We use the striking phenomenon of coherent quadrupole oscillations as one such bench-mark, and demonstrate that our codes do indeed observe this behaviour. We also suggest some other tests that could be used as bench-marks

  19. Bench-marking beam-beam simulations using coherent quadrupole effects

    International Nuclear Information System (INIS)

    Krishnagopal, S.; Chin, Y.H.

    1992-01-01

    Computer simulations are used extensively in the study of the beam-beam interaction. The proliferation of such codes raises the important question of their reliability, and motivates the development of a dependable set of bench-marks. We argue that rather than detailed quantitative comparisons, the ability of different codes to predict the same qualitative physics should be used as a criterion for such bench-marks. We use the striking phenomenon of coherent quadrupole oscillations as one such bench-mark, and demonstrate that our codes do indeed observe this behavior. We also suggest some other tests that could be used as bench-marks

  20. RISKIND verification and benchmark comparisons

    Energy Technology Data Exchange (ETDEWEB)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  1. RISKIND verification and benchmark comparisons

    International Nuclear Information System (INIS)

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models

  2. An Arbitrary Benchmark CAPM: One Additional Frontier Portfolio is Sufficient

    OpenAIRE

    Ekern, Steinar

    2008-01-01

    First draft: July 16, 2008 This version: October 7, 2008 The benchmark CAPM linearly relates the expected returns on an arbitrary asset, an arbitrary benchmark portfolio, and an arbitrary MV frontier portfolio. The benchmark is not required to be on the frontier and may be non-perfectly correlated with the frontier portfolio. The benchmark CAPM extends and generalizes previous CAPM formulations, including the zero beta, two correlated frontier portfolios, riskless augmented frontier, an...

  3. Benchmarking the implementation of E-Commerce A Case Study Approach

    OpenAIRE

    von Ettingshausen, C. R. D. Freiherr

    2009-01-01

    The purpose of this thesis was to develop a guideline to support the implementation of E-Commerce with E-Commerce benchmarking. Because of its importance as an interface with the customer, web-site benchmarking has been a widely researched topic. However, limited research has been conducted on benchmarking E-Commerce across other areas of the value chain. Consequently this thesis aims to extend benchmarking into E-Commerce related subjects. The literature review examined ...

  4. Baryogenesis, neutrino masses, and dynamical dark energy

    International Nuclear Information System (INIS)

    Eisele, M.T.

    2007-01-01

    This thesis considers several models that connect different areas of particle physics and cosmology. Our first discussion in this context concerns a baryogenesis scenario, in which the baryon asymmetry of our universe is created through the dynamics of a dark energy field, thereby illustrating that these two topics might be related. Subsequently, several neutrino mass models are analyzed, which make use of an extra-dimensional setting to overcome certain problems of their fourdimensional counterparts. The central discussion of this thesis concerns a leptogenesis model with many standard model singlets. Amongst other things, we show that the presence of these states can lower the standard bound for the necessary reheating temperature of the universe by at least one and a half orders of magnitude. To further motivate this approach, we also discuss an explicit, extradimensional leptogenesis scenario that naturally yields many of the ingredients required in this context. (orig.)

  5. Baryogenesis, neutrino masses, and dynamical dark energy

    Energy Technology Data Exchange (ETDEWEB)

    Eisele, M.T.

    2007-10-09

    This thesis considers several models that connect different areas of particle physics and cosmology. Our first discussion in this context concerns a baryogenesis scenario, in which the baryon asymmetry of our universe is created through the dynamics of a dark energy field, thereby illustrating that these two topics might be related. Subsequently, several neutrino mass models are analyzed, which make use of an extra-dimensional setting to overcome certain problems of their fourdimensional counterparts. The central discussion of this thesis concerns a leptogenesis model with many standard model singlets. Amongst other things, we show that the presence of these states can lower the standard bound for the necessary reheating temperature of the universe by at least one and a half orders of magnitude. To further motivate this approach, we also discuss an explicit, extradimensional leptogenesis scenario that naturally yields many of the ingredients required in this context. (orig.)

  6. Thermo-mechanical analysis of FG nanobeam with attached tip mass: an exact solution

    Science.gov (United States)

    Ghadiri, Majid; Jafari, Ali

    2016-12-01

    Present disquisition proposes an analytical solution method for exploring the vibration characteristics of a cantilever functionally graded nanobeam with a concentrated mass exposed to thermal loading for the first time. Thermo-mechanical properties of FGM nanobeam are supposed to change through the thickness direction of beam based on the rule of power-law (P-FGM). The small-scale effect is taken into consideration based on nonlocal elasticity theory of Eringen. Linear temperature rise (LTR) through thickness direction is studied. Existence of centralized mass in the free end of nanobeam influences the mechanical and physical properties. Timoshenko beam theory is employed to derive the nonlocal governing equations and boundary conditions of FGM beam attached with a tip mass under temperature field via Hamilton's principle. An exact solution procedure is exploited to achieve the non-dimensional frequency of FG nanobeam exposed to temperature field with a tip mass. A parametric study is led to assess the efficacy of temperature changes, tip mass, small scale, beam thickness, power-law exponent, slenderness and thermal loading on the natural frequencies of FG cantilever nanobeam with a point mass at the free end. It is concluded that these parameters play remarkable roles on the dynamic behavior of FG nanobeam subjected to LTR with a tip mass. The results for simpler states are confirmed with known data in the literature. Presented numerical results can serve as benchmarks for future thermo-mechanical analyses of FG nanobeam with tip mass.

  7. Simulation of the OECD Main-Steam-Line-Break Benchmark Exercise 3 Using the Coupled RELAP5/PANTHER Codes

    International Nuclear Information System (INIS)

    Schneidesch, Christophe R.; Zhang Jinzhao

    2004-01-01

    The RELAP5 best-estimate thermal-hydraulic system code has been coupled with the PANTHER three-dimensional neutron kinetics code via the TALINK dynamic data exchange control and processing tool. The coupled RELAP5/PANTHER code package has been qualified and will be used at Tractebel Engineering (TE) for analyzing asymmetric pressurized water reactor (PWR) accidents with strong core-system interactions. The Organization for Economic Cooperation and Development/U.S. Nuclear Regulatory Commission PWR main-steam-line-break benchmark problem was analyzed as part of the qualification efforts to demonstrate the capability of the coupled code package of simulating such transients. This paper reports the main results of TE's contribution to the benchmark Exercise 3

  8. WWER in-core fuel management benchmark definition

    International Nuclear Information System (INIS)

    Apostolov, T.; Alekova, G.; Prodanova, R.; Petrova, T.; Ivanov, K.

    1994-01-01

    Two benchmark problems for WWER-440, including design parameters, operating conditions and measured quantities are discussed in this paper. Some benchmark results for infinitive multiplication factor -K eff , natural boron concentration - C β and relative power distribution - K q obtained by use of the code package are represented. (authors). 5 refs., 3 tabs

  9. Dynamic contrast enhanced MRI in the differential diagnosis of adrenal adenomas and malignant adrenal masses

    International Nuclear Information System (INIS)

    Inan, Nagihan; Arslan, Arzu; Akansel, Gur; Anik, Yonca; Balci, N. Cem; Demirci, Ali

    2008-01-01

    Objective: To evaluate the value of dynamic MR imaging in the differential diagnosis of adrenal adenomas and malignant tumors, especially in cases with atypical adenomas. Materials and methods: Sixty-four masses (48 adenomas, 16 malignant tumors) were included in this prospective study. Signal loss of masses was evaluated using chemical shift MR imaging. Five dynamic series of T1-weighted spoiled gradient echo (FFE) images were obtained, with the acquisition starting simultaneously with i.v. contrast administration (0-100 s) followed by a T1-weighted FFE sequence in the late phase (5th minute). Contrast enhancement patterns in the early (25th second) and late (5th minute) phase images were evaluated. For the quantitative evaluation, signal intensity (SI)-time curves were obtained according to the SIs on the 0th, 25th, 50th 75th and 100th second. Also, the wash-in rate, maximum relative enhancement, time-to-peak, and wash-out of contrast at 100 s of masses in both groups were calculated. The statistical significance was determined by Mann-Whitney U test. To evaluate the diagnostic performance of the quantitative tests, receiver operating characteristic (ROC) analysis was performed. Results: Chemical shift MR imaging was able to differentiate 44 out of 48 adenomas (91.7%) from non-adenomas. The 4 adenomas (8.3%) which could not be differentiated from non-adenomas by this technique did not exhibit signal loss on out-of-phase images. With a cut-off value of 30, SI indices of adenomas had a sensitivity of 93.8%, specificity of 100% and a positive predictive value of 100%. On visual evaluation of dynamic MR imaging, early phase contrast enhancement patterns were homogeneous in 75% and punctate in 20,83% of the adenomas; while patchy in 56.25% and peripheral in 25% of the malignant tumors. On the late phase images 58.33% of the adenomas showed peripheral ring-shaped enhancement and 10.41% showed heterogeneous enhancement. All of the malignant masses showed heterogeneous

  10. Dynamic characterization, monitoring and control of rotating flexible beam-mass structures via piezo-embedded techniques

    Science.gov (United States)

    Lai, Steven H.-Y.

    1992-01-01

    A variational principle and a finite element discretization technique were used to derive the dynamic equations for a high speed rotating flexible beam-mass system embedded with piezo-electric materials. The dynamic equation thus obtained allows the development of finite element models which accommodate both the original structural element and the piezoelectric element. The solutions of finite element models provide system dynamics needed to design a sensing system. The characterization of gyroscopic effect and damping capacity of smart rotating devices are addressed. Several simulation examples are presented to validate the analytical solution.

  11. Dynamical Mass Generation and Confinement in Maxwell-Chern-Simons Planar Quantum Electrodynamics

    International Nuclear Information System (INIS)

    Sanchez Madrigal, S; Raya, A; Hofmann, C P

    2011-01-01

    We study the non-perturbative phenomena of Dynamical Mass Generation and Confinement by truncating at the non-perturbative level the Schwinger-Dyson equations in Maxwell-Chern-Simons planar quantum electrodynamics. We obtain numerical solutions for the fermion propagator in Landau gauge within the so-called rainbow approximation. A comparison with the ordinary theory without the Chern-Simons term is presented.

  12. A suite of exercises for verifying dynamic earthquake rupture codes

    Science.gov (United States)

    Harris, Ruth A.; Barall, Michael; Aagaard, Brad T.; Ma, Shuo; Roten, Daniel; Olsen, Kim B.; Duan, Benchun; Liu, Dunyu; Luo, Bin; Bai, Kangchen; Ampuero, Jean-Paul; Kaneko, Yoshihiro; Gabriel, Alice-Agnes; Duru, Kenneth; Ulrich, Thomas; Wollherr, Stephanie; Shi, Zheqiang; Dunham, Eric; Bydlon, Sam; Zhang, Zhenguo; Chen, Xiaofei; Somala, Surendra N.; Pelties, Christian; Tago, Josue; Cruz-Atienza, Victor Manuel; Kozdon, Jeremy; Daub, Eric; Aslam, Khurram; Kase, Yuko; Withers, Kyle; Dalguer, Luis

    2018-01-01

    We describe a set of benchmark exercises that are designed to test if computer codes that simulate dynamic earthquake rupture are working as intended. These types of computer codes are often used to understand how earthquakes operate, and they produce simulation results that include earthquake size, amounts of fault slip, and the patterns of ground shaking and crustal deformation. The benchmark exercises examine a range of features that scientists incorporate in their dynamic earthquake rupture simulations. These include implementations of simple or complex fault geometry, off‐fault rock response to an earthquake, stress conditions, and a variety of formulations for fault friction. Many of the benchmarks were designed to investigate scientific problems at the forefronts of earthquake physics and strong ground motions research. The exercises are freely available on our website for use by the scientific community.

  13. Introducing a Generic Concept for an Online IT-Benchmarking System

    OpenAIRE

    Ziaie, Pujan;Ziller, Markus;Wollersheim, Jan;Krcmar, Helmut

    2014-01-01

    While IT benchmarking has grown considerably in the last few years, conventional benchmarking tools have not been able to adequately respond to the rapid changes in technology and paradigm shifts in IT-related domains. This paper aims to review benchmarking methods and leverage design science methodology to present design elements for a novel software solution in the field of IT benchmarking. The solution, which introduces a concept for generic (service-independent) indicators is based on and...

  14. Strategic behaviour under regulatory benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Jamasb, T. [Cambridge Univ. (United Kingdom). Dept. of Applied Economics; Nillesen, P. [NUON NV (Netherlands); Pollitt, M. [Cambridge Univ. (United Kingdom). Judge Inst. of Management

    2004-09-01

    In order to improve the efficiency of electricity distribution networks, some regulators have adopted incentive regulation schemes that rely on performance benchmarking. Although regulation benchmarking can influence the ''regulation game,'' the subject has received limited attention. This paper discusses how strategic behaviour can result in inefficient behaviour by firms. We then use the Data Envelopment Analysis (DEA) method with US utility data to examine implications of illustrative cases of strategic behaviour reported by regulators. The results show that gaming can have significant effects on the measured performance and profitability of firms. (author)

  15. 3-D neutron transport benchmarks

    International Nuclear Information System (INIS)

    Takeda, T.; Ikeda, H.

    1991-03-01

    A set of 3-D neutron transport benchmark problems proposed by the Osaka University to NEACRP in 1988 has been calculated by many participants and the corresponding results are summarized in this report. The results of K eff , control rod worth and region-averaged fluxes for the four proposed core models, calculated by using various 3-D transport codes are compared and discussed. The calculational methods used were: Monte Carlo, Discrete Ordinates (Sn), Spherical Harmonics (Pn), Nodal Transport and others. The solutions of the four core models are quite useful as benchmarks for checking the validity of 3-D neutron transport codes

  16. The EDGE-CALIFA survey: validating stellar dynamical mass models with CO kinematics

    Science.gov (United States)

    Leung, Gigi Y. C.; Leaman, Ryan; van de Ven, Glenn; Lyubenova, Mariya; Zhu, Ling; Bolatto, Alberto D.; Falcón-Barroso, Jesus; Blitz, Leo; Dannerbauer, Helmut; Fisher, David B.; Levy, Rebecca C.; Sanchez, Sebastian F.; Utomo, Dyas; Vogel, Stuart; Wong, Tony; Ziegler, Bodo

    2018-06-01

    Deriving circular velocities of galaxies from stellar kinematics can provide an estimate of their total dynamical mass, provided a contribution from the velocity dispersion of the stars is taken into account. Molecular gas (e.g. CO), on the other hand, is a dynamically cold tracer and hence acts as an independent circular velocity estimate without needing such a correction. In this paper, we test the underlying assumptions of three commonly used dynamical models, deriving circular velocities from stellar kinematics of 54 galaxies (S0-Sd) that have observations of both stellar kinematics from the Calar Alto Legacy Integral Field Area (CALIFA) survey, and CO kinematics from the Extragalactic Database for Galaxy Evolution (EDGE) survey. We test the asymmetric drift correction (ADC) method, as well as Jeans, and Schwarzschild models. The three methods each reproduce the CO circular velocity at 1Re to within 10 per cent. All three methods show larger scatter (up to 20 per cent) in the inner regions (R < 0.4Re) that may be due to an increasingly spherical mass distribution (which is not captured by the thin disc assumption in ADC), or non-constant stellar M/L ratios (for both the JAM and Schwarzschild models). This homogeneous analysis of stellar and gaseous kinematics validates that all three models can recover Mdyn at 1Re to better than 20 per cent, but users should be mindful of scatter in the inner regions where some assumptions may break down.

  17. Introduction to 'International Handbook of Criticality Safety Benchmark Experiments'

    International Nuclear Information System (INIS)

    Komuro, Yuichi

    1998-01-01

    The Criticality Safety Benchmark Evaluation Project (CSBEP) was initiated in 1992 by the United States Department of Energy. The project quickly became an international effort as scientists from other interested countries became involved. The International Criticality Safety Benchmark Evaluation Project (ICSBEP) is now an official activity of the Organization for Economic Cooperation and Development-Nuclear Energy Agency (OECD-NEA). 'International Handbook of Criticality Safety Benchmark Experiments' was prepared and is updated year by year by the working group of the project. This handbook contains criticality safety benchmark specifications that have been derived from experiments that were performed at various nuclear critical facilities around the world. The benchmark specifications are intended for use by criticality safety engineers to validate calculation techniques used. The author briefly introduces the informative handbook and would like to encourage Japanese engineers who are in charge of nuclear criticality safety to use the handbook. (author)

  18. Benchmark calculation of subchannel analysis codes

    International Nuclear Information System (INIS)

    1996-02-01

    In order to evaluate the analysis capabilities of various subchannel codes used in thermal-hydraulic design of light water reactors, benchmark calculations were performed. The selected benchmark problems and major findings obtained by the calculations were as follows: (1)As for single-phase flow mixing experiments between two channels, the calculated results of water temperature distribution along the flow direction were agreed with experimental results by tuning turbulent mixing coefficients properly. However, the effect of gap width observed in the experiments could not be predicted by the subchannel codes. (2)As for two-phase flow mixing experiments between two channels, in high water flow rate cases, the calculated distributions of air and water flows in each channel were well agreed with the experimental results. In low water flow cases, on the other hand, the air mixing rates were underestimated. (3)As for two-phase flow mixing experiments among multi-channels, the calculated mass velocities at channel exit under steady-state condition were agreed with experimental values within about 10%. However, the predictive errors of exit qualities were as high as 30%. (4)As for critical heat flux(CHF) experiments, two different results were obtained. A code indicated that the calculated CHF's using KfK or EPRI correlations were well agreed with the experimental results, while another code suggested that the CHF's were well predicted by using WSC-2 correlation or Weisman-Pei mechanistic model. (5)As for droplets entrainment and deposition experiments, it was indicated that the predictive capability was significantly increased by improving correlations. On the other hand, a remarkable discrepancy between codes was observed. That is, a code underestimated the droplet flow rate and overestimated the liquid film flow rate in high quality cases, while another code overestimated the droplet flow rate and underestimated the liquid film flow rate in low quality cases. (J.P.N.)

  19. A Benchmark and Simulator for UAV Tracking

    KAUST Repository

    Mueller, Matthias; Smith, Neil; Ghanem, Bernard

    2016-01-01

    In this paper, we propose a new aerial video dataset and benchmark for low altitude UAV target tracking, as well as, a photorealistic UAV simulator that can be coupled with tracking methods. Our benchmark provides the first evaluation of many state-of-the-art and popular trackers on 123 new and fully annotated HD video sequences captured from a low-altitude aerial perspective. Among the compared trackers, we determine which ones are the most suitable for UAV tracking both in terms of tracking accuracy and run-time. The simulator can be used to evaluate tracking algorithms in real-time scenarios before they are deployed on a UAV “in the field”, as well as, generate synthetic but photo-realistic tracking datasets with automatic ground truth annotations to easily extend existing real-world datasets. Both the benchmark and simulator are made publicly available to the vision community on our website to further research in the area of object tracking from UAVs. (https://ivul.kaust.edu.sa/Pages/pub-benchmark-simulator-uav.aspx.). © Springer International Publishing AG 2016.

  20. A Benchmark and Simulator for UAV Tracking

    KAUST Repository

    Mueller, Matthias

    2016-09-16

    In this paper, we propose a new aerial video dataset and benchmark for low altitude UAV target tracking, as well as, a photorealistic UAV simulator that can be coupled with tracking methods. Our benchmark provides the first evaluation of many state-of-the-art and popular trackers on 123 new and fully annotated HD video sequences captured from a low-altitude aerial perspective. Among the compared trackers, we determine which ones are the most suitable for UAV tracking both in terms of tracking accuracy and run-time. The simulator can be used to evaluate tracking algorithms in real-time scenarios before they are deployed on a UAV “in the field”, as well as, generate synthetic but photo-realistic tracking datasets with automatic ground truth annotations to easily extend existing real-world datasets. Both the benchmark and simulator are made publicly available to the vision community on our website to further research in the area of object tracking from UAVs. (https://ivul.kaust.edu.sa/Pages/pub-benchmark-simulator-uav.aspx.). © Springer International Publishing AG 2016.