WorldWideScience

Sample records for benchmarking derivative-free optimization

  1. Derivative-free and blackbox optimization

    CERN Document Server

    Audet, Charles

    2017-01-01

    This book is designed as a textbook, suitable for self-learning or for teaching an upper-year university course on derivative-free and blackbox optimization.  The book is split into 5 parts and is designed to be modular; any individual part depends only on the material in Part I.  Part I of the book discusses what is meant by Derivative-Free and Blackbox Optimization, provides background material, and early basics while Part II focuses on heuristic methods (Genetic Algorithms and Nelder-Mead).  Part III presents direct search methods (Generalized Pattern Search and Mesh Adaptive Direct Search) and Part IV focuses on model-based methods (Simplex Gradient and Trust Region).  Part V discusses dealing with constraints, using surrogates, and bi-objective optimization. End of chapter exercises are included throughout as well as 15 end of chapter projects and over 40 figures.  Benchmarking techniques are also presented in the appendix.

  2. RF cavity design exploiting a new derivative-free trust region optimization approach

    Directory of Open Access Journals (Sweden)

    Abdel-Karim S.O. Hassan

    2015-11-01

    Full Text Available In this article, a novel derivative-free (DF surrogate-based trust region optimization approach is proposed. In the proposed approach, quadratic surrogate models are constructed and successively updated. The generated surrogate model is then optimized instead of the underlined objective function over trust regions. Truncated conjugate gradients are employed to find the optimal point within each trust region. The approach constructs the initial quadratic surrogate model using few data points of order O(n, where n is the number of design variables. The proposed approach adopts weighted least squares fitting for updating the surrogate model instead of interpolation which is commonly used in DF optimization. This makes the approach more suitable for stochastic optimization and for functions subject to numerical error. The weights are assigned to give more emphasis to points close to the current center point. The accuracy and efficiency of the proposed approach are demonstrated by applying it to a set of classical bench-mark test problems. It is also employed to find the optimal design of RF cavity linear accelerator with a comparison analysis with a recent optimization technique.

  3. Derivative-free optimization under uncertainty applied to costly simulators

    International Nuclear Information System (INIS)

    Pauwels, Benoit

    2016-01-01

    The modeling of complex phenomena encountered in industrial issues can lead to the study of numerical simulation codes. These simulators may require extensive execution time (from hours to days), involve uncertain parameters and even be intrinsically stochastic. Importantly within the context of simulation-based optimization, the derivatives of the outputs with respect to the inputs may be inexistent, inaccessible or too costly to approximate reasonably. This thesis is organized in four chapters. The first chapter discusses the state of the art in derivative-free optimization and uncertainty modeling. The next three chapters introduce three independent - although connected - contributions to the field of derivative-free optimization in the presence of uncertainty. The second chapter addresses the emulation of costly stochastic simulation codes - stochastic in the sense simulations run with the same input parameters may lead to distinct outputs. Such was the matter of the CODESTOCH project carried out at the Summer mathematical research center on scientific computing and its applications (CEMRACS) during the summer of 2013, together with two Ph.D. students from Electricity of France (EDF) and the Atomic Energy and Alternative Energies Commission (CEA). We designed four methods to build emulators for functions whose values are probability density functions. These methods were tested on two toy functions and applied to industrial simulation codes concerned with three complex phenomena: the spatial distribution of molecules in a hydrocarbon system (IFPEN), the life cycle of large electric transformers (EDF) and the repercussions of a hypothetical accidental in a nuclear plant (CEA). Emulation was a preliminary process towards optimization in the first two cases. In the third chapter we consider the influence of inaccurate objective function evaluations on direct search - a classical derivative-free optimization method. In real settings inaccuracy may never vanish

  4. Benchmarking Parameter-free AMaLGaM on Functions With and Without Noise

    NARCIS (Netherlands)

    P.A.N. Bosman (Peter); J. Grahl; D. Thierens (Dirk)

    2013-01-01

    htmlabstractWe describe a parameter-free estimation-of-distribution algorithm (EDA) called the adapted maximum-likelihood Gaussian model iterated density-estimation evolutionary algorithm (AMaLGaM-IDEA, or AMaLGaM for short) for numerical optimization. AMaLGaM is benchmarked within the 2009 black

  5. Derivative-free generation and interpolation of convex Pareto optimal IMRT plans

    Science.gov (United States)

    Hoffmann, Aswin L.; Siem, Alex Y. D.; den Hertog, Dick; Kaanders, Johannes H. A. M.; Huizenga, Henk

    2006-12-01

    In inverse treatment planning for intensity-modulated radiation therapy (IMRT), beamlet intensity levels in fluence maps of high-energy photon beams are optimized. Treatment plan evaluation criteria are used as objective functions to steer the optimization process. Fluence map optimization can be considered a multi-objective optimization problem, for which a set of Pareto optimal solutions exists: the Pareto efficient frontier (PEF). In this paper, a constrained optimization method is pursued to iteratively estimate the PEF up to some predefined error. We use the property that the PEF is convex for a convex optimization problem to construct piecewise-linear upper and lower bounds to approximate the PEF from a small initial set of Pareto optimal plans. A derivative-free Sandwich algorithm is presented in which these bounds are used with three strategies to determine the location of the next Pareto optimal solution such that the uncertainty in the estimated PEF is maximally reduced. We show that an intelligent initial solution for a new Pareto optimal plan can be obtained by interpolation of fluence maps from neighbouring Pareto optimal plans. The method has been applied to a simplified clinical test case using two convex objective functions to map the trade-off between tumour dose heterogeneity and critical organ sparing. All three strategies produce representative estimates of the PEF. The new algorithm is particularly suitable for dynamic generation of Pareto optimal plans in interactive treatment planning.

  6. Derivative-free generation and interpolation of convex Pareto optimal IMRT plans

    International Nuclear Information System (INIS)

    Hoffmann, Aswin L; Siem, Alex Y D; Hertog, Dick den; Kaanders, Johannes H A M; Huizenga, Henk

    2006-01-01

    In inverse treatment planning for intensity-modulated radiation therapy (IMRT), beamlet intensity levels in fluence maps of high-energy photon beams are optimized. Treatment plan evaluation criteria are used as objective functions to steer the optimization process. Fluence map optimization can be considered a multi-objective optimization problem, for which a set of Pareto optimal solutions exists: the Pareto efficient frontier (PEF). In this paper, a constrained optimization method is pursued to iteratively estimate the PEF up to some predefined error. We use the property that the PEF is convex for a convex optimization problem to construct piecewise-linear upper and lower bounds to approximate the PEF from a small initial set of Pareto optimal plans. A derivative-free Sandwich algorithm is presented in which these bounds are used with three strategies to determine the location of the next Pareto optimal solution such that the uncertainty in the estimated PEF is maximally reduced. We show that an intelligent initial solution for a new Pareto optimal plan can be obtained by interpolation of fluence maps from neighbouring Pareto optimal plans. The method has been applied to a simplified clinical test case using two convex objective functions to map the trade-off between tumour dose heterogeneity and critical organ sparing. All three strategies produce representative estimates of the PEF. The new algorithm is particularly suitable for dynamic generation of Pareto optimal plans in interactive treatment planning

  7. Designing molecular complexes using free-energy derivatives from liquid-state integral equation theory

    International Nuclear Information System (INIS)

    Mrugalla, Florian; Kast, Stefan M

    2016-01-01

    Complex formation between molecules in solution is the key process by which molecular interactions are translated into functional systems. These processes are governed by the binding or free energy of association which depends on both direct molecular interactions and the solvation contribution. A design goal frequently addressed in pharmaceutical sciences is the optimization of chemical properties of the complex partners in the sense of minimizing their binding free energy with respect to a change in chemical structure. Here, we demonstrate that liquid-state theory in the form of the solute–solute equation of the reference interaction site model provides all necessary information for such a task with high efficiency. In particular, computing derivatives of the potential of mean force (PMF), which defines the free-energy surface of complex formation, with respect to potential parameters can be viewed as a means to define a direction in chemical space toward better binders. We illustrate the methodology in the benchmark case of alkali ion binding to the crown ether 18-crown-6 in aqueous solution. In order to examine the validity of the underlying solute–solute theory, we first compare PMFs computed by different approaches, including explicit free-energy molecular dynamics simulations as a reference. Predictions of an optimally binding ion radius based on free-energy derivatives are then shown to yield consistent results for different ion parameter sets and to compare well with earlier, orders-of-magnitude more costly explicit simulation results. This proof-of-principle study, therefore, demonstrates the potential of liquid-state theory for molecular design problems. (paper)

  8. Suggested benchmarks for shape optimization for minimum stress concentration

    DEFF Research Database (Denmark)

    Pedersen, Pauli

    2008-01-01

    Shape optimization for minimum stress concentration is vital, important, and difficult. New formulations and numerical procedures imply the need for good benchmarks. The available analytical shape solutions rely on assumptions that are seldom satisfied, so here, we suggest alternative benchmarks...

  9. Benchmarking of radiological departments. Starting point for successful process optimization

    International Nuclear Information System (INIS)

    Busch, Hans-Peter

    2010-01-01

    Continuous optimization of the process of organization and medical treatment is part of the successful management of radiological departments. The focus of this optimization can be cost units such as CT and MRI or the radiological parts of total patient treatment. Key performance indicators for process optimization are cost- effectiveness, service quality and quality of medical treatment. The potential for improvements can be seen by comparison (benchmark) with other hospitals and radiological departments. Clear definitions of key data and criteria are absolutely necessary for comparability. There is currently little information in the literature regarding the methodology and application of benchmarks especially from the perspective of radiological departments and case-based lump sums, even though benchmarking has frequently been applied to radiological departments by hospital management. The aim of this article is to describe and discuss systematic benchmarking as an effective starting point for successful process optimization. This includes the description of the methodology, recommendation of key parameters and discussion of the potential for cost-effectiveness analysis. The main focus of this article is cost-effectiveness (efficiency and effectiveness) with respect to cost units and treatment processes. (orig.)

  10. A Benchmark Estimate for the Capital Stock. An Optimal Consistency Method

    OpenAIRE

    Jose Miguel Albala-Bertrand

    2001-01-01

    There are alternative methods to estimate a capital stock for a benchmark year. These methods, however, do not allow for an independent check, which could establish whether the estimated benchmark level is too high or too low. I propose here an optimal consistency method (OCM), which may allow estimating a capital stock level for a benchmark year and/or checking the consistency of alternative estimates of a benchmark capital stock.

  11. Implementation and verification of global optimization benchmark problems

    Science.gov (United States)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  12. A method for stochastic constrained optimization using derivative-free surrogate pattern search and collocation

    International Nuclear Information System (INIS)

    Sankaran, Sethuraman; Audet, Charles; Marsden, Alison L.

    2010-01-01

    Recent advances in coupling novel optimization methods to large-scale computing problems have opened the door to tackling a diverse set of physically realistic engineering design problems. A large computational overhead is associated with computing the cost function for most practical problems involving complex physical phenomena. Such problems are also plagued with uncertainties in a diverse set of parameters. We present a novel stochastic derivative-free optimization approach for tackling such problems. Our method extends the previously developed surrogate management framework (SMF) to allow for uncertainties in both simulation parameters and design variables. The stochastic collocation scheme is employed for stochastic variables whereas Kriging based surrogate functions are employed for the cost function. This approach is tested on four numerical optimization problems and is shown to have significant improvement in efficiency over traditional Monte-Carlo schemes. Problems with multiple probabilistic constraints are also discussed.

  13. The physics of an optimal basketball free throw

    OpenAIRE

    Barzykina, Irina

    2017-01-01

    A physical model is developed, which suggests a pathway to determining the optimal release conditions for a basketball free throw. Theoretical framework is supported by Monte Carlo simulations and a series of free throws performed and analysed at Southbank International School. The model defines a smile-shaped success region in angle-velocity space where a free throw will score. A formula for the minimum throwing angle is derived analytically. The optimal throwing conditions are determined nu...

  14. Implementation and verification of global optimization benchmark problems

    Directory of Open Access Journals (Sweden)

    Posypkin Mikhail

    2017-12-01

    Full Text Available The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its’ gradient at a given point and the interval estimates of a function and its’ gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  15. Photoinjector optimization using a derivative-free, model-based trust-region algorithm for the Argonne Wakefield Accelerator

    Science.gov (United States)

    Neveu, N.; Larson, J.; Power, J. G.; Spentzouris, L.

    2017-07-01

    Model-based, derivative-free, trust-region algorithms are increasingly popular for optimizing computationally expensive numerical simulations. A strength of such methods is their efficient use of function evaluations. In this paper, we use one such algorithm to optimize the beam dynamics in two cases of interest at the Argonne Wakefield Accelerator (AWA) facility. First, we minimize the emittance of a 1 nC electron bunch produced by the AWA rf photocathode gun by adjusting three parameters: rf gun phase, solenoid strength, and laser radius. The algorithm converges to a set of parameters that yield an emittance of 1.08 μm. Second, we expand the number of optimization parameters to model the complete AWA rf photoinjector (the gun and six accelerating cavities) at 40 nC. The optimization algorithm is used in a Pareto study that compares the trade-off between emittance and bunch length for the AWA 70MeV photoinjector.

  16. Analytic Approximations to the Free Boundary and Multi-dimensional Problems in Financial Derivatives Pricing

    Science.gov (United States)

    Lau, Chun Sing

    This thesis studies two types of problems in financial derivatives pricing. The first type is the free boundary problem, which can be formulated as a partial differential equation (PDE) subject to a set of free boundary condition. Although the functional form of the free boundary condition is given explicitly, the location of the free boundary is unknown and can only be determined implicitly by imposing continuity conditions on the solution. Two specific problems are studied in details, namely the valuation of fixed-rate mortgages and CEV American options. The second type is the multi-dimensional problem, which involves multiple correlated stochastic variables and their governing PDE. One typical problem we focus on is the valuation of basket-spread options, whose underlying asset prices are driven by correlated geometric Brownian motions (GBMs). Analytic approximate solutions are derived for each of these three problems. For each of the two free boundary problems, we propose a parametric moving boundary to approximate the unknown free boundary, so that the original problem transforms into a moving boundary problem which can be solved analytically. The governing parameter of the moving boundary is determined by imposing the first derivative continuity condition on the solution. The analytic form of the solution allows the price and the hedging parameters to be computed very efficiently. When compared against the benchmark finite-difference method, the computational time is significantly reduced without compromising the accuracy. The multi-stage scheme further allows the approximate results to systematically converge to the benchmark results as one recasts the moving boundary into a piecewise smooth continuous function. For the multi-dimensional problem, we generalize the Kirk (1995) approximate two-asset spread option formula to the case of multi-asset basket-spread option. Since the final formula is in closed form, all the hedging parameters can also be derived in

  17. Neutron Reference Benchmark Field Specification: ACRR Free-Field Environment (ACRR-FF-CC-32-CL).

    Energy Technology Data Exchange (ETDEWEB)

    Vega, Richard Manuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Parma, Edward J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Griffin, Patrick J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vehar, David W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This report was put together to support the International Atomic Energy Agency (IAEA) REAL- 2016 activity to validate the dosimetry community’s ability to use a consistent set of activation data and to derive consistent spectral characterizations. The report captures details of integral measurements taken in the Annular Core Research Reactor (ACRR) central cavity free-field reference neutron benchmark field. The field is described and an “a priori” calculated neutron spectrum is reported, based on MCNP6 calculations, and a subject matter expert (SME) based covariance matrix is given for this “a priori” spectrum. The results of 31 integral dosimetry measurements in the neutron field are reported.

  18. A comparative evaluation of risk-adjustment models for benchmarking amputation-free survival after lower extremity bypass.

    Science.gov (United States)

    Simons, Jessica P; Goodney, Philip P; Flahive, Julie; Hoel, Andrew W; Hallett, John W; Kraiss, Larry W; Schanzer, Andres

    2016-04-01

    Providing patients and payers with publicly reported risk-adjusted quality metrics for the purpose of benchmarking physicians and institutions has become a national priority. Several prediction models have been developed to estimate outcomes after lower extremity revascularization for critical limb ischemia, but the optimal model to use in contemporary practice has not been defined. We sought to identify the highest-performing risk-adjustment model for amputation-free survival (AFS) at 1 year after lower extremity bypass (LEB). We used the national Society for Vascular Surgery Vascular Quality Initiative (VQI) database (2003-2012) to assess the performance of three previously validated risk-adjustment models for AFS. The Bypass versus Angioplasty in Severe Ischaemia of the Leg (BASIL), Finland National Vascular (FINNVASC) registry, and the modified Project of Ex-vivo vein graft Engineering via Transfection III (PREVENT III [mPIII]) risk scores were applied to the VQI cohort. A novel model for 1-year AFS was also derived using the VQI data set and externally validated using the PIII data set. The relative discrimination (Harrell c-index) and calibration (Hosmer-May goodness-of-fit test) of each model were compared. Among 7754 patients in the VQI who underwent LEB for critical limb ischemia, the AFS was 74% at 1 year. Each of the previously published models for AFS demonstrated similar discriminative performance: c-indices for BASIL, FINNVASC, mPIII were 0.66, 0.60, and 0.64, respectively. The novel VQI-derived model had improved discriminative ability with a c-index of 0.71 and appropriate generalizability on external validation with a c-index of 0.68. The model was well calibrated in both the VQI and PIII data sets (goodness of fit P = not significant). Currently available prediction models for AFS after LEB perform modestly when applied to national contemporary VQI data. Moreover, the performance of each model was inferior to that of the novel VQI-derived model

  19. An Economical Approach to Estimate a Benchmark Capital Stock. An Optimal Consistency Method

    OpenAIRE

    Jose Miguel Albala-Bertrand

    2003-01-01

    There are alternative methods of estimating capital stock for a benchmark year. However, these methods are costly and time-consuming, requiring the gathering of much basic information as well as the use of some convenient assumptions and guesses. In addition, a way is needed of checking whether the estimated benchmark is at the correct level. This paper proposes an optimal consistency method (OCM), which enables a capital stock to be estimated for a benchmark year, and which can also be used ...

  20. Optimal orientation in flows : Providing a benchmark for animal movement strategies

    NARCIS (Netherlands)

    McLaren, James D.; Shamoun-Baranes, Judy; Dokter, Adriaan M.; Klaassen, Raymond H. G.; Bouten, Willem

    2014-01-01

    Animal movements in air and water can be strongly affected by experienced flow. While various flow-orientation strategies have been proposed and observed, their performance in variable flow conditions remains unclear. We apply control theory to establish a benchmark for time-minimizing (optimal)

  1. Network synchronization: optimal and pessimal scale-free topologies

    International Nuclear Information System (INIS)

    Donetti, Luca; Hurtado, Pablo I; Munoz, Miguel A

    2008-01-01

    By employing a recently introduced optimization algorithm we construct optimally synchronizable (unweighted) networks for any given scale-free degree distribution. We explore how the optimization process affects degree-degree correlations and observe a generic tendency toward disassortativity. Still, we show that there is not a one-to-one correspondence between synchronizability and disassortativity. On the other hand, we study the nature of optimally un-synchronizable networks, that is, networks whose topology minimizes the range of stability of the synchronous state. The resulting 'pessimal networks' turn out to have a highly assortative string-like structure. We also derive a rigorous lower bound for the Laplacian eigenvalue ratio controlling synchronizability, which helps understanding the impact of degree correlations on network synchronizability

  2. Developing Benchmarking Criteria for CO2 Emissions

    Energy Technology Data Exchange (ETDEWEB)

    Neelis, M.; Worrell, E.; Mueller, N.; Angelini, T. [Ecofys, Utrecht (Netherlands); Cremer, C.; Schleich, J.; Eichhammer, W. [The Fraunhofer Institute for Systems and Innovation research, Karlsruhe (Germany)

    2009-02-15

    A European Union (EU) wide greenhouse gas (GHG) allowance trading scheme (EU ETS) was implemented in the EU in 2005. In the first two trading periods of the scheme (running up to 2012), free allocation based on historical emissions was the main methodology for allocation of allowances to existing installations. For the third trading period (2013 - 2020), the European Commission proposed in January 2008 a more important role of auctioning of allowances rather then free allocation. (Transitional) free allocation of allowances to industrial sectors will be determined via harmonized allocation rules, where feasible based on benchmarking. In general terms, a benchmark based method allocates allowances based on a certain amount of emissions per unit of productive output (i.e. the benchmark). This study aims to derive criteria for an allocation methodology for the EU Emission Trading Scheme based on benchmarking for the period 2013 - 2020. To test the feasibility of the criteria, we apply them to four example product groups: iron and steel, pulp and paper, lime and glass. The basis for this study is the Commission proposal for a revised ETS directive put forward on 23 January 2008 and does not take into account any changes to this proposal in the co-decision procedure that resulted in the adoption of the Energy and Climate change package in December 2008.

  3. Tendances Carbone no. 79 'Free allocations under Phase 3 benchmarks: early evidence of what has changed'

    International Nuclear Information System (INIS)

    Sartor, Oliver

    2013-01-01

    Among the publications of CDC Climat Research, 'Tendances Carbone' bulletin specifically studies the developments of the European market for CO 2 allowances. This issue addresses the following points: One of the most controversial changes to the EU ETS in Phase 3 (2013-2020) has been the introduction of emissions-performance benchmarks for determining free allocations to non-electricity producers. Phases 1 and 2 used National Allocation Plans (NAPs). For practical reasons NAPs were drawn up by each Member State, but this led to problems, including over-generous allowance allocation, insufficiently harmonised allocations across countries and distorted incentives to reduce emissions. Benchmarking tries to fix things by allocating the equivalent of 100% of allowances needed if every installation used the best available technology. But this is not universally popular and industries say that they might lose international competitiveness. So a new study by CDC Climat and the Climate Economics Chair examined the data from the preliminary Phase 3 free allocations of 20 EU Member States and asked: how much are free allocations actually going to change with benchmarking?

  4. Network synchronization: optimal and pessimal scale-free topologies

    Energy Technology Data Exchange (ETDEWEB)

    Donetti, Luca [Departamento de Electronica y Tecnologia de Computadores and Instituto de Fisica Teorica y Computacional Carlos I, Facultad de Ciencias, Universidad de Granada, 18071 Granada (Spain); Hurtado, Pablo I; Munoz, Miguel A [Departamento de Electromagnetismo y Fisica de la Materia and Instituto Carlos I de Fisica Teorica y Computacional Facultad de Ciencias, Universidad de Granada, 18071 Granada (Spain)], E-mail: mamunoz@onsager.ugr.es

    2008-06-06

    By employing a recently introduced optimization algorithm we construct optimally synchronizable (unweighted) networks for any given scale-free degree distribution. We explore how the optimization process affects degree-degree correlations and observe a generic tendency toward disassortativity. Still, we show that there is not a one-to-one correspondence between synchronizability and disassortativity. On the other hand, we study the nature of optimally un-synchronizable networks, that is, networks whose topology minimizes the range of stability of the synchronous state. The resulting 'pessimal networks' turn out to have a highly assortative string-like structure. We also derive a rigorous lower bound for the Laplacian eigenvalue ratio controlling synchronizability, which helps understanding the impact of degree correlations on network synchronizability.

  5. OPTIMIZATION OF MUD HAMMER DRILLING PERFORMANCE - A PROGRAM TO BENCHMARK THE VIABILITY OF ADVANCED MUD HAMMER DRILLING

    Energy Technology Data Exchange (ETDEWEB)

    Arnis Judzis

    2003-01-01

    This document details the progress to date on the ''OPTIMIZATION OF MUD HAMMER DRILLING PERFORMANCE -- A PROGRAM TO BENCHMARK THE VIABILITY OF ADVANCED MUD HAMMER DRILLING'' contract for the quarter starting October 2002 through December 2002. Even though we are awaiting the optimization portion of the testing program, accomplishments included the following: (1) Smith International participated in the DOE Mud Hammer program through full scale benchmarking testing during the week of 4 November 2003. (2) TerraTek acknowledges Smith International, BP America, PDVSA, and ConocoPhillips for cost-sharing the Smith benchmarking tests allowing extension of the contract to add to the benchmarking testing program. (3) Following the benchmark testing of the Smith International hammer, representatives from DOE/NETL, TerraTek, Smith International and PDVSA met at TerraTek in Salt Lake City to review observations, performance and views on the optimization step for 2003. (4) The December 2002 issue of Journal of Petroleum Technology (Society of Petroleum Engineers) highlighted the DOE fluid hammer testing program and reviewed last years paper on the benchmark performance of the SDS Digger and Novatek hammers. (5) TerraTek's Sid Green presented a technical review for DOE/NETL personnel in Morgantown on ''Impact Rock Breakage'' and its importance on improving fluid hammer performance. Much discussion has taken place on the issues surrounding mud hammer performance at depth conditions.

  6. A derivative-free approach for the estimation of porosity and permeability using time-lapse seismic and production data

    International Nuclear Information System (INIS)

    Dadashpour, Mohsen; Kleppe, Jon; Landrø, Martin; Echeverria Ciaurri, David; Mukerji, Tapan

    2010-01-01

    In this study, we apply a derivative-free optimization algorithm to estimate porosity and permeability from time-lapse seismic data and production data from a real reservoir (Norne field). In some circumstances, obtaining gradient information (exact and/or approximate) can be problematic e.g. derivatives are not available from a commercial simulator, or results are needed within a very short time frame. Derivative-free optimization approaches can be very time consuming because they often require many simulations. Typically, one iteration roughly needs as many simulations as the number of optimization variables. In this work, we propose two ways to significantly increase the efficiency of an optimization methodology in model inversion problems. First, by principal component analysis we decrease the number of optimization variables while keeping geostatistical consistency, and second, noticing that some optimization methods are very amenable to being parallelized, we apply them within a distributed computing framework. If we combine all this, the model inversion approach can be robust, fairly efficient and very simple to implement. In this paper, we apply the methodology to two cases: a semi-synthetic model with noisy data, and a case based entirely on field data. The results show that the derivative-free approach presented is robust against noise in the data

  7. Determining Optimal Crude Oil Price Benchmark in Nigeria: An Empirical Approach

    Directory of Open Access Journals (Sweden)

    Saibu Olufemi Muibi

    2015-12-01

    Full Text Available This paper contributes to on-going empirical search for an appropriate crude oil price benchmark that ensures greater financial stability and efficient fiscal management in Nigeria. It adopted the seasonally adjusted ARIMA forecasting models using monthly data series from 2000m01 to 2012m12 to predict future movement in Nigeria crude oil prices. The paper derived a more robust and dynamic framework that accommodates fluctuation in crude oil price and also in government spending. The result shows that if the incessant withdrawal from the ECA fund and the increasing debt profile of government in recent times are factored into the benchmark, the real crude oil numerical fiscal rule is (US$82.3 for 2013 which is higher than the official benchmark of $75 used for 2013 and 2014 budget proposal. The paper argues that the current long run price rule based on 5-10 year moving average approach adopted by government is rigid and inflexible as a rule for managing Nigerian oil funds. The unrealistic assumption of the extant benchmark accounted for excessive depletion and lack of accountability of the excess crude oil account. The paper concludes that except the federal government can curtail its spending profligacy and adopts a more stringent fiscal discipline rules, the current benchmark is unrealistic and unsuitable for fiscal management of oil revenue in the context of Nigerian economic spending profile.

  8. Optimization based tuning approach for offset free MPC

    DEFF Research Database (Denmark)

    Olesen, Daniel Haugård; Huusom, Jakob Kjøbsted; Jørgensen, John Bagterp

    2012-01-01

    We present an optimization based tuning procedure with certain robustness properties for an offset free Model Predictive Controller (MPC). The MPC is designed for multivariate processes that can be represented by an ARX model. The advantage of ARX model representations is that standard system...... identifiation techniques using convex optimization can be used for identification of such models from input-output data. The stochastic model of the ARX model identified from input-output data is modified with an ARMA model designed as part of the MPC-design procedure to ensure offset-free control. The ARMAX...... model description resulting from the extension can be realized as a state space model in innovation form. The MPC is designed and implemented based on this state space model in innovation form. Expressions for the closed-loop dynamics of the unconstrained system is used to derive the sensitivity...

  9. Multiscale benchmarking of drug delivery vectors.

    Science.gov (United States)

    Summers, Huw D; Ware, Matthew J; Majithia, Ravish; Meissner, Kenith E; Godin, Biana; Rees, Paul

    2016-10-01

    Cross-system comparisons of drug delivery vectors are essential to ensure optimal design. An in-vitro experimental protocol is presented that separates the role of the delivery vector from that of its cargo in determining the cell response, thus allowing quantitative comparison of different systems. The technique is validated through benchmarking of the dose-response of human fibroblast cells exposed to the cationic molecule, polyethylene imine (PEI); delivered as a free molecule and as a cargo on the surface of CdSe nanoparticles and Silica microparticles. The exposure metrics are converted to a delivered dose with the transport properties of the different scale systems characterized by a delivery time, τ. The benchmarking highlights an agglomeration of the free PEI molecules into micron sized clusters and identifies the metric determining cell death as the total number of PEI molecules presented to cells, determined by the delivery vector dose and the surface density of the cargo. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Truss topology optimization with discrete design variables — Guaranteed global optimality and benchmark examples

    DEFF Research Database (Denmark)

    Achtziger, Wolfgang; Stolpe, Mathias

    2007-01-01

    this problem is well-studied for continuous bar areas, we consider in this study the case of discrete areas. This problem is of major practical relevance if the truss must be built from pre-produced bars with given areas. As a special case, we consider the design problem for a single available bar area, i.......e., a 0/1 problem. In contrast to the heuristic methods considered in many other approaches, our goal is to compute guaranteed globally optimal structures. This is done by a branch-and-bound method for which convergence can be proven. In this branch-and-bound framework, lower bounds of the optimal......-integer problems. The main intention of this paper is to provide optimal solutions for single and multiple load benchmark examples, which can be used for testing and validating other methods or heuristics for the treatment of this discrete topology design problem....

  11. Analyzing the BBOB results by means of benchmarking concepts.

    Science.gov (United States)

    Mersmann, O; Preuss, M; Trautmann, H; Bischl, B; Weihs, C

    2015-01-01

    We present methods to answer two basic questions that arise when benchmarking optimization algorithms. The first one is: which algorithm is the "best" one? and the second one is: which algorithm should I use for my real-world problem? Both are connected and neither is easy to answer. We present a theoretical framework for designing and analyzing the raw data of such benchmark experiments. This represents a first step in answering the aforementioned questions. The 2009 and 2010 BBOB benchmark results are analyzed by means of this framework and we derive insight regarding the answers to the two questions. Furthermore, we discuss how to properly aggregate rankings from algorithm evaluations on individual problems into a consensus, its theoretical background and which common pitfalls should be avoided. Finally, we address the grouping of test problems into sets with similar optimizer rankings and investigate whether these are reflected by already proposed test problem characteristics, finding that this is not always the case.

  12. A Comparative Study of Differential Evolution, Particle Swarm Optimization, and Evolutionary Algorithms on Numerical Benchmark Problems

    DEFF Research Database (Denmark)

    Vesterstrøm, Jacob Svaneborg; Thomsen, Rene

    2004-01-01

    Several extensions to evolutionary algorithms (EAs) and particle swarm optimization (PSO) have been suggested during the last decades offering improved performance on selected benchmark problems. Recently, another search heuristic termed differential evolution (DE) has shown superior performance...... in several real-world applications. In this paper, we evaluate the performance of DE, PSO, and EAs regarding their general applicability as numerical optimization techniques. The comparison is performed on a suite of 34 widely used benchmark problems. The results from our study show that DE generally...... outperforms the other algorithms. However, on two noisy functions, both DE and PSO were outperformed by the EA....

  13. PID controller tuning using metaheuristic optimization algorithms for benchmark problems

    Science.gov (United States)

    Gholap, Vishal; Naik Dessai, Chaitali; Bagyaveereswaran, V.

    2017-11-01

    This paper contributes to find the optimal PID controller parameters using particle swarm optimization (PSO), Genetic Algorithm (GA) and Simulated Annealing (SA) algorithm. The algorithms were developed through simulation of chemical process and electrical system and the PID controller is tuned. Here, two different fitness functions such as Integral Time Absolute Error and Time domain Specifications were chosen and applied on PSO, GA and SA while tuning the controller. The proposed Algorithms are implemented on two benchmark problems of coupled tank system and DC motor. Finally, comparative study has been done with different algorithms based on best cost, number of iterations and different objective functions. The closed loop process response for each set of tuned parameters is plotted for each system with each fitness function.

  14. Optimal model-free prediction from multivariate time series

    Science.gov (United States)

    Runge, Jakob; Donner, Reik V.; Kurths, Jürgen

    2015-05-01

    Forecasting a time series from multivariate predictors constitutes a challenging problem, especially using model-free approaches. Most techniques, such as nearest-neighbor prediction, quickly suffer from the curse of dimensionality and overfitting for more than a few predictors which has limited their application mostly to the univariate case. Therefore, selection strategies are needed that harness the available information as efficiently as possible. Since often the right combination of predictors matters, ideally all subsets of possible predictors should be tested for their predictive power, but the exponentially growing number of combinations makes such an approach computationally prohibitive. Here a prediction scheme that overcomes this strong limitation is introduced utilizing a causal preselection step which drastically reduces the number of possible predictors to the most predictive set of causal drivers making a globally optimal search scheme tractable. The information-theoretic optimality is derived and practical selection criteria are discussed. As demonstrated for multivariate nonlinear stochastic delay processes, the optimal scheme can even be less computationally expensive than commonly used suboptimal schemes like forward selection. The method suggests a general framework to apply the optimal model-free approach to select variables and subsequently fit a model to further improve a prediction or learn statistical dependencies. The performance of this framework is illustrated on a climatological index of El Niño Southern Oscillation.

  15. Estimation of an optimal chemotherapy utilisation rate for cancer: setting an evidence-based benchmark for quality cancer care.

    Science.gov (United States)

    Jacob, S A; Ng, W L; Do, V

    2015-02-01

    There is wide variation in the proportion of newly diagnosed cancer patients who receive chemotherapy, indicating the need for a benchmark rate of chemotherapy utilisation. This study describes an evidence-based model that estimates the proportion of new cancer patients in whom chemotherapy is indicated at least once (defined as the optimal chemotherapy utilisation rate). The optimal chemotherapy utilisation rate can act as a benchmark for measuring and improving the quality of care. Models of optimal chemotherapy utilisation were constructed for each cancer site based on indications for chemotherapy identified from evidence-based treatment guidelines. Data on the proportion of patient- and tumour-related attributes for which chemotherapy was indicated were obtained, using population-based data where possible. Treatment indications and epidemiological data were merged to calculate the optimal chemotherapy utilisation rate. Monte Carlo simulations and sensitivity analyses were used to assess the effect of controversial chemotherapy indications and variations in epidemiological data on our model. Chemotherapy is indicated at least once in 49.1% (95% confidence interval 48.8-49.6%) of all new cancer patients in Australia. The optimal chemotherapy utilisation rates for individual tumour sites ranged from a low of 13% in thyroid cancers to a high of 94% in myeloma. The optimal chemotherapy utilisation rate can serve as a benchmark for planning chemotherapy services on a population basis. The model can be used to evaluate service delivery by comparing the benchmark rate with patterns of care data. The overall estimate for other countries can be obtained by substituting the relevant distribution of cancer types. It can also be used to predict future chemotherapy workload and can be easily modified to take into account future changes in cancer incidence, presentation stage or chemotherapy indications. Copyright © 2014 The Royal College of Radiologists. Published by

  16. Benchmarking optimization solvers for structural topology optimization

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana; Stolpe, Mathias

    2015-01-01

    solvers in IPOPT and FMINCON, and the sequential quadratic programming method in SNOPT, are benchmarked on the library using performance profiles. Whenever possible the methods are applied to both the nested and the Simultaneous Analysis and Design (SAND) formulations of the problem. The performance...

  17. Optimal reflection-free complex absorbing potentials for quantum propagation of wave packets

    International Nuclear Information System (INIS)

    Shemer, Oded; Brisker, Daria; Moiseyev, Nimrod

    2005-01-01

    The conditions for optimal reflection-free complex-absorbing potentials (CAPs) are discussed. It is shown that the CAPs as derived from the smooth-exterior-scaling transformation of the Hamiltonian [J. Phys. B 31, 1431 (1998)] serve as optimal reflection-free CAPs (RF CAPs) in wave-packet propagation calculations of open systems. The initial wave packet, Φ(t=0), can be located in the interaction region (as in half collision experiments) where the CAPs have vanished or in the asymptote where V CAP ≠0. As we show, the optimal CAPs can be introduced also in the region where the physical potential has not vanished. The unavoided reflections due to the use of a finite number of grid points (or basis functions) are discussed. A simple way to reduce the 'edge-grid' reflection effect is described

  18. Implementing of the multi-objective particle swarm optimizer and fuzzy decision-maker in exergetic, exergoeconomic and environmental optimization of a benchmark cogeneration system

    International Nuclear Information System (INIS)

    Sayyaadi, Hoseyn; Babaie, Meisam; Farmani, Mohammad Reza

    2011-01-01

    Multi-objective optimization for design of a benchmark cogeneration system namely as the CGAM cogeneration system is performed. In optimization approach, Exergetic, Exergoeconomic and Environmental objectives are considered, simultaneously. In this regard, the set of Pareto optimal solutions known as the Pareto frontier is obtained using the MOPSO (multi-objective particle swarm optimizer). The exergetic efficiency as an exergetic objective is maximized while the unit cost of the system product and the cost of the environmental impact respectively as exergoeconomic and environmental objectives are minimized. Economic model which is utilized in the exergoeconomic analysis is built based on both simple model (used in original researches of the CGAM system) and the comprehensive modeling namely as TTR (total revenue requirement) method (used in sophisticated exergoeconomic analysis). Finally, a final optimal solution from optimal set of the Pareto frontier is selected using a fuzzy decision-making process based on the Bellman-Zadeh approach and results are compared with corresponding results obtained in a traditional decision-making process. Further, results are compared with the corresponding performance of the base case CGAM system and optimal designs of previous works and discussed. -- Highlights: → A multi-objective optimization approach has been implemented in optimization of a benchmark cogeneration system. → Objective functions based on the environmental impact evaluation, thermodynamic and economic analysis are obtained and optimized. → Particle swarm optimizer implemented and its robustness is compared with NSGA-II. → A final optimal configuration is found using various decision-making approaches. → Results compared with previous works in the field.

  19. Welding Robot Collision-Free Path Optimization

    Directory of Open Access Journals (Sweden)

    Xuewu Wang

    2017-02-01

    Full Text Available Reasonable welding path has a significant impact on welding efficiency, and a collision-free path should be considered first in the process of welding robot path planning. The shortest path length is considered as an optimization objective, and obstacle avoidance is considered as the constraint condition in this paper. First, a grid method is used as a modeling method after the optimization objective is analyzed. For local collision-free path planning, an ant colony algorithm is selected as the search strategy. Then, to overcome the shortcomings of the ant colony algorithm, a secondary optimization is presented to improve the optimization performance. Finally, the particle swarm optimization algorithm is used to realize global path planning. Simulation results show that the desired welding path can be obtained based on the optimization strategy.

  20. Parameter-free method for the shape optimization of stiffeners on thin-walled structures to minimize stress concentration

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yang; Shibutan, Yoji [Osaka University, Osaka (Japan); Shimoda, Masatoshi [Toyota Technological Institute, Nagoya (Japan)

    2015-04-15

    This paper presents a parameter-free shape optimization method for the strength design of stiffeners on thin-walled structures. The maximum von Mises stress is minimized and subjected to the volume constraint. The optimum design problem is formulated as a distributed-parameter shape optimization problem under the assumptions that a stiffener is varied in the in-plane direction and that the thickness is constant. The issue of nondifferentiability, which is inherent in this min-max problem, is avoided by transforming the local measure to a smooth differentiable integral functional by using the Kreisselmeier-Steinhauser function. The shape gradient functions are derived by using the material derivative method and adjoint variable method and are applied to the H{sup 1} gradient method for shells to determine the optimal free-boundary shapes. By using this method, the smooth optimal stiffener shape can be obtained without any shape design parameterization while minimizing the maximum stress. The validity of this method is verified through two practical design examples.

  1. Comparing, optimizing, and benchmarking quantum-control algorithms in a unifying programming framework

    International Nuclear Information System (INIS)

    Machnes, S.; Sander, U.; Glaser, S. J.; Schulte-Herbrueggen, T.; Fouquieres, P. de; Gruslys, A.; Schirmer, S.

    2011-01-01

    For paving the way to novel applications in quantum simulation, computation, and technology, increasingly large quantum systems have to be steered with high precision. It is a typical task amenable to numerical optimal control to turn the time course of pulses, i.e., piecewise constant control amplitudes, iteratively into an optimized shape. Here, we present a comparative study of optimal-control algorithms for a wide range of finite-dimensional applications. We focus on the most commonly used algorithms: GRAPE methods which update all controls concurrently, and Krotov-type methods which do so sequentially. Guidelines for their use are given and open research questions are pointed out. Moreover, we introduce a unifying algorithmic framework, DYNAMO (dynamic optimization platform), designed to provide the quantum-technology community with a convenient matlab-based tool set for optimal control. In addition, it gives researchers in optimal-control techniques a framework for benchmarking and comparing newly proposed algorithms with the state of the art. It allows a mix-and-match approach with various types of gradients, update and step-size methods as well as subspace choices. Open-source code including examples is made available at http://qlib.info.

  2. Multi-objective approach in thermoenvironomic optimization of a benchmark cogeneration system

    International Nuclear Information System (INIS)

    Sayyaadi, Hoseyn

    2009-01-01

    Multi-objective optimization for designing of a benchmark cogeneration system known as CGAM cogeneration system has been performed. In optimization approach, the exergetic, economic and environmental aspects have been considered, simultaneously. The thermodynamic modeling has been implemented comprehensively while economic analysis conducted in accordance with the total revenue requirement (TRR) method. The results for the single objective thermoeconomic optimization have been compared with the previous studies in optimization of CGAM problem. In multi-objective optimization of the CGAM problem, the three objective functions including the exergetic efficiency, total levelized cost rate of the system product and the cost rate of environmental impact have been considered. The environmental impact objective function has been defined and expressed in cost terms. This objective has been integrated with the thermoeconomic objective to form a new unique objective function known as a thermoenvironomic objective function. The thermoenvironomic objective has been minimized while the exergetic objective has been maximized. One of the most suitable optimization techniques developed using a particular class of search algorithms known as multi-objective evolutionary algorithms (MOEAs) has been considered here. This approach which is developed based on the genetic algorithm has been applied to find the set of Pareto optimal solutions with respect to the aforementioned objective functions. An example of decision-making has been presented and a final optimal solution has been introduced. The sensitivity of the solutions to the interest rate and the fuel cost has been studied

  3. Optimally Stopped Optimization

    Science.gov (United States)

    Vinci, Walter; Lidar, Daniel

    We combine the fields of heuristic optimization and optimal stopping. We propose a strategy for benchmarking randomized optimization algorithms that minimizes the expected total cost for obtaining a good solution with an optimal number of calls to the solver. To do so, rather than letting the objective function alone define a cost to be minimized, we introduce a further cost-per-call of the algorithm. We show that this problem can be formulated using optimal stopping theory. The expected cost is a flexible figure of merit for benchmarking probabilistic solvers that can be computed when the optimal solution is not known, and that avoids the biases and arbitrariness that affect other measures. The optimal stopping formulation of benchmarking directly leads to a real-time, optimal-utilization strategy for probabilistic optimizers with practical impact. We apply our formulation to benchmark the performance of a D-Wave 2X quantum annealer and the HFS solver, a specialized classical heuristic algorithm designed for low tree-width graphs. On a set of frustrated-loop instances with planted solutions defined on up to N = 1098 variables, the D-Wave device is between one to two orders of magnitude faster than the HFS solver.

  4. Microscopically derived free energy of dislocations

    NARCIS (Netherlands)

    Kooiman, M.; Hütter, M.; Geers, M.G.D.

    2015-01-01

    The dynamics of large amounts of dislocations is the governing mechanism in metal plasticity. The free energy of a continuous dislocation density profile plays a crucial role in the description of the dynamics of dislocations, as free energy derivatives act as the driving forces of dislocation

  5. A trust region approach with multivariate Padé model for optimal circuit design

    Science.gov (United States)

    Abdel-Malek, Hany L.; Ebid, Shaimaa E. K.; Mohamed, Ahmed S. A.

    2017-11-01

    Since the optimization process requires a significant number of consecutive function evaluations, it is recommended to replace the function by an easily evaluated approximation model during the optimization process. The model suggested in this article is based on a multivariate Padé approximation. This model is constructed using data points of ?, where ? is the number of parameters. The model is updated over a sequence of trust regions. This model avoids the slow convergence of linear models of ? and has features of quadratic models that need interpolation data points of ?. The proposed approach is tested by applying it to several benchmark problems. Yield optimization using such a direct method is applied to some practical circuit examples. Minimax solution leads to a suitable initial point to carry out the yield optimization process. The yield is optimized by the proposed derivative-free method for active and passive filter examples.

  6. Free allocations in EU ETS Phase 3: The impact of emissions performance benchmarking for carbon-intensive industry - Working Paper No. 2013-14

    International Nuclear Information System (INIS)

    Lecourt, S.; Palliere, C.; Sartor, O.

    2013-02-01

    From Phase 3 (2013-20) of the European Union Emissions Trading Scheme, carbon-intensive industrial emitters will receive free allocations based on harmonised, EU-wide benchmarks. This paper analyses the impacts of these new rules on allocations to key energy-intensive sectors across Europe. It explores an original dataset that combines recent data from the National Implementing Measures of 20 EU Member States with the Community Independent Transaction Log and other EU documents. The analysis reveals that free allocations to benchmarked sectors will be reduced significantly compared to Phase 2 (2008-12). This reduction should both increase public revenues from carbon auctions and has the potential to enhance the economic efficiency of the carbon market. The analysis also shows that changes in allocation vary mostly across installations within countries, raising the possibility that the carbon-cost competitiveness impacts may be more intense within rather than across countries. Lastly, the analysis finds evidence that the new benchmarking rules will, as intended, reward installations with better emissions performance and will improve harmonisation of free allocations in the EU ETS by reducing differences in allocation levels across countries with similar carbon intensities of production. (authors)

  7. Study on the mechanism and efficiency of simulated annealing using an LP optimization benchmark problem - 113

    International Nuclear Information System (INIS)

    Qianqian, Li; Xiaofeng, Jiang; Shaohong, Zhang

    2010-01-01

    Simulated Annealing Algorithm (SAA) for solving combinatorial optimization problems is a popular method for loading pattern optimization. The main purpose of this paper is to understand the underlying search mechanism of SAA and to study its efficiency. In this study, a general SAA that employs random pair exchange of fuel assemblies to search for the optimum fuel Loading Pattern (LP) is applied to an exhaustively searched LP optimization benchmark problem. All the possible LPs of the benchmark problem have been enumerated and evaluated via the use of the very fast and accurate Hybrid Harmonics and Linear Perturbation (HHLP) method, such that the mechanism of SA for LP optimization can be explicitly analyzed and its search efficiency evaluated. The generic core geometry itself dictates that only a small number LPs can be generated by performing random single pair exchanges and that the LPs are necessarily mostly similar to the initial LP. This phase space effect turns out to be the basic mechanism in SAA that can explain its efficiency and good local search ability. A measure of search efficiency is introduced which shows that the stochastic nature of SAA greatly influences the variability of its search efficiency. It is also found that using fuel assembly k-infinity distribution as a technique to filter the LPs can significantly enhance the SAA search efficiency. (authors)

  8. Effects of benchmarking on the quality of type 2 diabetes care: results of the OPTIMISE (Optimal Type 2 Diabetes Management Including Benchmarking and Standard Treatment) study in Greece

    Science.gov (United States)

    Tsimihodimos, Vasilis; Kostapanos, Michael S.; Moulis, Alexandros; Nikas, Nikos; Elisaf, Moses S.

    2015-01-01

    Objectives: To investigate the effect of benchmarking on the quality of type 2 diabetes (T2DM) care in Greece. Methods: The OPTIMISE (Optimal Type 2 Diabetes Management Including Benchmarking and Standard Treatment) study [ClinicalTrials.gov identifier: NCT00681850] was an international multicenter, prospective cohort study. It included physicians randomized 3:1 to either receive benchmarking for glycated hemoglobin (HbA1c), systolic blood pressure (SBP) and low-density lipoprotein cholesterol (LDL-C) treatment targets (benchmarking group) or not (control group). The proportions of patients achieving the targets of the above-mentioned parameters were compared between groups after 12 months of treatment. Also, the proportions of patients achieving those targets at 12 months were compared with baseline in the benchmarking group. Results: In the Greek region, the OPTIMISE study included 797 adults with T2DM (570 in the benchmarking group). At month 12 the proportion of patients within the predefined targets for SBP and LDL-C was greater in the benchmarking compared with the control group (50.6 versus 35.8%, and 45.3 versus 36.1%, respectively). However, these differences were not statistically significant. No difference between groups was noted in the percentage of patients achieving the predefined target for HbA1c. At month 12 the increase in the percentage of patients achieving all three targets was greater in the benchmarking (5.9–15.0%) than in the control group (2.7–8.1%). In the benchmarking group more patients were on target regarding SBP (50.6% versus 29.8%), LDL-C (45.3% versus 31.3%) and HbA1c (63.8% versus 51.2%) at 12 months compared with baseline (p Benchmarking may comprise a promising tool for improving the quality of T2DM care. Nevertheless, target achievement rates of each, and of all three, quality indicators were suboptimal, indicating there are still unmet needs in the management of T2DM. PMID:26445642

  9. Benchmarking in University Toolbox

    Directory of Open Access Journals (Sweden)

    Katarzyna Kuźmicz

    2015-06-01

    Full Text Available In the face of global competition and rising challenges that higher education institutions (HEIs meet, it is imperative to increase innovativeness and efficiency of their management. Benchmarking can be the appropriate tool to search for a point of reference necessary to assess institution’s competitive position and learn from the best in order to improve. The primary purpose of the paper is to present in-depth analysis of benchmarking application in HEIs worldwide. The study involves indicating premises of using benchmarking in HEIs. It also contains detailed examination of types, approaches and scope of benchmarking initiatives. The thorough insight of benchmarking applications enabled developing classification of benchmarking undertakings in HEIs. The paper includes review of the most recent benchmarking projects and relating them to the classification according to the elaborated criteria (geographical range, scope, type of data, subject, support and continuity. The presented examples were chosen in order to exemplify different approaches to benchmarking in higher education setting. The study was performed on the basis of the published reports from benchmarking projects, scientific literature and the experience of the author from the active participation in benchmarking projects. The paper concludes with recommendations for university managers undertaking benchmarking, derived on the basis of the conducted analysis.

  10. Free material optimization for laminated plates and shells

    DEFF Research Database (Denmark)

    Weldeyesus, Alemseged Gebrehiwot; Stolpe, Mathias

    2016-01-01

    Free Material Optimization (FMO) is a powerful approach for conceptual optimal design of composite structures. The design variable in FMO is the entire elastic material tensor which is allowed to vary almost freely over the design domain. The imposed requirements on the tensor are that it is symm......Free Material Optimization (FMO) is a powerful approach for conceptual optimal design of composite structures. The design variable in FMO is the entire elastic material tensor which is allowed to vary almost freely over the design domain. The imposed requirements on the tensor...

  11. Benchmark Two-Good Utility Functions

    NARCIS (Netherlands)

    de Jaegher, K.

    Benchmark two-good utility functions involving a good with zero income elasticity and unit income elasticity are well known. This paper derives utility functions for the additional benchmark cases where one good has zero cross-price elasticity, unit own-price elasticity, and zero own price

  12. Development and optimization of the synthesis of new thiazolidin-4-one derivatives of ibuprofen.

    Science.gov (United States)

    Vasincu, Ioana; Apotrosoaei, Maria; Panzariu, Andreea; Buron, F; Routier, S; Profire, Lenuta

    2014-01-01

    Ibuprofen, an important nonsteroidal anti-inflammatory agent, is one of the most prescribed drugs for the treatment of pain and inflammation from various rheumatic diseases, but some side effects can occur on long-term use. The method for synthesis optimization of new derivatives of Ibuprofen with thiazolidin-4-one moiety, with improved pharmacological and toxicological profile. To optimize the derivatization method of free carboxyl group of Ibuprofen (2-(4-isobutylphenyl)propionic acid) the reaction conditions were varied (reagent ratio, catalyst, reaction medium). The most favorable method was proved to be the reaction between ibuprofen hydrazone and mercaptoacetic acid, in excess, at 80-85 degrees C, for 6 h with 96% conversion rate. The synthesis of 2-phenyl-3-[2-(4-(isobutyl)phenyl)-2-methyl]acetamido-thiazolidin-4-one derivative was optimized in view of applying it as a general procedure for the synthesis of other derivatives with related structure. The chemical structure and molecular weight of the synthesized compound were confirmed by spectral methods (IR, 1H NMR, 13C NMR, HR-MS).

  13. Complementary numerical–experimental benchmarking for shape optimization and validation of structures subjected to wave and current forces

    DEFF Research Database (Denmark)

    Markus, D.; Ferri, Francesco; Wüchner, R.

    2015-01-01

    A new benchmark problem is proposed and evaluated targeting fluid related shape optimization problems, motivated by design related ocean engineering tasks. The analyzed test geometry is a bottom mounted, polygonal structure in a channel flow. The aim of the study is to analyze the effect of shape...

  14. Computational Chemistry Comparison and Benchmark Database

    Science.gov (United States)

    SRD 101 NIST Computational Chemistry Comparison and Benchmark Database (Web, free access)   The NIST Computational Chemistry Comparison and Benchmark Database is a collection of experimental and ab initio thermochemical properties for a selected set of molecules. The goals are to provide a benchmark set of molecules for the evaluation of ab initio computational methods and allow the comparison between different ab initio computational methods for the prediction of thermochemical properties.

  15. Benchmarking Benchmarks

    NARCIS (Netherlands)

    D.C. Blitz (David)

    2011-01-01

    textabstractBenchmarking benchmarks is a bundle of six studies that are inspired by the prevalence of benchmarking in academic finance research as well as in investment practice. Three studies examine if current benchmark asset pricing models adequately describe the cross-section of stock returns.

  16. SV-AUTOPILOT: optimized, automated construction of structural variation discovery and benchmarking pipelines.

    Science.gov (United States)

    Leung, Wai Yi; Marschall, Tobias; Paudel, Yogesh; Falquet, Laurent; Mei, Hailiang; Schönhuth, Alexander; Maoz Moss, Tiffanie Yael

    2015-03-25

    Many tools exist to predict structural variants (SVs), utilizing a variety of algorithms. However, they have largely been developed and tested on human germline or somatic (e.g. cancer) variation. It seems appropriate to exploit this wealth of technology available for humans also for other species. Objectives of this work included: a) Creating an automated, standardized pipeline for SV prediction. b) Identifying the best tool(s) for SV prediction through benchmarking. c) Providing a statistically sound method for merging SV calls. The SV-AUTOPILOT meta-tool platform is an automated pipeline for standardization of SV prediction and SV tool development in paired-end next-generation sequencing (NGS) analysis. SV-AUTOPILOT comes in the form of a virtual machine, which includes all datasets, tools and algorithms presented here. The virtual machine easily allows one to add, replace and update genomes, SV callers and post-processing routines and therefore provides an easy, out-of-the-box environment for complex SV discovery tasks. SV-AUTOPILOT was used to make a direct comparison between 7 popular SV tools on the Arabidopsis thaliana genome using the Landsberg (Ler) ecotype as a standardized dataset. Recall and precision measurements suggest that Pindel and Clever were the most adaptable to this dataset across all size ranges while Delly performed well for SVs larger than 250 nucleotides. A novel, statistically-sound merging process, which can control the false discovery rate, reduced the false positive rate on the Arabidopsis benchmark dataset used here by >60%. SV-AUTOPILOT provides a meta-tool platform for future SV tool development and the benchmarking of tools on other genomes using a standardized pipeline. It optimizes detection of SVs in non-human genomes using statistically robust merging. The benchmarking in this study has demonstrated the power of 7 different SV tools for analyzing different size classes and types of structural variants. The optional merge

  17. EGS4 benchmark program

    International Nuclear Information System (INIS)

    Yasu, Y.; Hirayama, H.; Namito, Y.; Yashiro, S.

    1995-01-01

    This paper proposes EGS4 Benchmark Suite which consists of three programs called UCSAMPL4, UCSAMPL4I and XYZDOS. This paper also evaluates optimization methods of recent RISC/UNIX systems, such as IBM, HP, DEC, Hitachi and Fujitsu, for the benchmark suite. When particular compiler option and math library were included in the evaluation process, system performed significantly better. Observed performance of some of the RISC/UNIX systems were beyond some so-called Mainframes of IBM, Hitachi or Fujitsu. The computer performance of EGS4 Code System on an HP9000/735 (99MHz) was defined to be the unit of EGS4 Unit. The EGS4 Benchmark Suite also run on various PCs such as Pentiums, i486 and DEC alpha and so forth. The performance of recent fast PCs reaches that of recent RISC/UNIX systems. The benchmark programs have been evaluated with correlation of industry benchmark programs, namely, SPECmark. (author)

  18. ELB-trees an efficient and lock-free B-tree derivative

    DEFF Research Database (Denmark)

    Bonnichsen, Lars Frydendal; Karlsson, Sven; Probst, Christian W.

    2013-01-01

    overhead. All lock-free data structures are based on simple atomic operations that, though supported by modern processors, are expensive in execution time. We present a lock-free data structure, ELB-trees, which under certain assumptions can be used as multimaps as well as priority queues. Specifically...... it cannot store duplicate key-value pairs, and it is not linearizable. Compared to existing data structures, ELB-trees require fewer atomic operations leading to improved performance. We measure the parallel performance of ELB-trees using a set of benchmarks and observe that ELB-trees are up to almost 30......As computer systems scale in the number of processors, scalable data structures with good parallel performance become increasingly important. Lock-free data structures promise such improved parallel performance at the expense of higher algorithmic complexity and higher sequential execution time...

  19. Derivative free Davidon-Fletcher-Powell (DFP) for solving symmetric systems of nonlinear equations

    Science.gov (United States)

    Mamat, M.; Dauda, M. K.; Mohamed, M. A. bin; Waziri, M. Y.; Mohamad, F. S.; Abdullah, H.

    2018-03-01

    Research from the work of engineers, economist, modelling, industry, computing, and scientist are mostly nonlinear equations in nature. Numerical solution to such systems is widely applied in those areas of mathematics. Over the years, there has been significant theoretical study to develop methods for solving such systems, despite these efforts, unfortunately the methods developed do have deficiency. In a contribution to solve systems of the form F(x) = 0, x ∈ Rn , a derivative free method via the classical Davidon-Fletcher-Powell (DFP) update is presented. This is achieved by simply approximating the inverse Hessian matrix with {Q}k+1-1 to θkI. The modified method satisfied the descent condition and possess local superlinear convergence properties. Interestingly, without computing any derivative, the proposed method never fail to converge throughout the numerical experiments. The output is based on number of iterations and CPU time, different initial starting points were used on a solve 40 benchmark test problems. With the aid of the squared norm merit function and derivative-free line search technique, the approach yield a method of solving symmetric systems of nonlinear equations that is capable of significantly reducing the CPU time and number of iteration, as compared to its counterparts. A comparison between the proposed method and classical DFP update were made and found that the proposed methodis the top performer and outperformed the existing method in almost all the cases. In terms of number of iterations, out of the 40 problems solved, the proposed method solved 38 successfully, (95%) while classical DFP solved 2 problems (i.e. 05%). In terms of CPU time, the proposed method solved 29 out of the 40 problems given, (i.e.72.5%) successfully whereas classical DFP solves 11 (27.5%). The method is valid in terms of derivation, reliable in terms of number of iterations and accurate in terms of CPU time. Thus, suitable and achived the objective.

  20. Models and Methods for Free Material Optimization

    DEFF Research Database (Denmark)

    Weldeyesus, Alemseged Gebrehiwot

    Free Material Optimization (FMO) is a powerful approach for structural optimization in which the design parametrization allows the entire elastic stiffness tensor to vary freely at each point of the design domain. The only requirement imposed on the stiffness tensor lies on its mild necessary...

  1. Qinshan CANDU NPP outage performance improvement through benchmarking

    International Nuclear Information System (INIS)

    Jiang Fuming

    2005-01-01

    With the increasingly fierce competition in the deregulated Energy Market, the optimization of outage duration has become one of the focal points for the Nuclear Power Plant owners around the world. People are seeking various ways to shorten the outage duration of NPP. Great efforts have been made in the Light Water Reactor (LWR) family with the concept of benchmarking and evaluation, which great reduced the outage duration and improved outage performance. The average capacity factor of LWRs has been greatly improved over the last three decades, which now is close to 90%. CANDU (Pressurized Heavy Water Reactor) stations, with its unique feature of on power refueling, of nuclear fuel remaining in the reactor all through the planned outage, have given raise to more stringent safety requirements during planned outage. In addition, the above feature gives more variations to the critical path of planned outage in different station. In order to benchmarking again the best practices in the CANDU stations, Third Qinshan Nuclear Power Company (TQNPC) have initiated the benchmarking program among the CANDU stations aiming to standardize the outage maintenance windows and optimize the outage duration. The initial benchmarking has resulted the optimization of outage duration in Qinshan CANDU NPP and the formulation of its first long-term outage plan. This paper describes the benchmarking works that have been proven to be useful for optimizing outage duration in Qinshan CANDU NPP, and the vision of further optimize the duration with joint effort from the CANDU community. (authors)

  2. Parallel Aircraft Trajectory Optimization with Analytic Derivatives

    Science.gov (United States)

    Falck, Robert D.; Gray, Justin S.; Naylor, Bret

    2016-01-01

    Trajectory optimization is an integral component for the design of aerospace vehicles, but emerging aircraft technologies have introduced new demands on trajectory analysis that current tools are not well suited to address. Designing aircraft with technologies such as hybrid electric propulsion and morphing wings requires consideration of the operational behavior as well as the physical design characteristics of the aircraft. The addition of operational variables can dramatically increase the number of design variables which motivates the use of gradient based optimization with analytic derivatives to solve the larger optimization problems. In this work we develop an aircraft trajectory analysis tool using a Legendre-Gauss-Lobatto based collocation scheme, providing analytic derivatives via the OpenMDAO multidisciplinary optimization framework. This collocation method uses an implicit time integration scheme that provides a high degree of sparsity and thus several potential options for parallelization. The performance of the new implementation was investigated via a series of single and multi-trajectory optimizations using a combination of parallel computing and constraint aggregation. The computational performance results show that in order to take full advantage of the sparsity in the problem it is vital to parallelize both the non-linear analysis evaluations and the derivative computations themselves. The constraint aggregation results showed a significant numerical challenge due to difficulty in achieving tight convergence tolerances. Overall, the results demonstrate the value of applying analytic derivatives to trajectory optimization problems and lay the foundation for future application of this collocation based method to the design of aircraft with where operational scheduling of technologies is key to achieving good performance.

  3. Revaluering benchmarking - A topical theme for the construction industry

    DEFF Research Database (Denmark)

    Rasmussen, Grane Mikael Gregaard

    2011-01-01

    and questioning the concept objectively. This paper addresses the underlying nature of benchmarking, and accounts for the importance of focusing attention on the sociological impacts benchmarking has in organizations. To understand these sociological impacts, benchmarking research needs to transcend...... the perception of benchmarking systems as secondary and derivative and instead studying benchmarking as constitutive of social relations and as irredeemably social phenomena. I have attempted to do so in this paper by treating benchmarking using a calculative practice perspective, and describing how...

  4. Imidazole derivatives as angiotensin II AT1 receptor blockers: Benchmarks, drug-like calculations and quantitative structure-activity relationships modeling

    Science.gov (United States)

    Alloui, Mebarka; Belaidi, Salah; Othmani, Hasna; Jaidane, Nejm-Eddine; Hochlaf, Majdi

    2018-03-01

    We performed benchmark studies on the molecular geometry, electron properties and vibrational analysis of imidazole using semi-empirical, density functional theory and post Hartree-Fock methods. These studies validated the use of AM1 for the treatment of larger systems. Then, we treated the structural, physical and chemical relationships for a series of imidazole derivatives acting as angiotensin II AT1 receptor blockers using AM1. QSAR studies were done for these imidazole derivatives using a combination of various physicochemical descriptors. A multiple linear regression procedure was used to design the relationships between molecular descriptor and the activity of imidazole derivatives. Results validate the derived QSAR model.

  5. Free Energy-Based Virtual Screening and Optimization of RNase H Inhibitors of HIV-1 Reverse Transcriptase.

    Science.gov (United States)

    Zhang, Baofeng; D'Erasmo, Michael P; Murelli, Ryan P; Gallicchio, Emilio

    2016-09-30

    We report the results of a binding free energy-based virtual screening campaign of a library of 77 α-hydroxytropolone derivatives against the challenging RNase H active site of the reverse transcriptase (RT) enzyme of human immunodeficiency virus-1. Multiple protonation states, rotamer states, and binding modalities of each compound were individually evaluated. The work involved more than 300 individual absolute alchemical binding free energy parallel molecular dynamics calculations and over 1 million CPU hours on national computing clusters and a local campus computational grid. The thermodynamic and structural measures obtained in this work rationalize a series of characteristics of this system useful for guiding future synthetic and biochemical efforts. The free energy model identified key ligand-dependent entropic and conformational reorganization processes difficult to capture using standard docking and scoring approaches. Binding free energy-based optimization of the lead compounds emerging from the virtual screen has yielded four compounds with very favorable binding properties, which will be the subject of further experimental investigations. This work is one of the few reported applications of advanced-binding free energy models to large-scale virtual screening and optimization projects. It further demonstrates that, with suitable algorithms and automation, advanced-binding free energy models can have a useful role in early-stage drug-discovery programs.

  6. Benchmarking set for domestic smart grid management

    NARCIS (Netherlands)

    Bosman, M.G.C.; Bakker, Vincent; Molderink, Albert; Hurink, Johann L.; Smit, Gerardus Johannes Maria

    2010-01-01

    In this paper we propose a benchmark for domestic smart grid management. It consists of an in-depth description of a domestic smart grid, in which local energy consumers, producers and buffers can be controlled. First, from this description a general benchmark framework is derived, which can be used

  7. A simplified 2D HTTR benchmark problem

    International Nuclear Information System (INIS)

    Zhang, Z.; Rahnema, F.; Pounders, J. M.; Zhang, D.; Ougouag, A.

    2009-01-01

    To access the accuracy of diffusion or transport methods for reactor calculations, it is desirable to create heterogeneous benchmark problems that are typical of relevant whole core configurations. In this paper we have created a numerical benchmark problem in 2D configuration typical of a high temperature gas cooled prismatic core. This problem was derived from the HTTR start-up experiment. For code-to-code verification, complex details of geometry and material specification of the physical experiments are not necessary. To this end, the benchmark problem presented here is derived by simplifications that remove the unnecessary details while retaining the heterogeneity and major physics properties from the neutronics viewpoint. Also included here is a six-group material (macroscopic) cross section library for the benchmark problem. This library was generated using the lattice depletion code HELIOS. Using this library, benchmark quality Monte Carlo solutions are provided for three different configurations (all-rods-in, partially-controlled and all-rods-out). The reference solutions include the core eigenvalue, block (assembly) averaged fuel pin fission density distributions, and absorption rate in absorbers (burnable poison and control rods). (authors)

  8. Benchmarking

    OpenAIRE

    Meylianti S., Brigita

    1999-01-01

    Benchmarking has different meaning to different people. There are five types of benchmarking, namely internal benchmarking, competitive benchmarking, industry / functional benchmarking, process / generic benchmarking and collaborative benchmarking. Each type of benchmarking has its own advantages as well as disadvantages. Therefore it is important to know what kind of benchmarking is suitable to a specific application. This paper will discuss those five types of benchmarking in detail, includ...

  9. Lifecycle-Based Swarm Optimization Method for Numerical Optimization

    Directory of Open Access Journals (Sweden)

    Hai Shen

    2014-01-01

    Full Text Available Bioinspired optimization algorithms have been widely used to solve various scientific and engineering problems. Inspired by biological lifecycle, this paper presents a novel optimization algorithm called lifecycle-based swarm optimization (LSO. Biological lifecycle includes four stages: birth, growth, reproduction, and death. With this process, even though individual organism died, the species will not perish. Furthermore, species will have stronger ability of adaptation to the environment and achieve perfect evolution. LSO simulates Biological lifecycle process through six optimization operators: chemotactic, assimilation, transposition, crossover, selection, and mutation. In addition, the spatial distribution of initialization population meets clumped distribution. Experiments were conducted on unconstrained benchmark optimization problems and mechanical design optimization problems. Unconstrained benchmark problems include both unimodal and multimodal cases the demonstration of the optimal performance and stability, and the mechanical design problem was tested for algorithm practicability. The results demonstrate remarkable performance of the LSO algorithm on all chosen benchmark functions when compared to several successful optimization techniques.

  10. Optimization of free ammonia concentration for nitrite accumulation in shortcut biological nitrogen removal process.

    Science.gov (United States)

    Chung, Jinwook; Shim, Hojae; Park, Seong-Jun; Kim, Seung-Jin; Bae, Wookeun

    2006-03-01

    A shortcut biological nitrogen removal (SBNR) utilizes the concept of a direct conversion of ammonium to nitrite and then to nitrogen gas. A successful SBNR requires accumulation of nitrite in the system and inhibition of the activity of nitrite oxidizers. A high concentration of free ammonia (FA) inhibits nitrite oxidizers, but unfortunately decreases the ammonium removal rate as well. Therefore, the optimal range of FA concentration is necessary not only to stabilize nitrite accumulation but also to achieve maximum ammonium removal. In order to derive such optimal FA concentrations, the specific substrate utilization rates of ammonium and nitrite oxidizers were measured. The optimal FA concentration range appeared to be 5-10 mg/L for the adapted sludge. The simulated results from the modified inhibition model expressed by FA and ammonium/nitrite concentrations were shown very similar to the experimental results.

  11. Optimal defense resource allocation in scale-free networks

    Science.gov (United States)

    Zhang, Xuejun; Xu, Guoqiang; Xia, Yongxiang

    2018-02-01

    The robustness research of networked systems has drawn widespread attention in the past decade, and one of the central topics is to protect the network from external attacks through allocating appropriate defense resource to different nodes. In this paper, we apply a specific particle swarm optimization (PSO) algorithm to optimize the defense resource allocation in scale-free networks. Results reveal that PSO based resource allocation shows a higher robustness than other resource allocation strategies such as uniform, degree-proportional, and betweenness-proportional allocation strategies. Furthermore, we find that assigning less resource to middle-degree nodes under small-scale attack while more resource to low-degree nodes under large-scale attack is conductive to improving the network robustness. Our work provides an insight into the optimal defense resource allocation pattern in scale-free networks and is helpful for designing a more robust network.

  12. Optimizing the method for generation of integration-free induced pluripotent stem cells from human peripheral blood.

    Science.gov (United States)

    Gu, Haihui; Huang, Xia; Xu, Jing; Song, Lili; Liu, Shuping; Zhang, Xiao-Bing; Yuan, Weiping; Li, Yanxin

    2018-06-15

    Generation of induced pluripotent stem cells (iPSCs) from human peripheral blood provides a convenient and low-invasive way to obtain patient-specific iPSCs. The episomal vector is one of the best approaches for reprogramming somatic cells to pluripotent status because of its simplicity and affordability. However, the efficiency of episomal vector reprogramming of adult peripheral blood cells is relatively low compared with cord blood and bone marrow cells. In the present study, integration-free human iPSCs derived from peripheral blood were established via episomal technology. We optimized mononuclear cell isolation and cultivation, episomal vector promoters, and a combination of transcriptional factors to improve reprogramming efficiency. Here, we improved the generation efficiency of integration-free iPSCs from human peripheral blood mononuclear cells by optimizing the method of isolating mononuclear cells from peripheral blood, by modifying the integration of culture medium, and by adjusting the duration of culture time and the combination of different episomal vectors. With this optimized protocol, a valuable asset for banking patient-specific iPSCs has been established.

  13. Particle swarm optimization with scale-free interactions.

    Directory of Open Access Journals (Sweden)

    Chen Liu

    Full Text Available The particle swarm optimization (PSO algorithm, in which individuals collaborate with their interacted neighbors like bird flocking to search for the optima, has been successfully applied in a wide range of fields pertaining to searching and convergence. Here we employ the scale-free network to represent the inter-individual interactions in the population, named SF-PSO. In contrast to the traditional PSO with fully-connected topology or regular topology, the scale-free topology used in SF-PSO incorporates the diversity of individuals in searching and information dissemination ability, leading to a quite different optimization process. Systematic results with respect to several standard test functions demonstrate that SF-PSO gives rise to a better balance between the convergence speed and the optimum quality, accounting for its much better performance than that of the traditional PSO algorithms. We further explore the dynamical searching process microscopically, finding that the cooperation of hub nodes and non-hub nodes play a crucial role in optimizing the convergence process. Our work may have implications in computational intelligence and complex networks.

  14. A multicenter study benchmarks software tools for label-free proteome quantification.

    Science.gov (United States)

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  15. Preliminary optimal configuration on free standing hybrid riser

    Directory of Open Access Journals (Sweden)

    Kyoung-Su Kim

    2018-05-01

    Full Text Available Free Standing Hybrid Riser (FSHR is comprised of vertical steel risers and Flexible Jumpers (FJ. They are jointly connected to a submerged Buoyancy Can (BC. There are several factors that have influence on the behavior of FSHR such as the span distance between an offshore platform and a foundation, BC up-lift force, BC submerged location and FJ length.An optimization method through a parametric study is presented. Firstly, descriptions for the overall arrangement and characteristics of FSHR are introduced. Secondly, a flowchart for optimization of FSHR is suggested. Following that, it is described how to select reasonable ranges for a parametric study and determine each of optimal configuration options. Lastly, numerical analysis based on this procedure is performed through a case study. In conclusion, the relation among those parameters is analyzed and non-dimensional parametric ranges on optimal arrangements are suggested. Additionally, strength analysis is performed with variation in the configuration. Keywords: Free standing hybrid riser, Hybrid riser system, Buoyancy can, Flexible jumper, Deepwater, Multi-body dynamics

  16. Benchmarking variable-density flow in saturated and unsaturated porous media

    Science.gov (United States)

    Guevara Morel, Carlos Roberto; Cremer, Clemens; Graf, Thomas

    2015-04-01

    In natural environments, fluid density and viscosity can be affected by spatial and temporal variations of solute concentration and/or temperature. These variations can occur, for example, due to salt water intrusion in coastal aquifers, leachate infiltration from waste disposal sites and upconing of saline water from deep aquifers. As a consequence, potentially unstable situations may exist in which a dense fluid overlies a less dense fluid. This situation can produce instabilities that manifest as dense plume fingers that move vertically downwards counterbalanced by vertical upwards flow of the less dense fluid. Resulting free convection increases solute transport rates over large distances and times relative to constant-density flow. Therefore, the understanding of free convection is relevant for the protection of freshwater aquifer systems. The results from a laboratory experiment of saturated and unsaturated variable-density flow and solute transport (Simmons et al., Transp. Porous Medium, 2002) are used as the physical basis to define a mathematical benchmark. The HydroGeoSphere code coupled with PEST are used to estimate the optimal parameter set capable of reproducing the physical model. A grid convergency analysis (in space and time) is also undertaken in order to obtain the adequate spatial and temporal discretizations. The new mathematical benchmark is useful for model comparison and testing of variable-density variably saturated flow in porous media.

  17. PORTFOLIO COMPOSITION WITH MINIMUM VARIANCE: COMPARISON WITH MARKET BENCHMARKS

    Directory of Open Access Journals (Sweden)

    Daniel Menezes Cavalcante

    2016-07-01

    Full Text Available Portfolio optimization strategies are advocated as being able to allow the composition of stocks portfolios that provide returns above market benchmarks. This study aims to determine whether, in fact, portfolios based on the minimum variance strategy, optimized by the Modern Portfolio Theory, are able to achieve earnings above market benchmarks in Brazil. Time series of 36 securities traded on the BM&FBOVESPA have been analyzed in a long period of time (1999-2012, with sample windows of 12, 36, 60 and 120 monthly observations. The results indicated that the minimum variance portfolio performance is superior to market benchmarks (CDI and IBOVESPA in terms of return and risk-adjusted return, especially in medium and long-term investment horizons.

  18. Behaviour - The keystone in optimizing free-ranging ungulate production

    Science.gov (United States)

    Free-ranging animal behaviour is a keystone to optimizing free-ranging domestic animal production. This chapter focuses on several aspects that emanate from foraging including defining terms, concepts and the complexity that underlie managing animals and landscapes. Behaviour is investigated in li...

  19. A topological derivative method for topology optimization

    DEFF Research Database (Denmark)

    Norato, J.; Bendsøe, Martin P.; Haber, RB

    2007-01-01

    resource constraint. A smooth and consistent projection of the region bounded by the level set onto the fictitious analysis domain simplifies the response analysis and enhances the convergence of the optimization algorithm. Moreover, the projection supports the reintroduction of solid material in void......We propose a fictitious domain method for topology optimization in which a level set of the topological derivative field for the cost function identifies the boundary of the optimal design. We describe a fixed-point iteration scheme that implements this optimality criterion subject to a volumetric...... regions, a critical requirement for robust topology optimization. We present several numerical examples that demonstrate compliance minimization of fixed-volume, linearly elastic structures....

  20. Perprof-py: A Python Package for Performance Profile of Mathematical Optimization Software

    Directory of Open Access Journals (Sweden)

    Abel Soares Siqueira

    2016-04-01

    Full Text Available A very important area of research in the field of Mathematical Optimization is the benchmarking of optimization packages to compare solvers. During benchmarking, one usually collects a large amount of information like CPU time, number of functions evaluations, number of iterations, and much more. This information, if presented as tables, can be difficult to analyze and compare due to large amount of data. Therefore tools to better process and understand optimization benchmark data have been developed. One of the most widespread tools is the Performance Profile graphics proposed by Dolan and Moré [2]. In this context, this paper describes perprof-py, a free/open source software that creates 'Performance Profile' graphics. This software produces graphics in PDF using LaTeX with PGF/TikZ [22] and PGFPLOTS [4] packages, in PNG using matplotlib [9], and in HTML using Bokeh [1]. Perprof-py can also be easily extended to be used with other plot libraries. It is implemented in Python 3 with support for internationalization, and is under the General Public License Version 3 (GPLv3.

  1. Orchard navigation using derivative free Kalman filtering

    DEFF Research Database (Denmark)

    Hansen, Søren; Bayramoglu, Enis; Andersen, Jens Christian

    2011-01-01

    This paper describes the use of derivative free filters for mobile robot localization and navigation in an orchard. The localization algorithm fuses odometry and gyro measurements with line features representing the surrounding fruit trees of the orchard. The line features are created on basis of 2...

  2. XWeB: The XML Warehouse Benchmark

    Science.gov (United States)

    Mahboubi, Hadj; Darmont, Jérôme

    With the emergence of XML as a standard for representing business data, new decision support applications are being developed. These XML data warehouses aim at supporting On-Line Analytical Processing (OLAP) operations that manipulate irregular XML data. To ensure feasibility of these new tools, important performance issues must be addressed. Performance is customarily assessed with the help of benchmarks. However, decision support benchmarks do not currently support XML features. In this paper, we introduce the XML Warehouse Benchmark (XWeB), which aims at filling this gap. XWeB derives from the relational decision support benchmark TPC-H. It is mainly composed of a test data warehouse that is based on a unified reference model for XML warehouses and that features XML-specific structures, and its associate XQuery decision support workload. XWeB's usage is illustrated by experiments on several XML database management systems.

  3. Topological Derivatives in Shape Optimization

    CERN Document Server

    Novotny, Antonio André

    2013-01-01

    The topological derivative is defined as the first term (correction) of the asymptotic expansion of a given shape functional with respect to a small parameter that measures the size of singular domain perturbations, such as holes, inclusions, defects, source-terms and cracks. Over the last decade, topological asymptotic analysis has become a broad, rich and fascinating research area from both theoretical and numerical standpoints. It has applications in many different fields such as shape and topology optimization, inverse problems, imaging processing and mechanical modeling including synthesis and/or optimal design of microstructures, sensitivity analysis in fracture mechanics and damage evolution modeling. Since there is no monograph on the subject at present, the authors provide here the first account of the theory which combines classical sensitivity analysis in shape optimization with asymptotic analysis by means of compound asymptotic expansions for elliptic boundary value problems. This book is intende...

  4. Optimization and benchmarking of a perturbative Metropolis Monte Carlo quantum mechanics/molecular mechanics program.

    Science.gov (United States)

    Feldt, Jonas; Miranda, Sebastião; Pratas, Frederico; Roma, Nuno; Tomás, Pedro; Mata, Ricardo A

    2017-12-28

    In this work, we present an optimized perturbative quantum mechanics/molecular mechanics (QM/MM) method for use in Metropolis Monte Carlo simulations. The model adopted is particularly tailored for the simulation of molecular systems in solution but can be readily extended to other applications, such as catalysis in enzymatic environments. The electrostatic coupling between the QM and MM systems is simplified by applying perturbation theory to estimate the energy changes caused by a movement in the MM system. This approximation, together with the effective use of GPU acceleration, leads to a negligible added computational cost for the sampling of the environment. Benchmark calculations are carried out to evaluate the impact of the approximations applied and the overall computational performance.

  5. Structural optimization of free-form reciprocal structures

    DEFF Research Database (Denmark)

    Parigi, Dario

    2014-01-01

    This paper presents an optimization algorithm for the design of structurally efficient free-form reciprocal structures. Because of the geometric complexity of reciprocal structures, only a few structural studies have been carried out so far, and we have a limited knowledge of the relation between...

  6. Case mix classification and a benchmark set for surgery scheduling

    NARCIS (Netherlands)

    Leeftink, Gréanne; Hans, Erwin W.

    Numerous benchmark sets exist for combinatorial optimization problems. However, in healthcare scheduling, only a few benchmark sets are known, mainly focused on nurse rostering. One of the most studied topics in the healthcare scheduling literature is surgery scheduling, for which there is no widely

  7. A benchmarking tool to evaluate computer tomography perfusion infarct core predictions against a DWI standard.

    Science.gov (United States)

    Cereda, Carlo W; Christensen, Søren; Campbell, Bruce Cv; Mishra, Nishant K; Mlynash, Michael; Levi, Christopher; Straka, Matus; Wintermark, Max; Bammer, Roland; Albers, Gregory W; Parsons, Mark W; Lansberg, Maarten G

    2016-10-01

    Differences in research methodology have hampered the optimization of Computer Tomography Perfusion (CTP) for identification of the ischemic core. We aim to optimize CTP core identification using a novel benchmarking tool. The benchmarking tool consists of an imaging library and a statistical analysis algorithm to evaluate the performance of CTP. The tool was used to optimize and evaluate an in-house developed CTP-software algorithm. Imaging data of 103 acute stroke patients were included in the benchmarking tool. Median time from stroke onset to CT was 185 min (IQR 180-238), and the median time between completion of CT and start of MRI was 36 min (IQR 25-79). Volumetric accuracy of the CTP-ROIs was optimal at an rCBF threshold of benchmarking tool can play an important role in optimizing CTP software as it provides investigators with a novel method to directly compare the performance of alternative CTP software packages. © The Author(s) 2015.

  8. A Field-Based Aquatic Life Benchmark for Conductivity in ...

    Science.gov (United States)

    This report adapts the standard U.S. EPA methodology for deriving ambient water quality criteria. Rather than use toxicity test results, the adaptation uses field data to determine the loss of 5% of genera from streams. The method is applied to derive effect benchmarks for dissolved salts as measured by conductivity in Central Appalachian streams using data from West Virginia and Kentucky. This report provides scientific evidence for a conductivity benchmark in a specific region rather than for the entire United States.

  9. Benchmarking transaction and analytical processing systems the creation of a mixed workload benchmark and its application

    CERN Document Server

    Bog, Anja

    2014-01-01

    This book introduces a new benchmark for hybrid database systems, gauging the effect of adding OLAP to an OLTP workload and analyzing the impact of commonly used optimizations in historically separate OLTP and OLAP domains in mixed-workload scenarios.

  10. Interior beam searchlight semi-analytical benchmark

    International Nuclear Information System (INIS)

    Ganapol, Barry D.; Kornreich, Drew E.

    2008-01-01

    Multidimensional semi-analytical benchmarks to provide highly accurate standards to assess routine numerical particle transport algorithms are few and far between. Because of the well-established 1D theory for the analytical solution of the transport equation, it is sometimes possible to 'bootstrap' a 1D solution to generate a more comprehensive solution representation. Here, we consider the searchlight problem (SLP) as a multidimensional benchmark. A variation of the usual SLP is the interior beam SLP (IBSLP) where a beam source lies beneath the surface of a half space and emits directly towards the free surface. We consider the establishment of a new semi-analytical benchmark based on a new FN formulation. This problem is important in radiative transfer experimental analysis to determine cloud absorption and scattering properties. (authors)

  11. Benchmarking von Krankenhausinformationssystemen – eine vergleichende Analyse deutschsprachiger Benchmarkingcluster

    Directory of Open Access Journals (Sweden)

    Jahn, Franziska

    2015-08-01

    Full Text Available Benchmarking is a method of strategic information management used by many hospitals today. During the last years, several benchmarking clusters have been established within the German-speaking countries. They support hospitals in comparing and positioning their information system’s and information management’s costs, performance and efficiency against other hospitals. In order to differentiate between these benchmarking clusters and to provide decision support in selecting an appropriate benchmarking cluster, a classification scheme is developed. The classification scheme observes both general conditions and examined contents of the benchmarking clusters. It is applied to seven benchmarking clusters which have been active in the German-speaking countries within the last years. Currently, performance benchmarking is the most frequent benchmarking type, whereas the observed benchmarking clusters differ in the number of benchmarking partners and their cooperation forms. The benchmarking clusters also deal with different benchmarking subjects. Assessing costs and quality application systems, physical data processing systems, organizational structures of information management and IT services processes are the most frequent benchmarking subjects. There is still potential for further activities within the benchmarking clusters to measure strategic and tactical information management, IT governance and quality of data and data-processing processes. Based on the classification scheme and the comparison of the benchmarking clusters, we derive general recommendations for benchmarking of hospital information systems.

  12. On a variational principle for shape optimization and elliptic free boundary problems

    Directory of Open Access Journals (Sweden)

    Raúl B. González De Paz

    2009-02-01

    Full Text Available A variational principle for several free boundary value problems using a relaxation approach is presented. The relaxed Energy functional is concave and it is defined on a convex set, so that the minimizing points are characteristic functions of sets. As a consequence of the first order optimality conditions, it is shown that the corresponding sets are domains bounded by free boundaries, so that the equivalence of the solution of the relaxed problem with the solution of several free boundary value problem is proved. Keywords: Calculus of variations, optimization, free boundary problems.

  13. PROCEDURES FOR THE DERIVATION OF EQUILIBRIUM PARTITIONING SEDIMENT BENCHMARKS (ESBS) FOR THE PROTECTION OF BENTHIC ORGANISMS: COMPENDIUM OF TIER 2 VALUES FOR NONIONIC ORGANICS

    Science.gov (United States)

    This equilibrium partitioning sediment benchmark (ESB) document describes procedures to derive concentrations for 32 nonionic organic chemicals in sediment which are protective of the presence of freshwater and marine benthic organisms. The equilibrium partitioning (EqP) approach...

  14. Multi-objective superstructure-free synthesis and optimization of thermal power plants

    International Nuclear Information System (INIS)

    Wang, Ligang; Lampe, Matthias; Voll, Philip; Yang, Yongping; Bardow, André

    2016-01-01

    The merits of superstructure-free synthesis are demonstrated for bi-objective design of thermal power plants. The design of thermal power plants is complex and thus best solved by optimization. Common optimization methods require specification of a superstructure which becomes a tedious and error-prone task for complex systems. Superstructure specification is avoided by the presented superstructure-free approach, which is shown to successfully solve the design task yielding a high-quality Pareto front of promising structural alternatives. The economic objective function avoids introducing infinite numbers of units (e.g., turbine, reheater and feedwater preheater) as favored by pure thermodynamic optimization. The number of feasible solutions found per number of mutation tries is still high even after many generations but declines after introducing highly-nonlinear cost functions leading to challenging MINLP problems. The identified Pareto-optimal solutions tend to employ more units than found in modern power plants indicating the need for cost functions to reflect current industrial practice. In summary, the multi-objective superstructure-free synthesis framework is a robust approach for very complex problems in the synthesis of thermal power plants. - Highlights: • A generalized multi-objective superstructure-free synthesis framework for thermal power plants is presented. • The superstructure-free synthesis framework is comprehensively evaluated by complex bi-objective synthesis problems. • The proposed framework is effective to explore the structural design space even for complex problems.

  15. Globally-Optimized Local Pseudopotentials for (Orbital-Free) Density Functional Theory Simulations of Liquids and Solids.

    Science.gov (United States)

    Del Rio, Beatriz G; Dieterich, Johannes M; Carter, Emily A

    2017-08-08

    The accuracy of local pseudopotentials (LPSs) is one of two major determinants of the fidelity of orbital-free density functional theory (OFDFT) simulations. We present a global optimization strategy for LPSs that enables OFDFT to reproduce solid and liquid properties obtained from Kohn-Sham DFT. Our optimization strategy can fit arbitrary properties from both solid and liquid phases, so the resulting globally optimized local pseudopotentials (goLPSs) can be used in solid and/or liquid-phase simulations depending on the fitting process. We show three test cases proving that we can (1) improve solid properties compared to our previous bulk-derived local pseudopotential generation scheme; (2) refine predicted liquid and solid properties by adding force matching data; and (3) generate a from-scratch, accurate goLPS from the local channel of a non-local pseudopotential. The proposed scheme therefore serves as a full and improved LPS construction protocol.

  16. Arbitrage-free valuation of energy derivatives

    International Nuclear Information System (INIS)

    Amin, K.; Ng, V.; Pirrong, C.

    1999-01-01

    This chapter focuses on techniques available for valuing energy-contingent claims and develops an arbitrage-free framework to value energy derivatives. The relationship between the spot, forward and futures prices is explained. Option valuation with deterministic convenience yields is discussed using an extension of the Black (1976) framework, and details of the risk-neutral valuation of European options, and valuation of American and European-style options are given. Option valuations with stochastic convenience yields, the evolution of the term structure of convenience yield, and a tree approach to valuing American and other options are discussed. Applications and limitations of the models for pricing energy derivative products are considered. The stochastic differential equation for the futures prices when the convenience yields are stochastic is presented in an appendix

  17. Transfer functions for solid-solution partitioning of cadmium, copper, nickel, lead and zinc in soils. Derivation of relationships for free metal ion activities and validation with independent data

    Energy Technology Data Exchange (ETDEWEB)

    Groenenberg, J.E.; Roemkens, P.F.A.M.; De Vries, W. [Soil Science Centre, Wageningen University and Research Centre, P.O. Box 47, 6700 AA Wageningen (Netherlands); Comans, R.N.J. [Energy Research Centre of the Netherlands, P.O. Box 1, 1755 ZG Petten (Netherlands); Luster, J. [Research Unit Soil Sciences, Swiss Federal Institute for Forest, Snow and Landscape Research, Zuercherstrasse 111 CH-8903 Birmensdorf (Switzerland); Pampura, T. [Laboratory of Physical Chemistry of Soils, Institute of Physicochemical and Biological Problems in Soil Science RAS, Pushchino, Moscow Region, 142290 (Russian Federation); Shotbolt, L. [Department of Geography, Queen Mary, University of London, Mile End Road, London E1 4NS (United Kingdom); Tipping, E. [Centre for Ecology & Hydrology, Lancaster Environment Centre, Library Avenue, Bailrigg, Lancaster, LA1 4AP (United Kingdom)

    2010-07-01

    Models to predict the solid-solution partitioning of trace metals are important tools in risk assessment, providing information on the biological availability of metals and their leaching. Empirically based models, or transfer functions, published to date differ with respect to the mathematical model used, the optimization method, the methods used to determine metal concentrations in the solid and solution phases and the soil properties accounted for. Here we review these methodological aspects before deriving our own transfer functions that relate free metal ion activities to reactive metal contents in the solid phase. One single function was able to predict free-metal ion activities estimated by a variety of soil solution extraction methods. Evaluation of the mathematical formulation showed that transfer functions derived to optimize the Freundlich adsorption constant (Kf ), in contrast to functions derived to optimize either the solid or solution concentration, were most suitable for predicting concentrations in solution from solid phase concentrations and vice versa. The model was shown to be generally applicable on the basis of a large number of independent data, for which predicted free metal activities were within one order of magnitude of the observations. The model only over-estimated free-metal ion activities at alkaline pH (>7). The use of the reactive metal content measured by 0.43 m HNO3 rather than the total metal content resulted in a close correlation with measured data, particularly for nickel and zinc.

  18. OPTIMIZATION OF MUD HAMMER DRILLING PERFORMANCE - A PROGRAM TO BENCHMARK THE VIABILITY OF ADVANCED MUD HAMMER DRILLING

    Energy Technology Data Exchange (ETDEWEB)

    Alan Black; Arnis Judzis

    2003-01-01

    Progress during current reporting year 2002 by quarter--Progress during Q1 2002: (1) In accordance to Task 7.0 (D. No.2 Technical Publications) TerraTek, NETL, and the Industry Contributors successfully presented a paper detailing Phase 1 testing results at the February 2002 IADC/SPE Drilling Conference, a prestigious venue for presenting DOE and private sector drilling technology advances. The full reference is as follows: IADC/SPE 74540 ''World's First Benchmarking of Drilling Mud Hammer Performance at Depth Conditions'' authored by Gordon A. Tibbitts, TerraTek; Roy C. Long, US Department of Energy, Brian E. Miller, BP America, Inc.; Arnis Judzis, TerraTek; and Alan D. Black, TerraTek. Gordon Tibbitts, TerraTek, will presented the well-attended paper in February of 2002. The full text of the Mud Hammer paper was included in the last quarterly report. (2) The Phase 2 project planning meeting (Task 6) was held at ExxonMobil's Houston Greenspoint offices on February 22, 2002. In attendance were representatives from TerraTek, DOE, BP, ExxonMobil, PDVSA, Novatek, and SDS Digger Tools. (3) PDVSA has joined the advisory board to this DOE mud hammer project. PDVSA's commitment of cash and in-kind contributions were reported during the last quarter. (4) Strong Industry support remains for the DOE project. Both Andergauge and Smith Tools have expressed an interest in participating in the ''optimization'' phase of the program. The potential for increased testing with additional Industry cash support was discussed at the planning meeting in February 2002. Progress during Q2 2002: (1) Presentation material was provided to the DOE/NETL project manager (Dr. John Rogers) for the DOE exhibit at the 2002 Offshore Technology Conference. (2) Two meeting at Smith International and one at Andergauge in Houston were held to investigate their interest in joining the Mud Hammer Performance study. (3) SDS Digger Tools (Task 3

  19. Analysis of a multigroup stylized CANDU half-core benchmark

    International Nuclear Information System (INIS)

    Pounders, Justin M.; Rahnema, Farzad; Serghiuta, Dumitru

    2011-01-01

    Highlights: → This paper provides a benchmark that is a stylized model problem in more than two energy groups that is realistic with respect to the underlying physics. → An 8-group cross section library is provided to augment a previously published 2-group 3D stylized half-core CANDU benchmark problem. → Reference eigenvalues and selected pin and bundle fission rates are included. → 2-, 4- and 47-group Monte Carlo solutions are compared to analyze homogenization-free transport approximations that result from energy condensation. - Abstract: An 8-group cross section library is provided to augment a previously published 2-group 3D stylized half-core Canadian deuterium uranium (CANDU) reactor benchmark problem. Reference eigenvalues and selected pin and bundle fission rates are also included. This benchmark is intended to provide computational reactor physicists and methods developers with a stylized model problem in more than two energy groups that is realistic with respect to the underlying physics. In addition to transport theory code verification, the 8-group energy structure provides reactor physicist with an ideal problem for examining cross section homogenization and collapsing effects in a full-core environment. To this end, additional 2-, 4- and 47-group full-core Monte Carlo benchmark solutions are compared to analyze homogenization-free transport approximations incurred as a result of energy group condensation.

  20. Equilibrium Partitioning Sediment Benchmarks (ESBs) for the ...

    Science.gov (United States)

    This document describes procedures to determine the concentrations of nonionic organic chemicals in sediment interstitial waters. In previous ESB documents, the general equilibrium partitioning (EqP) approach was chosen for the derivation of sediment benchmarks because it accounts for the varying bioavailability of chemicals in different sediments and allows for the incorporation of the appropriate biological effects concentration. This provides for the derivation of benchmarks that are causally linked to the specific chemical, applicable across sediments, and appropriately protective of benthic organisms.  This equilibrium partitioning sediment benchmark (ESB) document was prepared by scientists from the Atlantic Ecology Division, Mid-Continent Ecology Division, and Western Ecology Division, the Office of Water, and private consultants. The document describes procedures to determine the interstitial water concentrations of nonionic organic chemicals in contaminated sediments. Based on these concentrations, guidance is provided on the derivation of toxic units to assess whether the sediments are likely to cause adverse effects to benthic organisms. The equilibrium partitioning (EqP) approach was chosen because it is based on the concentrations of chemical(s) that are known to be harmful and bioavailable in the environment.  This document, and five others published over the last nine years, will be useful for the Program Offices, including Superfund, a

  1. Optimal economic order quantity for buyer-distributor-vendor supply chain with backlogging derived without derivatives

    Science.gov (United States)

    Teng, Jinn-Tsair; Cárdenas-Barrón, Leopoldo Eduardo; Lou, Kuo-Ren; Wee, Hui Ming

    2013-05-01

    In this article, we first complement an inappropriate mathematical error on the total cost in the previously published paper by Chung and Wee [2007, 'Optimal the Economic Lot Size of a Three-stage Supply Chain With Backlogging Derived Without Derivatives', European Journal of Operational Research, 183, 933-943] related to buyer-distributor-vendor three-stage supply chain with backlogging derived without derivatives. Then, an arithmetic-geometric inequality method is proposed not only to simplify the algebraic method of completing prefect squares, but also to complement their shortcomings. In addition, we provide a closed-form solution to integral number of deliveries for the distributor and the vendor without using complex derivatives. Furthermore, our method can solve many cases in which their method cannot, because they did not consider that a squared root of a negative number does not exist. Finally, we use some numerical examples to show that our proposed optimal solution is cheaper to operate than theirs.

  2. Deriving optimal exploration target zones on mineral prospectivity maps

    CSIR Research Space (South Africa)

    Debba, Pravesh

    2008-08-01

    Full Text Available into an objective function in simulated annealing in order to derive a set of optimal exploration focal points. Each optimal exploration focal point represents a pixel or location within a circular neighborhood of pixels with high posterior probability of mineral...

  3. Effect of concentration and molecular weight of chitosan and its derivative on the free radical scavenging ability.

    Science.gov (United States)

    Li, Huili; Xu, Qing; Chen, Yun; Wan, Ajun

    2014-03-01

    Chitosan is a biodegradable and biocompatible natural scaffold material, which has numerous applications in biomedical sciences. In this study, the in vitro antioxidant activity of chitosan scaffold material was investigated by the chemiluminescence signal generated from the hydroxyl radical (•OH) scavenging assay. The scavenging mechanism was also discussed. The results indicated that the free radical scavenging ability of chitosan scaffold material significantly depends on the chitosan concentration and shows interesting kinetic change. Within the experimental concentration range, the optimal concentration of chitosan was 0.2 mg/mL. The molecular weight of chitosan also attributed to the free radical scavenging ability. Comparison between chitosan and its derivative found that carboxymethyl chitosan possessed higher scavenging ability. Copyright © 2013 Society of Plastics Engineers.

  4. Benchmarking to improve the quality of cystic fibrosis care.

    Science.gov (United States)

    Schechter, Michael S

    2012-11-01

    Benchmarking involves the ascertainment of healthcare programs with most favorable outcomes as a means to identify and spread effective strategies for delivery of care. The recent interest in the development of patient registries for patients with cystic fibrosis (CF) has been fueled in part by an interest in using them to facilitate benchmarking. This review summarizes reports of how benchmarking has been operationalized in attempts to improve CF care. Although certain goals of benchmarking can be accomplished with an exclusive focus on registry data analysis, benchmarking programs in Germany and the United States have supplemented these data analyses with exploratory interactions and discussions to better understand successful approaches to care and encourage their spread throughout the care network. Benchmarking allows the discovery and facilitates the spread of effective approaches to care. It provides a pragmatic alternative to traditional research methods such as randomized controlled trials, providing insights into methods that optimize delivery of care and allowing judgments about the relative effectiveness of different therapeutic approaches.

  5. A Comparison of Evidence-Based Estimates and Empirical Benchmarks of the Appropriate Rate of Use of Radiation Therapy in Ontario

    International Nuclear Information System (INIS)

    Mackillop, William J.; Kong, Weidong; Brundage, Michael; Hanna, Timothy P.; Zhang-Salomons, Jina; McLaughlin, Pierre-Yves; Tyldesley, Scott

    2015-01-01

    Purpose: Estimates of the appropriate rate of use of radiation therapy (RT) are required for planning and monitoring access to RT. Our objective was to compare estimates of the appropriate rate of use of RT derived from mathematical models, with the rate observed in a population of patients with optimal access to RT. Methods and Materials: The rate of use of RT within 1 year of diagnosis (RT 1Y ) was measured in the 134,541 cases diagnosed in Ontario between November 2009 and October 2011. The lifetime rate of use of RT (RT LIFETIME ) was estimated by the multicohort utilization table method. Poisson regression was used to evaluate potential barriers to access to RT and to identify a benchmark subpopulation with unimpeded access to RT. Rates of use of RT were measured in the benchmark subpopulation and compared with published evidence-based estimates of the appropriate rates. Results: The benchmark rate for RT 1Y , observed under conditions of optimal access, was 33.6% (95% confidence interval [CI], 33.0%-34.1%), and the benchmark for RT LIFETIME was 41.5% (95% CI, 41.2%-42.0%). Benchmarks for RT LIFETIME for 4 of 5 selected sites and for all cancers combined were significantly lower than the corresponding evidence-based estimates. Australian and Canadian evidence-based estimates of RT LIFETIME for 5 selected sites differed widely. RT LIFETIME in the overall population of Ontario was just 7.9% short of the benchmark but 20.9% short of the Australian evidence-based estimate of the appropriate rate. Conclusions: Evidence-based estimates of the appropriate lifetime rate of use of RT may overestimate the need for RT in Ontario

  6. An Optimization of the Risk Management using Derivatives

    Directory of Open Access Journals (Sweden)

    Ovidiu ŞONTEA

    2011-07-01

    Full Text Available This article aims to provide a process that can be used in financial risk management by resolving problems of minimizing the risk measure (VaR using derivatives products, bonds and options. This optimization problem was formulated in the hedging situation of a portfolio formed by an active and a put option on this active, respectively a bond and an option on this bond. In the first optimization problem we will obtain the coverage ratio of the optimal price for the excertion of the option which is in fact the relative cost of the option’s value. In the second optimization problem we obtained optimal exercise price for a put option which is to support a bond.

  7. Toxicological Benchmarks for Screening Potential Contaminants of Concern for Effects on Terrestrial Plants

    Energy Technology Data Exchange (ETDEWEB)

    Suter, G.W. II

    1993-01-01

    One of the initial stages in ecological risk assessment for hazardous waste sites is screening contaminants to determine which of them are worthy of further consideration as contaminants of potential concern. This process is termed contaminant screening. It is performed by comparing measured ambient concentrations of chemicals to benchmark concentrations. Currently, no standard benchmark concentrations exist for assessing contaminants in soil with respect to their toxicity to plants. This report presents a standard method for deriving benchmarks for this purpose (phytotoxicity benchmarks), a set of data concerning effects of chemicals in soil or soil solution on plants, and a set of phytotoxicity benchmarks for 38 chemicals potentially associated with United States Department of Energy (DOE) sites. In addition, background information on the phytotoxicity and occurrence of the chemicals in soils is presented, and literature describing the experiments from which data were drawn for benchmark derivation is reviewed. Chemicals that are found in soil at concentrations exceeding both the phytotoxicity benchmark and the background concentration for the soil type should be considered contaminants of potential concern.

  8. A derivation of the free-free emission on the Galactic plane between ℓ= 20° and 44°

    Science.gov (United States)

    Alves, Marta I. R.; Davies, Rodney D.; Dickinson, Clive; Calabretta, Mark; Davis, Richard; Staveley-Smith, Lister

    2012-05-01

    We present the derivation of the free-free emission on the Galactic plane between ℓ= 20° and 44° and |b|≤ 4°, using radio recombination line (RRL) data from the H I Parkes All Sky Survey (HIPASS). Following an upgrade of the RRL data reduction technique, which improves significantly the quality of the final RRL spectra, we have extended the analysis to three times the area covered in Alves et al. The final RRL map has an angular resolution of 14.8 arcmin and a velocity resolution of 20 km s-1. The electron temperature (Te) distribution of the ionized gas in the area under study at 1.4 GHz is derived using the line and continuum data from the present survey. The mean Te on the Galactic plane is 6000 K. The first direct measure of the free-free emission is obtained based on the derived Te distribution. Subtraction of this thermal component from the total continuum leads to the first direct measurement of the synchrotron emission at 1.4 GHz. A narrow component of width 2° is identified in the latitude distribution of the synchrotron emission. We present a list of H II regions and supernova remnants (SNRs) extracted from the present free-free and synchrotron maps, where we confirm the synchrotron nature of the SNRs G42.0-0.1 and G41.5+0.4 proposed by Kaplan et al. and the SNR G35.6-0.4 recently re-identified by Green. The latitude distribution for the RRL-derived free-free emission shows that the Wilkinson Microwave Anisotropy Probe (WMAP) maximum entropy method is too high by ˜50 per cent, in agreement with other recent results. The extension of this study to the inner Galaxy region ℓ=-50° to 50° will allow a better overall comparison of the RRL result with WMAP.

  9. Protein Folding Free Energy Landscape along the Committor - the Optimal Folding Coordinate.

    Science.gov (United States)

    Krivov, Sergei V

    2018-06-06

    Recent advances in simulation and experiment have led to dramatic increases in the quantity and complexity of produced data, which makes the development of automated analysis tools very important. A powerful approach to analyze dynamics contained in such data sets is to describe/approximate it by diffusion on a free energy landscape - free energy as a function of reaction coordinates (RC). For the description to be quantitatively accurate, RCs should be chosen in an optimal way. Recent theoretical results show that such an optimal RC exists; however, determining it for practical systems is a very difficult unsolved problem. Here we describe a solution to this problem. We describe an adaptive nonparametric approach to accurately determine the optimal RC (the committor) for an equilibrium trajectory of a realistic system. In contrast to alternative approaches, which require a functional form with many parameters to approximate an RC and thus extensive expertise with the system, the suggested approach is nonparametric and can approximate any RC with high accuracy without system specific information. To avoid overfitting for a realistically sampled system, the approach performs RC optimization in an adaptive manner by focusing optimization on less optimized spatiotemporal regions of the RC. The power of the approach is illustrated on a long equilibrium atomistic folding simulation of HP35 protein. We have determined the optimal folding RC - the committor, which was confirmed by passing a stringent committor validation test. It allowed us to determine a first quantitatively accurate protein folding free energy landscape. We have confirmed the recent theoretical results that diffusion on such a free energy profile can be used to compute exactly the equilibrium flux, the mean first passage times, and the mean transition path times between any two points on the profile. We have shown that the mean squared displacement along the optimal RC grows linear with time as for

  10. Toxicological benchmarks for screening contaminants of potential concern for effects on freshwater biota

    International Nuclear Information System (INIS)

    Suter, G.W. II

    1996-01-01

    An important early step in the assessment of ecological risks at contaminated sites is the screening of chemicals detected on the site to identify those that constitute a potential risk. Part of this screening process is the comparison of measured ambient concentrations to concentrations that are believed to be nonhazardous, termed benchmarks. This article discusses 13 methods by which benchmarks may be derived for aquatic biota and presents benchmarks for 105 chemicals. It then compares them with respect to their sensitivity, availability, magnitude relative to background concentrations, and conceptual bases. This compilation is limited to chemicals that have been detected on the US Department of Energy's Oak Ridge Reservation (ORR) and to benchmarks derived from studies of toxic effects on freshwater organisms. The list of chemicals includes 45 metals and 56 industrial organic chemicals but only four pesticides. Although some individual values can be shown to be too high to be protective and others are too low to be useful for screening, none of the approaches to benchmark derivation can be rejected without further definition of what constitutes adequate protection. The most appropriate screening strategy is to use multiple benchmark values along with background concentrations, knowledge of waste composition, and physicochemical properties to identify contaminants of potential concern

  11. A multi-center study benchmarks software tools for label-free proteome quantification

    Science.gov (United States)

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  12. A Seafloor Benchmark for 3-dimensional Geodesy

    Science.gov (United States)

    Chadwell, C. D.; Webb, S. C.; Nooner, S. L.

    2014-12-01

    We have developed an inexpensive, permanent seafloor benchmark to increase the longevity of seafloor geodetic measurements. The benchmark provides a physical tie to the sea floor lasting for decades (perhaps longer) on which geodetic sensors can be repeatedly placed and removed with millimeter resolution. Global coordinates estimated with seafloor geodetic techniques will remain attached to the benchmark allowing for the interchange of sensors as they fail or become obsolete, or for the sensors to be removed and used elsewhere, all the while maintaining a coherent series of positions referenced to the benchmark. The benchmark has been designed to free fall from the sea surface with transponders attached. The transponder can be recalled via an acoustic command sent from the surface to release from the benchmark and freely float to the sea surface for recovery. The duration of the sensor attachment to the benchmark will last from a few days to a few years depending on the specific needs of the experiment. The recovered sensors are then available to be reused at other locations, or again at the same site in the future. Three pins on the sensor frame mate precisely and unambiguously with three grooves on the benchmark. To reoccupy a benchmark a Remotely Operated Vehicle (ROV) uses its manipulator arm to place the sensor pins into the benchmark grooves. In June 2014 we deployed four benchmarks offshore central Oregon. We used the ROV Jason to successfully demonstrate the removal and replacement of packages onto the benchmark. We will show the benchmark design and its operational capabilities. Presently models of megathrust slip within the Cascadia Subduction Zone (CSZ) are mostly constrained by the sub-aerial GPS vectors from the Plate Boundary Observatory, a part of Earthscope. More long-lived seafloor geodetic measures are needed to better understand the earthquake and tsunami risk associated with a large rupture of the thrust fault within the Cascadia subduction zone

  13. A Biogeography-Based Optimization Algorithm Hybridized with Tabu Search for the Quadratic Assignment Problem.

    Science.gov (United States)

    Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah

    2016-01-01

    The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them.

  14. A Biogeography-Based Optimization Algorithm Hybridized with Tabu Search for the Quadratic Assignment Problem

    Science.gov (United States)

    Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah

    2016-01-01

    The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them. PMID:26819585

  15. Space Weather Action Plan Solar Radio Burst Phase 1 Benchmarks and the Steps to Phase 2

    Science.gov (United States)

    Biesecker, D. A.; White, S. M.; Gopalswamy, N.; Black, C.; Love, J. J.; Pierson, J.

    2017-12-01

    Solar radio bursts, when at the right frequency and when strong enough, can interfere with radar, communication, and tracking signals. In severe cases, radio bursts can inhibit the successful use of radio communications and disrupt a wide range of systems that are reliant on Position, Navigation, and Timing services on timescales ranging from minutes to hours across wide areas on the dayside of Earth. The White House's Space Weather Action Plan asked for solar radio burst intensity benchmarks for an event occurrence frequency of 1 in 100 years and also a theoretical maximum intensity benchmark. The benchmark team has developed preliminary (phase 1) benchmarks for the VHF (30-300 MHz), UHF (300-3000 MHz), GPS (1176-1602 MHz), F10.7 (2800 MHz), and Microwave (4000-20000) bands. The preliminary benchmarks were derived based on previously published work. Limitations in the published work will be addressed in phase 2 of the benchmark process. In addition, deriving theoretical maxima requires additional work, where it is even possible to, in order to meet the Action Plan objectives. In this presentation, we will present the phase 1 benchmarks, the basis used to derive them, and the limitations of that work. We will also discuss the work that needs to be done to complete the phase 2 benchmarks.

  16. Developing a Benchmarking Process in Perfusion: A Report of the Perfusion Downunder Collaboration

    Science.gov (United States)

    Baker, Robert A.; Newland, Richard F.; Fenton, Carmel; McDonald, Michael; Willcox, Timothy W.; Merry, Alan F.

    2012-01-01

    Abstract: Improving and understanding clinical practice is an appropriate goal for the perfusion community. The Perfusion Downunder Collaboration has established a multi-center perfusion focused database aimed at achieving these goals through the development of quantitative quality indicators for clinical improvement through benchmarking. Data were collected using the Perfusion Downunder Collaboration database from procedures performed in eight Australian and New Zealand cardiac centers between March 2007 and February 2011. At the Perfusion Downunder Meeting in 2010, it was agreed by consensus, to report quality indicators (QI) for glucose level, arterial outlet temperature, and pCO2 management during cardiopulmonary bypass. The values chosen for each QI were: blood glucose ≥4 mmol/L and ≤10 mmol/L; arterial outlet temperature ≤37°C; and arterial blood gas pCO2 ≥ 35 and ≤45 mmHg. The QI data were used to derive benchmarks using the Achievable Benchmark of Care (ABC™) methodology to identify the incidence of QIs at the best performing centers. Five thousand four hundred and sixty-five procedures were evaluated to derive QI and benchmark data. The incidence of the blood glucose QI ranged from 37–96% of procedures, with a benchmark value of 90%. The arterial outlet temperature QI occurred in 16–98% of procedures with the benchmark of 94%; while the arterial pCO2 QI occurred in 21–91%, with the benchmark value of 80%. We have derived QIs and benchmark calculations for the management of several key aspects of cardiopulmonary bypass to provide a platform for improving the quality of perfusion practice. PMID:22730861

  17. Benchmarking DFT and semi-empirical methods for a reliable and cost-efficient computational screening of benzofulvene derivatives as donor materials for small-molecule organic solar cells.

    Science.gov (United States)

    Tortorella, Sara; Talamo, Maurizio Mastropasqua; Cardone, Antonio; Pastore, Mariachiara; De Angelis, Filippo

    2016-02-24

    A systematic computational investigation on the optical properties of a group of novel benzofulvene derivatives (Martinelli 2014 Org. Lett. 16 3424-7), proposed as possible donor materials in small molecule organic photovoltaic (smOPV) devices, is presented. A benchmark evaluation against experimental results on the accuracy of different exchange and correlation functionals and semi-empirical methods in predicting both reliable ground state equilibrium geometries and electronic absorption spectra is carried out. The benchmark of the geometry optimization level indicated that the best agreement with x-ray data is achieved by using the B3LYP functional. Concerning the optical gap prediction, we found that, among the employed functionals, MPW1K provides the most accurate excitation energies over the entire set of benzofulvenes. Similarly reliable results were also obtained for range-separated hybrid functionals (CAM-B3LYP and wB97XD) and for global hybrid methods incorporating a large amount of non-local exchange (M06-2X and M06-HF). Density functional theory (DFT) hybrids with a moderate (about 20-30%) extent of Hartree-Fock exchange (HFexc) (PBE0, B3LYP and M06) were also found to deliver HOMO-LUMO energy gaps which compare well with the experimental absorption maxima, thus representing a valuable alternative for a prompt and predictive estimation of the optical gap. The possibility of using completely semi-empirical approaches (AM1/ZINDO) is also discussed.

  18. Benchmarking DFT and semi-empirical methods for a reliable and cost-efficient computational screening of benzofulvene derivatives as donor materials for small-molecule organic solar cells

    International Nuclear Information System (INIS)

    Tortorella, Sara; Talamo, Maurizio Mastropasqua; Cardone, Antonio; Pastore, Mariachiara; De Angelis, Filippo

    2016-01-01

    A systematic computational investigation on the optical properties of a group of novel benzofulvene derivatives (Martinelli 2014 Org. Lett. 16 3424–7), proposed as possible donor materials in small molecule organic photovoltaic (smOPV) devices, is presented. A benchmark evaluation against experimental results on the accuracy of different exchange and correlation functionals and semi-empirical methods in predicting both reliable ground state equilibrium geometries and electronic absorption spectra is carried out. The benchmark of the geometry optimization level indicated that the best agreement with x-ray data is achieved by using the B3LYP functional. Concerning the optical gap prediction, we found that, among the employed functionals, MPW1K provides the most accurate excitation energies over the entire set of benzofulvenes. Similarly reliable results were also obtained for range-separated hybrid functionals (CAM-B3LYP and wB97XD) and for global hybrid methods incorporating a large amount of non-local exchange (M06-2X and M06-HF). Density functional theory (DFT) hybrids with a moderate (about 20–30%) extent of Hartree–Fock exchange (HFexc) (PBE0, B3LYP and M06) were also found to deliver HOMO–LUMO energy gaps which compare well with the experimental absorption maxima, thus representing a valuable alternative for a prompt and predictive estimation of the optical gap. The possibility of using completely semi-empirical approaches (AM1/ZINDO) is also discussed. (paper)

  19. Fast exploration of an optimal path on the multidimensional free energy surface

    Science.gov (United States)

    Chen, Changjun

    2017-01-01

    In a reaction, determination of an optimal path with a high reaction rate (or a low free energy barrier) is important for the study of the reaction mechanism. This is a complicated problem that involves lots of degrees of freedom. For simple models, one can build an initial path in the collective variable space by the interpolation method first and then update the whole path constantly in the optimization. However, such interpolation method could be risky in the high dimensional space for large molecules. On the path, steric clashes between neighboring atoms could cause extremely high energy barriers and thus fail the optimization. Moreover, performing simulations for all the snapshots on the path is also time-consuming. In this paper, we build and optimize the path by a growing method on the free energy surface. The method grows a path from the reactant and extends its length in the collective variable space step by step. The growing direction is determined by both the free energy gradient at the end of the path and the direction vector pointing at the product. With fewer snapshots on the path, this strategy can let the path avoid the high energy states in the growing process and save the precious simulation time at each iteration step. Applications show that the presented method is efficient enough to produce optimal paths on either the two-dimensional or the twelve-dimensional free energy surfaces of different small molecules. PMID:28542475

  20. Full sphere hydrodynamic and dynamo benchmarks

    KAUST Repository

    Marti, P.

    2014-01-26

    Convection in planetary cores can generate fluid flow and magnetic fields, and a number of sophisticated codes exist to simulate the dynamic behaviour of such systems. We report on the first community activity to compare numerical results of computer codes designed to calculate fluid flow within a whole sphere. The flows are incompressible and rapidly rotating and the forcing of the flow is either due to thermal convection or due to moving boundaries. All problems defined have solutions that alloweasy comparison, since they are either steady, slowly drifting or perfectly periodic. The first two benchmarks are defined based on uniform internal heating within the sphere under the Boussinesq approximation with boundary conditions that are uniform in temperature and stress-free for the flow. Benchmark 1 is purely hydrodynamic, and has a drifting solution. Benchmark 2 is a magnetohydrodynamic benchmark that can generate oscillatory, purely periodic, flows and magnetic fields. In contrast, Benchmark 3 is a hydrodynamic rotating bubble benchmark using no slip boundary conditions that has a stationary solution. Results from a variety of types of code are reported, including codes that are fully spectral (based on spherical harmonic expansions in angular coordinates and polynomial expansions in radius), mixed spectral and finite difference, finite volume, finite element and also a mixed Fourier-finite element code. There is good agreement between codes. It is found that in Benchmarks 1 and 2, the approximation of a whole sphere problem by a domain that is a spherical shell (a sphere possessing an inner core) does not represent an adequate approximation to the system, since the results differ from whole sphere results. © The Authors 2014. Published by Oxford University Press on behalf of The Royal Astronomical Society.

  1. Optimization of the culturing conditions of human umbilical cord blood-derived endothelial colony-forming cells under xeno-free conditions applying a transcriptomic approach

    NARCIS (Netherlands)

    Zeisberger, Steffen M.; Zoller, Stefan; Riegel, Mariluce; Chen, Shuhua; Krenning, Guido; Harmsen, Martin C.; Sachinidis, Agapios; Zisch, Andreas H.

    Establishment of fetal bovine serum (FBS)-free cell culture conditions is essential for transplantation therapies. Blood-derived endothelial colony-forming cells (ECFCs) are potential candidates for regenerative medicine applications. ECFCs were isolated from term umbilical cord blood units and

  2. Introduction to 'International Handbook of Criticality Safety Benchmark Experiments'

    International Nuclear Information System (INIS)

    Komuro, Yuichi

    1998-01-01

    The Criticality Safety Benchmark Evaluation Project (CSBEP) was initiated in 1992 by the United States Department of Energy. The project quickly became an international effort as scientists from other interested countries became involved. The International Criticality Safety Benchmark Evaluation Project (ICSBEP) is now an official activity of the Organization for Economic Cooperation and Development-Nuclear Energy Agency (OECD-NEA). 'International Handbook of Criticality Safety Benchmark Experiments' was prepared and is updated year by year by the working group of the project. This handbook contains criticality safety benchmark specifications that have been derived from experiments that were performed at various nuclear critical facilities around the world. The benchmark specifications are intended for use by criticality safety engineers to validate calculation techniques used. The author briefly introduces the informative handbook and would like to encourage Japanese engineers who are in charge of nuclear criticality safety to use the handbook. (author)

  3. Numerically derived parametrisation of optimal RMP coil phase as a guide to experiments on ASDEX Upgrade

    Science.gov (United States)

    Ryan, D. A.; Liu, Y. Q.; Li, L.; Kirk, A.; Dunne, M.; Dudson, B.; Piovesan, P.; Suttrop, W.; Willensdorfer, M.; the ASDEX Upgrade Team; the EUROfusion MST1 Team

    2017-02-01

    of the parametrisation relative to a plasma response computation, {{Δ }}{{{Φ }}}{{opt}} is also computed using MARS-F for a set of benchmarking points. Each benchmarking point consists of a distinct free boundary equilibrium reconstructed from an ASDEX Upgrade RMP experiment, and set of experimental kinetic profiles and coil currents. Comparing the MARS-F predictions of {{Δ }}{{{Φ }}}{{opt}} for these benchmarking points to predictions of the 2D quadratic, shows that relative to a plasma response computation with MARS-F the 2D quadratic is accurate to 26.5° for n = 1, and 20.6° for n = 2. Potential sources for uncertainty are assessed.

  4. Calculation of Free-Free Opacities

    Science.gov (United States)

    Bhatia, A. K.; Maiden, D.; Ritchie, A. B., Jr.

    2003-01-01

    Free-free absorption is an important contribution to the opacity for radiation transport through hot materials Temperatures can be as high as several keV, such that it becomes a computational challenge to solve the Schrodinger equation efficiently for rapidly oscillating continuum functions for high angular momenta. Several groups\\footnots, including ours, have studied the phase amplitude solution (PAS) of the Schrodinger equation, in which one solves equations for the wave function amplitude and phase, which are: smooth functions of the electron energy. It is also important to have an accurate Schroudinger benchmark for the development of the PAS method. We present results for dipole matrix elements, Gaunt factors, and cross sections for the absorption of radiation at various energies for Cs XIX at temperature=100 eV and density=0.187 g/cc for our newly developed PAS and Schrodinger benchmark.

  5. Shearlets and Optimally Sparse Approximations

    DEFF Research Database (Denmark)

    Kutyniok, Gitta; Lemvig, Jakob; Lim, Wang-Q

    2012-01-01

    Multivariate functions are typically governed by anisotropic features such as edges in images or shock fronts in solutions of transport-dominated equations. One major goal both for the purpose of compression as well as for an efficient analysis is the provision of optimally sparse approximations...... optimally sparse approximations of this model class in 2D as well as 3D. Even more, in contrast to all other directional representation systems, a theory for compactly supported shearlet frames was derived which moreover also satisfy this optimality benchmark. This chapter shall serve as an introduction...... to and a survey about sparse approximations of cartoon-like images by band-limited and also compactly supported shearlet frames as well as a reference for the state-of-the-art of this research field....

  6. Optimal trajectory planning of free-floating space manipulator using differential evolution algorithm

    Science.gov (United States)

    Wang, Mingming; Luo, Jianjun; Fang, Jing; Yuan, Jianping

    2018-03-01

    The existence of the path dependent dynamic singularities limits the volume of available workspace of free-floating space robot and induces enormous joint velocities when such singularities are met. In order to overcome this demerit, this paper presents an optimal joint trajectory planning method using forward kinematics equations of free-floating space robot, while joint motion laws are delineated with application of the concept of reaction null-space. Bézier curve, in conjunction with the null-space column vectors, are applied to describe the joint trajectories. Considering the forward kinematics equations of the free-floating space robot, the trajectory planning issue is consequently transferred to an optimization issue while the control points to construct the Bézier curve are the design variables. A constrained differential evolution (DE) scheme with premature handling strategy is implemented to find the optimal solution of the design variables while specific objectives and imposed constraints are satisfied. Differ from traditional methods, we synthesize null-space and specialized curve to provide a novel viewpoint for trajectory planning of free-floating space robot. Simulation results are presented for trajectory planning of 7 degree-of-freedom (DOF) kinematically redundant manipulator mounted on a free-floating spacecraft and demonstrate the feasibility and effectiveness of the proposed method.

  7. Two-dimensional optimization of free-electron-laser designs

    Science.gov (United States)

    Prosnitz, D.; Haas, R.A.

    1982-05-04

    Off-axis, two-dimensional designs for free electron lasers are described that maintain correspondence of a light beam with a synchronous electron at an optimal transverse radius r > 0 to achieve increased beam trapping efficiency and enhanced laser beam wavefront control so as to decrease optical beam diffraction and other deleterious effects.

  8. An opposition-based harmony search algorithm for engineering optimization problems

    Directory of Open Access Journals (Sweden)

    Abhik Banerjee

    2014-03-01

    Full Text Available Harmony search (HS is a derivative-free real parameter optimization algorithm. It draws inspiration from the musical improvisation process of searching for a perfect state of harmony. The proposed opposition-based HS (OHS of the present work employs opposition-based learning for harmony memory initialization and also for generation jumping. The concept of opposite number is utilized in OHS to improve the convergence rate of the HS algorithm. The potential of the proposed algorithm is assessed by means of an extensive comparative study of the numerical results on sixteen benchmark test functions. Additionally, the effectiveness of the proposed algorithm is tested for reactive power compensation of an autonomous power system. For real-time reactive power compensation of the studied model, Takagi Sugeno fuzzy logic (TSFL is employed. Time-domain simulation reveals that the proposed OHS-TSFL yields on-line, off-nominal model parameters, resulting in real-time incremental change in terminal voltage response profile.

  9. Radiological protection optimization using derivatives

    International Nuclear Information System (INIS)

    Freitas Acosta Perez, C. de; Sordi, G.M.A.A.

    2006-01-01

    The aim of this paper is to provide a different approach related to the integral cost-benefit and extended cost-benefit analysis used in the decision-aiding techniques. In the ICRP publication 55 the annual protection cost is envisaged as a set of points, each of them representing an option, linked by a straight line. The detriment cost function is considered a linear function whose angular coefficient is determined by the alpha value. In this paper the uranium mine example considered in the ICRP publication 55 was used. But the potential curve was introduced both in the integral cost benefit analysis and in the extended cost-benefit analysis, which the individual dose distribution attribute is added. The result was obtained using derivatives. The detriment cost, Y, is not necessary because the alpha value is known. The Y derivative dS/dY is the alpha value itself and so, the attention is directed to the derivative -dX/dS on the points that, along with the alpha value, present the optimum option. The results makes clear that the prevailing factor in the optimum option selection is the alpha value imputed, and those a single alpha value, as suggested now, probably as little efficiency on the optimization process. Obtaining a curve for the alpha value and using the derivative technique introduced in this paper, the analytical solution is more convenient and reliable compared to the one used now. (authors)

  10. Toxicological benchmarks for screening potential contaminants of concern for effects on terrestrial plants: 1994 revision

    International Nuclear Information System (INIS)

    Will, M.E.; Suter, G.W. II.

    1994-09-01

    One of the initial stages in ecological risk assessment for hazardous waste sites is screening contaminants to determine which of them are worthy of further consideration as contaminants of potential concern. This process is termed contaminant screening. It is performed by comparing measured ambient concentrations of chemicals to benchmark concentrations. Currently, no standard benchmark concentrations exist for assessing contaminants in soil with respect to their toxicity to plants. This report presents a standard method for deriving benchmarks for this purpose (phytotoxicity benchmarks), a set of data concerning effects of chemicals in soil or soil solution on plants, and a set of phytotoxicity benchmarks for 38 chemicals potentially associated with United States Department of Energy (DOE) sites. In addition, background information on the phytotoxicity and occurrence of the chemicals in soils is presented, and literature describing the experiments from which data were drawn for benchmark derivation is reviewed. Chemicals that are found in soil at concentrations exceeding both the phytotoxicity benchmark and the background concentration for the soil type should be considered contaminants of potential concern

  11. Toxicological benchmarks for screening potential contaminants of concern for effects on terrestrial plants: 1994 revision

    Energy Technology Data Exchange (ETDEWEB)

    Will, M.E.; Suter, G.W. II

    1994-09-01

    One of the initial stages in ecological risk assessment for hazardous waste sites is screening contaminants to determine which of them are worthy of further consideration as contaminants of potential concern. This process is termed contaminant screening. It is performed by comparing measured ambient concentrations of chemicals to benchmark concentrations. Currently, no standard benchmark concentrations exist for assessing contaminants in soil with respect to their toxicity to plants. This report presents a standard method for deriving benchmarks for this purpose (phytotoxicity benchmarks), a set of data concerning effects of chemicals in soil or soil solution on plants, and a set of phytotoxicity benchmarks for 38 chemicals potentially associated with United States Department of Energy (DOE) sites. In addition, background information on the phytotoxicity and occurrence of the chemicals in soils is presented, and literature describing the experiments from which data were drawn for benchmark derivation is reviewed. Chemicals that are found in soil at concentrations exceeding both the phytotoxicity benchmark and the background concentration for the soil type should be considered contaminants of potential concern.

  12. Toxicological benchmarks for screening potential contaminants of concern for effects on aquatic biota: 1996 revision

    Energy Technology Data Exchange (ETDEWEB)

    Suter, G.W. II [Oak Ridge National Lab., TN (United States); Tsao, C.L. [Duke Univ., Durham, NC (United States). School of the Environment

    1996-06-01

    This report presents potential screening benchmarks for protection of aquatic life form contaminants in water. Because there is no guidance for screening for benchmarks, a set of alternative benchmarks is presented herein. This report presents the alternative benchmarks for chemicals that have been detected on the Oak Ridge Reservation. It also presents the data used to calculate the benchmarks and the sources of the data. It compares the benchmarks and discusses their relative conservatism and utility. Also included is the updates of benchmark values where appropriate, new benchmark values, secondary sources are replaced by primary sources, and a more complete documentation of the sources and derivation of all values are presented.

  13. On benchmarking Stochastic Global Optimization Algorithms

    NARCIS (Netherlands)

    Hendrix, E.M.T.; Lancinskas, A.

    2015-01-01

    A multitude of heuristic stochastic optimization algorithms have been described in literature to obtain good solutions of the box-constrained global optimization problem often with a limit on the number of used function evaluations. In the larger question of which algorithms behave well on which

  14. Quality management benchmarking: FDA compliance in pharmaceutical industry.

    Science.gov (United States)

    Jochem, Roland; Landgraf, Katja

    2010-01-01

    By analyzing and comparing industry and business best practice, processes can be optimized and become more successful mainly because efficiency and competitiveness increase. This paper aims to focus on some examples. Case studies are used to show knowledge exchange in the pharmaceutical industry. Best practice solutions were identified in two companies using a benchmarking method and five-stage model. Despite large administrations, there is much potential regarding business process organization. This project makes it possible for participants to fully understand their business processes. The benchmarking method gives an opportunity to critically analyze value chains (a string of companies or players working together to satisfy market demands for a special product). Knowledge exchange is interesting for companies that like to be global players. Benchmarking supports information exchange and improves competitive ability between different enterprises. Findings suggest that the five-stage model improves efficiency and effectiveness. Furthermore, the model increases the chances for reaching targets. The method gives security to partners that did not have benchmarking experience. The study identifies new quality management procedures. Process management and especially benchmarking is shown to support pharmaceutical industry improvements.

  15. Smallest-Small-World Cellular Harmony Search for Optimization of Unconstrained Benchmark Problems

    Directory of Open Access Journals (Sweden)

    Sung Soo Im

    2013-01-01

    Full Text Available We presented a new hybrid method that combines cellular harmony search algorithms with the Smallest-Small-World theory. A harmony search (HS algorithm is based on musical performance processes that occur when a musician searches for a better state of harmony. Harmony search has successfully been applied to a wide variety of practical optimization problems. Most of the previous researches have sought to improve the performance of the HS algorithm by changing the pitch adjusting rate and harmony memory considering rate. However, there has been a lack of studies to improve the performance of the algorithm by the formation of population structures. Therefore, we proposed an improved HS algorithm that uses the cellular automata formation and the topological structure of Smallest-Small-World network. The improved HS algorithm has a high clustering coefficient and a short characteristic path length, having good exploration and exploitation efficiencies. Nine benchmark functions were applied to evaluate the performance of the proposed algorithm. Unlike the existing improved HS algorithm, the proposed algorithm is expected to have improved algorithmic efficiency from the formation of the population structure.

  16. Sieve of Eratosthenes benchmarks for the Z8 FORTH microcontroller

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, R.

    1989-02-01

    This report presents benchmarks for the Z8 FORTH microcontroller system that ORNL uses extensively in proving concepts and developing prototype test equipment for the Smart House Project. The results are based on the sieve of Eratosthenes algorithm, a calculation used extensively to rate computer systems and programming languages. Three benchmark refinements are presented,each showing how the execution speed of a FORTH program can be improved by use of a particular optimization technique. The last version of the FORTH benchmark shows that optimization is worth the effort: It executes 20 times faster than the Gilbreaths' widely-published FORTH benchmark program. The National Association of Home Builders Smart House Project is a cooperative research and development effort being undertaken by American home builders and a number of major corporations serving the home building industry. The major goal of the project is to help the participating organizations incorporate advanced technology in communications,energy distribution, and appliance control products for American homes. This information is provided to help project participants use the Z8 FORTH prototyping microcontroller in developing Smart House concepts and equipment. The discussion is technical in nature and assumes some experience with microcontroller devices and the techniques used to develop software for them. 7 refs., 5 tabs.

  17. Development of Multivariable Models to Predict and Benchmark Transfusion in Elective Surgery Supporting Patient Blood Management.

    Science.gov (United States)

    Hayn, Dieter; Kreiner, Karl; Ebner, Hubert; Kastner, Peter; Breznik, Nada; Rzepka, Angelika; Hofmann, Axel; Gombotz, Hans; Schreier, Günter

    2017-06-14

    Blood transfusion is a highly prevalent procedure in hospitalized patients and in some clinical scenarios it has lifesaving potential. However, in most cases transfusion is administered to hemodynamically stable patients with no benefit, but increased odds of adverse patient outcomes and substantial direct and indirect cost. Therefore, the concept of Patient Blood Management has increasingly gained importance to pre-empt and reduce transfusion and to identify the optimal transfusion volume for an individual patient when transfusion is indicated. It was our aim to describe, how predictive modeling and machine learning tools applied on pre-operative data can be used to predict the amount of red blood cells to be transfused during surgery and to prospectively optimize blood ordering schedules. In addition, the data derived from the predictive models should be used to benchmark different hospitals concerning their blood transfusion patterns. 6,530 case records obtained for elective surgeries from 16 centers taking part in two studies conducted in 2004-2005 and 2009-2010 were analyzed. Transfused red blood cell volume was predicted using random forests. Separate models were trained for overall data, for each center and for each of the two studies. Important characteristics of different models were compared with one another. Our results indicate that predictive modeling applied prior surgery can predict the transfused volume of red blood cells more accurately (correlation coefficient cc = 0.61) than state of the art algorithms (cc = 0.39). We found significantly different patterns of feature importance a) in different hospitals and b) between study 1 and study 2. We conclude that predictive modeling can be used to benchmark the importance of different features on the models derived with data from different hospitals. This might help to optimize crucial processes in a specific hospital, even in other scenarios beyond Patient Blood Management.

  18. Self-benchmarking Guide for Cleanrooms: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Sartor, Dale; Tschudi, William

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  19. Storage-Intensive Supercomputing Benchmark Study

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, J; Dossa, D; Gokhale, M; Hysom, D; May, J; Pearce, R; Yoo, A

    2007-10-30

    Critical data science applications requiring frequent access to storage perform poorly on today's computing architectures. This project addresses efficient computation of data-intensive problems in national security and basic science by exploring, advancing, and applying a new form of computing called storage-intensive supercomputing (SISC). Our goal is to enable applications that simply cannot run on current systems, and, for a broad range of data-intensive problems, to deliver an order of magnitude improvement in price/performance over today's data-intensive architectures. This technical report documents much of the work done under LDRD 07-ERD-063 Storage Intensive Supercomputing during the period 05/07-09/07. The following chapters describe: (1) a new file I/O monitoring tool iotrace developed to capture the dynamic I/O profiles of Linux processes; (2) an out-of-core graph benchmark for level-set expansion of scale-free graphs; (3) an entity extraction benchmark consisting of a pipeline of eight components; and (4) an image resampling benchmark drawn from the SWarp program in the LSST data processing pipeline. The performance of the graph and entity extraction benchmarks was measured in three different scenarios: data sets residing on the NFS file server and accessed over the network; data sets stored on local disk; and data sets stored on the Fusion I/O parallel NAND Flash array. The image resampling benchmark compared performance of software-only to GPU-accelerated. In addition to the work reported here, an additional text processing application was developed that used an FPGA to accelerate n-gram profiling for language classification. The n-gram application will be presented at SC07 at the High Performance Reconfigurable Computing Technologies and Applications Workshop. The graph and entity extraction benchmarks were run on a Supermicro server housing the NAND Flash 40GB parallel disk array, the Fusion-io. The Fusion system specs are as follows

  20. Experimental Determination of Third Derivative of the Gibbs Free Energy, G II

    DEFF Research Database (Denmark)

    Koga, Yoshikata; Westh, Peter; Inaba, Akira

    2010-01-01

    We have been evaluating third derivative quantities of the Gibbs free energy, G, by graphically differentiating the second derivatives that are accessible experimentally, and demonstrated their power in elucidating the mixing schemes in aqueous solutions. Here we determine directly one of the third...

  1. Support-free interior carving for 3D printing

    Directory of Open Access Journals (Sweden)

    Yue Xie

    2017-03-01

    Full Text Available Recent interior carving methods for functional design necessitate a cumbersome cut-and-glue process in fabrication. We propose a method to generate interior voids which not only satisfy the functional purposes but are also support-free during the 3D printing process. We introduce a support-free unit structure for voxelization and derive the wall thicknesses parametrization for continuous optimization. We also design a discrete dithering algorithm to ensure the printability of ghost voxels. The interior voids are iteratively carved by alternating the optimization and dithering. We apply our method to optimize the static and rotational stability, and print various results to evaluate the efficacy. Keywords: Interior carving, Support-free, Voxels dithering, Shape optimization, 3D printing

  2. Performance of Multi-chaotic PSO on a shifted benchmark functions set

    Energy Technology Data Exchange (ETDEWEB)

    Pluhacek, Michal; Senkerik, Roman; Zelinka, Ivan [Tomas Bata University in Zlín, Faculty of Applied Informatics Department of Informatics and Artificial Intelligence nám. T.G. Masaryka 5555, 760 01 Zlín (Czech Republic)

    2015-03-10

    In this paper the performance of Multi-chaotic PSO algorithm is investigated using two shifted benchmark functions. The purpose of shifted benchmark functions is to simulate the time-variant real-world problems. The results of chaotic PSO are compared with canonical version of the algorithm. It is concluded that using the multi-chaotic approach can lead to better results in optimization of shifted functions.

  3. Performance of Multi-chaotic PSO on a shifted benchmark functions set

    International Nuclear Information System (INIS)

    Pluhacek, Michal; Senkerik, Roman; Zelinka, Ivan

    2015-01-01

    In this paper the performance of Multi-chaotic PSO algorithm is investigated using two shifted benchmark functions. The purpose of shifted benchmark functions is to simulate the time-variant real-world problems. The results of chaotic PSO are compared with canonical version of the algorithm. It is concluded that using the multi-chaotic approach can lead to better results in optimization of shifted functions

  4. Optimal Order Strategy in Uncertain Demands with Free Shipping Option

    Directory of Open Access Journals (Sweden)

    Qing-Chun Meng

    2014-01-01

    Full Text Available Free shipping with conditions has become one of the most effective marketing tools; more and more companies especially e-business companies prefer to offer free shipping to buyers whenever their orders exceed the minimum quantity specified by them. But in practice, the demands of buyers are uncertain, which are affected by weather, season, and many other factors. Firstly, we model the centralization ordering problem of retailers who face stochastic demands when suppliers offer free shipping, in which limited distributional information such as known mean, support, and some deviation measures of the random data is needed only. Then, based on the linear decision rule mainly for stochastic programming, we analyze the optimal order strategies of retailers and discuss the approximate solution. Further, we present the core allocation between all retailers via dual and cooperative game theory. The existence of core shows that each retailer is pleased to cooperate with others in the centralization problem. Finally, a numerical example is implemented to discuss how uncertain data and parameters affect the optimal solution.

  5. Application of an improved PSO algorithm to optimal tuning of PID gains for water turbine governor

    International Nuclear Information System (INIS)

    Fang Hongqing; Chen Long; Shen Zuyi

    2011-01-01

    In this paper, an improved particle swarm optimization (IPSO) algorithm is proposed. Besides the individual best position and the global best position, a nominal average position of the swarm is introduced in IPSO. The performance of IPSO is compared to different PSO variants with five well-known benchmark functions. The experimental results show that the proposed IPSO algorithm improves the searching performance on the benchmark functions. And then, IPSO, as well as other PSO variants, is applied to optimal tuning of Proportional-Integral-Derivative (PID) gains for a typical PID control system of water turbine governor. The computer simulation results of an actual hydro power plant in China show that IPSO algorithm has stable convergence characteristic and good computational ability, and it is an effective and easily implemented method for optimal tuning of PID gains of water turbine governor.

  6. International handbook of evaluated criticality safety benchmark experiments

    International Nuclear Information System (INIS)

    2010-01-01

    The Criticality Safety Benchmark Evaluation Project (CSBEP) was initiated in October of 1992 by the United States Department of Energy. The project quickly became an international effort as scientists from other interested countries became involved. The International Criticality Safety Benchmark Evaluation Project (ICSBEP) became an official activity of the Organization for Economic Cooperation and Development - Nuclear Energy Agency (OECD-NEA) in 1995. This handbook contains criticality safety benchmark specifications that have been derived from experiments performed at various nuclear critical facilities around the world. The benchmark specifications are intended for use by criticality safety engineers to validate calculational techniques used to establish minimum subcritical margins for operations with fissile material and to determine criticality alarm requirement and placement. Many of the specifications are also useful for nuclear data testing. Example calculations are presented; however, these calculations do not constitute a validation of the codes or cross section data. The evaluated criticality safety benchmark data are given in nine volumes. These volumes span over 55,000 pages and contain 516 evaluations with benchmark specifications for 4,405 critical, near critical, or subcritical configurations, 24 criticality alarm placement / shielding configurations with multiple dose points for each, and 200 configurations that have been categorized as fundamental physics measurements that are relevant to criticality safety applications. Experiments that are found unacceptable for use as criticality safety benchmark experiments are discussed in these evaluations; however, benchmark specifications are not derived for such experiments (in some cases models are provided in an appendix). Approximately 770 experimental configurations are categorized as unacceptable for use as criticality safety benchmark experiments. Additional evaluations are in progress and will be

  7. Rotor design optimization using a free wake analysis

    Science.gov (United States)

    Quackenbush, Todd R.; Boschitsch, Alexander H.; Wachspress, Daniel A.; Chua, Kiat

    1993-01-01

    The aim of this effort was to develop a comprehensive performance optimization capability for tiltrotor and helicopter blades. The analysis incorporates the validated EHPIC (Evaluation of Hover Performance using Influence Coefficients) model of helicopter rotor aerodynamics within a general linear/quadratic programming algorithm that allows optimization using a variety of objective functions involving the performance. The resulting computer code, EHPIC/HERO (HElicopter Rotor Optimization), improves upon several features of the previous EHPIC performance model and allows optimization utilizing a wide spectrum of design variables, including twist, chord, anhedral, and sweep. The new analysis supports optimization of a variety of objective functions, including weighted measures of rotor thrust, power, and propulsive efficiency. The fundamental strength of the approach is that an efficient search for improved versions of the baseline design can be carried out while retaining the demonstrated accuracy inherent in the EHPIC free wake/vortex lattice performance analysis. Sample problems are described that demonstrate the success of this approach for several representative rotor configurations in hover and axial flight. Features that were introduced to convert earlier demonstration versions of this analysis into a generally applicable tool for researchers and designers is also discussed.

  8. Optimization of mixed quantum-classical dynamics: Time-derivative coupling terms and selected couplings

    International Nuclear Information System (INIS)

    Pittner, Jiri; Lischka, Hans; Barbatti, Mario

    2009-01-01

    The usage of time-derivative non-adiabatic coupling terms and partially coupled time-dependent equations are investigated to accelerate non-adiabatic dynamics simulations at multireference configuration interaction (MRCI) level. The quality of the results and computational costs are compared against non-adiabatic benchmark dynamics calculations using non-adiabatic coupling vectors. In the comparison between the time-derivative couplings and coupling vectors, deviations in the adiabatic population of individual trajectories were observed in regions of rapid variation of the coupling terms. They, however, affected the average adiabatic population to only about 5%. For small multiconfiguration spaces, dynamics with time-derivative couplings are significantly faster than those with coupling vectors. This relation inverts for larger configuration spaces. The use of the partially coupled equations approach speeds up the simulations significantly while keeping the deviations in the population below few percent. Imidazole and the methaniminium cation are used as test examples

  9. Towards benchmarking an in-stream water quality model

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available A method of model evaluation is presented which utilises a comparison with a benchmark model. The proposed benchmarking concept is one that can be applied to many hydrological models but, in this instance, is implemented in the context of an in-stream water quality model. The benchmark model is defined in such a way that it is easily implemented within the framework of the test model, i.e. the approach relies on two applications of the same model code rather than the application of two separate model codes. This is illustrated using two case studies from the UK, the Rivers Aire and Ouse, with the objective of simulating a water quality classification, general quality assessment (GQA, which is based on dissolved oxygen, biochemical oxygen demand and ammonium. Comparisons between the benchmark and test models are made based on GQA, as well as a step-wise assessment against the components required in its derivation. The benchmarking process yields a great deal of important information about the performance of the test model and raises issues about a priori definition of the assessment criteria.

  10. Worst-Case Investment and Reinsurance Optimization for an Insurer under Model Uncertainty

    Directory of Open Access Journals (Sweden)

    Xiangbo Meng

    2016-01-01

    Full Text Available In this paper, we study optimal investment-reinsurance strategies for an insurer who faces model uncertainty. The insurer is allowed to acquire new business and invest into a financial market which consists of one risk-free asset and one risky asset whose price process is modeled by a Geometric Brownian motion. Minimizing the expected quadratic distance of the terminal wealth to a given benchmark under the “worst-case” scenario, we obtain the closed-form expressions of optimal strategies and the corresponding value function by solving the Hamilton-Jacobi-Bellman (HJB equation. Numerical examples are presented to show the impact of model parameters on the optimal strategies.

  11. Benchmarking Data Sets for the Evaluation of Virtual Ligand Screening Methods: Review and Perspectives.

    Science.gov (United States)

    Lagarde, Nathalie; Zagury, Jean-François; Montes, Matthieu

    2015-07-27

    Virtual screening methods are commonly used nowadays in drug discovery processes. However, to ensure their reliability, they have to be carefully evaluated. The evaluation of these methods is often realized in a retrospective way, notably by studying the enrichment of benchmarking data sets. To this purpose, numerous benchmarking data sets were developed over the years, and the resulting improvements led to the availability of high quality benchmarking data sets. However, some points still have to be considered in the selection of the active compounds, decoys, and protein structures to obtain optimal benchmarking data sets.

  12. Free-flight odor tracking in Drosophila is consistent with an optimal intermittent scale-free search.

    Directory of Open Access Journals (Sweden)

    Andy M Reynolds

    2007-04-01

    Full Text Available During their trajectories in still air, fruit flies (Drosophila melanogaster explore their landscape using a series of straight flight paths punctuated by rapid 90 degrees body-saccades [1]. Some saccades are triggered by visual expansion associated with collision avoidance. Yet many saccades are not triggered by visual cues, but rather appear spontaneously. Our analysis reveals that the control of these visually independent saccades and the flight intervals between them constitute an optimal scale-free active searching strategy. Two characteristics of mathematical optimality that are apparent during free-flight in Drosophila are inter-saccade interval lengths distributed according to an inverse square law, which does not vary across landscape scale, and 90 degrees saccade angles, which increase the likelihood that territory will be revisited and thereby reduce the likelihood that near-by targets will be missed. We also show that searching is intermittent, such that active searching phases randomly alternate with relocation phases. Behaviorally, this intermittency is reflected in frequently occurring short, slow speed inter-saccade intervals randomly alternating with rarer, longer, faster inter-saccade intervals. Searching patterns that scale similarly across orders of magnitude of length (i.e., scale-free have been revealed in animals as diverse as microzooplankton, bumblebees, albatrosses, and spider monkeys, but these do not appear to be optimised with respect to turning angle, whereas Drosophila free-flight search does. Also, intermittent searching patterns, such as those reported here for Drosophila, have been observed in foragers such as planktivorous fish and ground foraging birds. Our results with freely flying Drosophila may constitute the first reported example of searching behaviour that is both scale-free and intermittent.

  13. Benchmark and parametric study of a passive flow controller (fluidic device) for the development of optimal designs using a CFD code

    International Nuclear Information System (INIS)

    Lim, Sang-Gyu; Lee, Seok-Ho; Kim, Han-Gon

    2010-01-01

    A passive flow controller or a fluidic device (FD) is used for a safety injection system (SIS) for efficient use of nuclear reactor emergency cooling water since it can control the injection flow rate in a passive and optimal way. The performance of the FD is represented by pressure loss coefficient (K-factor) which is further affected by the configuration of the components such as a control port direction and a nozzle angle. The flow control mechanism that is varied according to the water level inside a vortex chamber determines the duration of the safety injection. This paper deals with a computational fluid dynamics (CFD) analysis for simulating the flow characteristics of the FD using the ANSYS CFX 11.0. The CFD analysis is benchmarked against existing experimental data to obtain applicability to the prediction of the FD performance in terms of K-factor. The CFD calculation is implemented with Shear Stress Transport (SST) model for a swirling flow and a strong streamline curvature in the vortex chamber of the FD, considering a numerical efficiency. Based on the benchmark results, parametric analyses are performed for an optimal design of the FD by varying the control port direction and the nozzle angle. Consequently, the FD performance is enhanced according to the angle of the control port nozzle.

  14. Benchmarking motion planning algorithms for bin-picking applications

    DEFF Research Database (Denmark)

    Iversen, Thomas Fridolin; Ellekilde, Lars-Peter

    2017-01-01

    Purpose For robot motion planning there exists a large number of different algorithms, each appropriate for a certain domain, and the right choice of planner depends on the specific use case. The purpose of this paper is to consider the application of bin picking and benchmark a set of motion...... planning algorithms to identify which are most suited in the given context. Design/methodology/approach The paper presents a selection of motion planning algorithms and defines benchmarks based on three different bin-picking scenarios. The evaluation is done based on a fixed set of tasks, which are planned...... and executed on a real and a simulated robot. Findings The benchmarking shows a clear difference between the planners and generally indicates that algorithms integrating optimization, despite longer planning time, perform better due to a faster execution. Originality/value The originality of this work lies...

  15. Global Optimization Based on the Hybridization of Harmony Search and Particle Swarm Optimization Methods

    Directory of Open Access Journals (Sweden)

    A. P. Karpenko

    2014-01-01

    Full Text Available We consider a class of stochastic search algorithms of global optimization which in various publications are called behavioural, intellectual, metaheuristic, inspired by the nature, swarm, multi-agent, population, etc. We use the last term.Experience in using the population algorithms to solve challenges of global optimization shows that application of one such algorithm may not always effective. Therefore now great attention is paid to hybridization of population algorithms of global optimization. Hybrid algorithms unite various algorithms or identical algorithms, but with various values of free parameters. Thus efficiency of one algorithm can compensate weakness of another.The purposes of the work are development of hybrid algorithm of global optimization based on known algorithms of harmony search (HS and swarm of particles (PSO, software implementation of algorithm, study of its efficiency using a number of known benchmark problems, and a problem of dimensional optimization of truss structure.We set a problem of global optimization, consider basic algorithms of HS and PSO, give a flow chart of the offered hybrid algorithm called PSO HS , present results of computing experiments with developed algorithm and software, formulate main results of work and prospects of its development.

  16. IT-benchmarking of clinical workflows: concept, implementation, and evaluation.

    Science.gov (United States)

    Thye, Johannes; Straede, Matthias-Christopher; Liebe, Jan-David; Hübner, Ursula

    2014-01-01

    Due to the emerging evidence of health IT as opportunity and risk for clinical workflows, health IT must undergo a continuous measurement of its efficacy and efficiency. IT-benchmarks are a proven means for providing this information. The aim of this study was to enhance the methodology of an existing benchmarking procedure by including, in particular, new indicators of clinical workflows and by proposing new types of visualisation. Drawing on the concept of information logistics, we propose four workflow descriptors that were applied to four clinical processes. General and specific indicators were derived from these descriptors and processes. 199 chief information officers (CIOs) took part in the benchmarking. These hospitals were assigned to reference groups of a similar size and ownership from a total of 259 hospitals. Stepwise and comprehensive feedback was given to the CIOs. Most participants who evaluated the benchmark rated the procedure as very good, good, or rather good (98.4%). Benchmark information was used by CIOs for getting a general overview, advancing IT, preparing negotiations with board members, and arguing for a new IT project.

  17. Benchmarking local healthcare-associated infections: Available benchmarks and interpretation challenges

    Directory of Open Access Journals (Sweden)

    Aiman El-Saed

    2013-10-01

    Full Text Available Summary: Growing numbers of healthcare facilities are routinely collecting standardized data on healthcare-associated infection (HAI, which can be used not only to track internal performance but also to compare local data to national and international benchmarks. Benchmarking overall (crude HAI surveillance metrics without accounting or adjusting for potential confounders can result in misleading conclusions. Methods commonly used to provide risk-adjusted metrics include multivariate logistic regression analysis, stratification, indirect standardization, and restrictions. The characteristics of recognized benchmarks worldwide, including the advantages and limitations are described. The choice of the right benchmark for the data from the Gulf Cooperation Council (GCC states is challenging. The chosen benchmark should have similar data collection and presentation methods. Additionally, differences in surveillance environments including regulations should be taken into consideration when considering such a benchmark. The GCC center for infection control took some steps to unify HAI surveillance systems in the region. GCC hospitals still need to overcome legislative and logistic difficulties in sharing data to create their own benchmark. The availability of a regional GCC benchmark may better enable health care workers and researchers to obtain more accurate and realistic comparisons. Keywords: Benchmarking, Comparison, Surveillance, Healthcare-associated infections

  18. A Derivative Method with Free Radical Oxidation to Predict Resveratrol Metabolites by Tandem Mass Spectrometry.

    Science.gov (United States)

    Liu, Wangta; Shiue, Yow-Ling; Lin, Yi-Reng; Lin, Hugo You-Hsien; Liang, Shih-Shin

    2015-10-01

    In this study, we demonstrated an oxidative method with free radical to generate 3,5,4'-trihydroxy- trans -stilbene ( trans -resveratrol) metabolites and detect sequentially by an autosampler coupling with liquid chromatography electrospray ionization tandem mass spectrometer (LC-ESI-MS/MS). In this oxidative method, the free radical initiator, ammonium persulfate (APS), was placed in a sample bottle containing resveratrol to produce oxidative derivatives, and the reaction progress was tracked by autosampler sequencing. Resveratrol, a natural product with purported cancer preventative qualities, produces metabolites including dihydroresveratrol, 3,4'-dihydroxy- trans -stilbene, lunularin, resveratrol monosulfate, and dihydroresveratrol monosulfate by free radical oxidation. Using APS free radical, the concentrations of resveratrol derivatives differ as a function of time. Besides simple, convenient and time- and labor saving, the advantages of free radical oxidative method of its in situ generation of oxidative derivatives followed by LC-ESI-MS/MS can be utilized to evaluate different metabolites in various conditions.

  19. Solving the Traveling Salesman’s Problem Using the African Buffalo Optimization

    Directory of Open Access Journals (Sweden)

    Julius Beneoluchi Odili

    2016-01-01

    Full Text Available This paper proposes the African Buffalo Optimization (ABO which is a new metaheuristic algorithm that is derived from careful observation of the African buffalos, a species of wild cows, in the African forests and savannahs. This animal displays uncommon intelligence, strategic organizational skills, and exceptional navigational ingenuity in its traversal of the African landscape in search for food. The African Buffalo Optimization builds a mathematical model from the behavior of this animal and uses the model to solve 33 benchmark symmetric Traveling Salesman’s Problem and six difficult asymmetric instances from the TSPLIB. This study shows that buffalos are able to ensure excellent exploration and exploitation of the search space through regular communication, cooperation, and good memory of its previous personal exploits as well as tapping from the herd’s collective exploits. The results obtained by using the ABO to solve these TSP cases were benchmarked against the results obtained by using other popular algorithms. The results obtained using the African Buffalo Optimization algorithm are very competitive.

  20. HEK293 cell culture media study towards bioprocess optimization: Animal derived component free and animal derived component containing platforms.

    Science.gov (United States)

    Liste-Calleja, Leticia; Lecina, Martí; Cairó, Jordi Joan

    2014-04-01

    The increasing demand for biopharmaceuticals produced in mammalian cells has lead industries to enhance bioprocess volumetric productivity through different strategies. Among those strategies, cell culture media development is of major interest. In the present work, several commercially available culture media for Human Embryonic Kidney cells (HEK293) were evaluated in terms of maximal specific growth rate and maximal viable cell concentration supported. The main objective was to provide different cell culture platforms which are suitable for a wide range of applications depending on the type and the final use of the product obtained. Performing simple media supplementations with and without animal derived components, an enhancement of cell concentration from 2 × 10(6) cell/mL to 17 × 10(6) cell/mL was achieved in batch mode operation. Additionally, the media were evaluated for adenovirus production as a specific application case of HEK293 cells. None of the supplements interfered significantly with the adenovirus infection although some differences were encountered in viral productivity. To the best of our knowledge, the high cell density achieved in the work presented has never been reported before in HEK293 batch cell cultures and thus, our results are greatly promising to further study cell culture strategies in bioreactor towards bioprocess optimization. Copyright © 2013 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  1. Benchmarking HIV health care

    DEFF Research Database (Denmark)

    Podlekareva, Daria; Reekie, Joanne; Mocroft, Amanda

    2012-01-01

    ABSTRACT: BACKGROUND: State-of-the-art care involving the utilisation of multiple health care interventions is the basis for an optimal long-term clinical prognosis for HIV-patients. We evaluated health care for HIV-patients based on four key indicators. METHODS: Four indicators of health care we...... document pronounced regional differences in adherence to guidelines and can help to identify gaps and direct target interventions. It may serve as a tool for assessment and benchmarking the clinical management of HIV-patients in any setting worldwide....

  2. Binary Cockroach Swarm Optimization for Combinatorial Optimization Problem

    Directory of Open Access Journals (Sweden)

    Ibidun Christiana Obagbuwa

    2016-09-01

    Full Text Available The Cockroach Swarm Optimization (CSO algorithm is inspired by cockroach social behavior. It is a simple and efficient meta-heuristic algorithm and has been applied to solve global optimization problems successfully. The original CSO algorithm and its variants operate mainly in continuous search space and cannot solve binary-coded optimization problems directly. Many optimization problems have their decision variables in binary. Binary Cockroach Swarm Optimization (BCSO is proposed in this paper to tackle such problems and was evaluated on the popular Traveling Salesman Problem (TSP, which is considered to be an NP-hard Combinatorial Optimization Problem (COP. A transfer function was employed to map a continuous search space CSO to binary search space. The performance of the proposed algorithm was tested firstly on benchmark functions through simulation studies and compared with the performance of existing binary particle swarm optimization and continuous space versions of CSO. The proposed BCSO was adapted to TSP and applied to a set of benchmark instances of symmetric TSP from the TSP library. The results of the proposed Binary Cockroach Swarm Optimization (BCSO algorithm on TSP were compared to other meta-heuristic algorithms.

  3. An optimized rapid bisulfite conversion method with high recovery of cell-free DNA.

    Science.gov (United States)

    Yi, Shaohua; Long, Fei; Cheng, Juanbo; Huang, Daixin

    2017-12-19

    Methylation analysis of cell-free DNA is a encouraging tool for tumor diagnosis, monitoring and prognosis. Sensitivity of methylation analysis is a very important matter due to the tiny amounts of cell-free DNA available in plasma. Most current methods of DNA methylation analysis are based on the difference of bisulfite-mediated deamination of cytosine between cytosine and 5-methylcytosine. However, the recovery of bisulfite-converted DNA based on current methods is very poor for the methylation analysis of cell-free DNA. We optimized a rapid method for the crucial steps of bisulfite conversion with high recovery of cell-free DNA. A rapid deamination step and alkaline desulfonation was combined with the purification of DNA on a silica column. The conversion efficiency and recovery of bisulfite-treated DNA was investigated by the droplet digital PCR. The optimization of the reaction results in complete cytosine conversion in 30 min at 70 °C and about 65% of recovery of bisulfite-treated cell-free DNA, which is higher than current methods. The method allows high recovery from low levels of bisulfite-treated cell-free DNA, enhancing the analysis sensitivity of methylation detection from cell-free DNA.

  4. Self-benchmarking Guide for Laboratory Buildings: Metrics, Benchmarks, Actions

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul; Greenberg, Steve; Sartor, Dale

    2009-07-13

    This guide describes energy efficiency metrics and benchmarks that can be used to track the performance of and identify potential opportunities to reduce energy use in laboratory buildings. This guide is primarily intended for personnel who have responsibility for managing energy use in existing laboratory facilities - including facilities managers, energy managers, and their engineering consultants. Additionally, laboratory planners and designers may also use the metrics and benchmarks described in this guide for goal-setting in new construction or major renovation. This guide provides the following information: (1) A step-by-step outline of the benchmarking process. (2) A set of performance metrics for the whole building as well as individual systems. For each metric, the guide provides a definition, performance benchmarks, and potential actions that can be inferred from evaluating this metric. (3) A list and descriptions of the data required for computing the metrics. This guide is complemented by spreadsheet templates for data collection and for computing the benchmarking metrics. This guide builds on prior research supported by the national Laboratories for the 21st Century (Labs21) program, supported by the U.S. Department of Energy and the U.S. Environmental Protection Agency. Much of the benchmarking data are drawn from the Labs21 benchmarking database and technical guides. Additional benchmark data were obtained from engineering experts including laboratory designers and energy managers.

  5. International Criticality Safety Benchmark Evaluation Project (ICSBEP) - ICSBEP 2015 Handbook

    International Nuclear Information System (INIS)

    Bess, John D.

    2015-01-01

    The Criticality Safety Benchmark Evaluation Project (CSBEP) was initiated in October of 1992 by the United States Department of Energy (DOE). The project quickly became an international effort as scientists from other interested countries became involved. The International Criticality Safety Benchmark Evaluation Project (ICSBEP) became an official activity of the Nuclear Energy Agency (NEA) in 1995. This handbook contains criticality safety benchmark specifications that have been derived from experiments performed at various critical facilities around the world. The benchmark specifications are intended for use by criticality safety engineers to validate calculation techniques used to establish minimum subcritical margins for operations with fissile material and to determine criticality alarm requirements and placement. Many of the specifications are also useful for nuclear data testing. Example calculations are presented; however, these calculations do not constitute a validation of the codes or cross-section data. The evaluated criticality safety benchmark data are given in nine volumes. These volumes span approximately 69000 pages and contain 567 evaluations with benchmark specifications for 4874 critical, near-critical or subcritical configurations, 31 criticality alarm placement/shielding configurations with multiple dose points for each, and 207 configurations that have been categorised as fundamental physics measurements that are relevant to criticality safety applications. New to the handbook are benchmark specifications for neutron activation foil and thermoluminescent dosimeter measurements performed at the SILENE critical assembly in Valduc, France as part of a joint venture in 2010 between the US DOE and the French Alternative Energies and Atomic Energy Commission (CEA). A photograph of this experiment is shown on the front cover. Experiments that are found unacceptable for use as criticality safety benchmark experiments are discussed in these

  6. Structure activity relationships of quinoxalin-2-one derivatives as platelet-derived growth factor-beta receptor (PDGFbeta R) inhibitors, derived from molecular modeling.

    Science.gov (United States)

    Mori, Yoshikazu; Hirokawa, Takatsugu; Aoki, Katsuyuki; Satomi, Hisanori; Takeda, Shuichi; Aburada, Masaki; Miyamoto, Ken-ichi

    2008-05-01

    We previously reported a quinoxalin-2-one compound (Compound 1) that had inhibitory activity equivalent to existing platelet-derived growth factor-beta receptor (PDGFbeta R) inhibitors. Lead optimization of Compound 1 to increase its activity and selectivity, using structural information regarding PDGFbeta R-ligand interactions, is urgently needed. Here we present models of the PDGFbeta R kinase domain complexed with quinoxalin-2-one derivatives. The models were constructed using comparative modeling, molecular dynamics (MD) and ligand docking. In particular, conformations derived from MD, and ligand binding site information presented by alpha-spheres in the pre-docking processing, allowed us to identify optimal protein structures for docking of target ligands. By carrying out molecular modeling and MD of PDGFbeta R in its inactive state, we obtained two structural models having good Compound 1 binding potentials. In order to distinguish the optimal candidate, we evaluated the structural activity relationships (SAR) between the ligand-binding free energies and inhibitory activity values (IC50 values) for available quinoxalin-2-one derivatives. Consequently, a final model with a high SAR was identified. This model included a molecular interaction between the hydrophobic pocket behind the ATP binding site and the substitution region of the quinoxalin-2-one derivatives. These findings should prove useful in lead optimization of quinoxalin-2-one derivatives as PDGFb R inhibitors.

  7. Free-time and fixed end-point multi-target optimal control theory: Application to quantum computing

    International Nuclear Information System (INIS)

    Mishima, K.; Yamashita, K.

    2011-01-01

    Graphical abstract: The two-state Deutsch-Jozsa algortihm used to demonstrate the utility of free-time and fixed-end point multi-target optimal control theory. Research highlights: → Free-time and fixed-end point multi-target optimal control theory (FRFP-MTOCT) was constructed. → The features of our theory include optimization of the external time-dependent perturbations with high transition probabilities, that of the temporal duration, the monotonic convergence, and the ability to optimize multiple-laser pulses simultaneously. → The advantage of the theory and a comparison with conventional fixed-time and fixed end-point multi-target optimal control theory (FIFP-MTOCT) are presented by comparing data calculated using the present theory with those published previously [K. Mishima, K. Yamashita, Chem. Phys. 361 (2009) 106]. → The qubit system of our interest consists of two polar NaCl molecules coupled by dipole-dipole interaction. → The calculation examples show that our theory is useful for minor adjustment of the external fields. - Abstract: An extension of free-time and fixed end-point optimal control theory (FRFP-OCT) to monotonically convergent free-time and fixed end-point multi-target optimal control theory (FRFP-MTOCT) is presented. The features of our theory include optimization of the external time-dependent perturbations with high transition probabilities, that of the temporal duration, the monotonic convergence, and the ability to optimize multiple-laser pulses simultaneously. The advantage of the theory and a comparison with conventional fixed-time and fixed end-point multi-target optimal control theory (FIFP-MTOCT) are presented by comparing data calculated using the present theory with those published previously [K. Mishima, K. Yamashita, Chem. Phys. 361, (2009), 106]. The qubit system of our interest consists of two polar NaCl molecules coupled by dipole-dipole interaction. The calculation examples show that our theory is useful for minor

  8. BSMBench: a flexible and scalable supercomputer benchmark from computational particle physics

    CERN Document Server

    Bennett, Ed; Del Debbio, Luigi; Jordan, Kirk; Patella, Agostino; Pica, Claudio; Rago, Antonio

    2016-01-01

    Benchmarking plays a central role in the evaluation of High Performance Computing architectures. Several benchmarks have been designed that allow users to stress various components of supercomputers. In order for the figures they provide to be useful, benchmarks need to be representative of the most common real-world scenarios. In this work, we introduce BSMBench, a benchmarking suite derived from Monte Carlo code used in computational particle physics. The advantage of this suite (which can be freely downloaded from http://www.bsmbench.org/) over others is the capacity to vary the relative importance of computation and communication. This enables the tests to simulate various practical situations. To showcase BSMBench, we perform a wide range of tests on various architectures, from desktop computers to state-of-the-art supercomputers, and discuss the corresponding results. Possible future directions of development of the benchmark are also outlined.

  9. Simplified two and three dimensional HTTR benchmark problems

    International Nuclear Information System (INIS)

    Zhang Zhan; Rahnema, Farzad; Zhang Dingkang; Pounders, Justin M.; Ougouag, Abderrafi M.

    2011-01-01

    To assess the accuracy of diffusion or transport methods for reactor calculations, it is desirable to create heterogeneous benchmark problems that are typical of whole core configurations. In this paper we have created two and three dimensional numerical benchmark problems typical of high temperature gas cooled prismatic cores. Additionally, a single cell and single block benchmark problems are also included. These problems were derived from the HTTR start-up experiment. Since the primary utility of the benchmark problems is in code-to-code verification, minor details regarding geometry and material specification of the original experiment have been simplified while retaining the heterogeneity and the major physics properties of the core from a neutronics viewpoint. A six-group material (macroscopic) cross section library has been generated for the benchmark problems using the lattice depletion code HELIOS. Using this library, Monte Carlo solutions are presented for three configurations (all-rods-in, partially-controlled and all-rods-out) for both the 2D and 3D problems. These solutions include the core eigenvalues, the block (assembly) averaged fission densities, local peaking factors, the absorption densities in the burnable poison and control rods, and pin fission density distribution for selected blocks. Also included are the solutions for the single cell and single block problems.

  10. Benchmarking and performance management in health care

    OpenAIRE

    Buttigieg, Sandra; ; EHMA Annual Conference : Public Health Care : Who Pays, Who Provides?

    2012-01-01

    Current economic conditions challenge health care providers globally. Healthcare organizations need to deliver optimal financial, operational, and clinical performance to sustain quality of service delivery. Benchmarking is one of the most potent and under-utilized management tools available and an analytic tool to understand organizational performance. Additionally, it is required for financial survival and organizational excellence.

  11. Merton's problem for an investor with a benchmark in a Barndorff-Nielsen and Shephard market.

    Science.gov (United States)

    Lennartsson, Jan; Lindberg, Carl

    2015-01-01

    To try to outperform an externally given benchmark with known weights is the most common equity mandate in the financial industry. For quantitative investors, this task is predominantly approached by optimizing their portfolios consecutively over short time horizons with one-period models. We seek in this paper to provide a theoretical justification to this practice when the underlying market is of Barndorff-Nielsen and Shephard type. This is done by verifying that an investor who seeks to maximize her expected terminal exponential utility of wealth in excess of her benchmark will in fact use an optimal portfolio equivalent to the one-period Markowitz mean-variance problem in continuum under the corresponding Black-Scholes market. Further, we can represent the solution to the optimization problem as in Feynman-Kac form. Hence, the problem, and its solution, is analogous to Merton's classical portfolio problem, with the main difference that Merton maximizes expected utility of terminal wealth, not wealth in excess of a benchmark.

  12. ICSBEP-2007, International Criticality Safety Benchmark Experiment Handbook

    International Nuclear Information System (INIS)

    Blair Briggs, J.

    2007-01-01

    1 - Description: The Critically Safety Benchmark Evaluation Project (CSBEP) was initiated in October of 1992 by the United Sates Department of Energy. The project quickly became an international effort as scientist from other interested countries became involved. The International Criticality Safety Benchmark Evaluation Project (ICSBEP) is now an official activity of the Organization of Economic Cooperation and Development - Nuclear Energy Agency (OECD-NEA). This handbook contains criticality safety benchmark specifications that have been derived from experiments that were performed at various nuclear critical facilities around the world. The benchmark specifications are intended for use by criticality safety engineers to validate calculational techniques used to establish minimum subcritical margins for operations with fissile material. The example calculations presented do not constitute a validation of the codes or cross section data. The work of the ICSBEP is documented as an International Handbook of Evaluated Criticality Safety Benchmark Experiments. Currently, the handbook spans over 42,000 pages and contains 464 evaluations representing 4,092 critical, near-critical, or subcritical configurations and 21 criticality alarm placement/shielding configurations with multiple dose points for each and 46 configurations that have been categorized as fundamental physics measurements that are relevant to criticality safety applications. The handbook is intended for use by criticality safety analysts to perform necessary validations of their calculational techniques and is expected to be a valuable tool for decades to come. The ICSBEP Handbook is available on DVD. You may request a DVD by completing the DVD Request Form on the internet. Access to the Handbook on the Internet requires a password. You may request a password by completing the Password Request Form. The Web address is: http://icsbep.inel.gov/handbook.shtml 2 - Method of solution: Experiments that are found

  13. Optimally stopped variational quantum algorithms

    Science.gov (United States)

    Vinci, Walter; Shabani, Alireza

    2018-04-01

    Quantum processors promise a paradigm shift in high-performance computing which needs to be assessed by accurate benchmarking measures. In this article, we introduce a benchmark for the variational quantum algorithm (VQA), recently proposed as a heuristic algorithm for small-scale quantum processors. In VQA, a classical optimization algorithm guides the processor's quantum dynamics to yield the best solution for a given problem. A complete assessment of the scalability and competitiveness of VQA should take into account both the quality and the time of dynamics optimization. The method of optimal stopping, employed here, provides such an assessment by explicitly including time as a cost factor. Here, we showcase this measure for benchmarking VQA as a solver for some quadratic unconstrained binary optimization. Moreover, we show that a better choice for the cost function of the classical routine can significantly improve the performance of the VQA algorithm and even improve its scaling properties.

  14. Toxicological benchmarks for screening potential contaminants of concern for effects on terrestrial plants

    International Nuclear Information System (INIS)

    Suter, G.W. II; Will, M.E.; Evans, C.

    1993-09-01

    One of the initial stages in ecological risk assessment for hazardous waste sites is the screening of contaminants to determine which of them are worthy of further consideration as ''contaminants of potential concern.'' This process is termed ''contaminant screening.'' It is performed by comparing measured ambient concentrations of chemicals to benchmark concentrations. Currently, no standard benchmark concentrations exist for assessing contaminants in soil with respect to their toxicity to plants. This report presents a standard method for deriving benchmarks for this purpose (phytotoxicity benchmarks), a set of data concerning effects of chemicals in soil or soil solution on plants, and a set of phytotoxicity benchmarks for 34 chemicals potentially associated with US Department of Energy (DOE) sites. Chemicals that are found in soil at concentrations exceeding both the phytotoxicity benchmark and the background concentration for the soil type should be considered contaminants of potential concern. The purpose of this report is to present plant toxicity data and discuss their utility as benchmarks for determining the hazard to terrestrial plants caused by contaminants in soil. Benchmarks are provided for soils and solutions

  15. Multiple energy supply risks, optimal reserves, and optimal domestic production capacities

    International Nuclear Information System (INIS)

    Zweifel, P.; Ferrari, M.

    1992-01-01

    This study starts from the observation that today's Western trading nations are exposed to multiple risks of energy supplies, e.g. simultaneous shortage of oil and electricity supplies. To cope with these risks, oil can be stockpiled as well as domestic capacity for power production built up. Adopting the viewpoint of a policy maker who aims at minimizing the expected cost of security of supply, optimal simultaneous adjustments of oil stocks and electric production capacities to exogenous changes such as economic growth are derived. Against this benchmark, one-dimensional rules such as 'oil reserves for 90 days' turn out to be not only suboptimal but also to foster adjustments that exacerbate suboptimality. 9 refs., 1 tabs

  16. Analytical benchmarks for nuclear engineering applications. Case studies in neutron transport theory

    International Nuclear Information System (INIS)

    2008-01-01

    The developers of computer codes involving neutron transport theory for nuclear engineering applications seldom apply analytical benchmarking strategies to ensure the quality of their programs. A major reason for this is the lack of analytical benchmarks and their documentation in the literature. The few such benchmarks that do exist are difficult to locate, as they are scattered throughout the neutron transport and radiative transfer literature. The motivation for this benchmark compendium, therefore, is to gather several analytical benchmarks appropriate for nuclear engineering applications under one cover. We consider the following three subject areas: neutron slowing down and thermalization without spatial dependence, one-dimensional neutron transport in infinite and finite media, and multidimensional neutron transport in a half-space and an infinite medium. Each benchmark is briefly described, followed by a detailed derivation of the analytical solution representation. Finally, a demonstration of the evaluation of the solution representation includes qualified numerical benchmark results. All accompanying computer codes are suitable for the PC computational environment and can serve as educational tools for courses in nuclear engineering. While this benchmark compilation does not contain all possible benchmarks, by any means, it does include some of the most prominent ones and should serve as a valuable reference. (author)

  17. Optimized Free Energies from Bidirectional Single-Molecule Force Spectroscopy

    Science.gov (United States)

    Minh, David D. L.; Adib, Artur B.

    2008-05-01

    An optimized method for estimating path-ensemble averages using data from processes driven in opposite directions is presented. Based on this estimator, bidirectional expressions for reconstructing free energies and potentials of mean force from single-molecule force spectroscopy—valid for biasing potentials of arbitrary stiffness—are developed. Numerical simulations on a model potential indicate that these methods perform better than unidirectional strategies.

  18. Toxicological Benchmarks for Screening Potential Contaminants of Concern for Effects on Soil and Litter Invertebrates and Heterotrophic Process

    Energy Technology Data Exchange (ETDEWEB)

    Will, M.E.

    1994-01-01

    This report presents a standard method for deriving benchmarks for the purpose of ''contaminant screening,'' performed by comparing measured ambient concentrations of chemicals. The work was performed under Work Breakdown Structure 1.4.12.2.3.04.07.02 (Activity Data Sheet 8304). In addition, this report presents sets of data concerning the effects of chemicals in soil on invertebrates and soil microbial processes, benchmarks for chemicals potentially associated with United States Department of Energy sites, and literature describing the experiments from which data were drawn for benchmark derivation.

  19. Free terminal time optimal control problem for the treatment of HIV infection

    Directory of Open Access Journals (Sweden)

    Amine Hamdache

    2016-01-01

    to provide the explicit formulations of the optimal controls. The corresponding optimality system with the additional transversality condition for the terminal time is derived and solved numerically using an adapted iterative method with a Runge-Kutta fourth order scheme and a gradient method routine.

  20. Free Vibration of Rectangular Plates with Attached Discrete Sprung Masses

    Directory of Open Access Journals (Sweden)

    Ding Zhou

    2012-01-01

    Full Text Available A direct approach is used to derive the exact solution for the free vibration of thin rectangular plates with discrete sprung masses attached. The plate is simply supported along two opposite edges and elastically supported along the two other edges. The elastic support can represent a range of boundary conditions from free to clamped supports. Considering only the compatibility of the internal forces between the plate and the sprung masses, the equations of the coupled vibration of the plate-spring-mass system are derived. The exact expressions for mode and frequency equations of the coupled vibration of the plate and sprung masses are determined. The solutions converge steadily and monotonically to exact values. The correctness and accuracy of the solutions are demonstrated through comparison with published results. A parametric study is undertaken focusing on the plate with one or two sprung masses. The results can be used as a benchmark for further investigation.

  1. Advancements in rationally designed PGM-free fuel cell catalysts derived from metal–organic frameworks

    International Nuclear Information System (INIS)

    Barkholtz, Heather M.; Liu, Di-Jia

    2016-01-01

    Over the past several years, metal-organic framework (MOF)-derived platinum group metal free (PGM-free) electrocatalysts have gained considerable attention due to their high efficiency and low cost as potential replacement for platinum in catalyzing oxygen reduction reaction (ORR). In this review, we summarize the recent advancements in design, synthesis and characterization of MOF-derived ORR catalysts and their performances in acidic and alkaline media. As a result, we also discuss the key challenges such as durability and activity enhancement critical in moving forward this emerging electrocatalyst science.

  2. Library Benchmarking

    Directory of Open Access Journals (Sweden)

    Wiji Suwarno

    2017-02-01

    Full Text Available The term benchmarking has been encountered in the implementation of total quality (TQM or in Indonesian termed holistic quality management because benchmarking is a tool to look for ideas or learn from the library. Benchmarking is a processof measuring and comparing for continuous business process of systematic and continuous measurement, the process of measuring and comparing for continuous business process of an organization to get information that can help these organization improve their performance efforts.

  3. CBLIB 2014: a benchmark library for conic mixed-integer and continuous optimization

    DEFF Research Database (Denmark)

    Friberg, Henrik Alsing

    2016-01-01

    The Conic Benchmark Library is an ongoing community-driven project aiming to challenge commercial and open source solvers on mainstream cone support. In this paper, 121 mixed-integer and continuous second-order cone problem instances have been selected from 11 categories as representative...

  4. Toxicological benchmarks for screening potential contaminants of concern for effects on soil and litter invertebrates and heterotrophic process

    International Nuclear Information System (INIS)

    Will, M.E.; Suter, G.W. II.

    1994-09-01

    One of the initial stages in ecological risk assessments for hazardous waste sites is the screening of contaminants to determine which of them are worthy of further consideration as open-quotes contaminants of potential concern.close quotes This process is termed open-quotes contaminant screening.close quotes It is performed by comparing measured ambient concentrations of chemicals to benchmark concentrations. Currently, no standard benchmark concentrations exist for assessing contaminants in soil with respect to their toxicity to soil- and litter-dwelling invertebrates, including earthworms, other micro- and macroinvertebrates, or heterotrophic bacteria and fungi. This report presents a standard method for deriving benchmarks for this purpose, sets of data concerning effects of chemicals in soil on invertebrates and soil microbial processes, and benchmarks for chemicals potentially associated with United States Department of Energy sites. In addition, literature describing the experiments from which data were drawn for benchmark derivation. Chemicals that are found in soil at concentrations exceeding both the benchmarks and the background concentration for the soil type should be considered contaminants of potential concern

  5. Toxicological benchmarks for screening potential contaminants of concern for effects on soil and litter invertebrates and heterotrophic process

    Energy Technology Data Exchange (ETDEWEB)

    Will, M.E.; Suter, G.W. II

    1994-09-01

    One of the initial stages in ecological risk assessments for hazardous waste sites is the screening of contaminants to determine which of them are worthy of further consideration as {open_quotes}contaminants of potential concern.{close_quotes} This process is termed {open_quotes}contaminant screening.{close_quotes} It is performed by comparing measured ambient concentrations of chemicals to benchmark concentrations. Currently, no standard benchmark concentrations exist for assessing contaminants in soil with respect to their toxicity to soil- and litter-dwelling invertebrates, including earthworms, other micro- and macroinvertebrates, or heterotrophic bacteria and fungi. This report presents a standard method for deriving benchmarks for this purpose, sets of data concerning effects of chemicals in soil on invertebrates and soil microbial processes, and benchmarks for chemicals potentially associated with United States Department of Energy sites. In addition, literature describing the experiments from which data were drawn for benchmark derivation. Chemicals that are found in soil at concentrations exceeding both the benchmarks and the background concentration for the soil type should be considered contaminants of potential concern.

  6. FENDL neutronics benchmark: Specifications for the calculational neutronics and shielding benchmark

    International Nuclear Information System (INIS)

    Sawan, M.E.

    1994-12-01

    During the IAEA Advisory Group Meeting on ''Improved Evaluations and Integral Data Testing for FENDL'' held in Garching near Munich, Germany in the period 12-16 September 1994, the Working Group II on ''Experimental and Calculational Benchmarks on Fusion Neutronics for ITER'' recommended that a calculational benchmark representative of the ITER design should be developed. This report describes the neutronics and shielding calculational benchmark available for scientists interested in performing analysis for this benchmark. (author)

  7. MicroRNA Array Normalization: An Evaluation Using a Randomized Dataset as the Benchmark

    Science.gov (United States)

    Qin, Li-Xuan; Zhou, Qin

    2014-01-01

    MicroRNA arrays possess a number of unique data features that challenge the assumption key to many normalization methods. We assessed the performance of existing normalization methods using two microRNA array datasets derived from the same set of tumor samples: one dataset was generated using a blocked randomization design when assigning arrays to samples and hence was free of confounding array effects; the second dataset was generated without blocking or randomization and exhibited array effects. The randomized dataset was assessed for differential expression between two tumor groups and treated as the benchmark. The non-randomized dataset was assessed for differential expression after normalization and compared against the benchmark. Normalization improved the true positive rate significantly in the non-randomized data but still possessed a false discovery rate as high as 50%. Adding a batch adjustment step before normalization further reduced the number of false positive markers while maintaining a similar number of true positive markers, which resulted in a false discovery rate of 32% to 48%, depending on the specific normalization method. We concluded the paper with some insights on possible causes of false discoveries to shed light on how to improve normalization for microRNA arrays. PMID:24905456

  8. Benchmarking and Performance Measurement.

    Science.gov (United States)

    Town, J. Stephen

    This paper defines benchmarking and its relationship to quality management, describes a project which applied the technique in a library context, and explores the relationship between performance measurement and benchmarking. Numerous benchmarking methods contain similar elements: deciding what to benchmark; identifying partners; gathering…

  9. Topology optimization using the improved element-free Galerkin method for elasticity*

    International Nuclear Information System (INIS)

    Wu Yi; Ma Yong-Qi; Feng Wei; Cheng Yu-Min

    2017-01-01

    The improved element-free Galerkin (IEFG) method of elasticity is used to solve the topology optimization problems. In this method, the improved moving least-squares approximation is used to form the shape function. In a topology optimization process, the entire structure volume is considered as the constraint. From the solid isotropic microstructures with penalization, we select relative node density as a design variable. Then we choose the minimization of compliance to be an objective function, and compute its sensitivity with the adjoint method. The IEFG method in this paper can overcome the disadvantages of the singular matrices that sometimes appear in conventional element-free Galerkin (EFG) method. The central processing unit (CPU) time of each example is given to show that the IEFG method is more efficient than the EFG method under the same precision, and the advantage that the IEFG method does not form singular matrices is also shown. (paper)

  10. First principles molecular dynamics without self-consistent field optimization

    International Nuclear Information System (INIS)

    Souvatzis, Petros; Niklasson, Anders M. N.

    2014-01-01

    We present a first principles molecular dynamics approach that is based on time-reversible extended Lagrangian Born-Oppenheimer molecular dynamics [A. M. N. Niklasson, Phys. Rev. Lett. 100, 123004 (2008)] in the limit of vanishing self-consistent field optimization. The optimization-free dynamics keeps the computational cost to a minimum and typically provides molecular trajectories that closely follow the exact Born-Oppenheimer potential energy surface. Only one single diagonalization and Hamiltonian (or Fockian) construction are required in each integration time step. The proposed dynamics is derived for a general free-energy potential surface valid at finite electronic temperatures within hybrid density functional theory. Even in the event of irregular functional behavior that may cause a dynamical instability, the optimization-free limit represents a natural starting guess for force calculations that may require a more elaborate iterative electronic ground state optimization. Our optimization-free dynamics thus represents a flexible theoretical framework for a broad and general class of ab initio molecular dynamics simulations

  11. Benchmarking in the Netherlands

    International Nuclear Information System (INIS)

    1999-01-01

    In two articles an overview is given of the activities in the Dutch industry and energy sector with respect to benchmarking. In benchmarking operational processes of different competitive businesses are compared to improve your own performance. Benchmark covenants for energy efficiency between the Dutch government and industrial sectors contribute to a growth of the number of benchmark surveys in the energy intensive industry in the Netherlands. However, some doubt the effectiveness of the benchmark studies

  12. Grey Wolf Optimizer Based on Powell Local Optimization Method for Clustering Analysis

    Directory of Open Access Journals (Sweden)

    Sen Zhang

    2015-01-01

    Full Text Available One heuristic evolutionary algorithm recently proposed is the grey wolf optimizer (GWO, inspired by the leadership hierarchy and hunting mechanism of grey wolves in nature. This paper presents an extended GWO algorithm based on Powell local optimization method, and we call it PGWO. PGWO algorithm significantly improves the original GWO in solving complex optimization problems. Clustering is a popular data analysis and data mining technique. Hence, the PGWO could be applied in solving clustering problems. In this study, first the PGWO algorithm is tested on seven benchmark functions. Second, the PGWO algorithm is used for data clustering on nine data sets. Compared to other state-of-the-art evolutionary algorithms, the results of benchmark and data clustering demonstrate the superior performance of PGWO algorithm.

  13. The fifth AER dynamic benchmark calculation with hextran-smabre

    International Nuclear Information System (INIS)

    Haemaelaeinen, A.; Kyrki-Rajamaeki, R.

    1998-01-01

    The first AER benchmark for coupling of the thermohydraulic codes and three-dimensional reactordynamic core models is discussed. HEXTRAN 2.7 is used for the core dynamics and SMABRE 4.6 as a thermohydraulic model for the primary and secondary loops. The plant model for SMABRE is based mainly on two input models, the Loviisa model and standard VVER-440/213 plant model. The primary circuit includes six separate loops, totally 505 nodes and 652 junctions. The reactor pressure vessel is divided into six parallel channels. In HEXTRAN calculation 1/6 symmetry is used in the core. In the calculations nuclear data is based on the ENDF/B-IV library and it has been evaluated with the CASMO-HEX code. The importance of the nuclear data was illustrated by repeating the benchmark calculation with using three different data sets. Optimal extensive data valid from hot to cold conditions were not available for all types of fuel enrichments needed in this benchmark. (author)

  14. Towards optimal dosing of coumarin derivatives: the role of pharmacogenetics

    NARCIS (Netherlands)

    van Schie, R.M.F.

    2013-01-01

    Coumarin derivatives are effective in the prevention and treatment of thromboembolic diseases. Examples of indications are atrial fibrillation and venous thromboembolism. Although coumarins are on the market for decades, it is still challenging to find the optimal dosage for each patient since

  15. Statistical identifiability and convergence evaluation for nonlinear pharmacokinetic models with particle swarm optimization.

    Science.gov (United States)

    Kim, Seongho; Li, Lang

    2014-02-01

    The statistical identifiability of nonlinear pharmacokinetic (PK) models with the Michaelis-Menten (MM) kinetic equation is considered using a global optimization approach, which is particle swarm optimization (PSO). If a model is statistically non-identifiable, the conventional derivative-based estimation approach is often terminated earlier without converging, due to the singularity. To circumvent this difficulty, we develop a derivative-free global optimization algorithm by combining PSO with a derivative-free local optimization algorithm to improve the rate of convergence of PSO. We further propose an efficient approach to not only checking the convergence of estimation but also detecting the identifiability of nonlinear PK models. PK simulation studies demonstrate that the convergence and identifiability of the PK model can be detected efficiently through the proposed approach. The proposed approach is then applied to clinical PK data along with a two-compartmental model. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. Multicomponent One-Pot Synthesis of Substituted Hantzsch Thiazole Derivatives Under Solvent Free Conditions

    Directory of Open Access Journals (Sweden)

    Bhaskar S. Dawane

    2009-01-01

    Full Text Available Thiazole derivatives were prepared by one-pot procedure by the reaction of α-haloketones, thiourea and substituted o-hydroxybenzaldehyde under environmentally solvent free conditions.

  17. WLUP benchmarks

    International Nuclear Information System (INIS)

    Leszczynski, Francisco

    2002-01-01

    The IAEA-WIMS Library Update Project (WLUP) is on the end stage. The final library will be released on 2002. It is a result of research and development made by more than ten investigators during 10 years. The organization of benchmarks for testing and choosing the best set of data has been coordinated by the author of this paper. It is presented the organization, name conventions, contents and documentation of WLUP benchmarks, and an updated list of the main parameters for all cases. First, the benchmarks objectives and types are given. Then, comparisons of results from different WIMSD libraries are included. Finally it is described the program QVALUE for analysis and plot of results. Some examples are given. The set of benchmarks implemented on this work is a fundamental tool for testing new multigroup libraries. (author)

  18. Asynchronous Gossip-Based Gradient-Free Method for Multiagent Optimization

    OpenAIRE

    Deming Yuan

    2014-01-01

    This paper considers the constrained multiagent optimization problem. The objective function of the problem is a sum of convex functions, each of which is known by a specific agent only. For solving this problem, we propose an asynchronous distributed method that is based on gradient-free oracles and gossip algorithm. In contrast to the existing work, we do not require that agents be capable of computing the subgradients of their objective functions and coordinating their...

  19. Electrochemical behavior of free-radical derivatives of tetra(4hydroxyl-3,5-di-tert-butylphenyl) porphyrin

    Energy Technology Data Exchange (ETDEWEB)

    Pokhodenko, V.D.; Melezhik, A.V.; Platonova, E.P.; Vovk, D.N.

    1984-08-01

    The electrochemical behavior of free-radical derivatives of tetra(4hydroxyl-3,5-di-tert-butylphenyl) porphyrins and their complexes with Mg(II), Zn(II), Ni(II), CU(II), and Pd(II) ions was studied by the methods of voltamperometry, ESR, and spectrophotometry. It was shown that the introduction of free-radical substituents into the porphin macrocycle leads to a substantial decrease in the oxidation and reduction potentials of the complexes. The degree of conjunction of substituents with the porphin macrocycle is estimated according to the difference of the redox potentials of free-radical and quinoid derivatives of metalloporphyrins.

  20. Benchmarking electricity distribution

    Energy Technology Data Exchange (ETDEWEB)

    Watts, K. [Department of Justice and Attorney-General, QLD (Australia)

    1995-12-31

    Benchmarking has been described as a method of continuous improvement that involves an ongoing and systematic evaluation and incorporation of external products, services and processes recognised as representing best practice. It is a management tool similar to total quality management (TQM) and business process re-engineering (BPR), and is best used as part of a total package. This paper discusses benchmarking models and approaches and suggests a few key performance indicators that could be applied to benchmarking electricity distribution utilities. Some recent benchmarking studies are used as examples and briefly discussed. It is concluded that benchmarking is a strong tool to be added to the range of techniques that can be used by electricity distribution utilities and other organizations in search of continuous improvement, and that there is now a high level of interest in Australia. Benchmarking represents an opportunity for organizations to approach learning from others in a disciplined and highly productive way, which will complement the other micro-economic reforms being implemented in Australia. (author). 26 refs.

  1. Optimal Willingness to Supply Wholesale Electricity Under Asymmetric Linearized Marginal Costs

    Directory of Open Access Journals (Sweden)

    David Hudgins

    2012-01-01

    Full Text Available This analysis derives the profit-maximizing willingness to supply functions for single-plant and multi-plant wholesale electricity suppliers that all incur linear marginal costs. The optimal strategy must result in linear residual demand functions in the absence of capacity constraints. This necessarily leads to a linear pricing rule structure that can be used by firm managers to construct their offer curves and to serve as a benchmark to evaluate firm profit-maximizing behavior. The procedure derives the cost functions and the residual demand curves for merged or multi-plant generators, and uses these to construct the individual generator plant offer curves for a multi-plant firm.

  2. Energy-efficient relay selection and optimal power allocation for performance-constrained dual-hop variable-gain AF relaying

    KAUST Repository

    Zafar, Ammar

    2013-12-01

    This paper investigates the energy-efficiency enhancement of a variable-gain dual-hop amplify-and-forward (AF) relay network utilizing selective relaying. The objective is to minimize the total consumed power while keeping the end-to-end signal-to-noise-ratio (SNR) above a certain peak value and satisfying the peak power constraints at the source and relay nodes. To achieve this objective, an optimal relay selection and power allocation strategy is derived by solving the power minimization problem. Numerical results show that the derived optimal strategy enhances the energy-efficiency as compared to a benchmark scheme in which both the source and the selected relay transmit at peak power. © 2013 IEEE.

  3. Effect of tiger nut-derived products in gluten-free batter and bread.

    Science.gov (United States)

    Aguilar, Núria; Albanell, Elena; Miñarro, Begoña; Guamis, Buenaventura; Capellas, Marta

    2015-07-01

    Tiger nut is a tuber used to produce tiger nut milk that yields a high quantity of solid waste, which can be dried and used as fiber source. The objective of this paper was to evaluate the quality of gluten-free bread formulated with different tiger nut-derived products in order to substitute soya flour (which is an allergen ingredient) and, at the same time, increase the use of tiger nut-derived products. Four gluten-free formulations based on corn starch and containing tiger nut milk, tiger nut milk by-product, tiger nut flour, or soya flour (as reference formulation) were studied. Tiger nut milk increased G' of gluten-free batter and rendered breads with the softest crumb (502.46 g ± 102.05), the highest loaf-specific volume (3.35 cm(3)/g ± 0.25), and it was mostly preferred by consumers (61.02%). Breads elaborated with tiger nut flour had similar characteristics than soya flour breads (except in color and crumb structure). The addition of tiger nut milk by-product resulted in a hard (1047.64 g ± 145.74) and dark (L(*)  = 70.02 ± 3.38) crumb bread, which was the least preferred by consumers. Results showed that tiger nut is a promising ingredient to formulate gluten-free baked products. © The Author(s) 2014.

  4. Optimal investment strategies and hedging of derivatives in the presence of transaction costs (Invited Paper)

    Science.gov (United States)

    Muratore-Ginanneschi, Paolo

    2005-05-01

    Investment strategies in multiplicative Markovian market models with transaction costs are defined using growth optimal criteria. The optimal strategy is shown to consist in holding the amount of capital invested in stocks within an interval around an ideal optimal investment. The size of the holding interval is determined by the intensity of the transaction costs and the time horizon. The inclusion of financial derivatives in the models is also considered. All the results presented in this contributions were previously derived in collaboration with E. Aurell.

  5. Benchmarking semantic web technology

    CERN Document Server

    García-Castro, R

    2009-01-01

    This book addresses the problem of benchmarking Semantic Web Technologies; first, from a methodological point of view, proposing a general methodology to follow in benchmarking activities over Semantic Web Technologies and, second, from a practical point of view, presenting two international benchmarking activities that involved benchmarking the interoperability of Semantic Web technologies using RDF(S) as the interchange language in one activity and OWL in the other.The book presents in detail how the different resources needed for these interoperability benchmarking activities were defined:

  6. RUNE benchmarks

    DEFF Research Database (Denmark)

    Peña, Alfredo

    This report contains the description of a number of benchmarks with the purpose of evaluating flow models for near-shore wind resource estimation. The benchmarks are designed based on the comprehensive database of observations that the RUNE coastal experiment established from onshore lidar...

  7. MCNP neutron benchmarks

    International Nuclear Information System (INIS)

    Hendricks, J.S.; Whalen, D.J.; Cardon, D.A.; Uhle, J.L.

    1991-01-01

    Over 50 neutron benchmark calculations have recently been completed as part of an ongoing program to validate the MCNP Monte Carlo radiation transport code. The new and significant aspects of this work are as follows: These calculations are the first attempt at a validation program for MCNP and the first official benchmarking of version 4 of the code. We believe the chosen set of benchmarks is a comprehensive set that may be useful for benchmarking other radiation transport codes and data libraries. These calculations provide insight into how well neutron transport calculations can be expected to model a wide variety of problems

  8. First and second order derivatives for optimizing parallel RF excitation waveforms

    Science.gov (United States)

    Majewski, Kurt; Ritter, Dieter

    2015-09-01

    For piecewise constant magnetic fields, the Bloch equations (without relaxation terms) can be solved explicitly. This way the magnetization created by an excitation pulse can be written as a concatenation of rotations applied to the initial magnetization. For fixed gradient trajectories, the problem of finding parallel RF waveforms, which minimize the difference between achieved and desired magnetization on a number of voxels, can thus be represented as a finite-dimensional minimization problem. We use quaternion calculus to formulate this optimization problem in the magnitude least squares variant and specify first and second order derivatives of the objective function. We obtain a small tip angle approximation as first order Taylor development from the first order derivatives and also develop algorithms for first and second order derivatives for this small tip angle approximation. All algorithms are accompanied by precise floating point operation counts to assess and compare the computational efforts. We have implemented these algorithms as callback functions of an interior-point solver. We have applied this numerical optimization method to example problems from the literature and report key observations.

  9. Serious injuries: an additional indicator to fatalities for road safety benchmarking.

    Science.gov (United States)

    Shen, Yongjun; Hermans, Elke; Bao, Qiong; Brijs, Tom; Wets, Geert

    2015-01-01

    Almost all of the current road safety benchmarking studies focus entirely on fatalities, which, however, represent only one measure of the magnitude of the road safety problem. The main objective of this article was to investigate the possibility of including the number of serious injuries in addition to the number of fatalities for road safety benchmarking and to further illuminate its impact on the countries' rankings. We introduced the technique of data envelopment analysis (DEA) to the road safety domain and developed a DEA-based road safety model (DEA-RS) in this study. Moreover, we outlined different types of possible weight restrictions and adopted 2 of them to indicate the relationship between road fatalities and serious injuries for the sake of rational benchmarking. One was a relative weight restriction based on the information of their shadow price, and the other was a virtual weight restriction using a priori knowledge about the importance level of these 2 aspects. By computing the most optimal road safety risk scores of 10 European countries based on the different models, we found that United Kingdom was the only best-performing country no matter which model was utilized. However, countries such as The Netherlands, Sweden, and Switzerland were no longer best-performing when the serious injuries were integrated. On the contrary, Spain, which ranked almost at the bottom among all of the countries when only the number of road fatalities was considered, became a relatively well-performing country when integrating its number of serious injuries in the evaluation. In general, no matter whether the country's road safety ranking was improved or deteriorated, most of the countries achieved a higher risk score when the number of serious injuries was included, which implied that compared to the road fatalities, more policy attention has to be paid to improve the situation of serious injuries in most countries. Given the importance of considering the serious

  10. Quantitative Performance Analysis of the SPEC OMPM2001 Benchmarks

    Directory of Open Access Journals (Sweden)

    Vishal Aslot

    2003-01-01

    Full Text Available The state of modern computer systems has evolved to allow easy access to multiprocessor systems by supporting multiple processors on a single physical package. As the multiprocessor hardware evolves, new ways of programming it are also developed. Some inventions may merely be adopting and standardizing the older paradigms. One such evolving standard for programming shared-memory parallel computers is the OpenMP API. The Standard Performance Evaluation Corporation (SPEC has created a suite of parallel programs called SPEC OMP to compare and evaluate modern shared-memory multiprocessor systems using the OpenMP standard. We have studied these benchmarks in detail to understand their performance on a modern architecture. In this paper, we present detailed measurements of the benchmarks. We organize, summarize, and display our measurements using a Quantitative Model. We present a detailed discussion and derivation of the model. Also, we discuss the important loops in the SPEC OMPM2001 benchmarks and the reasons for less than ideal speedup on our platform.

  11. Design and Optimization of Tube Type Interior Permanent Magnets Generator for Free Piston Applications

    Directory of Open Access Journals (Sweden)

    Serdal ARSLAN

    2017-05-01

    Full Text Available In this study a design and optimization of a generator to be used in free piston applications was made. In order to supply required initial force, an IPM (interior permanent magnets cavity tube type linear generator was selected. By using analytical equations’ basic dimensioning of generator was made. By using Ansys-Maxwell dimensioning, analysis and optimization of the generator was realized. Also, the effects of design basic variables (pole step ratio, cavity step ratio, inner diameter - outer diameter ratio, primary final length, air interval on pinking force were examined by using parametric analyses. Among these variables, cavity step ratio, inner diameter - outer diameter ratio, primary final length were optimally determined by algorithm and sequential nonlinear programming. The two methods were compared in terms of pinking force calculation problem. Preliminary application of the linear generator was performed for free piston application.

  12. Benchmarks of programming languages for special purposes in the space station

    Science.gov (United States)

    Knoebel, Arthur

    1986-01-01

    Although Ada is likely to be chosen as the principal programming language for the Space Station, certain needs, such as expert systems and robotics, may be better developed in special languages. The languages, LISP and Prolog, are studied and some benchmarks derived. The mathematical foundations for these languages are reviewed. Likely areas of the space station are sought out where automation and robotics might be applicable. Benchmarks are designed which are functional, mathematical, relational, and expert in nature. The coding will depend on the particular versions of the languages which become available for testing.

  13. A primal-dual interior point method for large-scale free material optimization

    DEFF Research Database (Denmark)

    Weldeyesus, Alemseged Gebrehiwot; Stolpe, Mathias

    2015-01-01

    Free Material Optimization (FMO) is a branch of structural optimization in which the design variable is the elastic material tensor that is allowed to vary over the design domain. The requirements are that the material tensor is symmetric positive semidefinite with bounded trace. The resulting...... optimization problem is a nonlinear semidefinite program with many small matrix inequalities for which a special-purpose optimization method should be developed. The objective of this article is to propose an efficient primal-dual interior point method for FMO that can robustly and accurately solve large...... of iterations the interior point method requires is modest and increases only marginally with problem size. The computed optimal solutions obtain a higher precision than other available special-purpose methods for FMO. The efficiency and robustness of the method is demonstrated by numerical experiments on a set...

  14. Benchmark selection

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Tvede, Mich

    2002-01-01

    Within a production theoretic framework, this paper considers an axiomatic approach to benchmark selection. It is shown that two simple and weak axioms; efficiency and comprehensive monotonicity characterize a natural family of benchmarks which typically becomes unique. Further axioms are added...... in order to obtain a unique selection...

  15. Constrained optimization by radial basis function interpolation for high-dimensional expensive black-box problems with infeasible initial points

    Science.gov (United States)

    Regis, Rommel G.

    2014-02-01

    This article develops two new algorithms for constrained expensive black-box optimization that use radial basis function surrogates for the objective and constraint functions. These algorithms are called COBRA and Extended ConstrLMSRBF and, unlike previous surrogate-based approaches, they can be used for high-dimensional problems where all initial points are infeasible. They both follow a two-phase approach where the first phase finds a feasible point while the second phase improves this feasible point. COBRA and Extended ConstrLMSRBF are compared with alternative methods on 20 test problems and on the MOPTA08 benchmark automotive problem (D.R. Jones, Presented at MOPTA 2008), which has 124 decision variables and 68 black-box inequality constraints. The alternatives include a sequential penalty derivative-free algorithm, a direct search method with kriging surrogates, and two multistart methods. Numerical results show that COBRA algorithms are competitive with Extended ConstrLMSRBF and they generally outperform the alternatives on the MOPTA08 problem and most of the test problems.

  16. Benchmarking school nursing practice: the North West Regional Benchmarking Group

    OpenAIRE

    Littler, Nadine; Mullen, Margaret; Beckett, Helen; Freshney, Alice; Pinder, Lynn

    2016-01-01

    It is essential that the quality of care is reviewed regularly through robust processes such as benchmarking to ensure all outcomes and resources are evidence-based so that children and young people’s needs are met effectively. This article provides an example of the use of benchmarking in school nursing practice. Benchmarking has been defined as a process for finding, adapting and applying best practices (Camp, 1994). This concept was first adopted in the 1970s ‘from industry where it was us...

  17. Using Participatory Action Research to Study the Implementation of Career Development Benchmarks at a New Zealand University

    Science.gov (United States)

    Furbish, Dale S.; Bailey, Robyn; Trought, David

    2016-01-01

    Benchmarks for career development services at tertiary institutions have been developed by Careers New Zealand. The benchmarks are intended to provide standards derived from international best practices to guide career development services. A new career development service was initiated at a large New Zealand university just after the benchmarks…

  18. Toxicological benchmarks for screening potential contaminants of concern for effects on terrestrial plants. Environmental Restoration Program

    Energy Technology Data Exchange (ETDEWEB)

    Suter, G.W. II; Will, M.E.; Evans, C.

    1993-09-01

    One of the initial stages in ecological risk assessment for hazardous waste sites is the screening of contaminants to determine which of them are worthy of further consideration as ``contaminants of potential concern.`` This process is termed ``contaminant screening.`` It is performed by comparing measured ambient concentrations of chemicals to benchmark concentrations. Currently, no standard benchmark concentrations exist for assessing contaminants in soil with respect to their toxicity to plants. This report presents a standard method for deriving benchmarks for this purpose (phytotoxicity benchmarks), a set of data concerning effects of chemicals in soil or soil solution on plants, and a set of phytotoxicity benchmarks for 34 chemicals potentially associated with US Department of Energy (DOE) sites. Chemicals that are found in soil at concentrations exceeding both the phytotoxicity benchmark and the background concentration for the soil type should be considered contaminants of potential concern. The purpose of this report is to present plant toxicity data and discuss their utility as benchmarks for determining the hazard to terrestrial plants caused by contaminants in soil. Benchmarks are provided for soils and solutions.

  19. Chemical optimization algorithm for fuzzy controller design

    CERN Document Server

    Astudillo, Leslie; Castillo, Oscar

    2014-01-01

    In this book, a novel optimization method inspired by a paradigm from nature is introduced. The chemical reactions are used as a paradigm to propose an optimization method that simulates these natural processes. The proposed algorithm is described in detail and then a set of typical complex benchmark functions is used to evaluate the performance of the algorithm. Simulation results show that the proposed optimization algorithm can outperform other methods in a set of benchmark functions. This chemical reaction optimization paradigm is also applied to solve the tracking problem for the dynamic model of a unicycle mobile robot by integrating a kinematic and a torque controller based on fuzzy logic theory. Computer simulations are presented confirming that this optimization paradigm is able to outperform other optimization techniques applied to this particular robot application

  20. Towards a public, standardized, diagnostic benchmarking system for land surface models

    Directory of Open Access Journals (Sweden)

    G. Abramowitz

    2012-06-01

    Full Text Available This work examines different conceptions of land surface model benchmarking and the importance of internationally standardized evaluation experiments that specify data sets, variables, metrics and model resolutions. It additionally demonstrates how essential the definition of a priori expectations of model performance can be, based on the complexity of a model and the amount of information being provided to it, and gives an example of how these expectations might be quantified. Finally, the Protocol for the Analysis of Land Surface models (PALS is introduced – a free, online land surface model benchmarking application that is structured to meet both of these goals.

  1. A comparison of global optimization algorithms with standard benchmark functions and real-world applications using Energy Plus

    Energy Technology Data Exchange (ETDEWEB)

    Kamph, Jerome Henri; Robinson, Darren; Wetter, Michael

    2009-09-01

    There is an increasing interest in the use of computer algorithms to identify combinations of parameters which optimise the energy performance of buildings. For such problems, the objective function can be multi-modal and needs to be approximated numerically using building energy simulation programs. As these programs contain iterative solution algorithms, they introduce discontinuities in the numerical approximation to the objective function. Metaheuristics often work well for such problems, but their convergence to a global optimum cannot be established formally. Moreover, different algorithms tend to be suited to particular classes of optimization problems. To shed light on this issue we compared the performance of two metaheuristics, the hybrid CMA-ES/HDE and the hybrid PSO/HJ, in minimizing standard benchmark functions and real-world building energy optimization problems of varying complexity. From this we find that the CMA-ES/HDE performs well on more complex objective functions, but that the PSO/HJ more consistently identifies the global minimum for simpler objective functions. Both identified similar values in the objective functions arising from energy simulations, but with different combinations of model parameters. This may suggest that the objective function is multi-modal. The algorithms also correctly identified some non-intuitive parameter combinations that were caused by a simplified control sequence of the building energy system that does not represent actual practice, further reinforcing their utility.

  2. An Optimization Principle for Deriving Nonequilibrium Statistical Models of Hamiltonian Dynamics

    Science.gov (United States)

    Turkington, Bruce

    2013-08-01

    A general method for deriving closed reduced models of Hamiltonian dynamical systems is developed using techniques from optimization and statistical estimation. Given a vector of resolved variables, selected to describe the macroscopic state of the system, a family of quasi-equilibrium probability densities on phase space corresponding to the resolved variables is employed as a statistical model, and the evolution of the mean resolved vector is estimated by optimizing over paths of these densities. Specifically, a cost function is constructed to quantify the lack-of-fit to the microscopic dynamics of any feasible path of densities from the statistical model; it is an ensemble-averaged, weighted, squared-norm of the residual that results from submitting the path of densities to the Liouville equation. The path that minimizes the time integral of the cost function determines the best-fit evolution of the mean resolved vector. The closed reduced equations satisfied by the optimal path are derived by Hamilton-Jacobi theory. When expressed in terms of the macroscopic variables, these equations have the generic structure of governing equations for nonequilibrium thermodynamics. In particular, the value function for the optimization principle coincides with the dissipation potential that defines the relation between thermodynamic forces and fluxes. The adjustable closure parameters in the best-fit reduced equations depend explicitly on the arbitrary weights that enter into the lack-of-fit cost function. Two particular model reductions are outlined to illustrate the general method. In each example the set of weights in the optimization principle contracts into a single effective closure parameter.

  3. Optimization of Serum Immunoglobulin Free Light Chain Analysis for Subclassification of Cardiac Amyloidosis.

    Science.gov (United States)

    Halushka, Marc K; Eng, George; Collins, A Bernard; Judge, Daniel P; Semigran, Marc J; Stone, James R

    2015-06-01

    Accurate and rapid classification of cardiac amyloidosis is important for patient management. We have optimized the use of serum free light chain kappa and lambda values to differentiate immunoglobulin light chain amyloid (AL) amyloidosis from transthyretin amyloid and amyloid A using 85 cases of tissue-proven cardiac amyloidosis, in which there was direct classification of amyloidosis by mass spectrometry or immunofluorescence. The serum free light chain kappa/lambda ratios were non-overlapping for the three major groups: AL-lambda (0.01-0.41, n = 30), non-AL (0.52-2.7, n = 43), and AL-kappa (6.7-967, n = 12). A kappa/lambda ratio value between 0.5 and 5.0 had 100 % sensitivity and 100 % specificity for distinguishing AL amyloidosis from non-AL amyloidosis. This optimized range for serum light chain kappa/lambda ratio provides extremely robust classification of cardiac amyloidosis. Cases of cardiac amyloidosis in which the serum kappa/lambda free light chain ratio falls close to these new cutoff values may benefit most from direct amyloid subtyping.

  4. Interactive benchmarking

    DEFF Research Database (Denmark)

    Lawson, Lartey; Nielsen, Kurt

    2005-01-01

    We discuss individual learning by interactive benchmarking using stochastic frontier models. The interactions allow the user to tailor the performance evaluation to preferences and explore alternative improvement strategies by selecting and searching the different frontiers using directional...... in the suggested benchmarking tool. The study investigates how different characteristics on dairy farms influences the technical efficiency....

  5. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  6. First and second order derivatives for optimizing parallel RF excitation waveforms.

    Science.gov (United States)

    Majewski, Kurt; Ritter, Dieter

    2015-09-01

    For piecewise constant magnetic fields, the Bloch equations (without relaxation terms) can be solved explicitly. This way the magnetization created by an excitation pulse can be written as a concatenation of rotations applied to the initial magnetization. For fixed gradient trajectories, the problem of finding parallel RF waveforms, which minimize the difference between achieved and desired magnetization on a number of voxels, can thus be represented as a finite-dimensional minimization problem. We use quaternion calculus to formulate this optimization problem in the magnitude least squares variant and specify first and second order derivatives of the objective function. We obtain a small tip angle approximation as first order Taylor development from the first order derivatives and also develop algorithms for first and second order derivatives for this small tip angle approximation. All algorithms are accompanied by precise floating point operation counts to assess and compare the computational efforts. We have implemented these algorithms as callback functions of an interior-point solver. We have applied this numerical optimization method to example problems from the literature and report key observations. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Lipid-derived free radical production in superantigen-induced interstitial pneumonia

    Science.gov (United States)

    Miyakawa, Hisako; Mason, Ronald P.; Jiang, JinJie; Kadiiska, Maria B.

    2009-01-01

    We studied the free radical generation involved in the development of interstitial pneumonia (IP) in an animal model of autoimmune disease. We observed an electron spin resonance (ESR) spectrum of α-(4-pyridyl-1-oxide)-N-tert-butylnitrone (POBN) radical adducts detected in the lipid extract of lungs in autoimmune-prone mice after intratracheal instillation of staphylococcal enterotoxin B. The POBN adducts detected by ESR were paralleled by infiltration of macrophages and neutrophils in the bronchoalveolar lavage fluid. To further investigate the mechanism of free radical generation, mice were pretreated with the macrophage toxicant gadolinium chloride, which significantly suppressed the radical generation. Free radical generation was also decreased by pretreatment with the xanthine oxidase (XO) inhibitor allopurinol, the iron chelator Desferal, and the inducible nitric oxide synthase (iNOS) inhibitor 1400W. Histopathologically, these drugs significantly reduced both the cell infiltration to alveolar septal walls and the synthesis of pulmonary collagen fibers. Experiments with NADPH oxidase knockout mice showed that NADPH oxidase did not contribute to lipid radical generation. These results suggest that lipid-derived carbon-centered free radical production is important in the manifestation of IP and that a macrophage toxicant, an XO inhibitor, an iron chelator, and an iNOS inhibitor protect against both radical generation and the manifestation of IP. PMID:19376221

  8. The KMAT: Benchmarking Knowledge Management.

    Science.gov (United States)

    de Jager, Martha

    Provides an overview of knowledge management and benchmarking, including the benefits and methods of benchmarking (e.g., competitive, cooperative, collaborative, and internal benchmarking). Arthur Andersen's KMAT (Knowledge Management Assessment Tool) is described. The KMAT is a collaborative benchmarking tool, designed to help organizations make…

  9. Application of the docking program SOL for CSAR benchmark.

    Science.gov (United States)

    Sulimov, Alexey V; Kutov, Danil C; Oferkin, Igor V; Katkova, Ekaterina V; Sulimov, Vladimir B

    2013-08-26

    This paper is devoted to results obtained by the docking program SOL and the post-processing program DISCORE at the CSAR benchmark. SOL and DISCORE programs are described. SOL is the original docking program developed on the basis of the genetic algorithm, MMFF94 force field, rigid protein, precalculated energy grid including desolvation in the frame of simplified GB model, vdW, and electrostatic interactions and taking into account the ligand internal strain energy. An important SOL feature is the single- or multi-processor performance for up to hundreds of CPUs. DISCORE improves the binding energy scoring by the local energy optimization of the ligand docked pose and a simple linear regression on the base of available experimental data. The docking program SOL has demonstrated a good ability for correct ligand positioning in the active sites of the tested proteins in most cases of CSAR exercises. SOL and DISCORE have not demonstrated very exciting results on the protein-ligand binding free energy estimation. Nevertheless, for some target proteins, SOL and DISCORE were among the first in prediction of inhibition activity. Ways to improve SOL and DISCORE are discussed.

  10. Benchmarking in Mobarakeh Steel Company

    OpenAIRE

    Sasan Ghasemi; Mohammad Nazemi; Mehran Nejati

    2008-01-01

    Benchmarking is considered as one of the most effective ways of improving performance in companies. Although benchmarking in business organizations is a relatively new concept and practice, it has rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan's Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aims to share the process deployed for the benchmarking project in this company and illustrate how th...

  11. OECD/NEA benchmark for time-dependent neutron transport calculations without spatial homogenization

    Energy Technology Data Exchange (ETDEWEB)

    Hou, Jason, E-mail: jason.hou@ncsu.edu [Department of Nuclear Engineering, North Carolina State University, Raleigh, NC 27695 (United States); Ivanov, Kostadin N. [Department of Nuclear Engineering, North Carolina State University, Raleigh, NC 27695 (United States); Boyarinov, Victor F.; Fomichenko, Peter A. [National Research Centre “Kurchatov Institute”, Kurchatov Sq. 1, Moscow (Russian Federation)

    2017-06-15

    Highlights: • A time-dependent homogenization-free neutron transport benchmark was created. • The first phase, known as the kinetics phase, was described in this work. • Preliminary results for selected 2-D transient exercises were presented. - Abstract: A Nuclear Energy Agency (NEA), Organization for Economic Co-operation and Development (OECD) benchmark for the time-dependent neutron transport calculations without spatial homogenization has been established in order to facilitate the development and assessment of numerical methods for solving the space-time neutron kinetics equations. The benchmark has been named the OECD/NEA C5G7-TD benchmark, and later extended with three consecutive phases each corresponding to one modelling stage of the multi-physics transient analysis of the nuclear reactor core. This paper provides a detailed introduction of the benchmark specification of Phase I, known as the “kinetics phase”, including the geometry description, supporting neutron transport data, transient scenarios in both two-dimensional (2-D) and three-dimensional (3-D) configurations, as well as the expected output parameters from the participants. Also presented are the preliminary results for the initial state 2-D core and selected transient exercises that have been obtained using the Monte Carlo method and the Surface Harmonic Method (SHM), respectively.

  12. Toxicological benchmarks for potential contaminants of concern for effects on soil and litter invertebrates and heterotrophic process

    Energy Technology Data Exchange (ETDEWEB)

    Will, M.E.; Suter, G.W. II

    1995-09-01

    An important step in ecological risk assessments is screening the chemicals occur-ring on a site for contaminants of potential concern. Screening may be accomplished by comparing reported ambient concentrations to a set of toxicological benchmarks. Multiple endpoints for assessing risks posed by soil-borne contaminants to organisms directly impacted by them have been established. This report presents benchmarks for soil invertebrates and microbial processes and addresses only chemicals found at United States Department of Energy (DOE) sites. No benchmarks for pesticides are presented. After discussing methods, this report presents the results of the literature review and benchmark derivation for toxicity to earthworms (Sect. 3), heterotrophic microbes and their processes (Sect. 4), and other invertebrates (Sect. 5). The final sections compare the benchmarks to other criteria and background and draw conclusions concerning the utility of the benchmarks.

  13. Recommendations for Benchmarking Preclinical Studies of Nanomedicines.

    Science.gov (United States)

    Dawidczyk, Charlene M; Russell, Luisa M; Searson, Peter C

    2015-10-01

    Nanoparticle-based delivery systems provide new opportunities to overcome the limitations associated with traditional small-molecule drug therapy for cancer and to achieve both therapeutic and diagnostic functions in the same platform. Preclinical trials are generally designed to assess therapeutic potential and not to optimize the design of the delivery platform. Consequently, progress in developing design rules for cancer nanomedicines has been slow, hindering progress in the field. Despite the large number of preclinical trials, several factors restrict comparison and benchmarking of different platforms, including variability in experimental design, reporting of results, and the lack of quantitative data. To solve this problem, we review the variables involved in the design of preclinical trials and propose a protocol for benchmarking that we recommend be included in in vivo preclinical studies of drug-delivery platforms for cancer therapy. This strategy will contribute to building the scientific knowledge base that enables development of design rules and accelerates the translation of new technologies. ©2015 American Association for Cancer Research.

  14. An Improved Real-Coded Population-Based Extremal Optimization Method for Continuous Unconstrained Optimization Problems

    Directory of Open Access Journals (Sweden)

    Guo-Qiang Zeng

    2014-01-01

    Full Text Available As a novel evolutionary optimization method, extremal optimization (EO has been successfully applied to a variety of combinatorial optimization problems. However, the applications of EO in continuous optimization problems are relatively rare. This paper proposes an improved real-coded population-based EO method (IRPEO for continuous unconstrained optimization problems. The key operations of IRPEO include generation of real-coded random initial population, evaluation of individual and population fitness, selection of bad elements according to power-law probability distribution, generation of new population based on uniform random mutation, and updating the population by accepting the new population unconditionally. The experimental results on 10 benchmark test functions with the dimension N=30 have shown that IRPEO is competitive or even better than the recently reported various genetic algorithm (GA versions with different mutation operations in terms of simplicity, effectiveness, and efficiency. Furthermore, the superiority of IRPEO to other evolutionary algorithms such as original population-based EO, particle swarm optimization (PSO, and the hybrid PSO-EO is also demonstrated by the experimental results on some benchmark functions.

  15. How Activists Use Benchmarks

    DEFF Research Database (Denmark)

    Seabrooke, Leonard; Wigan, Duncan

    2015-01-01

    Non-governmental organisations use benchmarks as a form of symbolic violence to place political pressure on firms, states, and international organisations. The development of benchmarks requires three elements: (1) salience, that the community of concern is aware of the issue and views...... are put to the test. The first is a reformist benchmarking cycle where organisations defer to experts to create a benchmark that conforms with the broader system of politico-economic norms. The second is a revolutionary benchmarking cycle driven by expert-activists that seek to contest strong vested...... interests and challenge established politico-economic norms. Differentiating these cycles provides insights into how activists work through organisations and with expert networks, as well as how campaigns on complex economic issues can be mounted and sustained....

  16. Benchmarking in Mobarakeh Steel Company

    Directory of Open Access Journals (Sweden)

    Sasan Ghasemi

    2008-05-01

    Full Text Available Benchmarking is considered as one of the most effective ways of improving performance incompanies. Although benchmarking in business organizations is a relatively new concept and practice, ithas rapidly gained acceptance worldwide. This paper introduces the benchmarking project conducted in Esfahan’s Mobarakeh Steel Company, as the first systematic benchmarking project conducted in Iran. It aimsto share the process deployed for the benchmarking project in this company and illustrate how the projectsystematic implementation led to succes.

  17. Aquatic Life Benchmarks

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Aquatic Life Benchmarks is an EPA-developed set of criteria for freshwater species. These benchmarks are based on toxicity values reviewed by EPA and used in the...

  18. Regulatory Benchmarking

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    2017-01-01

    Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators. The appli......Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of bench-marking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...

  19. Regulatory Benchmarking

    DEFF Research Database (Denmark)

    Agrell, Per J.; Bogetoft, Peter

    2017-01-01

    Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators. The appli......Benchmarking methods, and in particular Data Envelopment Analysis (DEA), have become well-established and informative tools for economic regulation. DEA is now routinely used by European regulators to set reasonable revenue caps for energy transmission and distribution system operators....... The application of benchmarking in regulation, however, requires specific steps in terms of data validation, model specification and outlier detection that are not systematically documented in open publications, leading to discussions about regulatory stability and economic feasibility of these techniques...

  20. Massive optimal data compression and density estimation for scalable, likelihood-free inference in cosmology

    Science.gov (United States)

    Alsing, Justin; Wandelt, Benjamin; Feeney, Stephen

    2018-03-01

    Many statistical models in cosmology can be simulated forwards but have intractable likelihood functions. Likelihood-free inference methods allow us to perform Bayesian inference from these models using only forward simulations, free from any likelihood assumptions or approximations. Likelihood-free inference generically involves simulating mock data and comparing to the observed data; this comparison in data-space suffers from the curse of dimensionality and requires compression of the data to a small number of summary statistics to be tractable. In this paper we use massive asymptotically-optimal data compression to reduce the dimensionality of the data-space to just one number per parameter, providing a natural and optimal framework for summary statistic choice for likelihood-free inference. Secondly, we present the first cosmological application of Density Estimation Likelihood-Free Inference (DELFI), which learns a parameterized model for joint distribution of data and parameters, yielding both the parameter posterior and the model evidence. This approach is conceptually simple, requires less tuning than traditional Approximate Bayesian Computation approaches to likelihood-free inference and can give high-fidelity posteriors from orders of magnitude fewer forward simulations. As an additional bonus, it enables parameter inference and Bayesian model comparison simultaneously. We demonstrate Density Estimation Likelihood-Free Inference with massive data compression on an analysis of the joint light-curve analysis supernova data, as a simple validation case study. We show that high-fidelity posterior inference is possible for full-scale cosmological data analyses with as few as ˜104 simulations, with substantial scope for further improvement, demonstrating the scalability of likelihood-free inference to large and complex cosmological datasets.

  1. Deriving the Normalized Min-Sum Algorithm from Cooperative Optimization

    OpenAIRE

    Huang, Xiaofei

    2006-01-01

    The normalized min-sum algorithm can achieve near-optimal performance at decoding LDPC codes. However, it is a critical question to understand the mathematical principle underlying the algorithm. Traditionally, people thought that the normalized min-sum algorithm is a good approximation to the sum-product algorithm, the best known algorithm for decoding LDPC codes and Turbo codes. This paper offers an alternative approach to understand the normalized min-sum algorithm. The algorithm is derive...

  2. Benchmark neutron porosity log calculations

    International Nuclear Information System (INIS)

    Little, R.C.; Michael, M.; Verghese, K.; Gardner, R.P.

    1989-01-01

    Calculations have been made for a benchmark neutron porosity log problem with the general purpose Monte Carlo code MCNP and the specific purpose Monte Carlo code McDNL. For accuracy and timing comparison purposes the CRAY XMP and MicroVax II computers have been used with these codes. The CRAY has been used for an analog version of the MCNP code while the MicroVax II has been used for the optimized variance reduction versions of both codes. Results indicate that the two codes give the same results within calculated standard deviations. Comparisons are given and discussed for accuracy (precision) and computation times for the two codes

  3. Serum-free media formulations are cell line-specific and require optimization for microcarrier culture.

    Science.gov (United States)

    Tan, Kah Yong; Teo, Kim Leng; Lim, Jessica F Y; Chen, Allen K L; Choolani, Mahesh; Reuveny, Shaul; Chan, Jerry; Oh, Steve Kw

    2015-08-01

    Mesenchymal stromal cells (MSCs) are being investigated as potential cell therapies for many different indications. Current methods of production rely on traditional monolayer culture on tissue-culture plastic, usually with the use of serum-supplemented growth media. However, the monolayer culturing system has scale-up limitations and may not meet the projected hundreds of billions to trillions batches of cells needed for therapy. Furthermore, serum-free medium offers several advantages over serum-supplemented medium, which may have supply and contaminant issues, leading to many serum-free medium formulations being developed. We cultured seven MSC lines in six different serum-free media and compared their growth between monolayer and microcarrier culture. We show that (i) expansion levels of MSCs in serum-free monolayer cultures may not correlate with expansion in serum-containing media; (ii) optimal culture conditions (serum-free media for monolayer or microcarrier culture) differ for each cell line; (iii) growth in static microcarrier culture does not correlate with growth in stirred spinner culture; (iv) and that early cell attachment and spreading onto microcarriers does not necessarily predict efficiency of cell expansion in agitated microcarrier culture. Current serum-free media developed for monolayer cultures of MSCs may not support MSC proliferation in microcarrier cultures. Further optimization in medium composition will be required for microcarrier suspension culture for each cell line. Copyright © 2015 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  4. Benchmarking and the laboratory

    Science.gov (United States)

    Galloway, M; Nadin, L

    2001-01-01

    This article describes how benchmarking can be used to assess laboratory performance. Two benchmarking schemes are reviewed, the Clinical Benchmarking Company's Pathology Report and the College of American Pathologists' Q-Probes scheme. The Clinical Benchmarking Company's Pathology Report is undertaken by staff based in the clinical management unit, Keele University with appropriate input from the professional organisations within pathology. Five annual reports have now been completed. Each report is a detailed analysis of 10 areas of laboratory performance. In this review, particular attention is focused on the areas of quality, productivity, variation in clinical practice, skill mix, and working hours. The Q-Probes scheme is part of the College of American Pathologists programme in studies of quality assurance. The Q-Probes scheme and its applicability to pathology in the UK is illustrated by reviewing two recent Q-Probe studies: routine outpatient test turnaround time and outpatient test order accuracy. The Q-Probes scheme is somewhat limited by the small number of UK laboratories that have participated. In conclusion, as a result of the government's policy in the UK, benchmarking is here to stay. Benchmarking schemes described in this article are one way in which pathologists can demonstrate that they are providing a cost effective and high quality service. Key Words: benchmarking • pathology PMID:11477112

  5. Benchmarking for Higher Education.

    Science.gov (United States)

    Jackson, Norman, Ed.; Lund, Helen, Ed.

    The chapters in this collection explore the concept of benchmarking as it is being used and developed in higher education (HE). Case studies and reviews show how universities in the United Kingdom are using benchmarking to aid in self-regulation and self-improvement. The chapters are: (1) "Introduction to Benchmarking" (Norman Jackson…

  6. Deriving consensus rankings via multicriteria decision making methodology

    OpenAIRE

    Amy Poh Ai Ling; Mohamad Nasir Saludin; Masao Mukaidono

    2012-01-01

    Purpose - This paper seeks to take a cautionary stance to the impact of the marketing mix on customer satisfaction, via a case study deriving consensus rankings for benchmarking on selected retail stores in Malaysia. Design/methodology/approach - The ELECTRE I model is used in deriving consensus rankings via multicriteria decision making method for benchmarking base on the marketing mix model 4P's. Descriptive analysis is used to analyze best practice among the four marketing tactics. Finding...

  7. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  8. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  9. Benchmarking and Learning in Public Healthcare

    DEFF Research Database (Denmark)

    Buckmaster, Natalie; Mouritsen, Jan

    2017-01-01

    This research investigates the effects of learning-oriented benchmarking in public healthcare settings. Benchmarking is a widely adopted yet little explored accounting practice that is part of the paradigm of New Public Management. Extant studies are directed towards mandated coercive benchmarking...... applications. The present study analyses voluntary benchmarking in a public setting that is oriented towards learning. The study contributes by showing how benchmarking can be mobilised for learning and offers evidence of the effects of such benchmarking for performance outcomes. It concludes that benchmarking...... can enable learning in public settings but that this requires actors to invest in ensuring that benchmark data are directed towards improvement....

  10. Benchmark job – Watch out!

    CERN Multimedia

    Staff Association

    2017-01-01

    On 12 December 2016, in Echo No. 259, we already discussed at length the MERIT and benchmark jobs. Still, we find that a couple of issues warrant further discussion. Benchmark job – administrative decision on 1 July 2017 On 12 January 2017, the HR Department informed all staff members of a change to the effective date of the administrative decision regarding benchmark jobs. The benchmark job title of each staff member will be confirmed on 1 July 2017, instead of 1 May 2017 as originally announced in HR’s letter on 18 August 2016. Postponing the administrative decision by two months will leave a little more time to address the issues related to incorrect placement in a benchmark job. Benchmark job – discuss with your supervisor, at the latest during the MERIT interview In order to rectify an incorrect placement in a benchmark job, it is essential that the supervisor and the supervisee go over the assigned benchmark job together. In most cases, this placement has been done autom...

  11. Distributed Multi-Commodity Network Flow Algorithm for Energy Optimal Routing in Wireless Sensor Networks.

    Directory of Open Access Journals (Sweden)

    J. Trdlicka

    2010-12-01

    Full Text Available This work proposes a distributed algorithm for energy optimal routing in a wireless sensor network. The routing problem is described as a mathematical problem by the minimum-cost multi-commodity network flow problem. Due to the separability of the problem, we use the duality theorem to derive the distributed algorithm. The algorithm computes the energy optimal routing in the network without any central node or knowledge of the whole network structure. Each node only needs to know the flow which is supposed to send or receive and the costs and capacities of the neighboring links. An evaluation of the presented algorithm on benchmarks for the energy optimal data flow routing in sensor networks with up to 100 nodes is presented.

  12. Design optimization of structural parameters for highly sensitive photonic crystal label-free biosensors.

    Science.gov (United States)

    Ju, Jonghyun; Han, Yun-ah; Kim, Seok-min

    2013-03-07

    The effects of structural design parameters on the performance of nano-replicated photonic crystal (PC) label-free biosensors were examined by the analysis of simulated reflection spectra of PC structures. The grating pitch, duty, scaled grating height and scaled TiO2 layer thickness were selected as the design factors to optimize the PC structure. The peak wavelength value (PWV), full width at half maximum of the peak, figure of merit for the bulk and surface sensitivities, and surface/bulk sensitivity ratio were also selected as the responses to optimize the PC label-free biosensor performance. A parametric study showed that the grating pitch was the dominant factor for PWV, and that it had low interaction effects with other scaled design factors. Therefore, we can isolate the effect of grating pitch using scaled design factors. For the design of PC-label free biosensor, one should consider that: (1) the PWV can be measured by the reflection peak measurement instruments, (2) the grating pitch and duty can be manufactured using conventional lithography systems, and (3) the optimum design is less sensitive to the grating height and TiO2 layer thickness variations in the fabrication process. In this paper, we suggested a design guide for highly sensitive PC biosensor in which one select the grating pitch and duty based on the limitations of the lithography and measurement system, and conduct a multi objective optimization of the grating height and TiO2 layer thickness for maximizing performance and minimizing the influence of parameter variation. Through multi-objective optimization of a PC structure with a fixed grating height of 550 nm and a duty of 50%, we obtained a surface FOM of 66.18 RIU-1 and an S/B ratio of 34.8%, with a grating height of 117 nm and TiO2 height of 210 nm.

  13. Design Optimization of Structural Parameters for Highly Sensitive Photonic Crystal Label-Free Biosensors

    Directory of Open Access Journals (Sweden)

    Yun-ah Han

    2013-03-01

    Full Text Available The effects of structural design parameters on the performance of nano-replicated photonic crystal (PC label-free biosensors were examined by the analysis of simulated reflection spectra of PC structures. The grating pitch, duty, scaled grating height and scaled TiO2 layer thickness were selected as the design factors to optimize the PC structure. The peak wavelength value (PWV, full width at half maximum of the peak, figure of merit for the bulk and surface sensitivities, and surface/bulk sensitivity ratio were also selected as the responses to optimize the PC label-free biosensor performance. A parametric study showed that the grating pitch was the dominant factor for PWV, and that it had low interaction effects with other scaled design factors. Therefore, we can isolate the effect of grating pitch using scaled design factors. For the design of PC-label free biosensor, one should consider that: (1 the PWV can be measured by the reflection peak measurement instruments, (2 the grating pitch and duty can be manufactured using conventional lithography systems, and (3 the optimum design is less sensitive to the grating height and TiO2 layer thickness variations in the fabrication process. In this paper, we suggested a design guide for highly sensitive PC biosensor in which one select the grating pitch and duty based on the limitations of the lithography and measurement system, and conduct a multi objective optimization of the grating height and TiO2 layer thickness for maximizing performance and minimizing the influence of parameter variation. Through multi-objective optimization of a PC structure with a fixed grating height of 550 nm and a duty of 50%, we obtained a surface FOM of 66.18 RIU−1 and an S/B ratio of 34.8%, with a grating height of 117 nm and TiO2 height of 210 nm.

  14. Proportional–Integral–Derivative (PID Controller Tuning using Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    J. S. Bassi

    2012-08-01

    Full Text Available The proportional-integral-derivative (PID controllers are the most popular controllers used in industry because of their remarkable effectiveness, simplicity of implementation and broad applicability. However, manual tuning of these controllers is time consuming, tedious and generally lead to poor performance. This tuning which is application specific also deteriorates with time as a result of plant parameter changes. This paper presents an artificial intelligence (AI method of particle swarm optimization (PSO algorithm for tuning the optimal proportional-integral derivative (PID controller parameters for industrial processes. This approach has superior features, including easy implementation, stable convergence characteristic and good computational efficiency over the conventional methods. Ziegler- Nichols, tuning method was applied in the PID tuning and results were compared with the PSO-Based PID for optimum control. Simulation results are presented to show that the PSO-Based optimized PID controller is capable of providing an improved closed-loop performance over the Ziegler- Nichols tuned PID controller Parameters. Compared to the heuristic PID tuning method of Ziegler-Nichols, the proposed method was more efficient in improving the step response characteristics such as, reducing the steady-states error; rise time, settling time and maximum overshoot in speed control of DC motor.

  15. Benchmarking reference services: an introduction.

    Science.gov (United States)

    Marshall, J G; Buchanan, H S

    1995-01-01

    Benchmarking is based on the common sense idea that someone else, either inside or outside of libraries, has found a better way of doing certain things and that your own library's performance can be improved by finding out how others do things and adopting the best practices you find. Benchmarking is one of the tools used for achieving continuous improvement in Total Quality Management (TQM) programs. Although benchmarking can be done on an informal basis, TQM puts considerable emphasis on formal data collection and performance measurement. Used to its full potential, benchmarking can provide a common measuring stick to evaluate process performance. This article introduces the general concept of benchmarking, linking it whenever possible to reference services in health sciences libraries. Data collection instruments that have potential application in benchmarking studies are discussed and the need to develop common measurement tools to facilitate benchmarking is emphasized.

  16. Design optimization and transverse coherence analysis for an x-ray free electron laser driven by SLAC LINAC

    International Nuclear Information System (INIS)

    Xie, M.

    1995-01-01

    I present a design study for an X-ray Free Electron Laser driven by the SLAC linac, the Linac Coherent Light Source (LCLS). The study assumes the LCLS is based on Self-Amplified Spontaneous Emission (SASE). Following a brief review of the fundamentals of SASE, I will provide without derivation a collection of formulas relating SASE performance to the system parameters. These formulas allow quick evaluation of FEL designs and provide powerful tools for optimization in multi-dimensional parameter space. Optimization is carried out for the LCLS over all independent system parameters modeled, subjected to a number of practical constraints. In addition to the optimizations concerning gain and power, another important consideration for a single pass FEL starting from noise is the transverse coherence property of the amplified radiation, especially at short wavelength. A widely used emittance criteria for FELs requires that the emittance is smaller than the radiation wavelength divided by 4π. For the LCLS the criteria is violated by a factor of 5, at a normalized emittance of 1.5 mm-mrad, wavelength of 1.5 angstrom, and beam energy of 15 GeV. Thus it is important to check quantitatively the emittance effect on the transverse coherence. I will examine the emittance effect on transverse coherence by analyzing different transverse modes and show that full transverse coherence can be obtained even at the LCLS parameter regime

  17. The Global Benchmarking as a Method of Countering the Intellectual Migration in Ukraine

    Directory of Open Access Journals (Sweden)

    Striy Lуbov A.

    2017-05-01

    Full Text Available The publication is aimed at studying the global benchmarking as a method of countering the intellectual migration in Ukraine. The article explores the intellectual process of migration in Ukraine; the current status of the country in the light of crisis and all the problems that arose has been analyzed; statistical data on the migration process are provided, the method of countering it has been determined; types of benchmarking have been considered; the benchmarking method as a way of achieving objective has been analyzed; the benefits to be derived from this method have been determined, as well as «bottlenecks» in the State process of regulating migratory flows, not only to call attention to, but also take corrective actions.

  18. Energy benchmarking in wastewater treatment plants: the importance of site operation and layout.

    Science.gov (United States)

    Belloir, C; Stanford, C; Soares, A

    2015-01-01

    Energy benchmarking is a powerful tool in the optimization of wastewater treatment plants (WWTPs) in helping to reduce costs and greenhouse gas emissions. Traditionally, energy benchmarking methods focused solely on reporting electricity consumption, however, recent developments in this area have led to the inclusion of other types of energy, including electrical, manual, chemical and mechanical consumptions that can be expressed in kWh/m3. In this study, two full-scale WWTPs were benchmarked, both incorporated preliminary, secondary (oxidation ditch) and tertiary treatment processes, Site 1 also had an additional primary treatment step. The results indicated that Site 1 required 2.32 kWh/m3 against 0.98 kWh/m3 for Site 2. Aeration presented the highest energy consumption for both sites with 2.08 kWh/m3 required for Site 1 and 0.91 kWh/m3 in Site 2. The mechanical energy represented the second biggest consumption for Site 1 (9%, 0.212 kWh/m3) and chemical input was significant in Site 2 (4.1%, 0.026 kWh/m3). The analysis of the results indicated that Site 2 could be optimized by constructing a primary settling tank that would reduce the biochemical oxygen demand, total suspended solids and NH4 loads to the oxidation ditch by 55%, 75% and 12%, respectively, and at the same time reduce the aeration requirements by 49%. This study demonstrated that the effectiveness of the energy benchmarking exercise in identifying the highest energy-consuming assets, nevertheless it points out the need to develop a holistic overview of the WWTP and the need to include parameters such as effluent quality, site operation and plant layout to allow adequate benchmarking.

  19. A hybrid bird mating optimizer algorithm with teaching-learning-based optimization for global numerical optimization

    Directory of Open Access Journals (Sweden)

    Qingyang Zhang

    2015-02-01

    Full Text Available Bird Mating Optimizer (BMO is a novel meta-heuristic optimization algorithm inspired by intelligent mating behavior of birds. However, it is still insufficient in convergence of speed and quality of solution. To overcome these drawbacks, this paper proposes a hybrid algorithm (TLBMO, which is established by combining the advantages of Teaching-learning-based optimization (TLBO and Bird Mating Optimizer (BMO. The performance of TLBMO is evaluated on 23 benchmark functions, and compared with seven state-of-the-art approaches, namely BMO, TLBO, Artificial Bee Bolony (ABC, Particle Swarm Optimization (PSO, Fast Evolution Programming (FEP, Differential Evolution (DE, Group Search Optimization (GSO. Experimental results indicate that the proposed method performs better than other existing algorithms for global numerical optimization.

  20. Internet based benchmarking

    DEFF Research Database (Denmark)

    Bogetoft, Peter; Nielsen, Kurt

    2005-01-01

    We discuss the design of interactive, internet based benchmarking using parametric (statistical) as well as nonparametric (DEA) models. The user receives benchmarks and improvement potentials. The user is also given the possibility to search different efficiency frontiers and hereby to explore...

  1. Toxicological Benchmarks for Wildlife

    Energy Technology Data Exchange (ETDEWEB)

    Sample, B.E. Opresko, D.M. Suter, G.W.

    1993-01-01

    Ecological risks of environmental contaminants are evaluated by using a two-tiered process. In the first tier, a screening assessment is performed where concentrations of contaminants in the environment are compared to no observed adverse effects level (NOAEL)-based toxicological benchmarks. These benchmarks represent concentrations of chemicals (i.e., concentrations presumed to be nonhazardous to the biota) in environmental media (water, sediment, soil, food, etc.). While exceedance of these benchmarks does not indicate any particular level or type of risk, concentrations below the benchmarks should not result in significant effects. In practice, when contaminant concentrations in food or water resources are less than these toxicological benchmarks, the contaminants may be excluded from further consideration. However, if the concentration of a contaminant exceeds a benchmark, that contaminant should be retained as a contaminant of potential concern (COPC) and investigated further. The second tier in ecological risk assessment, the baseline ecological risk assessment, may use toxicological benchmarks as part of a weight-of-evidence approach (Suter 1993). Under this approach, based toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. Other sources of evidence include media toxicity tests, surveys of biota (abundance and diversity), measures of contaminant body burdens, and biomarkers. This report presents NOAEL- and lowest observed adverse effects level (LOAEL)-based toxicological benchmarks for assessment of effects of 85 chemicals on 9 representative mammalian wildlife species (short-tailed shrew, little brown bat, meadow vole, white-footed mouse, cottontail rabbit, mink, red fox, and whitetail deer) or 11 avian wildlife species (American robin, rough-winged swallow, American woodcock, wild turkey, belted kingfisher, great blue heron, barred owl, barn owl, Cooper's hawk, and red

  2. Employing Sensitivity Derivatives for Robust Optimization under Uncertainty in CFD

    Science.gov (United States)

    Newman, Perry A.; Putko, Michele M.; Taylor, Arthur C., III

    2004-01-01

    A robust optimization is demonstrated on a two-dimensional inviscid airfoil problem in subsonic flow. Given uncertainties in statistically independent, random, normally distributed flow parameters (input variables), an approximate first-order statistical moment method is employed to represent the Computational Fluid Dynamics (CFD) code outputs as expected values with variances. These output quantities are used to form the objective function and constraints. The constraints are cast in probabilistic terms; that is, the probability that a constraint is satisfied is greater than or equal to some desired target probability. Gradient-based robust optimization of this stochastic problem is accomplished through use of both first and second-order sensitivity derivatives. For each robust optimization, the effect of increasing both input standard deviations and target probability of constraint satisfaction are demonstrated. This method provides a means for incorporating uncertainty when considering small deviations from input mean values.

  3. Optimization of corn, rice and buckwheat formulations for gluten-free wafer production.

    Science.gov (United States)

    Dogan, Ismail Sait; Yildiz, Onder; Meral, Raciye

    2016-07-01

    Gluten-free baked products for celiac sufferers are essential for healthy living. Cereals having gluten such as wheat and rye must be removed from the diet for the clinical and histological improvement. The variety of gluten-free foods should be offered for the sufferers. In the study, gluten-free wafer formulas were optimized using corn, rice and buckwheat flours, xanthan and guar gum blend as an alternative product for celiac sufferers. Wafer sheet attributes and textural properties were investigated. Considering all wafer sheet properties in gluten-free formulas, better results were obtained by using 163.5% water, 0.5% guar and 0.1% xanthan in corn formula; 173.3% water, 0.45% guar and 0.15% xanthan gum in rice formula; 176% water, 0.1% guar and 0.5% xanthan gum in buckwheat formula. Average desirability values in gluten-free formulas were between 0.86 and 0.91 indicating they had similar visual and textural profiles to control sheet made with wheat flour. © The Author(s) 2015.

  4. Application of Markowitz Portfolio Theory by Building Optimal Portfolio on the US Stock Market

    Directory of Open Access Journals (Sweden)

    Martin Širůček

    2015-01-01

    Full Text Available This paper is focused on building investment portfolios by using the Markowitz Portfolio Theory (MPT. Derivation based on the Capital Asset Pricing Model (CAPM is used to calculate the weights of individual securities in portfolios. The calculated portfolios include a portfolio copying the benchmark made using the CAPM model, portfolio with low and high beta coefficients, and a random portfolio. Only stocks were selected for the examined sample from all the asset classes. Stocks in each portfolio are put together according to predefined criteria. All stocks were selected from Dow Jones Industrial Average (DJIA index which serves as a benchmark, too. Portfolios were compared based on their risk and return profiles. The results of this work will provide general recommendations on the optimal approach to choose securities for an investor’s portfolio.

  5. An Improved Quantum-Behaved Particle Swarm Optimization Algorithm with Elitist Breeding for Unconstrained Optimization.

    Science.gov (United States)

    Yang, Zhen-Lun; Wu, Angus; Min, Hua-Qing

    2015-01-01

    An improved quantum-behaved particle swarm optimization with elitist breeding (EB-QPSO) for unconstrained optimization is presented and empirically studied in this paper. In EB-QPSO, the novel elitist breeding strategy acts on the elitists of the swarm to escape from the likely local optima and guide the swarm to perform more efficient search. During the iterative optimization process of EB-QPSO, when criteria met, the personal best of each particle and the global best of the swarm are used to generate new diverse individuals through the transposon operators. The new generated individuals with better fitness are selected to be the new personal best particles and global best particle to guide the swarm for further solution exploration. A comprehensive simulation study is conducted on a set of twelve benchmark functions. Compared with five state-of-the-art quantum-behaved particle swarm optimization algorithms, the proposed EB-QPSO performs more competitively in all of the benchmark functions in terms of better global search capability and faster convergence rate.

  6. The Enterprise Derivative Application: Flexible Software for Optimizing Manufacturing Processes

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Richard C [ORNL; Allgood, Glenn O [ORNL; Knox, John R [ORNL

    2008-11-01

    The Enterprise Derivative Application (EDA) implements the enterprise-derivative analysis for optimization of an industrial process (Allgood and Manges, 2001). It is a tool to help industry planners choose the most productive way of manufacturing their products while minimizing their cost. Developed in MS Access, the application allows users to input initial data ranging from raw material to variable costs and enables the tracking of specific information as material is passed from one process to another. Energy-derivative analysis is based on calculation of sensitivity parameters. For the specific application to a steel production process these include: the cost to product sensitivity, the product to energy sensitivity, the energy to efficiency sensitivity, and the efficiency to cost sensitivity. Using the EDA, for all processes the user can display a particular sensitivity or all sensitivities can be compared for all processes. Although energy-derivative analysis was originally designed for use by the steel industry, it is flexible enough to be applied to many other industrial processes. Examples of processes where energy-derivative analysis would prove useful are wireless monitoring of processes in the petroleum cracking industry and wireless monitoring of motor failure for determining the optimum time to replace motor parts. One advantage of the MS Access-based application is its flexibility in defining the process flow and establishing the relationships between parent and child process and products resulting from a process. Due to the general design of the program, a process can be anything that occurs over time with resulting output (products). So the application can be easily modified to many different industrial and organizational environments. Another advantage is the flexibility of defining sensitivity parameters. Sensitivities can be determined between all possible variables in the process flow as a function of time. Thus the dynamic development of the

  7. Benchmarking in academic pharmacy departments.

    Science.gov (United States)

    Bosso, John A; Chisholm-Burns, Marie; Nappi, Jean; Gubbins, Paul O; Ross, Leigh Ann

    2010-10-11

    Benchmarking in academic pharmacy, and recommendations for the potential uses of benchmarking in academic pharmacy departments are discussed in this paper. Benchmarking is the process by which practices, procedures, and performance metrics are compared to an established standard or best practice. Many businesses and industries use benchmarking to compare processes and outcomes, and ultimately plan for improvement. Institutions of higher learning have embraced benchmarking practices to facilitate measuring the quality of their educational and research programs. Benchmarking is used internally as well to justify the allocation of institutional resources or to mediate among competing demands for additional program staff or space. Surveying all chairs of academic pharmacy departments to explore benchmarking issues such as department size and composition, as well as faculty teaching, scholarly, and service productivity, could provide valuable information. To date, attempts to gather this data have had limited success. We believe this information is potentially important, urge that efforts to gather it should be continued, and offer suggestions to achieve full participation.

  8. BENCHMARKING AND CONFIGURATION OF OPENSOURCE MANUFACTURING EXECUTION SYSTEM (MES APPLICATION

    Directory of Open Access Journals (Sweden)

    Ganesha Nur Laksmana

    2013-05-01

    Full Text Available Information now is an important element to every growing industry in the world. Inorder to keep up with other competitors, endless improvements in optimizing overall efficiency areneeded. There still exist barriers that separate departments in PT. XYZ and cause limitation to theinformation sharing in the system. Open-Source Manufacturing Execution System (MES presentsas an IT-based application that offers wide variety of customization to eliminate stovepipes bysharing information between departments. Benchmarking is used to choose the best Open-SourceMES Application; and Dynamic System Development Method (DSDM is adopted as this workguideline. As a result, recommendations of the chosen Open-Source MES Application arerepresented.Keywords: Manufacturing Execution System (MES; Open Source; Dynamic SystemDevelopment Method (DSDM; Benchmarking; Configuration

  9. Oxidation of free, peptide and protein tryptophan residues mediated by AAPH-derived free radicals: role of alkoxyl and peroxyl radicals

    DEFF Research Database (Denmark)

    Fuentes-Lemus, E.; Dorta, E.; Escobar, E.

    2016-01-01

    The oxidation of tryptophan (Trp) residues, mediated by peroxyl radicals (ROOc), follows a complex mechanism involving free radical intermediates, and short chain reactions. The reactivity of Trp towards ROOc should be strongly affected by its inclusion in peptides and proteins. To examine...... the latter, we investigated (by fluorescence) the kinetic of the consumption of free, peptide- and protein-Trp residues towards AAPH (2,20 -azobis(2-amidinopropane)dihydrochloride)-derived free radicals. Interestingly, the initial consumption rates (Ri ) were only slightly influenced by the inclusion of Trp...... concentrations (10–50 mM), the values of Ri were nearly constant; and at high Trp concentrations (50 mM to 1 mM), a slower increase of Ri than expected for chain reactions. Similar behavior was detected for all three systems (free Trp, and Trp in peptides and proteins). For the first time we are showing...

  10. Generation and periodontal differentiation of human gingival fibroblasts-derived integration-free induced pluripotent stem cells

    Energy Technology Data Exchange (ETDEWEB)

    Yin, Xiaohui [Department of Periodontology, School and Hospital of Stomatology, Peking University, 22 South Avenue Zhong-Guan-Cun, Beijing 100081 (China); Peking University Stem Cell Research Center and Department of Cell Biology, School of Basic Medical Sciences, Peking University, 38 Xueyuan Road, Beijing 100191 (China); Li, Yang [Peking University Stem Cell Research Center and Department of Cell Biology, School of Basic Medical Sciences, Peking University, 38 Xueyuan Road, Beijing 100191 (China); Li, Jingwen [Department of Periodontology, School and Hospital of Stomatology, Peking University, 22 South Avenue Zhong-Guan-Cun, Beijing 100081 (China); Li, Peng [Faculty of Dentistry, The University of Hong Kong, 34 Hospital Road, Hong Kong SAR (China); Liu, Yinan [Peking University Stem Cell Research Center and Department of Cell Biology, School of Basic Medical Sciences, Peking University, 38 Xueyuan Road, Beijing 100191 (China); Wen, Jinhua, E-mail: jhwen@bjmu.edu.cn [Peking University Stem Cell Research Center and Department of Cell Biology, School of Basic Medical Sciences, Peking University, 38 Xueyuan Road, Beijing 100191 (China); Luan, Qingxian, E-mail: kqluanqx@126.com [Department of Periodontology, School and Hospital of Stomatology, Peking University, 22 South Avenue Zhong-Guan-Cun, Beijing 100081 (China)

    2016-05-06

    Induced pluripotent stem cells (iPSCs) have been recognized as a promising cell source for periodontal tissue regeneration. However, the conventional virus-based reprogramming approach is associated with a high risk of genetic mutation and limits their therapeutic utility. Here, we successfully generated iPSCs from readily accessible human gingival fibroblasts (hGFs) through an integration-free and feeder-free approach via delivery of reprogramming factors of Oct4, Sox2, Klf4, L-myc, Lin28 and TP53 shRNA with episomal plasmid vectors. The iPSCs presented similar morphology and proliferation characteristics as embryonic stem cells (ESCs), and expressed pluripotent markers including Oct4, Tra181, Nanog and SSEA-4. Additionally, these cells maintained a normal karyotype and showed decreased CpG methylation ratio in the promoter regions of Oct4 and Nanog. In vivo teratoma formation assay revealed the development of tissues representative of three germ layers, confirming the acquisition of pluripotency. Furthermore, treatment of the iPSCs in vitro with enamel matrix derivative (EMD) or growth/differentiation factor-5 (GDF-5) significantly up-regulated the expression of periodontal tissue markers associated with bone, periodontal ligament and cementum respectively. Taken together, our data demonstrate that hGFs are a valuable cell source for generating integration-free iPSCs, which could be sequentially induced toward periodontal cells under the treatment of EMD and GDF-5. - Highlights: • Integration-free iPSCs are successfully generated from hGFs via an episomal approach. • EMD promotes differentiation of the hGFs-derived iPSCs toward periodontal cells. • GDF-5 promotes differentiation of the hGFs-derived iPSCs toward periodontal cells. • hGFs-derived iPSCs could be a promising cell source for periodontal regeneration.

  11. Generation and periodontal differentiation of human gingival fibroblasts-derived integration-free induced pluripotent stem cells

    International Nuclear Information System (INIS)

    Yin, Xiaohui; Li, Yang; Li, Jingwen; Li, Peng; Liu, Yinan; Wen, Jinhua; Luan, Qingxian

    2016-01-01

    Induced pluripotent stem cells (iPSCs) have been recognized as a promising cell source for periodontal tissue regeneration. However, the conventional virus-based reprogramming approach is associated with a high risk of genetic mutation and limits their therapeutic utility. Here, we successfully generated iPSCs from readily accessible human gingival fibroblasts (hGFs) through an integration-free and feeder-free approach via delivery of reprogramming factors of Oct4, Sox2, Klf4, L-myc, Lin28 and TP53 shRNA with episomal plasmid vectors. The iPSCs presented similar morphology and proliferation characteristics as embryonic stem cells (ESCs), and expressed pluripotent markers including Oct4, Tra181, Nanog and SSEA-4. Additionally, these cells maintained a normal karyotype and showed decreased CpG methylation ratio in the promoter regions of Oct4 and Nanog. In vivo teratoma formation assay revealed the development of tissues representative of three germ layers, confirming the acquisition of pluripotency. Furthermore, treatment of the iPSCs in vitro with enamel matrix derivative (EMD) or growth/differentiation factor-5 (GDF-5) significantly up-regulated the expression of periodontal tissue markers associated with bone, periodontal ligament and cementum respectively. Taken together, our data demonstrate that hGFs are a valuable cell source for generating integration-free iPSCs, which could be sequentially induced toward periodontal cells under the treatment of EMD and GDF-5. - Highlights: • Integration-free iPSCs are successfully generated from hGFs via an episomal approach. • EMD promotes differentiation of the hGFs-derived iPSCs toward periodontal cells. • GDF-5 promotes differentiation of the hGFs-derived iPSCs toward periodontal cells. • hGFs-derived iPSCs could be a promising cell source for periodontal regeneration.

  12. Benchmarking: applications to transfusion medicine.

    Science.gov (United States)

    Apelseth, Torunn Oveland; Molnar, Laura; Arnold, Emmy; Heddle, Nancy M

    2012-10-01

    Benchmarking is as a structured continuous collaborative process in which comparisons for selected indicators are used to identify factors that, when implemented, will improve transfusion practices. This study aimed to identify transfusion medicine studies reporting on benchmarking, summarize the benchmarking approaches used, and identify important considerations to move the concept of benchmarking forward in the field of transfusion medicine. A systematic review of published literature was performed to identify transfusion medicine-related studies that compared at least 2 separate institutions or regions with the intention of benchmarking focusing on 4 areas: blood utilization, safety, operational aspects, and blood donation. Forty-five studies were included: blood utilization (n = 35), safety (n = 5), operational aspects of transfusion medicine (n = 5), and blood donation (n = 0). Based on predefined criteria, 7 publications were classified as benchmarking, 2 as trending, and 36 as single-event studies. Three models of benchmarking are described: (1) a regional benchmarking program that collects and links relevant data from existing electronic sources, (2) a sentinel site model where data from a limited number of sites are collected, and (3) an institutional-initiated model where a site identifies indicators of interest and approaches other institutions. Benchmarking approaches are needed in the field of transfusion medicine. Major challenges include defining best practices and developing cost-effective methods of data collection. For those interested in initiating a benchmarking program, the sentinel site model may be most effective and sustainable as a starting point, although the regional model would be the ideal goal. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. Validation of tautomeric and protomeric binding modes by free energy calculations. A case study for the structure based optimization of d-amino acid oxidase inhibitors

    Science.gov (United States)

    Orgován, Zoltán; Ferenczy, György G.; Steinbrecher, Thomas; Szilágyi, Bence; Bajusz, Dávid; Keserű, György M.

    2018-02-01

    Optimization of fragment size d-amino acid oxidase (DAAO) inhibitors was investigated using a combination of computational and experimental methods. Retrospective free energy perturbation (FEP) calculations were performed for benzo[d]isoxazole derivatives, a series of known inhibitors with two potential binding modes derived from X-ray structures of other DAAO inhibitors. The good agreement between experimental and computed binding free energies in only one of the hypothesized binding modes strongly support this bioactive conformation. Then, a series of 1-H-indazol-3-ol derivatives formerly not described as DAAO inhibitors was investigated. Binding geometries could be reliably identified by structural similarity to benzo[d]isoxazole and other well characterized series and FEP calculations were performed for several tautomers of the deprotonated and protonated compounds since all these forms are potentially present owing to the experimental pKa values of representative compounds in the series. Deprotonated compounds are proposed to be the most important bound species owing to the significantly better agreement between their calculated and measured affinities compared to the protonated forms. FEP calculations were also used for the prediction of the affinities of compounds not previously tested as DAAO inhibitors and for a comparative structure-activity relationship study of the benzo[d]isoxazole and indazole series. Selected indazole derivatives were synthesized and their measured binding affinity towards DAAO was in good agreement with FEP predictions.

  14. Higgs Pair Production: Choosing Benchmarks With Cluster Analysis

    CERN Document Server

    Carvalho, Alexandra; Dorigo, Tommaso; Goertz, Florian; Gottardo, Carlo A.; Tosi, Mia

    2016-01-01

    New physics theories often depend on a large number of free parameters. The precise values of those parameters in some cases drastically affect the resulting phenomenology of fundamental physics processes, while in others finite variations can leave it basically invariant at the level of detail experimentally accessible. When designing a strategy for the analysis of experimental data in the search for a signal predicted by a new physics model, it appears advantageous to categorize the parameter space describing the model according to the corresponding kinematical features of the final state. A multi-dimensional test statistic can be used to gauge the degree of similarity in the kinematics of different models; a clustering algorithm using that metric may then allow the division of the space into homogeneous regions, each of which can be successfully represented by a benchmark point. Searches targeting those benchmark points are then guaranteed to be sensitive to a large area of the parameter space. In this doc...

  15. International Handbook of Evaluated Criticality Safety Benchmark Experiments - ICSBEP (DVD), Version 2013

    International Nuclear Information System (INIS)

    2013-01-01

    The Criticality Safety Benchmark Evaluation Project (CSBEP) was initiated in October of 1992 by the United States Department of Energy. The project quickly became an international effort as scientists from other interested countries became involved. The International Criticality Safety Benchmark Evaluation Project (ICSBEP) became an official activity of the Organisation for Economic Co-operation and Development (OECD) Nuclear Energy Agency (NEA) in 1995. This handbook contains criticality safety benchmark specifications that have been derived from experiments performed at various nuclear critical experiment facilities around the world. The benchmark specifications are intended for use by criticality safety engineers to validate calculational techniques used to establish minimum subcritical margins for operations with fissile material and to determine criticality alarm requirement and placement. Many of the specifications are also useful for nuclear data testing. Example calculations are presented; however, these calculations do not constitute a validation of the codes or cross section data. The evaluated criticality safety benchmark data are given in nine volumes. These volumes span nearly 66,000 pages and contain 558 evaluations with benchmark specifications for 4,798 critical, near critical or subcritical configurations, 24 criticality alarm placement/shielding configurations with multiple dose points for each and 200 configurations that have been categorised as fundamental physics measurements that are relevant to criticality safety applications. New to the Handbook are benchmark specifications for Critical, Bare, HEU(93.2)- Metal Sphere experiments referred to as ORSphere that were performed by a team of experimenters at Oak Ridge National Laboratory in the early 1970's. A photograph of this assembly is shown on the front cover

  16. Benchmarking Commercial Conformer Ensemble Generators.

    Science.gov (United States)

    Friedrich, Nils-Ole; de Bruyn Kops, Christina; Flachsenberg, Florian; Sommer, Kai; Rarey, Matthias; Kirchmair, Johannes

    2017-11-27

    We assess and compare the performance of eight commercial conformer ensemble generators (ConfGen, ConfGenX, cxcalc, iCon, MOE LowModeMD, MOE Stochastic, MOE Conformation Import, and OMEGA) and one leading free algorithm, the distance geometry algorithm implemented in RDKit. The comparative study is based on a new version of the Platinum Diverse Dataset, a high-quality benchmarking dataset of 2859 protein-bound ligand conformations extracted from the PDB. Differences in the performance of commercial algorithms are much smaller than those observed for free algorithms in our previous study (J. Chem. Inf. 2017, 57, 529-539). For commercial algorithms, the median minimum root-mean-square deviations measured between protein-bound ligand conformations and ensembles of a maximum of 250 conformers are between 0.46 and 0.61 Å. Commercial conformer ensemble generators are characterized by their high robustness, with at least 99% of all input molecules successfully processed and few or even no substantial geometrical errors detectable in their output conformations. The RDKit distance geometry algorithm (with minimization enabled) appears to be a good free alternative since its performance is comparable to that of the midranked commercial algorithms. Based on a statistical analysis, we elaborate on which algorithms to use and how to parametrize them for best performance in different application scenarios.

  17. Efficient synthesis of sulfonamide derivatives on solid supports catalyzed using solvent-free and microwave-assisted methods

    Energy Technology Data Exchange (ETDEWEB)

    Camargo-Ordonez, Argelia; Moreno-Reyes, Christian; Olazaran-Santibanez, Fabian; Martinez-Hernandez, Sheila; Bocanegra-Garcia, Virgilio; Rivera, Gildardo [Universidad Autonoma de Tamaulipas, Reynosa (Mexico). Dep. de Farmacia y Quimica Medicinal

    2011-07-01

    In this work we report the synthesis of sulfonamide derivatives using a conventional procedure and with solid supports, such as silica gel, florisil, alumina, 4A molecular sieves, montmorillonite KSF, and montmorillonite K10 using solvent-free and microwave-assisted methods. Our results show that solid supports have a catalytic activity in the formation of sulfonamide derivatives. We found that florisil, montmorillonite KSF, and K10 could be used as inexpensive alternative catalysts that are easily separated from the reaction media. Additionally, solvent-free and microwave-assisted methods were more efficient in reducing reaction time and in increasing yield. (author)

  18. Efficient synthesis of sulfonamide derivatives on solid supports catalyzed using solvent-free and microwave-assisted methods

    International Nuclear Information System (INIS)

    Camargo-Ordonez, Argelia; Moreno-Reyes, Christian; Olazaran-Santibanez, Fabian; Martinez-Hernandez, Sheila; Bocanegra-Garcia, Virgilio; Rivera, Gildardo

    2011-01-01

    In this work we report the synthesis of sulfonamide derivatives using a conventional procedure and with solid supports, such as silica gel, florisil, alumina, 4A molecular sieves, montmorillonite KSF, and montmorillonite K10 using solvent-free and microwave-assisted methods. Our results show that solid supports have a catalytic activity in the formation of sulfonamide derivatives. We found that florisil, montmorillonite KSF, and K10 could be used as inexpensive alternative catalysts that are easily separated from the reaction media. Additionally, solvent-free and microwave-assisted methods were more efficient in reducing reaction time and in increasing yield. (author)

  19. Benchmarking the Multidimensional Stellar Implicit Code MUSIC

    Science.gov (United States)

    Goffrey, T.; Pratt, J.; Viallet, M.; Baraffe, I.; Popov, M. V.; Walder, R.; Folini, D.; Geroux, C.; Constantino, T.

    2017-04-01

    We present the results of a numerical benchmark study for the MUltidimensional Stellar Implicit Code (MUSIC) based on widely applicable two- and three-dimensional compressible hydrodynamics problems relevant to stellar interiors. MUSIC is an implicit large eddy simulation code that uses implicit time integration, implemented as a Jacobian-free Newton Krylov method. A physics based preconditioning technique which can be adjusted to target varying physics is used to improve the performance of the solver. The problems used for this benchmark study include the Rayleigh-Taylor and Kelvin-Helmholtz instabilities, and the decay of the Taylor-Green vortex. Additionally we show a test of hydrostatic equilibrium, in a stellar environment which is dominated by radiative effects. In this setting the flexibility of the preconditioning technique is demonstrated. This work aims to bridge the gap between the hydrodynamic test problems typically used during development of numerical methods and the complex flows of stellar interiors. A series of multidimensional tests were performed and analysed. Each of these test cases was analysed with a simple, scalar diagnostic, with the aim of enabling direct code comparisons. As the tests performed do not have analytic solutions, we verify MUSIC by comparing it to established codes including ATHENA and the PENCIL code. MUSIC is able to both reproduce behaviour from established and widely-used codes as well as results expected from theoretical predictions. This benchmarking study concludes a series of papers describing the development of the MUSIC code and provides confidence in future applications.

  20. Topology optimization based on spline-based meshfree method using topological derivatives

    International Nuclear Information System (INIS)

    Hur, Junyoung; Youn, Sung-Kie; Kang, Pilseong

    2017-01-01

    Spline-based meshfree method (SBMFM) is originated from the Isogeometric analysis (IGA) which integrates design and analysis through Non-uniform rational B-spline (NURBS) basis functions. SBMFM utilizes trimming technique of CAD system by representing the domain using NURBS curves. In this work, an explicit boundary topology optimization using SBMFM is presented with an effective boundary update scheme. There have been similar works in this subject. However unlike the previous works where semi-analytic method for calculating design sensitivities is employed, the design update is done by using topological derivatives. In this research, the topological derivative is used to derive the sensitivity of boundary curves and for the creation of new holes. Based on the values of topological derivatives, the shape of boundary curves is updated. Also, the topological change is achieved by insertion and removal of the inner holes. The presented approach is validated through several compliance minimization problems.

  1. Topology optimization based on spline-based meshfree method using topological derivatives

    Energy Technology Data Exchange (ETDEWEB)

    Hur, Junyoung; Youn, Sung-Kie [KAIST, Daejeon (Korea, Republic of); Kang, Pilseong [Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of)

    2017-05-15

    Spline-based meshfree method (SBMFM) is originated from the Isogeometric analysis (IGA) which integrates design and analysis through Non-uniform rational B-spline (NURBS) basis functions. SBMFM utilizes trimming technique of CAD system by representing the domain using NURBS curves. In this work, an explicit boundary topology optimization using SBMFM is presented with an effective boundary update scheme. There have been similar works in this subject. However unlike the previous works where semi-analytic method for calculating design sensitivities is employed, the design update is done by using topological derivatives. In this research, the topological derivative is used to derive the sensitivity of boundary curves and for the creation of new holes. Based on the values of topological derivatives, the shape of boundary curves is updated. Also, the topological change is achieved by insertion and removal of the inner holes. The presented approach is validated through several compliance minimization problems.

  2. Modelling Fanconi anemia pathogenesis and therapeutics using integration-free patient-derived iPSCs.

    Science.gov (United States)

    Liu, Guang-Hui; Suzuki, Keiichiro; Li, Mo; Qu, Jing; Montserrat, Nuria; Tarantino, Carolina; Gu, Ying; Yi, Fei; Xu, Xiuling; Zhang, Weiqi; Ruiz, Sergio; Plongthongkum, Nongluk; Zhang, Kun; Masuda, Shigeo; Nivet, Emmanuel; Tsunekawa, Yuji; Soligalla, Rupa Devi; Goebl, April; Aizawa, Emi; Kim, Na Young; Kim, Jessica; Dubova, Ilir; Li, Ying; Ren, Ruotong; Benner, Chris; Del Sol, Antonio; Bueren, Juan; Trujillo, Juan Pablo; Surralles, Jordi; Cappelli, Enrico; Dufour, Carlo; Esteban, Concepcion Rodriguez; Belmonte, Juan Carlos Izpisua

    2014-07-07

    Fanconi anaemia (FA) is a recessive disorder characterized by genomic instability, congenital abnormalities, cancer predisposition and bone marrow (BM) failure. However, the pathogenesis of FA is not fully understood partly due to the limitations of current disease models. Here, we derive integration free-induced pluripotent stem cells (iPSCs) from an FA patient without genetic complementation and report in situ gene correction in FA-iPSCs as well as the generation of isogenic FANCA-deficient human embryonic stem cell (ESC) lines. FA cellular phenotypes are recapitulated in iPSCs/ESCs and their adult stem/progenitor cell derivatives. By using isogenic pathogenic mutation-free controls as well as cellular and genomic tools, our model serves to facilitate the discovery of novel disease features. We validate our model as a drug-screening platform by identifying several compounds that improve hematopoietic differentiation of FA-iPSCs. These compounds are also able to rescue the hematopoietic phenotype of FA patient BM cells.

  3. Modeling Fanconi Anemia pathogenesis and therapeutics using integration-free patient-derived iPSCs

    Science.gov (United States)

    Montserrat, Nuria; Tarantino, Carolina; Gu, Ying; Yi, Fei; Xu, Xiuling; Zhang, Weiqi; Ruiz, Sergio; Plongthongkum, Nongluk; Zhang, Kun; Masuda, Shigeo; Nivet, Emmanuel; Tsunekawa, Yuji; Soligalla, Rupa Devi; Goebl, April; Aizawa, Emi; Kim, Na Young; Kim, Jessica; Dubova, Ilir; Li, Ying; Ren, Ruotong; Benner, Chris; del Sol, Antonio; Bueren, Juan; Trujillo, Juan Pablo; Surralles, Jordi; Cappelli, Enrico; Dufour, Carlo; Esteban, Concepcion Rodriguez; Belmonte, Juan Carlos Izpisua

    2014-01-01

    Fanconi Anemia (FA) is a recessive disorder characterized by genomic instability, congenital abnormalities, cancer predisposition and bone marrow failure. However, the pathogenesis of FA is not fully understood partly due to the limitations of current disease models. Here, we derive integration-free induced pluripotent stem cells (iPSCs) from an FA patient without genetic complementation and report in situ gene correction in FA-iPSCs as well as the generation of isogenic FANCA deficient human embryonic stem cell (ESC) lines. FA cellular phenotypes are recapitulated in iPSCs/ESCs and their adult stem/progenitor cell derivatives. By using isogenic pathogenic mutation-free controls as well as cellular and genomic tools, our model serves to facilitate the discovery of novel disease features. We validate our model as a drug-screening platform by identifying several compounds that improve hematopoietic differentiation of FA-iPSCs. These compounds are also able to rescue the hematopoietic phenotype of FA-patient bone marrow cells. PMID:24999918

  4. Practical mine ventilation optimization based on genetic algorithms for free splitting networks

    Energy Technology Data Exchange (ETDEWEB)

    Acuna, E.; Maynard, R.; Hall, S. [Laurentian Univ., Sudbury, ON (Canada). Mirarco Mining Innovation; Hardcastle, S.G.; Li, G. [Natural Resources Canada, Sudbury, ON (Canada). CANMET Mining and Mineral Sciences Laboratories; Lowndes, I.S. [Nottingham Univ., Nottingham (United Kingdom). Process and Environmental Research Division; Tonnos, A. [Bestech, Sudbury, ON (Canada)

    2010-07-01

    The method used to optimize the design and operation of mine ventilation has generally been based on case studies and expert knowledge. It has yet to benefit from optimization techniques used and proven in other fields of engineering. Currently, optimization of mine ventilation systems is a manual based decision process performed by an experienced mine ventilation specialist assisted by commercial ventilation distribution solvers. These analysis tools are widely used in the mining industry to evaluate the practical and economic viability of alternative ventilation system configurations. The scenario which is usually selected is the one that reports the lowest energy consumption while delivering the required airflow distribution. Since most commercial solvers do not have an integrated optimization algorithm network, the process of generating a series of potential ventilation solutions using the conventional iterative design strategy can be time consuming. For that reason, a genetic algorithm (GA) optimization routine was developed in combination with a ventilation solver to determine the potential optimal solutions of a primary mine ventilation system based on a free splitting network. The optimization method was used in a small size mine ventilation network. The technique was shown to have the capacity to generate good feasible solutions and improve upon the manual results obtained by mine ventilation specialists. 9 refs., 7 tabs., 3 figs.

  5. Modified harmony search

    Science.gov (United States)

    Mohamed, Najihah; Lutfi Amri Ramli, Ahmad; Majid, Ahmad Abd; Piah, Abd Rahni Mt

    2017-09-01

    A metaheuristic algorithm, called Harmony Search is quite highly applied in optimizing parameters in many areas. HS is a derivative-free real parameter optimization algorithm, and draws an inspiration from the musical improvisation process of searching for a perfect state of harmony. Propose in this paper Modified Harmony Search for solving optimization problems, which employs a concept from genetic algorithm method and particle swarm optimization for generating new solution vectors that enhances the performance of HS algorithm. The performances of MHS and HS are investigated on ten benchmark optimization problems in order to make a comparison to reflect the efficiency of the MHS in terms of final accuracy, convergence speed and robustness.

  6. Quantum-teleportation benchmarks for independent and identically distributed spin states and displaced thermal states

    International Nuclear Information System (INIS)

    Guta, Madalin; Bowles, Peter; Adesso, Gerardo

    2010-01-01

    A successful state-transfer (or teleportation) experiment must perform better than the benchmark set by the 'best' measure and prepare procedure. We consider the benchmark problem for the following families of states: (i) displaced thermal equilibrium states of a given temperature; (ii) independent identically prepared qubits with a completely unknown state. For the first family we show that the optimal procedure is heterodyne measurement followed by the preparation of a coherent state. This procedure was known to be optimal for coherent states and for squeezed states with the 'overlap fidelity' as the figure of merit. Here, we prove its optimality with respect to the trace norm distance and supremum risk. For the second problem we consider n independent and identically distributed (i.i.d.) spin-(1/2) systems in an arbitrary unknown state ρ and look for the measurement-preparation pair (M n ,P n ) for which the reconstructed state ω n :=P n circle M n (ρ xn ) is as close as possible to the input state (i.e., parallel ω n -ρ xn parallel 1 is small). The figure of merit is based on the trace norm distance between the input and output states. We show that asymptotically with n this problem is equivalent to the first one. The proof and construction of (M n ,P n ) uses the theory of local asymptotic normality developed for state estimation which shows that i.i.d. quantum models can be approximated in a strong sense by quantum Gaussian models. The measurement part is identical to 'optimal estimation', showing that 'benchmarking' and estimation are closely related problems in the asymptotic set up.

  7. Performance indices and evaluation of algorithms in building energy efficient design optimization

    International Nuclear Information System (INIS)

    Si, Binghui; Tian, Zhichao; Jin, Xing; Zhou, Xin; Tang, Peng; Shi, Xing

    2016-01-01

    Building energy efficient design optimization is an emerging technique that is increasingly being used to design buildings with better overall performance and a particular emphasis on energy efficiency. To achieve building energy efficient design optimization, algorithms are vital to generate new designs and thus drive the design optimization process. Therefore, the performance of algorithms is crucial to achieving effective energy efficient design techniques. This study evaluates algorithms used for building energy efficient design optimization. A set of performance indices, namely, stability, robustness, validity, speed, coverage, and locality, is proposed to evaluate the overall performance of algorithms. A benchmark building and a design optimization problem are also developed. Hooke–Jeeves algorithm, Multi-Objective Genetic Algorithm II, and Multi-Objective Particle Swarm Optimization algorithm are evaluated by using the proposed performance indices and benchmark design problem. Results indicate that no algorithm performs best in all six areas. Therefore, when facing an energy efficient design problem, the algorithm must be carefully selected based on the nature of the problem and the performance indices that matter the most. - Highlights: • Six indices of algorithm performance in building energy optimization are developed. • For each index, its concept is defined and the calculation formulas are proposed. • A benchmark building and benchmark energy efficient design problem are proposed. • The performance of three selected algorithms are evaluated.

  8. Benchmarking ENDF/B-VII.0

    International Nuclear Information System (INIS)

    Marck, Steven C. van der

    2006-01-01

    The new major release VII.0 of the ENDF/B nuclear data library has been tested extensively using benchmark calculations. These were based upon MCNP-4C3 continuous-energy Monte Carlo neutronics simulations, together with nuclear data processed using the code NJOY. Three types of benchmarks were used, viz., criticality safety benchmarks (fusion) shielding benchmarks, and reference systems for which the effective delayed neutron fraction is reported. For criticality safety, more than 700 benchmarks from the International Handbook of Criticality Safety Benchmark Experiments were used. Benchmarks from all categories were used, ranging from low-enriched uranium, compound fuel, thermal spectrum ones (LEU-COMP-THERM), to mixed uranium-plutonium, metallic fuel, fast spectrum ones (MIX-MET-FAST). For fusion shielding many benchmarks were based on IAEA specifications for the Oktavian experiments (for Al, Co, Cr, Cu, LiF, Mn, Mo, Si, Ti, W, Zr), Fusion Neutronics Source in Japan (for Be, C, N, O, Fe, Pb), and Pulsed Sphere experiments at Lawrence Livermore National Laboratory (for 6 Li, 7 Li, Be, C, N, O, Mg, Al, Ti, Fe, Pb, D 2 O, H 2 O, concrete, polyethylene and teflon). For testing delayed neutron data more than thirty measurements in widely varying systems were used. Among these were measurements in the Tank Critical Assembly (TCA in Japan) and IPEN/MB-01 (Brazil), both with a thermal spectrum, and two cores in Masurca (France) and three cores in the Fast Critical Assembly (FCA, Japan), all with fast spectra. In criticality safety, many benchmarks were chosen from the category with a thermal spectrum, low-enriched uranium, compound fuel (LEU-COMP-THERM), because this is typical of most current-day reactors, and because these benchmarks were previously underpredicted by as much as 0.5% by most nuclear data libraries (such as ENDF/B-VI.8, JEFF-3.0). The calculated results presented here show that this underprediction is no longer there for ENDF/B-VII.0. The average over 257

  9. Practice benchmarking in the age of targeted auditing.

    Science.gov (United States)

    Langdale, Ryan P; Holland, Ben F

    2012-11-01

    The frequency and sophistication of health care reimbursement auditing has progressed rapidly in recent years, leaving many oncologists wondering whether their private practices would survive a full-scale Office of the Inspector General (OIG) investigation. The Medicare Part B claims database provides a rich source of information for physicians seeking to understand how their billing practices measure up to their peers, both locally and nationally. This database was dissected by a team of cancer specialists to uncover important benchmarks related to targeted auditing. All critical Medicare charges, payments, denials, and service ratios in this article were derived from the full 2010 Medicare Part B claims database. Relevant claims were limited by using Medicare provider specialty codes 83 (hematology/oncology) and 90 (medical oncology), with an emphasis on claims filed from the physician office place of service (11). All charges, denials, and payments were summarized at the Current Procedural Terminology code level to drive practice benchmarking standards. A careful analysis of this data set, combined with the published audit priorities of the OIG, produced germane benchmarks from which medical oncologists can monitor, measure and improve on common areas of billing fraud, waste or abuse in their practices. Part II of this series and analysis will focus on information pertinent to radiation oncologists.

  10. A Field-Based Aquatic Life Benchmark for Conductivity in Central Appalachian Streams (2010) (External Review Draft)

    Science.gov (United States)

    This report adapts the standard U.S. EPA methodology for deriving ambient water quality criteria. Rather than use toxicity test results, the adaptation uses field data to determine the loss of 5% of genera from streams. The method is applied to derive effect benchmarks for disso...

  11. California commercial building energy benchmarking

    Energy Technology Data Exchange (ETDEWEB)

    Kinney, Satkartar; Piette, Mary Ann

    2003-07-01

    Building energy benchmarking is the comparison of whole-building energy use relative to a set of similar buildings. It provides a useful starting point for individual energy audits and for targeting buildings for energy-saving measures in multiple-site audits. Benchmarking is of interest and practical use to a number of groups. Energy service companies and performance contractors communicate energy savings potential with ''typical'' and ''best-practice'' benchmarks while control companies and utilities can provide direct tracking of energy use and combine data from multiple buildings. Benchmarking is also useful in the design stage of a new building or retrofit to determine if a design is relatively efficient. Energy managers and building owners have an ongoing interest in comparing energy performance to others. Large corporations, schools, and government agencies with numerous facilities also use benchmarking methods to compare their buildings to each other. The primary goal of Task 2.1.1 Web-based Benchmarking was the development of a web-based benchmarking tool, dubbed Cal-Arch, for benchmarking energy use in California commercial buildings. While there were several other benchmarking tools available to California consumers prior to the development of Cal-Arch, there were none that were based solely on California data. Most available benchmarking information, including the Energy Star performance rating, were developed using DOE's Commercial Building Energy Consumption Survey (CBECS), which does not provide state-level data. Each database and tool has advantages as well as limitations, such as the number of buildings and the coverage by type, climate regions and end uses. There is considerable commercial interest in benchmarking because it provides an inexpensive method of screening buildings for tune-ups and retrofits. However, private companies who collect and manage consumption data are concerned that the

  12. Benchmarking in Foodservice Operations

    National Research Council Canada - National Science Library

    Johnson, Bonnie

    1998-01-01

    The objective of this study was to identify usage of foodservice performance measures, important activities in foodservice benchmarking, and benchmarking attitudes, beliefs, and practices by foodservice directors...

  13. Benchmarking, benchmarks, or best practices? Applying quality improvement principles to decrease surgical turnaround time.

    Science.gov (United States)

    Mitchell, L

    1996-01-01

    The processes of benchmarking, benchmark data comparative analysis, and study of best practices are distinctly different. The study of best practices is explained with an example based on the Arthur Andersen & Co. 1992 "Study of Best Practices in Ambulatory Surgery". The results of a national best practices study in ambulatory surgery were used to provide our quality improvement team with the goal of improving the turnaround time between surgical cases. The team used a seven-step quality improvement problem-solving process to improve the surgical turnaround time. The national benchmark for turnaround times between surgical cases in 1992 was 13.5 minutes. The initial turnaround time at St. Joseph's Medical Center was 19.9 minutes. After the team implemented solutions, the time was reduced to an average of 16.3 minutes, an 18% improvement. Cost-benefit analysis showed a potential enhanced revenue of approximately $300,000, or a potential savings of $10,119. Applying quality improvement principles to benchmarking, benchmarks, or best practices can improve process performance. Understanding which form of benchmarking the institution wishes to embark on will help focus a team and use appropriate resources. Communicating with professional organizations that have experience in benchmarking will save time and money and help achieve the desired results.

  14. Benchmarking i den offentlige sektor

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj; Dietrichson, Lars; Sandalgaard, Niels

    2008-01-01

    I artiklen vil vi kort diskutere behovet for benchmarking i fraværet af traditionelle markedsmekanismer. Herefter vil vi nærmere redegøre for, hvad benchmarking er med udgangspunkt i fire forskellige anvendelser af benchmarking. Regulering af forsyningsvirksomheder vil blive behandlet, hvorefter...

  15. Regional Competitive Intelligence: Benchmarking and Policymaking

    OpenAIRE

    Huggins , Robert

    2010-01-01

    Benchmarking exercises have become increasingly popular within the sphere of regional policymaking in recent years. The aim of this paper is to analyse the concept of regional benchmarking and its links with regional policymaking processes. It develops a typology of regional benchmarking exercises and regional benchmarkers, and critically reviews the literature, both academic and policy oriented. It is argued that critics who suggest regional benchmarking is a flawed concept and technique fai...

  16. Predictive uncertainty reduction in coupled neutron-kinetics/thermal hydraulics modeling of the BWR-TT2 benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Badea, Aurelian F., E-mail: aurelian.badea@kit.edu [Karlsruhe Institute of Technology, Vincenz-Prießnitz-Str. 3, 76131 Karlsruhe (Germany); Cacuci, Dan G. [Center for Nuclear Science and Energy/Dept. of ME, University of South Carolina, 300 Main Street, Columbia, SC 29208 (United States)

    2017-03-15

    Highlights: • BWR Turbine Trip 2 (BWR-TT2) benchmark. • Substantial (up to 50%) reduction of uncertainties in the predicted transient power. • 6660 uncertain model parameters were calibrated. - Abstract: By applying a comprehensive predictive modeling methodology, this work demonstrates a substantial (up to 50%) reduction of uncertainties in the predicted total transient power in the BWR Turbine Trip 2 (BWR-TT2) benchmark while calibrating the numerical simulation of this benchmark, comprising 6090 macroscopic cross sections, and 570 thermal-hydraulics parameters involved in modeling the phase-slip correlation, transient outlet pressure, and total mass flow. The BWR-TT2 benchmark is based on an experiment that was carried out in 1977 in the NPP Peach Bottom 2, involving the closure of the turbine stop valve which caused a pressure wave that propagated with attenuation into the reactor core. The condensation of the steam in the reactor core caused by the pressure increase led to a positive reactivity insertion. The subsequent rise of power was limited by the feedback and the insertion of the control rods. The BWR-TT2 benchmark was modeled with the three-dimensional reactor physics code system DYN3D, by coupling neutron kinetics with two-phase thermal-hydraulics. All 6660 DYN3D model parameters were calibrated by applying a predictive modeling methodology that combines experimental and computational information to produce optimally predicted best-estimate results with reduced predicted uncertainties. Simultaneously, the predictive modeling methodology yields optimally predicted values for the BWR total transient power while reducing significantly the accompanying predicted standard deviations.

  17. Predictive uncertainty reduction in coupled neutron-kinetics/thermal hydraulics modeling of the BWR-TT2 benchmark

    International Nuclear Information System (INIS)

    Badea, Aurelian F.; Cacuci, Dan G.

    2017-01-01

    Highlights: • BWR Turbine Trip 2 (BWR-TT2) benchmark. • Substantial (up to 50%) reduction of uncertainties in the predicted transient power. • 6660 uncertain model parameters were calibrated. - Abstract: By applying a comprehensive predictive modeling methodology, this work demonstrates a substantial (up to 50%) reduction of uncertainties in the predicted total transient power in the BWR Turbine Trip 2 (BWR-TT2) benchmark while calibrating the numerical simulation of this benchmark, comprising 6090 macroscopic cross sections, and 570 thermal-hydraulics parameters involved in modeling the phase-slip correlation, transient outlet pressure, and total mass flow. The BWR-TT2 benchmark is based on an experiment that was carried out in 1977 in the NPP Peach Bottom 2, involving the closure of the turbine stop valve which caused a pressure wave that propagated with attenuation into the reactor core. The condensation of the steam in the reactor core caused by the pressure increase led to a positive reactivity insertion. The subsequent rise of power was limited by the feedback and the insertion of the control rods. The BWR-TT2 benchmark was modeled with the three-dimensional reactor physics code system DYN3D, by coupling neutron kinetics with two-phase thermal-hydraulics. All 6660 DYN3D model parameters were calibrated by applying a predictive modeling methodology that combines experimental and computational information to produce optimally predicted best-estimate results with reduced predicted uncertainties. Simultaneously, the predictive modeling methodology yields optimally predicted values for the BWR total transient power while reducing significantly the accompanying predicted standard deviations.

  18. Derived heuristics-based consistent optimization of material flow in a gold processing plant

    Science.gov (United States)

    Myburgh, Christie; Deb, Kalyanmoy

    2018-01-01

    Material flow in a chemical processing plant often follows complicated control laws and involves plant capacity constraints. Importantly, the process involves discrete scenarios which when modelled in a programming format involves if-then-else statements. Therefore, a formulation of an optimization problem of such processes becomes complicated with nonlinear and non-differentiable objective and constraint functions. In handling such problems using classical point-based approaches, users often have to resort to modifications and indirect ways of representing the problem to suit the restrictions associated with classical methods. In a particular gold processing plant optimization problem, these facts are demonstrated by showing results from MATLAB®'s well-known fmincon routine. Thereafter, a customized evolutionary optimization procedure which is capable of handling all complexities offered by the problem is developed. Although the evolutionary approach produced results with comparatively less variance over multiple runs, the performance has been enhanced by introducing derived heuristics associated with the problem. In this article, the development and usage of derived heuristics in a practical problem are presented and their importance in a quick convergence of the overall algorithm is demonstrated.

  19. Benchmarking Using Basic DBMS Operations

    Science.gov (United States)

    Crolotte, Alain; Ghazal, Ahmad

    The TPC-H benchmark proved to be successful in the decision support area. Many commercial database vendors and their related hardware vendors used these benchmarks to show the superiority and competitive edge of their products. However, over time, the TPC-H became less representative of industry trends as vendors keep tuning their database to this benchmark-specific workload. In this paper, we present XMarq, a simple benchmark framework that can be used to compare various software/hardware combinations. Our benchmark model is currently composed of 25 queries that measure the performance of basic operations such as scans, aggregations, joins and index access. This benchmark model is based on the TPC-H data model due to its maturity and well-understood data generation capability. We also propose metrics to evaluate single-system performance and compare two systems. Finally we illustrate the effectiveness of this model by showing experimental results comparing two systems under different conditions.

  20. High Energy Physics (HEP) benchmark program

    International Nuclear Information System (INIS)

    Yasu, Yoshiji; Ichii, Shingo; Yashiro, Shigeo; Hirayama, Hideo; Kokufuda, Akihiro; Suzuki, Eishin.

    1993-01-01

    High Energy Physics (HEP) benchmark programs are indispensable tools to select suitable computer for HEP application system. Industry standard benchmark programs can not be used for this kind of particular selection. The CERN and the SSC benchmark suite are famous HEP benchmark programs for this purpose. The CERN suite includes event reconstruction and event generator programs, while the SSC one includes event generators. In this paper, we found that the results from these two suites are not consistent. And, the result from the industry benchmark does not agree with either of these two. Besides, we describe comparison of benchmark results using EGS4 Monte Carlo simulation program with ones from two HEP benchmark suites. Then, we found that the result from EGS4 in not consistent with the two ones. The industry standard of SPECmark values on various computer systems are not consistent with the EGS4 results either. Because of these inconsistencies, we point out the necessity of a standardization of HEP benchmark suites. Also, EGS4 benchmark suite should be developed for users of applications such as medical science, nuclear power plant, nuclear physics and high energy physics. (author)

  1. Derivative Trade Optimizing Model Utilizing GP Based on Behavioral Finance Theory

    Science.gov (United States)

    Matsumura, Koki; Kawamoto, Masaru

    This paper proposed a new technique which makes the strategy trees for the derivative (option) trading investment decision based on the behavioral finance theory and optimizes it using evolutionary computation, in order to achieve high profitability. The strategy tree uses a technical analysis based on a statistical, experienced technique for the investment decision. The trading model is represented by various technical indexes, and the strategy tree is optimized by the genetic programming(GP) which is one of the evolutionary computations. Moreover, this paper proposed a method using the prospect theory based on the behavioral finance theory to set psychological bias for profit and deficit and attempted to select the appropriate strike price of option for the higher investment efficiency. As a result, this technique produced a good result and found the effectiveness of this trading model by the optimized dealings strategy.

  2. Benchmarking Tool Kit.

    Science.gov (United States)

    Canadian Health Libraries Association.

    Nine Canadian health libraries participated in a pilot test of the Benchmarking Tool Kit between January and April, 1998. Although the Tool Kit was designed specifically for health libraries, the content and approach are useful to other types of libraries as well. Used to its full potential, benchmarking can provide a common measuring stick to…

  3. Benchmarking Academic Anatomic Pathologists

    Directory of Open Access Journals (Sweden)

    Barbara S. Ducatman MD

    2016-10-01

    Full Text Available The most common benchmarks for faculty productivity are derived from Medical Group Management Association (MGMA or Vizient-AAMC Faculty Practice Solutions Center ® (FPSC databases. The Association of Pathology Chairs has also collected similar survey data for several years. We examined the Association of Pathology Chairs annual faculty productivity data and compared it with MGMA and FPSC data to understand the value, inherent flaws, and limitations of benchmarking data. We hypothesized that the variability in calculated faculty productivity is due to the type of practice model and clinical effort allocation. Data from the Association of Pathology Chairs survey on 629 surgical pathologists and/or anatomic pathologists from 51 programs were analyzed. From review of service assignments, we were able to assign each pathologist to a specific practice model: general anatomic pathologists/surgical pathologists, 1 or more subspecialties, or a hybrid of the 2 models. There were statistically significant differences among academic ranks and practice types. When we analyzed our data using each organization’s methods, the median results for the anatomic pathologists/surgical pathologists general practice model compared to MGMA and FPSC results for anatomic and/or surgical pathology were quite close. Both MGMA and FPSC data exclude a significant proportion of academic pathologists with clinical duties. We used the more inclusive FPSC definition of clinical “full-time faculty” (0.60 clinical full-time equivalent and above. The correlation between clinical full-time equivalent effort allocation, annual days on service, and annual work relative value unit productivity was poor. This study demonstrates that effort allocations are variable across academic departments of pathology and do not correlate well with either work relative value unit effort or reported days on service. Although the Association of Pathology Chairs–reported median work relative

  4. Benchmarking of industrial control systems via case-based reasoning

    International Nuclear Information System (INIS)

    Hadjiiski, M.; Boshnakov, K.; Georgiev, Z.

    2013-01-01

    Full text: The recent development of information and communication technologies enables the establishment of virtual consultation centers related to the control of specific processes that are widely presented worldwide as the location of the installations does not have influence on the results. The centers can provide consultations regarding the quality of the process control and overall enterprise management as correction factors such as weather conditions, product or service and associated technology, production level, quality of feedstock used and others can be also taken into account. The benchmarking technique is chosen as a tool for analyzing and comparing the quality of the assessed control systems in individual plants. It is a process of gathering, analyzing and comparing data on the characteristics of comparable units to assess and compare these characteristics and improve the performance of the particular process, enterprise or organization. By comparing the different processes and the adoption of the best practices energy efficiency could be improved and hence the competitiveness of the participating organizations will increase. In the presented work algorithm for benchmarking and parametric optimization of a given control system is developed by applying the approaches of Case-Based Reasoning (CBR) and Data Envelopment Analysis (DEA). Expert knowledge and approaches for optimal tuning of control systems are combined. Two of the most common systems for automatic control of different variables in the case of biological wastewater treatment are presented and discussed. Based on analysis of the processes, different cases are defined. By using DEA analysis the relative efficiencies of 10 systems for automatic control of dissolved oxygen are estimated. The designed and implemented in the current work CBR and DEA are applicable for the purposed of virtual consultation centers. Key words: benchmarking technique, energy efficiency, Case-Based Reasoning (CBR

  5. Novel Water Soluble Chitosan Derivatives with 1,2,3-Triazolium and Their Free Radical-Scavenging Activity

    Directory of Open Access Journals (Sweden)

    Qing Li

    2018-03-01

    Full Text Available Chitosan is an abundant and renewable polysaccharide, which exhibits attractive bioactivities and natural properties. Improvement such as chemical modification of chitosan is often performed for its potential of providing high bioactivity and good water solubility. A new class of chitosan derivatives possessing 1,2,3-triazolium charged units by associating “click reaction” with efficient 1,2,3-triazole quaternization were designed and synthesized. Their free radical-scavenging activity against three free radicals was tested. The inhibitory property and water solubility of the synthesized chitosan derivatives exhibited a remarkable improvement over chitosan. It is hypothesized that triazole or triazolium groups enable the synthesized chitosan to possess obviously better radical-scavenging activity. Moreover, the scavenging activity against superoxide radical of chitosan derivatives with triazolium (IC50 < 0.01 mg mL−1 was more efficient than that of derivatives with triazole and Vitamin C. In the 1,1-diphenyl-2-picrylhydrazyl (DPPH and hydroxyl radical-scavenging assay, the same pattern were observed, which should be related to the triazolium grafted at the periphery of molecular chains.

  6. Optimal design of an alignment-free two-DOF rehabilitation robot for the shoulder complex.

    Science.gov (United States)

    Galinski, Daniel; Sapin, Julien; Dehez, Bruno

    2013-06-01

    This paper presents the optimal design of an alignment-free exoskeleton for the rehabilitation of the shoulder complex. This robot structure is constituted of two actuated joints and is linked to the arm through passive degrees of freedom (DOFs) to drive the flexion-extension and abduction-adduction movements of the upper arm. The optimal design of this structure is performed through two steps. The first step is a multi-objective optimization process aiming to find the best parameters characterizing the robot and its position relative to the patient. The second step is a comparison process aiming to select the best solution from the optimization results on the basis of several criteria related to practical considerations. The optimal design process leads to a solution outperforming an existing solution on aspects as kinematics or ergonomics while being more simple.

  7. Higgs pair production: choosing benchmarks with cluster analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Alexandra; Dall’Osso, Martino; Dorigo, Tommaso [Dipartimento di Fisica e Astronomia and INFN, Sezione di Padova,Via Marzolo 8, I-35131 Padova (Italy); Goertz, Florian [CERN,1211 Geneva 23 (Switzerland); Gottardo, Carlo A. [Physikalisches Institut, Universität Bonn,Nussallee 12, 53115 Bonn (Germany); Tosi, Mia [CERN,1211 Geneva 23 (Switzerland)

    2016-04-20

    New physics theories often depend on a large number of free parameters. The phenomenology they predict for fundamental physics processes is in some cases drastically affected by the precise value of those free parameters, while in other cases is left basically invariant at the level of detail experimentally accessible. When designing a strategy for the analysis of experimental data in the search for a signal predicted by a new physics model, it appears advantageous to categorize the parameter space describing the model according to the corresponding kinematical features of the final state. A multi-dimensional test statistic can be used to gauge the degree of similarity in the kinematics predicted by different models; a clustering algorithm using that metric may allow the division of the space into homogeneous regions, each of which can be successfully represented by a benchmark point. Searches targeting those benchmarks are then guaranteed to be sensitive to a large area of the parameter space. In this document we show a practical implementation of the above strategy for the study of non-resonant production of Higgs boson pairs in the context of extensions of the standard model with anomalous couplings of the Higgs bosons. A non-standard value of those couplings may significantly enhance the Higgs boson pair-production cross section, such that the process could be detectable with the data that the LHC will collect in Run 2.

  8. Polymerized Nile Blue derivatives for plasticizer-free fluorescent ion optode microsphere sensors.

    Science.gov (United States)

    Ngeontae, Wittaya; Xu, Chao; Ye, Nan; Wygladacz, Katarzyna; Aeungmaitrepirom, Wanlapa; Tuntulani, Thawatchai; Bakker, Eric

    2007-09-05

    Lipophilic H+-selective fluorophores such as Nile Blue derivatives are widely used in ISE-based pH sensors and bulk optodes, and are commonly dissolved in a plasticized matrix such as PVC. Unfortunately, leaching of the active sensing ingredients and plasticizer from the matrix dictates the lifetime of the sensors and hampers their applications in vivo, especially with miniaturized particle based sensors. We find that classical copolymerization of Nile Blue derivatives containing an acrylic side group gives rise to multiple reaction products with different spectral and H+-binding properties, making this approach unsuitable for the development of reliable sensor materials. This limitation was overcome by grafting Nile Blue to a self-plasticized poly(n-butyl acrylate) matrix via an urea or amide linkage between the Nile Blue base structure and the polymer. Optode leaching experiments into methanol confirmed the successful covalent attachment of the two chromoionophores to the polymer matrix. Both polymerized Nile Blue derivatives have satisfactory pH response and appropriate optical properties that are suitable for use in ion-selective electrodes and optodes. Plasticizer-free Na+-selective microsphere sensors using the polymerized chromoionophores were fabricated under mild conditions with an in-house sonic microparticle generator for the measurement of sodium activities at physiological pH. The measuring range for sodium was found as 10(-1)-10(-4) M and 1-10(-3) M, for Nile Blue derivatives linked via urea and amide functionalities, respectively, at physiological pH. The observed ion-exchange constants of the plasticizer-free microsphere were log K(exch) = -5.6 and log K(exch) = -6.5 for the same two systems, respectively. Compared with earlier Na+-selective bulk optodes, the fabricated optical sensing microbeads reported here have agreeable selectivity patterns, reasonably fast response times, and more appropriate measuring ranges for determination of Na+ activity

  9. A Global Vision over Benchmarking Process: Benchmarking Based Enterprises

    OpenAIRE

    Sitnikov, Catalina; Giurca Vasilescu, Laura

    2008-01-01

    Benchmarking uses the knowledge and the experience of others to improve the enterprise. Starting from the analysis of the performance and underlying the strengths and weaknesses of the enterprise it should be assessed what must be done in order to improve its activity. Using benchmarking techniques, an enterprise looks at how processes in the value chain are performed. The approach based on the vision “from the whole towards the parts” (a fragmented image of the enterprise’s value chain) redu...

  10. Simple mathematical law benchmarks human confrontations

    Science.gov (United States)

    Johnson, Neil F.; Medina, Pablo; Zhao, Guannan; Messinger, Daniel S.; Horgan, John; Gill, Paul; Bohorquez, Juan Camilo; Mattson, Whitney; Gangi, Devon; Qi, Hong; Manrique, Pedro; Velasquez, Nicolas; Morgenstern, Ana; Restrepo, Elvira; Johnson, Nicholas; Spagat, Michael; Zarama, Roberto

    2013-01-01

    Many high-profile societal problems involve an individual or group repeatedly attacking another – from child-parent disputes, sexual violence against women, civil unrest, violent conflicts and acts of terror, to current cyber-attacks on national infrastructure and ultrafast cyber-trades attacking stockholders. There is an urgent need to quantify the likely severity and timing of such future acts, shed light on likely perpetrators, and identify intervention strategies. Here we present a combined analysis of multiple datasets across all these domains which account for >100,000 events, and show that a simple mathematical law can benchmark them all. We derive this benchmark and interpret it, using a minimal mechanistic model grounded by state-of-the-art fieldwork. Our findings provide quantitative predictions concerning future attacks; a tool to help detect common perpetrators and abnormal behaviors; insight into the trajectory of a ‘lone wolf'; identification of a critical threshold for spreading a message or idea among perpetrators; an intervention strategy to erode the most lethal clusters; and more broadly, a quantitative starting point for cross-disciplinary theorizing about human aggression at the individual and group level, in both real and online worlds. PMID:24322528

  11. Simple mathematical law benchmarks human confrontations

    Science.gov (United States)

    Johnson, Neil F.; Medina, Pablo; Zhao, Guannan; Messinger, Daniel S.; Horgan, John; Gill, Paul; Bohorquez, Juan Camilo; Mattson, Whitney; Gangi, Devon; Qi, Hong; Manrique, Pedro; Velasquez, Nicolas; Morgenstern, Ana; Restrepo, Elvira; Johnson, Nicholas; Spagat, Michael; Zarama, Roberto

    2013-12-01

    Many high-profile societal problems involve an individual or group repeatedly attacking another - from child-parent disputes, sexual violence against women, civil unrest, violent conflicts and acts of terror, to current cyber-attacks on national infrastructure and ultrafast cyber-trades attacking stockholders. There is an urgent need to quantify the likely severity and timing of such future acts, shed light on likely perpetrators, and identify intervention strategies. Here we present a combined analysis of multiple datasets across all these domains which account for >100,000 events, and show that a simple mathematical law can benchmark them all. We derive this benchmark and interpret it, using a minimal mechanistic model grounded by state-of-the-art fieldwork. Our findings provide quantitative predictions concerning future attacks; a tool to help detect common perpetrators and abnormal behaviors; insight into the trajectory of a `lone wolf' identification of a critical threshold for spreading a message or idea among perpetrators; an intervention strategy to erode the most lethal clusters; and more broadly, a quantitative starting point for cross-disciplinary theorizing about human aggression at the individual and group level, in both real and online worlds.

  12. Novel Water Soluble Chitosan Derivatives with 1,2,3-Triazolium and Their Free Radical-Scavenging Activity.

    Science.gov (United States)

    Li, Qing; Sun, Xueqi; Gu, Guodong; Guo, Zhanyong

    2018-03-28

    Chitosan is an abundant and renewable polysaccharide, which exhibits attractive bioactivities and natural properties. Improvement such as chemical modification of chitosan is often performed for its potential of providing high bioactivity and good water solubility. A new class of chitosan derivatives possessing 1,2,3-triazolium charged units by associating "click reaction" with efficient 1,2,3-triazole quaternization were designed and synthesized. Their free radical-scavenging activity against three free radicals was tested. The inhibitory property and water solubility of the synthesized chitosan derivatives exhibited a remarkable improvement over chitosan. It is hypothesized that triazole or triazolium groups enable the synthesized chitosan to possess obviously better radical-scavenging activity. Moreover, the scavenging activity against superoxide radical of chitosan derivatives with triazolium (IC 50 radical-scavenging assay, the same pattern were observed, which should be related to the triazolium grafted at the periphery of molecular chains.

  13. A dynamic inertia weight particle swarm optimization algorithm

    International Nuclear Information System (INIS)

    Jiao Bin; Lian Zhigang; Gu Xingsheng

    2008-01-01

    Particle swarm optimization (PSO) algorithm has been developing rapidly and has been applied widely since it was introduced, as it is easily understood and realized. This paper presents an improved particle swarm optimization algorithm (IPSO) to improve the performance of standard PSO, which uses the dynamic inertia weight that decreases according to iterative generation increasing. It is tested with a set of 6 benchmark functions with 30, 50 and 150 different dimensions and compared with standard PSO. Experimental results indicate that the IPSO improves the search performance on the benchmark functions significantly

  14. An optimized ultra-fine energy group structure for neutron transport calculations

    International Nuclear Information System (INIS)

    Huria, Harish; Ouisloumen, Mohamed

    2008-01-01

    This paper describes an optimized energy group structure that was developed for neutron transport calculations in lattices using the Westinghouse lattice physics code PARAGON. The currently used 70-energy group structure results in significant discrepancies when the predictions are compared with those from the continuous energy Monte Carlo methods. The main source of the differences is the approximations employed in the resonance self-shielding methodology. This, in turn, leads to ambiguous adjustments in the resonance range cross-sections. The main goal of developing this group structure was to bypass the self-shielding methodology altogether thereby reducing the neutronic calculation errors. The proposed optimized energy mesh has 6064 points with 5877 points spanning the resonance range. The group boundaries in the resonance range were selected so that the micro group cross-sections matched reasonably well with those derived from reaction tallies of MCNP for a number of resonance absorbers of interest in reactor lattices. At the same time, however, the fast and thermal energy range boundaries were also adjusted to match the MCNP reaction rates in the relevant ranges. The resulting multi-group library was used to obtain eigenvalues for a wide variety of reactor lattice numerical benchmarks and also the Doppler reactivity defect benchmarks to establish its adequacy. (authors)

  15. A note on bound constraints handling for the IEEE CEC'05 benchmark function suite.

    Science.gov (United States)

    Liao, Tianjun; Molina, Daniel; de Oca, Marco A Montes; Stützle, Thomas

    2014-01-01

    The benchmark functions and some of the algorithms proposed for the special session on real parameter optimization of the 2005 IEEE Congress on Evolutionary Computation (CEC'05) have played and still play an important role in the assessment of the state of the art in continuous optimization. In this article, we show that if bound constraints are not enforced for the final reported solutions, state-of-the-art algorithms produce infeasible best candidate solutions for the majority of functions of the IEEE CEC'05 benchmark function suite. This occurs even though the optima of the CEC'05 functions are within the specified bounds. This phenomenon has important implications on algorithm comparisons, and therefore on algorithm designs. This article's goal is to draw the attention of the community to the fact that some authors might have drawn wrong conclusions from experiments using the CEC'05 problems.

  16. Argonne Code Center: Benchmark problem book.

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    1977-06-01

    This book is an outgrowth of activities of the Computational Benchmark Problems Committee of the Mathematics and Computation Division of the American Nuclear Society. This is the second supplement of the original benchmark book which was first published in February, 1968 and contained computational benchmark problems in four different areas. Supplement No. 1, which was published in December, 1972, contained corrections to the original benchmark book plus additional problems in three new areas. The current supplement. Supplement No. 2, contains problems in eight additional new areas. The objectives of computational benchmark work and the procedures used by the committee in pursuing the objectives are outlined in the original edition of the benchmark book (ANL-7416, February, 1968). The members of the committee who have made contributions to Supplement No. 2 are listed below followed by the contributors to the earlier editions of the benchmark book.

  17. Benchmarks for GADRAS performance validation

    International Nuclear Information System (INIS)

    Mattingly, John K.; Mitchell, Dean James; Rhykerd, Charles L. Jr.

    2009-01-01

    The performance of the Gamma Detector Response and Analysis Software (GADRAS) was validated by comparing GADRAS model results to experimental measurements for a series of benchmark sources. Sources for the benchmark include a plutonium metal sphere, bare and shielded in polyethylene, plutonium oxide in cans, a highly enriched uranium sphere, bare and shielded in polyethylene, a depleted uranium shell and spheres, and a natural uranium sphere. The benchmark experimental data were previously acquired and consist of careful collection of background and calibration source spectra along with the source spectra. The calibration data were fit with GADRAS to determine response functions for the detector in each experiment. A one-dimensional model (pie chart) was constructed for each source based on the dimensions of the benchmark source. The GADRAS code made a forward calculation from each model to predict the radiation spectrum for the detector used in the benchmark experiment. The comparisons between the GADRAS calculation and the experimental measurements are excellent, validating that GADRAS can correctly predict the radiation spectra for these well-defined benchmark sources.

  18. Benchmarking in Czech Higher Education

    Directory of Open Access Journals (Sweden)

    Plaček Michal

    2015-12-01

    Full Text Available The first part of this article surveys the current experience with the use of benchmarking at Czech universities specializing in economics and management. The results indicate that collaborative benchmarking is not used on this level today, but most actors show some interest in its introduction. The expression of the need for it and the importance of benchmarking as a very suitable performance-management tool in less developed countries are the impetus for the second part of our article. Based on an analysis of the current situation and existing needs in the Czech Republic, as well as on a comparison with international experience, recommendations for public policy are made, which lie in the design of a model of a collaborative benchmarking for Czech economics and management in higher-education programs. Because the fully complex model cannot be implemented immediately – which is also confirmed by structured interviews with academics who have practical experience with benchmarking –, the final model is designed as a multi-stage model. This approach helps eliminate major barriers to the implementation of benchmarking.

  19. Implementation, capabilities, and benchmarking of Shift, a massively parallel Monte Carlo radiation transport code

    International Nuclear Information System (INIS)

    Pandya, Tara M.; Johnson, Seth R.; Evans, Thomas M.; Davidson, Gregory G.; Hamilton, Steven P.; Godfrey, Andrew T.

    2015-01-01

    This paper discusses the implementation, capabilities, and validation of Shift, a massively parallel Monte Carlo radiation transport package developed and maintained at Oak Ridge National Laboratory. It has been developed to scale well from laptop to small computing clusters to advanced supercomputers. Special features of Shift include hybrid capabilities for variance reduction such as CADIS and FW-CADIS, and advanced parallel decomposition and tally methods optimized for scalability on supercomputing architectures. Shift has been validated and verified against various reactor physics benchmarks and compares well to other state-of-the-art Monte Carlo radiation transport codes such as MCNP5, CE KENO-VI, and OpenMC. Some specific benchmarks used for verification and validation include the CASL VERA criticality test suite and several Westinghouse AP1000 ® problems. These benchmark and scaling studies show promising results

  20. Deterministic global optimization an introduction to the diagonal approach

    CERN Document Server

    Sergeyev, Yaroslav D

    2017-01-01

    This book begins with a concentrated introduction into deterministic global optimization and moves forward to present new original results from the authors who are well known experts in the field. Multiextremal continuous problems that have an unknown structure with Lipschitz objective functions and functions having the first Lipschitz derivatives defined over hyperintervals are examined. A class of algorithms using several Lipschitz constants is introduced which has its origins in the DIRECT (DIviding RECTangles) method. This new class is based on an efficient strategy that is applied for the search domain partitioning. In addition a survey on derivative free methods and methods using the first derivatives is given for both one-dimensional and multi-dimensional cases. Non-smooth and smooth minorants and acceleration techniques that can speed up several classes of global optimization methods with examples of applications and problems arising in numerical testing of global optimization algorithms are discussed...

  1. New scale-down methodology from commercial to lab scale to optimize plant-derived soft gel capsule formulations on a commercial scale.

    Science.gov (United States)

    Oishi, Sana; Kimura, Shin-Ichiro; Noguchi, Shuji; Kondo, Mio; Kondo, Yosuke; Shimokawa, Yoshiyuki; Iwao, Yasunori; Itai, Shigeru

    2018-01-15

    A new scale-down methodology from commercial rotary die scale to laboratory scale was developed to optimize a plant-derived soft gel capsule formulation and eventually manufacture superior soft gel capsules on a commercial scale, in order to reduce the time and cost for formulation development. Animal-derived and plant-derived soft gel film sheets were prepared using an applicator on a laboratory scale and their physicochemical properties, such as tensile strength, Young's modulus, and adhesive strength, were evaluated. The tensile strength of the animal-derived and plant-derived soft gel film sheets was 11.7 MPa and 4.41 MPa, respectively. The Young's modulus of the animal-derived and plant-derived soft gel film sheets was 169 MPa and 17.8 MPa, respectively, and both sheets showed a similar adhesion strength of approximately 4.5-10 MPa. Using a D-optimal mixture design, plant-derived soft gel film sheets were prepared and optimized by varying their composition, including variations in the mass of κ-carrageenan, ι-carrageenan, oxidized starch and heat-treated starch. The physicochemical properties of the sheets were evaluated to determine the optimal formulation. Finally, plant-derived soft gel capsules were manufactured using the rotary die method and the prepared soft gel capsules showed equivalent or superior physical properties compared with pre-existing soft gel capsules. Therefore, we successfully developed a new scale-down methodology to optimize the formulation of plant-derived soft gel capsules on a commercial scale. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. ANN-Benchmarks: A Benchmarking Tool for Approximate Nearest Neighbor Algorithms

    DEFF Research Database (Denmark)

    Aumüller, Martin; Bernhardsson, Erik; Faithfull, Alexander

    2017-01-01

    This paper describes ANN-Benchmarks, a tool for evaluating the performance of in-memory approximate nearest neighbor algorithms. It provides a standard interface for measuring the performance and quality achieved by nearest neighbor algorithms on different standard data sets. It supports several...... visualise these as images, Open image in new window plots, and websites with interactive plots. ANN-Benchmarks aims to provide a constantly updated overview of the current state of the art of k-NN algorithms. In the short term, this overview allows users to choose the correct k-NN algorithm and parameters...... for their similarity search task; in the longer term, algorithm designers will be able to use this overview to test and refine automatic parameter tuning. The paper gives an overview of the system, evaluates the results of the benchmark, and points out directions for future work. Interestingly, very different...

  3. Exchange Rate Exposure Management: The Benchmarking Process of Industrial Companies

    DEFF Research Database (Denmark)

    Aabo, Tom

    . The conducted interviews show that empirical reasons behind actual hedging strategies vary considerably - some in accordance with mainstream finance theory, some resting on asymmetric information. The diversity of attitudes seems to be partly a result of different competitive environments, partly a result...... of practices and strategies that have been established in each company fairly independently over time. The paper argues that hedge benchmarks are useful in their creation process (by forcing a comprehensive analysis) as well as in their final status (by the establishment of a consistent hedging strategy......Based on a cross-case study of Danish industrial companies the paper analyzes the benchmarking of the optimal hedging strategy. A stock market approach is pursued but a serious question mark is put on the validity of the obtained information seen from a corporate value-adding point of view...

  4. Benchmarking Swiss electricity grids

    International Nuclear Information System (INIS)

    Walti, N.O.; Weber, Ch.

    2001-01-01

    This extensive article describes a pilot benchmarking project initiated by the Swiss Association of Electricity Enterprises that assessed 37 Swiss utilities. The data collected from these utilities on a voluntary basis included data on technical infrastructure, investments and operating costs. These various factors are listed and discussed in detail. The assessment methods and rating mechanisms that provided the benchmarks are discussed and the results of the pilot study are presented that are to form the basis of benchmarking procedures for the grid regulation authorities under the planned Switzerland's electricity market law. Examples of the practical use of the benchmarking methods are given and cost-efficiency questions still open in the area of investment and operating costs are listed. Prefaces by the Swiss Association of Electricity Enterprises and the Swiss Federal Office of Energy complete the article

  5. Development of free-piston Stirling engine performance and optimization codes based on Martini simulation technique

    Science.gov (United States)

    Martini, William R.

    1989-01-01

    A FORTRAN computer code is described that could be used to design and optimize a free-displacer, free-piston Stirling engine similar to the RE-1000 engine made by Sunpower. The code contains options for specifying displacer and power piston motion or for allowing these motions to be calculated by a force balance. The engine load may be a dashpot, inertial compressor, hydraulic pump or linear alternator. Cycle analysis may be done by isothermal analysis or adiabatic analysis. Adiabatic analysis may be done using the Martini moving gas node analysis or the Rios second-order Runge-Kutta analysis. Flow loss and heat loss equations are included. Graphical display of engine motions and pressures and temperatures are included. Programming for optimizing up to 15 independent dimensions is included. Sample performance results are shown for both specified and unconstrained piston motions; these results are shown as generated by each of the two Martini analyses. Two sample optimization searches are shown using specified piston motion isothermal analysis. One is for three adjustable input and one is for four. Also, two optimization searches for calculated piston motion are presented for three and for four adjustable inputs. The effect of leakage is evaluated. Suggestions for further work are given.

  6. Maximizing Use of Extension Beef Cattle Benchmarks Data Derived from Cow Herd Appraisal Performance Software

    Science.gov (United States)

    Ramsay, Jennifer M.; Hanna, Lauren L. Hulsman; Ringwall, Kris A.

    2016-01-01

    One goal of Extension is to provide practical information that makes a difference to producers. Cow Herd Appraisal Performance Software (CHAPS) has provided beef producers with production benchmarks for 30 years, creating a large historical data set. Many such large data sets contain useful information but are underutilized. Our goal was to create…

  7. Adaptive Conflict-Free Optimization of Rule Sets for Network Security Packet Filtering Devices

    Directory of Open Access Journals (Sweden)

    Andrea Baiocchi

    2015-01-01

    Full Text Available Packet filtering and processing rules management in firewalls and security gateways has become commonplace in increasingly complex networks. On one side there is a need to maintain the logic of high level policies, which requires administrators to implement and update a large amount of filtering rules while keeping them conflict-free, that is, avoiding security inconsistencies. On the other side, traffic adaptive optimization of large rule lists is useful for general purpose computers used as filtering devices, without specific designed hardware, to face growing link speeds and to harden filtering devices against DoS and DDoS attacks. Our work joins the two issues in an innovative way and defines a traffic adaptive algorithm to find conflict-free optimized rule sets, by relying on information gathered with traffic logs. The proposed approach suits current technology architectures and exploits available features, like traffic log databases, to minimize the impact of ACO development on the packet filtering devices. We demonstrate the benefit entailed by the proposed algorithm through measurements on a test bed made up of real-life, commercial packet filtering devices.

  8. Free terminal time optimal control problem of an HIV model based on a conjugate gradient method.

    Science.gov (United States)

    Jang, Taesoo; Kwon, Hee-Dae; Lee, Jeehyun

    2011-10-01

    The minimum duration of treatment periods and the optimal multidrug therapy for human immunodeficiency virus (HIV) type 1 infection are considered. We formulate an optimal tracking problem, attempting to drive the states of the model to a "healthy" steady state in which the viral load is low and the immune response is strong. We study an optimal time frame as well as HIV therapeutic strategies by analyzing the free terminal time optimal tracking control problem. The minimum duration of treatment periods and the optimal multidrug therapy are found by solving the corresponding optimality systems with the additional transversality condition for the terminal time. We demonstrate by numerical simulations that the optimal dynamic multidrug therapy can lead to the long-term control of HIV by the strong immune response after discontinuation of therapy.

  9. Calculation of free-energy differences from computer simulations of initial and final states

    International Nuclear Information System (INIS)

    Hummer, G.; Szabo, A.

    1996-01-01

    A class of simple expressions of increasing accuracy for the free-energy difference between two states is derived based on numerical thermodynamic integration. The implementation of these formulas requires simulations of the initial and final (and possibly a few intermediate) states. They involve higher free-energy derivatives at these states which are related to the moments of the probability distribution of the perturbation. Given a specified number of such derivatives, these integration formulas are optimal in the sense that they are exact to the highest possible order of free-energy perturbation theory. The utility of this approach is illustrated for the hydration free energy of water. This problem provides a quite stringent test because the free energy is a highly nonlinear function of the charge so that even fourth order perturbation theory gives a very poor estimate of the free-energy change. Our results should prove most useful for complex, computationally demanding problems where free-energy differences arise primarily from changes in the electrostatic interactions (e.g., electron transfer, charging of ions, protonation of amino acids in proteins). copyright 1996 American Institute of Physics

  10. Dynamic PET of human liver inflammation: impact of kinetic modeling with optimization-derived dual-blood input function.

    Science.gov (United States)

    Wang, Guobao; Corwin, Michael T; Olson, Kristin A; Badawi, Ramsey D; Sarkar, Souvik

    2018-05-30

    The hallmark of nonalcoholic steatohepatitis is hepatocellular inflammation and injury in the setting of hepatic steatosis. Recent work has indicated that dynamic 18F-FDG PET with kinetic modeling has the potential to assess hepatic inflammation noninvasively, while static FDG-PET did not show a promise. Because the liver has dual blood supplies, kinetic modeling of dynamic liver PET data is challenging in human studies. The objective of this study is to evaluate and identify a dual-input kinetic modeling approach for dynamic FDG-PET of human liver inflammation. Fourteen human patients with nonalcoholic fatty liver disease were included in the study. Each patient underwent one-hour dynamic FDG-PET/CT scan and had liver biopsy within six weeks. Three models were tested for kinetic analysis: traditional two-tissue compartmental model with an image-derived single-blood input function (SBIF), model with population-based dual-blood input function (DBIF), and modified model with optimization-derived DBIF through a joint estimation framework. The three models were compared using Akaike information criterion (AIC), F test and histopathologic inflammation reference. The results showed that the optimization-derived DBIF model improved the fitting of liver time activity curves and achieved lower AIC values and higher F values than the SBIF and population-based DBIF models in all patients. The optimization-derived model significantly increased FDG K1 estimates by 101% and 27% as compared with traditional SBIF and population-based DBIF. K1 by the optimization-derived model was significantly associated with histopathologic grades of liver inflammation while the other two models did not provide a statistical significance. In conclusion, modeling of DBIF is critical for kinetic analysis of dynamic liver FDG-PET data in human studies. The optimization-derived DBIF model is more appropriate than SBIF and population-based DBIF for dynamic FDG-PET of liver inflammation. © 2018

  11. Funnel plot control limits to identify poorly performing healthcare providers when there is uncertainty in the value of the benchmark.

    Science.gov (United States)

    Manktelow, Bradley N; Seaton, Sarah E; Evans, T Alun

    2016-12-01

    There is an increasing use of statistical methods, such as funnel plots, to identify poorly performing healthcare providers. Funnel plots comprise the construction of control limits around a benchmark and providers with outcomes falling outside the limits are investigated as potential outliers. The benchmark is usually estimated from observed data but uncertainty in this estimate is usually ignored when constructing control limits. In this paper, the use of funnel plots in the presence of uncertainty in the value of the benchmark is reviewed for outcomes from a Binomial distribution. Two methods to derive the control limits are shown: (i) prediction intervals; (ii) tolerance intervals Tolerance intervals formally include the uncertainty in the value of the benchmark while prediction intervals do not. The probability properties of 95% control limits derived using each method were investigated through hypothesised scenarios. Neither prediction intervals nor tolerance intervals produce funnel plot control limits that satisfy the nominal probability characteristics when there is uncertainty in the value of the benchmark. This is not necessarily to say that funnel plots have no role to play in healthcare, but that without the development of intervals satisfying the nominal probability characteristics they must be interpreted with care. © The Author(s) 2014.

  12. Benchmarking af kommunernes sagsbehandling

    DEFF Research Database (Denmark)

    Amilon, Anna

    Fra 2007 skal Ankestyrelsen gennemføre benchmarking af kommuernes sagsbehandlingskvalitet. Formålet med benchmarkingen er at udvikle praksisundersøgelsernes design med henblik på en bedre opfølgning og at forbedre kommunernes sagsbehandling. Dette arbejdspapir diskuterer metoder for benchmarking...

  13. Perspective: Recommendations for benchmarking pre-clinical studies of nanomedicines

    Science.gov (United States)

    Dawidczyk, Charlene M.; Russell, Luisa M.; Searson, Peter C.

    2015-01-01

    Nanoparticle-based delivery systems provide new opportunities to overcome the limitations associated with traditional small molecule drug therapy for cancer, and to achieve both therapeutic and diagnostic functions in the same platform. Pre-clinical trials are generally designed to assess therapeutic potential and not to optimize the design of the delivery platform. Consequently, progress in developing design rules for cancer nanomedicines has been slow, hindering progress in the field. Despite the large number of pre-clinical trials, several factors restrict comparison and benchmarking of different platforms, including variability in experimental design, reporting of results, and the lack of quantitative data. To solve this problem, we review the variables involved in the design of pre-clinical trials and propose a protocol for benchmarking that we recommend be included in in vivo pre-clinical studies of drug delivery platforms for cancer therapy. This strategy will contribute to building the scientific knowledge base that enables development of design rules and accelerates the translation of new technologies. PMID:26249177

  14. A new enhanced index tracking model in portfolio optimization with sum weighted approach

    Science.gov (United States)

    Siew, Lam Weng; Jaaman, Saiful Hafizah; Hoe, Lam Weng

    2017-04-01

    Index tracking is a portfolio management which aims to construct the optimal portfolio to achieve similar return with the benchmark index return at minimum tracking error without purchasing all the stocks that make up the index. Enhanced index tracking is an improved portfolio management which aims to generate higher portfolio return than the benchmark index return besides minimizing the tracking error. The objective of this paper is to propose a new enhanced index tracking model with sum weighted approach to improve the existing index tracking model for tracking the benchmark Technology Index in Malaysia. The optimal portfolio composition and performance of both models are determined and compared in terms of portfolio mean return, tracking error and information ratio. The results of this study show that the optimal portfolio of the proposed model is able to generate higher mean return than the benchmark index at minimum tracking error. Besides that, the proposed model is able to outperform the existing model in tracking the benchmark index. The significance of this study is to propose a new enhanced index tracking model with sum weighted apporach which contributes 67% improvement on the portfolio mean return as compared to the existing model.

  15. Free-energy coarse-grained potential for C60

    International Nuclear Information System (INIS)

    Edmunds, D. M.; Tangney, P.; Vvedensky, D. D.; Foulkes, W. M. C.

    2015-01-01

    We propose a new deformable free energy method for generating a free-energy coarse-graining potential for C 60 . Potentials generated from this approach exhibit a strong temperature dependence and produce excellent agreement with benchmark fully atomistic molecular dynamics simulations. Parameter sets for analytical fits to this potential are provided at four different temperatures

  16. Benchmarking of refinery CO2 emissions. The CWT methodology provides a way forward

    Energy Technology Data Exchange (ETDEWEB)

    Larive, J.F. [CONCAWE, Brussels (Belgium)

    2009-10-01

    The EU Greenhouse Gas Emissions Trading Scheme foresees a number of mechanisms for distributing emission allowances amongst market players. For those economic sectors exposed to international competition, a portion of the required allowances will be distributed free of charge. In order to do this in an equitable manner, the amount of free allowances will be based on a sectoral benchmark representing best practice in the sector. In cooperation with Solomon Associates, CONCAWE has developed the so-called Complexity Weighted Tonne (CWT) methodology which provides a common and balanced basis for comparing the performance of refineries.

  17. Optimizing Combinations of Flavonoids Deriving from Astragali Radix in Activating the Regulatory Element of Erythropoietin by a Feedback System Control Scheme

    Directory of Open Access Journals (Sweden)

    Hui Yu

    2013-01-01

    Full Text Available Identifying potent drug combination from a herbal mixture is usually quite challenging, due to a large number of possible trials. Using an engineering approach of the feedback system control (FSC scheme, we identified the potential best combinations of four flavonoids, including formononetin, ononin, calycosin, and calycosin-7-O-β-D-glucoside deriving from Astragali Radix (AR; Huangqi, which provided the best biological action at minimal doses. Out of more than one thousand possible combinations, only tens of trials were required to optimize the flavonoid combinations that stimulated a maximal transcriptional activity of hypoxia response element (HRE, a critical regulator for erythropoietin (EPO transcription, in cultured human embryonic kidney fibroblast (HEK293T. By using FSC scheme, 90% of the work and time can be saved, and the optimized flavonoid combinations increased the HRE mediated transcriptional activity by ~3-fold as compared with individual flavonoid, while the amount of flavonoids was reduced by ~10-fold. Our study suggests that the optimized combination of flavonoids may have strong effect in activating the regulatory element of erythropoietin at very low dosage, which may be used as new source of natural hematopoietic agent. The present work also indicates that the FSC scheme is able to serve as an efficient and model-free approach to optimize the drug combination of different ingredients within a herbal decoction.

  18. MFTF TOTAL benchmark

    International Nuclear Information System (INIS)

    Choy, J.H.

    1979-06-01

    A benchmark of the TOTAL data base management system as applied to the Mirror Fusion Test Facility (MFTF) data base was implemented and run in February and March of 1979. The benchmark was run on an Interdata 8/32 and involved the following tasks: (1) data base design, (2) data base generation, (3) data base load, and (4) develop and implement programs to simulate MFTF usage of the data base

  19. The Drill Down Benchmark

    NARCIS (Netherlands)

    P.A. Boncz (Peter); T. Rühl (Tim); F. Kwakkel

    1998-01-01

    textabstractData Mining places specific requirements on DBMS query performance that cannot be evaluated satisfactorily using existing OLAP benchmarks. The DD Benchmark - defined here - provides a practical case and yardstick to explore how well a DBMS is able to support Data Mining applications. It

  20. Fluorescent in situ folding control for rapid optimization of cell-free membrane protein synthesis.

    Directory of Open Access Journals (Sweden)

    Annika Müller-Lucks

    Full Text Available Cell-free synthesis is an open and powerful tool for high-yield protein production in small reaction volumes predestined for high-throughput structural and functional analysis. Membrane proteins require addition of detergents for solubilization, liposomes, or nanodiscs. Hence, the number of parameters to be tested is significantly higher than with soluble proteins. Optimization is commonly done with respect to protein yield, yet without knowledge of the protein folding status. This approach contains a large inherent risk of ending up with non-functional protein. We show that fluorophore formation in C-terminal fusions with green fluorescent protein (GFP indicates the folding state of a membrane protein in situ, i.e. within the cell-free reaction mixture, as confirmed by circular dichroism (CD, proteoliposome reconstitution and functional assays. Quantification of protein yield and in-gel fluorescence intensity imply suitability of the method for membrane proteins of bacterial, protozoan, plant, and mammalian origin, representing vacuolar and plasma membrane localization, as well as intra- and extracellular positioning of the C-terminus. We conclude that GFP-fusions provide an extension to cell-free protein synthesis systems eliminating the need for experimental folding control and, thus, enabling rapid optimization towards membrane protein quality.

  1. Benchmarking and Learning in Public Healthcare

    DEFF Research Database (Denmark)

    Buckmaster, Natalie; Mouritsen, Jan

    2017-01-01

    This research investigates the effects of learning-oriented benchmarking in public healthcare settings. Benchmarking is a widely adopted yet little explored accounting practice that is part of the paradigm of New Public Management. Extant studies are directed towards mandated coercive benchmarking...

  2. Particle swarm optimization: an alternative in marine propeller optimization?

    Science.gov (United States)

    Vesting, F.; Bensow, R. E.

    2018-01-01

    This article deals with improving and evaluating the performance of two evolutionary algorithm approaches for automated engineering design optimization. Here a marine propeller design with constraints on cavitation nuisance is the intended application. For this purpose, the particle swarm optimization (PSO) algorithm is adapted for multi-objective optimization and constraint handling for use in propeller design. Three PSO algorithms are developed and tested for the optimization of four commercial propeller designs for different ship types. The results are evaluated by interrogating the generation medians and the Pareto front development. The same propellers are also optimized utilizing the well established NSGA-II genetic algorithm to provide benchmark results. The authors' PSO algorithms deliver comparable results to NSGA-II, but converge earlier and enhance the solution in terms of constraints violation.

  3. A high-fidelity airbus benchmark for system fault detection and isolation and flight control law clearance

    Science.gov (United States)

    Goupil, Ph.; Puyou, G.

    2013-12-01

    This paper presents a high-fidelity generic twin engine civil aircraft model developed by Airbus for advanced flight control system research. The main features of this benchmark are described to make the reader aware of the model complexity and representativeness. It is a complete representation including the nonlinear rigid-body aircraft model with a full set of control surfaces, actuator models, sensor models, flight control laws (FCL), and pilot inputs. Two applications of this benchmark in the framework of European projects are presented: FCL clearance using optimization and advanced fault detection and diagnosis (FDD).

  4. Benchmarking & European Sustainable Transport Policies

    DEFF Research Database (Denmark)

    Gudmundsson, H.

    2003-01-01

    , Benchmarking is one of the management tools that have recently been introduced in the transport sector. It is rapidly being applied to a wide range of transport operations, services and policies. This paper is a contribution to the discussion of the role of benchmarking in the future efforts to...... contribution to the discussions within the Eusponsored BEST Thematic Network (Benchmarking European Sustainable Transport) which ran from 2000 to 2003....

  5. Benchmarking – A tool for judgment or improvement?

    DEFF Research Database (Denmark)

    Rasmussen, Grane Mikael Gregaard

    2010-01-01

    perceptions of benchmarking will be presented; public benchmarking and best practice benchmarking. These two types of benchmarking are used to characterize and discuss the Danish benchmarking system and to enhance which effects, possibilities and challenges that follow in the wake of using this kind......Change in construction is high on the agenda for the Danish government and a comprehensive effort is done in improving quality and efficiency. This has led to an initiated governmental effort in bringing benchmarking into the Danish construction sector. This paper is an appraisal of benchmarking...... as it is presently carried out in the Danish construction sector. Many different perceptions of benchmarking and the nature of the construction sector, lead to an uncertainty in how to perceive and use benchmarking, hence, generating an uncertainty in understanding the effects of benchmarking. This paper addresses...

  6. Exploring the QSAR's predictive truthfulness of the novel N-tuple discrete derivative indices on benchmark datasets.

    Science.gov (United States)

    Martínez-Santiago, O; Marrero-Ponce, Y; Vivas-Reyes, R; Rivera-Borroto, O M; Hurtado, E; Treto-Suarez, M A; Ramos, Y; Vergara-Murillo, F; Orozco-Ugarriza, M E; Martínez-López, Y

    2017-05-01

    Graph derivative indices (GDIs) have recently been defined over N-atoms (N = 2, 3 and 4) simultaneously, which are based on the concept of derivatives in discrete mathematics (finite difference), metaphorical to the derivative concept in classical mathematical analysis. These molecular descriptors (MDs) codify topo-chemical and topo-structural information based on the concept of the derivative of a molecular graph with respect to a given event (S) over duplex, triplex and quadruplex relations of atoms (vertices). These GDIs have been successfully applied in the description of physicochemical properties like reactivity, solubility and chemical shift, among others, and in several comparative quantitative structure activity/property relationship (QSAR/QSPR) studies. Although satisfactory results have been obtained in previous modelling studies with the aforementioned indices, it is necessary to develop new, more rigorous analysis to assess the true predictive performance of the novel structure codification. So, in the present paper, an assessment and statistical validation of the performance of these novel approaches in QSAR studies are executed, as well as a comparison with those of other QSAR procedures reported in the literature. To achieve the main aim of this research, QSARs were developed on eight chemical datasets widely used as benchmarks in the evaluation/validation of several QSAR methods and/or many different MDs (fundamentally 3D MDs). Three to seven variable QSAR models were built for each chemical dataset, according to the original dissection into training/test sets. The models were developed by using multiple linear regression (MLR) coupled with a genetic algorithm as the feature wrapper selection technique in the MobyDigs software. Each family of GDIs (for duplex, triplex and quadruplex) behaves similarly in all modelling, although there were some exceptions. However, when all families were used in combination, the results achieved were quantitatively

  7. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  8. The Isprs Benchmark on Indoor Modelling

    Science.gov (United States)

    Khoshelham, K.; Díaz Vilariño, L.; Peter, M.; Kang, Z.; Acharya, D.

    2017-09-01

    Automated generation of 3D indoor models from point cloud data has been a topic of intensive research in recent years. While results on various datasets have been reported in literature, a comparison of the performance of different methods has not been possible due to the lack of benchmark datasets and a common evaluation framework. The ISPRS benchmark on indoor modelling aims to address this issue by providing a public benchmark dataset and an evaluation framework for performance comparison of indoor modelling methods. In this paper, we present the benchmark dataset comprising several point clouds of indoor environments captured by different sensors. We also discuss the evaluation and comparison of indoor modelling methods based on manually created reference models and appropriate quality evaluation criteria. The benchmark dataset is available for download at: html"target="_blank">http://www2.isprs.org/commissions/comm4/wg5/benchmark-on-indoor-modelling.html.

  9. Benchmarking the energy efficiency of commercial buildings

    International Nuclear Information System (INIS)

    Chung, William; Hui, Y.V.; Lam, Y. Miu

    2006-01-01

    Benchmarking energy-efficiency is an important tool to promote the efficient use of energy in commercial buildings. Benchmarking models are mostly constructed in a simple benchmark table (percentile table) of energy use, which is normalized with floor area and temperature. This paper describes a benchmarking process for energy efficiency by means of multiple regression analysis, where the relationship between energy-use intensities (EUIs) and the explanatory factors (e.g., operating hours) is developed. Using the resulting regression model, these EUIs are then normalized by removing the effect of deviance in the significant explanatory factors. The empirical cumulative distribution of the normalized EUI gives a benchmark table (or percentile table of EUI) for benchmarking an observed EUI. The advantage of this approach is that the benchmark table represents a normalized distribution of EUI, taking into account all the significant explanatory factors that affect energy consumption. An application to supermarkets is presented to illustrate the development and the use of the benchmarking method

  10. Numisheet2005 Benchmark Analysis on Forming of an Automotive Underbody Cross Member: Benchmark 2

    International Nuclear Information System (INIS)

    Buranathiti, Thaweepat; Cao Jian

    2005-01-01

    This report presents an international cooperation benchmark effort focusing on simulations of a sheet metal stamping process. A forming process of an automotive underbody cross member using steel and aluminum blanks is used as a benchmark. Simulation predictions from each submission are analyzed via comparison with the experimental results. A brief summary of various models submitted for this benchmark study is discussed. Prediction accuracy of each parameter of interest is discussed through the evaluation of cumulative errors from each submission

  11. Research on loading pattern optimization for VVER reactor

    International Nuclear Information System (INIS)

    Tran Viet Phu; Nguyen Thi Mai Huong; Nguyen Huu Tiep; Ta Duy Long; Tran Vinh Thanh; Tran Hoai Nam

    2017-01-01

    A study on fuel loading pattern optimization of a VVER reactor was performed. In this study, a core physics simulator was developed based on a multi-group diffusion theory for the use in the problem of fuel loading optimization of VVER reactors. The core simulator could handle the triangular meshes of the core and the computational speed is fast. Verification of the core simulator was confirmed against a benchmark problem of a VVER-1000 reactor. Several optimization methods such as DS, SA, TS and a combination of them were investigated and implemented in coupling with the core simulator. Calculations was performed for optimizing the fuel loading pattern of the core using these methods based on a benchmark core model in comparison with the reference core. Comparison among these methods have shown that a combination of SA+TS is the most effective for the problem of fuel loading pattern optimization. Advanced methods are being researched continuously. (author)

  12. Hierarchical Swarm Model: A New Approach to Optimization

    Directory of Open Access Journals (Sweden)

    Hanning Chen

    2010-01-01

    Full Text Available This paper presents a novel optimization model called hierarchical swarm optimization (HSO, which simulates the natural hierarchical complex system from where more complex intelligence can emerge for complex problems solving. This proposed model is intended to suggest ways that the performance of HSO-based algorithms on complex optimization problems can be significantly improved. This performance improvement is obtained by constructing the HSO hierarchies, which means that an agent in a higher level swarm can be composed of swarms of other agents from lower level and different swarms of different levels evolve on different spatiotemporal scale. A novel optimization algorithm (named PS2O, based on the HSO model, is instantiated and tested to illustrate the ideas of HSO model clearly. Experiments were conducted on a set of 17 benchmark optimization problems including both continuous and discrete cases. The results demonstrate remarkable performance of the PS2O algorithm on all chosen benchmark functions when compared to several successful swarm intelligence and evolutionary algorithms.

  13. Application of different measures of bioavailability to support the derivation of risk-based remedial benchmarks for PHC-contaminated sites

    Energy Technology Data Exchange (ETDEWEB)

    Stephenson, G. [Stantec Consulting Ltd., Surrey, BC (Canada)

    2009-07-01

    Risk estimates and exposure scenarios hardly ever take into consideration site-specific bioavailability of contaminants. Risk assessors frequently adopt the assumption that a contaminant in soils is 100 percent bioavailable, resulting in an overestimation of the risks associated with contamination. Remedial targets or benchmarks derived in light of this assumption are needlessly low and might be technically unattainable or prohibitive in terms of cost. This presentation discussed a research project whose goal was to develop a tool kit to measure or determine site-specific bioavailability of contaminants (PHCs) in soils to ecological receptors. Tools that were discussed included: biological measures such as toxicity tests, contaminant residues in tissues, and bioaccumulation tests. Chemical measures such as bioaccessibility tests and other biomimetic devices (SPMDs), biotic ligand modeling, and chemical extractions were also presented. Preliminary investigation results were provided. Other topics that were discussed included: single-species toxicity tests; preliminary comparisons; the site; bioaccumulation; and toxicity to earthworms. It was concluded that total soil and water-extractable concentrations did not correlate well with toxicity. tabs., figs.

  14. SKaMPI: A Comprehensive Benchmark for Public Benchmarking of MPI

    Directory of Open Access Journals (Sweden)

    Ralf Reussner

    2002-01-01

    Full Text Available The main objective of the MPI communication library is to enable portable parallel programming with high performance within the message-passing paradigm. Since the MPI standard has no associated performance model, and makes no performance guarantees, comprehensive, detailed and accurate performance figures for different hardware platforms and MPI implementations are important for the application programmer, both for understanding and possibly improving the behavior of a given program on a given platform, as well as for assuring a degree of predictable behavior when switching to another hardware platform and/or MPI implementation. We term this latter goal performance portability, and address the problem of attaining performance portability by benchmarking. We describe the SKaMPI benchmark which covers a large fraction of MPI, and incorporates well-accepted mechanisms for ensuring accuracy and reliability. SKaMPI is distinguished among other MPI benchmarks by an effort to maintain a public performance database with performance data from different hardware platforms and MPI implementations.

  15. MODIS-derived daily PAR simulation from cloud-free images and its validation

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Liangfu; Gu, Xingfa; Tian, Guoliang [State Key Laboratory of Remote Sensing Science, Jointly Sponsored by Institute of Remote Sensing Applications of Chinese Academy of Sciences and Beijing Normal University, Beijing 100101 (China); The Center for National Spaceborne Demonstration, Beijing 100101 (China); Gao, Yanhua [State Key Laboratory of Remote Sensing Science, Jointly Sponsored by Institute of Remote Sensing Applications of Chinese Academy of Sciences and Beijing Normal University, Beijing 100101 (China); Institute of Geographic Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing 100101 (China); Yang, Lei [State Key Laboratory of Remote Sensing Science, Jointly Sponsored by Institute of Remote Sensing Applications of Chinese Academy of Sciences and Beijing Normal University, Beijing 100101 (China); Jilin University, Changchun 130026 (China); Liu, Qinhuo [State Key Laboratory of Remote Sensing Science, Jointly Sponsored by Institute of Remote Sensing Applications of Chinese Academy of Sciences and Beijing Normal University, Beijing 100101 (China)

    2008-06-15

    In this paper, a MODIS-derived daily PAR (photosynthetically active radiation) simulation model from cloud-free image over land surface has been developed based on Bird and Riordan's model. In this model, the total downwelling spectral surface irradiance is divided into two parts: one is beam irradiance, and another is diffuse irradiance. The attenuation of solar beam irradiance comprises scattering by the gas mixture, absorption by ozone, the gas mixture and water vapor, and scattering and absorption by aerosols. The diffuse irradiance is scattered out of the direct beam and towards the surface. The multiple ground-air interactions have been taken into account in the diffuse irradiance model. The parameters needed in this model are atmospheric water vapor content, aerosol optical thickness and spectral albedo ranging from 400 nm to 700 nm. They are all retrieved from MODIS data. Then, the instantaneous photosynthetically available radiation (IPAR) is integrated by using a weighted sum at each of the visible MODIS wavebands. Finally, a daily PAR is derived by integration of IPAR. In order to validate the MODIS-derived PAR model, we compared the field PAR measurements in 2003 and 2004 against the simulated PAR. The measurements were made at the Qianyanzhou ecological experimental station, Chinese Ecosystem Research Network. A total of 54 days of cloud-free MODIS L1B level images were used for the PAR simulation. Our results show that the simulated PAR is consistent with field measurements, where the correlation coefficient of linear regression between calculated PAR and measured PAR is 0.93396. However, there were some uncertainties in the comparison of 1 km pixel PAR with the tower flux stand measurement. (author)

  16. Entropy-based benchmarking methods

    NARCIS (Netherlands)

    Temurshoev, Umed

    2012-01-01

    We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth

  17. Derivation of xeno-free and GMP-grade human embryonic stem cells--platforms for future clinical applications.

    Directory of Open Access Journals (Sweden)

    Shelly E Tannenbaum

    Full Text Available Clinically compliant human embryonic stem cells (hESCs should be developed in adherence to ethical standards, without risk of contamination by adventitious agents. Here we developed for the first time animal-component free and good manufacturing practice (GMP-compliant hESCs. After vendor and raw material qualification, we derived xeno-free, GMP-grade feeders from umbilical cord tissue, and utilized them within a novel, xeno-free hESC culture system. We derived and characterized three hESC lines in adherence to regulations for embryo procurement, and good tissue, manufacturing and laboratory practices. To minimize freezing and thawing, we continuously expanded the lines from initial outgrowths and samples were cryopreserved as early stocks and banks. Batch release criteria included DNA-fingerprinting and HLA-typing for identity, characterization of pluripotency-associated marker expression, proliferation, karyotyping and differentiation in-vitro and in-vivo. These hESCs may be valuable for regenerative therapy. The ethical, scientific and regulatory methodology presented here may serve for development of additional clinical-grade hESCs.

  18. A Benchmark Study on Error Assessment and Quality Control of CCS Reads Derived from the PacBio RS.

    Science.gov (United States)

    Jiao, Xiaoli; Zheng, Xin; Ma, Liang; Kutty, Geetha; Gogineni, Emile; Sun, Qiang; Sherman, Brad T; Hu, Xiaojun; Jones, Kristine; Raley, Castle; Tran, Bao; Munroe, David J; Stephens, Robert; Liang, Dun; Imamichi, Tomozumi; Kovacs, Joseph A; Lempicki, Richard A; Huang, Da Wei

    2013-07-31

    PacBio RS, a newly emerging third-generation DNA sequencing platform, is based on a real-time, single-molecule, nano-nitch sequencing technology that can generate very long reads (up to 20-kb) in contrast to the shorter reads produced by the first and second generation sequencing technologies. As a new platform, it is important to assess the sequencing error rate, as well as the quality control (QC) parameters associated with the PacBio sequence data. In this study, a mixture of 10 prior known, closely related DNA amplicons were sequenced using the PacBio RS sequencing platform. After aligning Circular Consensus Sequence (CCS) reads derived from the above sequencing experiment to the known reference sequences, we found that the median error rate was 2.5% without read QC, and improved to 1.3% with an SVM based multi-parameter QC method. In addition, a De Novo assembly was used as a downstream application to evaluate the effects of different QC approaches. This benchmark study indicates that even though CCS reads are post error-corrected it is still necessary to perform appropriate QC on CCS reads in order to produce successful downstream bioinformatics analytical results.

  19. Power reactor pressure vessel benchmarks

    International Nuclear Information System (INIS)

    Rahn, F.J.

    1978-01-01

    A review is given of the current status of experimental and calculational benchmarks for use in understanding the radiation embrittlement effects in the pressure vessels of operating light water power reactors. The requirements of such benchmarks for application to pressure vessel dosimetry are stated. Recent developments in active and passive neutron detectors sensitive in the ranges of importance to embrittlement studies are summarized and recommendations for improvements in the benchmark are made. (author)

  20. Scalable implementation of ancilla-free optimal 1→M phase-covariant quantum cloning by combining quantum Zeno dynamics and adiabatic passage

    International Nuclear Information System (INIS)

    Shao, Xiao-Qiang; Zheng, Tai-Yu; Zhang, Shou

    2011-01-01

    A scalable way for implementation of ancilla-free optimal 1→M phase-covariant quantum cloning (PCC) is proposed by combining quantum Zeno dynamics and adiabatic passage. An optimal 1→M PCC can be achieved directly from the existed optimal 1→(M-1) PCC without excited states population during the whole process. The cases for optimal 1→3 (4) PCCs are discussed detailedly to show that the scheme is robust against the effect of decoherence. Moreover, the time for carrying out each cloning transformation is regular, which may reduce the complexity for achieving the optimal PCC in experiment. -- Highlights: → We implement the ancilla-free optimal 1→M phase-covariant quantum cloning machine. → This scheme is robust against the cavity decay and the spontaneous emission of atom. → The time for carrying out each cloning transformation is regular.

  1. Scalable implementation of ancilla-free optimal 1→M phase-covariant quantum cloning by combining quantum Zeno dynamics and adiabatic passage

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Xiao-Qiang, E-mail: xqshao83@yahoo.cn [School of Physics, Northeast Normal University, Changchun 130024 (China); Zheng, Tai-Yu, E-mail: zhengty@nenu.edu.cn [School of Physics, Northeast Normal University, Changchun 130024 (China); Zhang, Shou [Department of Physics, College of Science, Yanbian University, Yanji, Jilin 133002 (China)

    2011-09-19

    A scalable way for implementation of ancilla-free optimal 1→M phase-covariant quantum cloning (PCC) is proposed by combining quantum Zeno dynamics and adiabatic passage. An optimal 1→M PCC can be achieved directly from the existed optimal 1→(M-1) PCC without excited states population during the whole process. The cases for optimal 1→3 (4) PCCs are discussed detailedly to show that the scheme is robust against the effect of decoherence. Moreover, the time for carrying out each cloning transformation is regular, which may reduce the complexity for achieving the optimal PCC in experiment. -- Highlights: → We implement the ancilla-free optimal 1→M phase-covariant quantum cloning machine. → This scheme is robust against the cavity decay and the spontaneous emission of atom. → The time for carrying out each cloning transformation is regular.

  2. Shielding benchmark problems, (2)

    International Nuclear Information System (INIS)

    Tanaka, Shun-ichi; Sasamoto, Nobuo; Oka, Yoshiaki; Shin, Kazuo; Tada, Keiko.

    1980-02-01

    Shielding benchmark problems prepared by Working Group of Assessment of Shielding Experiments in the Research Committee on Shielding Design in the Atomic Energy Society of Japan were compiled by Shielding Laboratory in Japan Atomic Energy Research Institute. Fourteen shielding benchmark problems are presented newly in addition to twenty-one problems proposed already, for evaluating the calculational algorithm and accuracy of computer codes based on discrete ordinates method and Monte Carlo method and for evaluating the nuclear data used in codes. The present benchmark problems are principally for investigating the backscattering and the streaming of neutrons and gamma rays in two- and three-dimensional configurations. (author)

  3. Electricity consumption in school buildings - benchmark and web tools; Elforbrug i skoler - benchmark og webvaerktoej

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-07-01

    The aim of this project has been to produce benchmarks for electricity consumption in Danish schools in order to encourage electricity conservation. An internet programme has been developed with the aim of facilitating schools' access to benchmarks and to evaluate energy consumption. The overall purpose is to create increased attention to the electricity consumption of each separate school by publishing benchmarks which take the schools' age and number of pupils as well as after school activities into account. Benchmarks can be used to make green accounts and work as markers in e.g. energy conservation campaigns, energy management and for educational purposes. The internet tool can be found on www.energiguiden.dk. (BA)

  4. Optimal free will on one side in reproducing the singlet correlation

    International Nuclear Information System (INIS)

    Banik, Manik; Gazi, MD. Rajjak; Das, Subhadipa; Rai, Ashutosh; Kunkri, Samir

    2012-01-01

    Bell’s theorem teaches us that there are quantum correlations that cannot be simulated by just shared randomness (local hidden variable). There are some recent results which simulate the singlet correlation by using either 1 bit or a binary (no-signaling) correlation which violates Bell’s inequality maximally. But there is one more possible way to simulate quantum correlation by relaxing the condition of independency of measurement on shared randomness. Recently, Hall showed that the statistics of a singlet state can be generated by sacrificing measurement independence where underlying distribution of hidden variables depends on measurement directions of both parties (Hall 2010 Phys. Rev. Lett. 105 250404). He also proved that for any model of singlet correlation, 86% measurement independence is optimal. In this paper, we show that 59% measurement independence is optimal for simulating the singlet correlation when the underlying distribution of hidden variables depends only on the measurements of one party. We also show that a distribution corresponding to this optimal lack of free will already exists in the literature which first appeared in the context of detection efficiency loophole (Gisin and Gisin 1999 Phys. Lett. A 323–7). (paper)

  5. HS06 Benchmark for an ARM Server

    Science.gov (United States)

    Kluth, Stefan

    2014-06-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  6. HS06 benchmark for an ARM server

    International Nuclear Information System (INIS)

    Kluth, Stefan

    2014-01-01

    We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.

  7. Performance Targets and External Benchmarking

    DEFF Research Database (Denmark)

    Friis, Ivar; Hansen, Allan; Vámosi, Tamás S.

    Research on relative performance measures, transfer pricing, beyond budgeting initiatives, target costing, piece rates systems and value based management has for decades underlined the importance of external benchmarking in performance management. Research conceptualises external benchmarking...... as a market mechanism that can be brought inside the firm to provide incentives for continuous improvement and the development of competitive advances. However, whereas extant research primarily has focused on the importance and effects of using external benchmarks, less attention has been directed towards...... the conditions upon which the market mechanism is performing within organizations. This paper aims to contribute to research by providing more insight to the conditions for the use of external benchmarking as an element in performance management in organizations. Our study explores a particular type of external...

  8. Multiplier-free filters for wideband SAR

    DEFF Research Database (Denmark)

    Dall, Jørgen; Christensen, Erik Lintz

    2001-01-01

    This paper derives a set of parameters to be optimized when designing filters for digital demodulation and range prefiltering in SAR systems. Aiming at an implementation in field programmable gate arrays (FPGAs), an approach for the design of multiplier-free filters is outlined. Design results...... are presented in terms of filter complexity and performance. One filter has been coded in VHDL and preliminary results indicate that the filter can meet a 2 GHz input sample rate....

  9. Proposed biopsy performance benchmarks for MRI based on an audit of a large academic center.

    Science.gov (United States)

    Sedora Román, Neda I; Mehta, Tejas S; Sharpe, Richard E; Slanetz, Priscilla J; Venkataraman, Shambhavi; Fein-Zachary, Valerie; Dialani, Vandana

    2018-05-01

    Performance benchmarks exist for mammography (MG); however, performance benchmarks for magnetic resonance imaging (MRI) are not yet fully developed. The purpose of our study was to perform an MRI audit based on established MG and screening MRI benchmarks and to review whether these benchmarks can be applied to an MRI practice. An IRB approved retrospective review of breast MRIs was performed at our center from 1/1/2011 through 12/31/13. For patients with biopsy recommendation, core biopsy and surgical pathology results were reviewed. The data were used to derive mean performance parameter values, including abnormal interpretation rate (AIR), positive predictive value (PPV), cancer detection rate (CDR), percentage of minimal cancers and axillary node negative cancers and compared with MG and screening MRI benchmarks. MRIs were also divided by screening and diagnostic indications to assess for differences in performance benchmarks amongst these two groups. Of the 2455 MRIs performed over 3-years, 1563 were performed for screening indications and 892 for diagnostic indications. With the exception of PPV2 for screening breast MRIs from 2011 to 2013, PPVs were met for our screening and diagnostic populations when compared to the MRI screening benchmarks established by the Breast Imaging Reporting and Data System (BI-RADS) 5 Atlas ® . AIR and CDR were lower for screening indications as compared to diagnostic indications. New MRI screening benchmarks can be used for screening MRI audits while the American College of Radiology (ACR) desirable goals for diagnostic MG can be used for diagnostic MRI audits. Our study corroborates established findings regarding differences in AIR and CDR amongst screening versus diagnostic indications. © 2017 Wiley Periodicals, Inc.

  10. Three Component Synthesis of Substituted 4H-[1,3]Dioxin Derivatives Under Solvent-Free Conditions

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Hosseini-Tabatabaei

    2012-01-01

    Full Text Available Reaction between aryl aldehydes, acetylacetone and alkyl isocyanides in solvent-free conditions provided a simple and efficient one-pot route for the synthesis of 1-(2-alkylamino-6-methyl-4-aryl-4H-[1,3]dioxin-5-ylethanone derivatives in excellent yields.

  11. Optimization of some electrochemical etching parameters for cellulose derivatives

    International Nuclear Information System (INIS)

    Chowdhury, Annis; Gammage, R.B.

    1978-01-01

    Electrochemical etching of fast neutron induced recoil particle tracks in cellulose derivatives and other polymers provides an inexpensive and sensitive means of fast neutron personnel dosimetry. A study of the shape, clarity, and size of the tracks in Transilwrap polycarbonate indicated that the optimum normality of the potassium hydroxide etching solution is 9 N. Optimizations have also been attempted for cellulose nitrate, triacetate, and acetobutyrate with respect to such electrochemical etching parameters as frequency, voltage gradient, and concentration of the etching solution. The measurement of differential leakage currents between the undamaged and the neutron damaged foils aided in the selection of optimum frequencies. (author)

  12. Benchmarking in Czech Higher Education

    OpenAIRE

    Plaček Michal; Ochrana František; Půček Milan

    2015-01-01

    The first part of this article surveys the current experience with the use of benchmarking at Czech universities specializing in economics and management. The results indicate that collaborative benchmarking is not used on this level today, but most actors show some interest in its introduction. The expression of the need for it and the importance of benchmarking as a very suitable performance-management tool in less developed countries are the impetus for the second part of our article. Base...

  13. Benchmark simulation models, quo vadis?

    Science.gov (United States)

    Jeppsson, U; Alex, J; Batstone, D J; Benedetti, L; Comas, J; Copp, J B; Corominas, L; Flores-Alsina, X; Gernaey, K V; Nopens, I; Pons, M-N; Rodríguez-Roda, I; Rosen, C; Steyer, J-P; Vanrolleghem, P A; Volcke, E I P; Vrecko, D

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to provide their insights, highlighting areas where knowledge may still be deficient and where new opportunities are emerging, and to propose potential avenues for future development and application of the general benchmarking framework and its associated tools. The paper focuses on the topics of temporal and spatial extension, process modifications within the WWTP, the realism of models, control strategy extensions and the potential for new evaluation tools within the existing benchmark system. We find that there are major opportunities for application within all of these areas, either from existing work already being done within the context of the benchmarking simulation models (BSMs) or applicable work in the wider literature. Of key importance is increasing capability, usability and transparency of the BSM package while avoiding unnecessary complexity.

  14. Dissipativity analysis of the base isolated benchmark structure with magnetorheological fluid dampers

    International Nuclear Information System (INIS)

    Erkus, Baris; Johnson, Erik A

    2011-01-01

    This paper investigates the dissipativity and performance characteristics of the semiactive control of the base isolated benchmark structure with magnetorheological (MR) fluid dampers. Previously, the authors introduced the concepts of dissipativity and dissipativity indices in the semiactive control of structures with smart dampers and studied the dissipativity characteristics of simple structures with idealized dampers. To investigate the effects of semiactive controller dissipativity characteristics on the overall performance of the base isolated benchmark building, a clipped optimal control strategy with a linear quadratic Gaussian (LQG) controller and a 20 ton MR fluid damper model is used. A cumulative index is proposed for quantifying the overall dissipativity of a control system with multiple control devices. Two control designs with different dissipativity and performance characteristics are considered as the primary controller in clipped optimal control. Numerical simulations reveal that the dissipativity indices can be classified into two groups that exhibit distinct patterns. It is shown that the dissipativity indices identify primary controllers that are more suitable for application with MR dampers and provide useful information in the semiactive design process that complements other performance indices. The computational efficiency of the proposed dissipativity indices is verified by comparing computation times

  15. The Concepts "Benchmarks and Benchmarking" Used in Education Planning: Teacher Education as Example

    Science.gov (United States)

    Steyn, H. J.

    2015-01-01

    Planning in education is a structured activity that includes several phases and steps that take into account several kinds of information (Steyn, Steyn, De Waal & Wolhuter, 2002: 146). One of the sets of information that are usually considered is the (so-called) "benchmarks" and "benchmarking" regarding the focus of a…

  16. RCQ-GA: RDF Chain Query Optimization Using Genetic Algorithms

    Science.gov (United States)

    Hogenboom, Alexander; Milea, Viorel; Frasincar, Flavius; Kaymak, Uzay

    The application of Semantic Web technologies in an Electronic Commerce environment implies a need for good support tools. Fast query engines are needed for efficient querying of large amounts of data, usually represented using RDF. We focus on optimizing a special class of SPARQL queries, the so-called RDF chain queries. For this purpose, we devise a genetic algorithm called RCQ-GA that determines the order in which joins need to be performed for an efficient evaluation of RDF chain queries. The approach is benchmarked against a two-phase optimization algorithm, previously proposed in literature. The more complex a query is, the more RCQ-GA outperforms the benchmark in solution quality, execution time needed, and consistency of solution quality. When the algorithms are constrained by a time limit, the overall performance of RCQ-GA compared to the benchmark further improves.

  17. Aerodynamic Benchmarking of the Deepwind Design

    DEFF Research Database (Denmark)

    Bedona, Gabriele; Schmidt Paulsen, Uwe; Aagaard Madsen, Helge

    2015-01-01

    The aerodynamic benchmarking for the DeepWind rotor is conducted comparing different rotor geometries and solutions and keeping the comparison as fair as possible. The objective for the benchmarking is to find the most suitable configuration in order to maximize the power production and minimize...... the blade solicitation and the cost of energy. Different parameters are considered for the benchmarking study. The DeepWind blade is characterized by a shape similar to the Troposkien geometry but asymmetric between the top and bottom parts: this shape is considered as a fixed parameter in the benchmarking...

  18. Vver-1000 Mox core computational benchmark

    International Nuclear Information System (INIS)

    2006-01-01

    The NEA Nuclear Science Committee has established an Expert Group that deals with the status and trends of reactor physics, fuel performance and fuel cycle issues related to disposing of weapons-grade plutonium in mixed-oxide fuel. The objectives of the group are to provide NEA member countries with up-to-date information on, and to develop consensus regarding, core and fuel cycle issues associated with burning weapons-grade plutonium in thermal water reactors (PWR, BWR, VVER-1000, CANDU) and fast reactors (BN-600). These issues concern core physics, fuel performance and reliability, and the capability and flexibility of thermal water reactors and fast reactors to dispose of weapons-grade plutonium in standard fuel cycles. The activities of the NEA Expert Group on Reactor-based Plutonium Disposition are carried out in close co-operation (jointly, in most cases) with the NEA Working Party on Scientific Issues in Reactor Systems (WPRS). A prominent part of these activities include benchmark studies. At the time of preparation of this report, the following benchmarks were completed or in progress: VENUS-2 MOX Core Benchmarks: carried out jointly with the WPRS (formerly the WPPR) (completed); VVER-1000 LEU and MOX Benchmark (completed); KRITZ-2 Benchmarks: carried out jointly with the WPRS (formerly the WPPR) (completed); Hollow and Solid MOX Fuel Behaviour Benchmark (completed); PRIMO MOX Fuel Performance Benchmark (ongoing); VENUS-2 MOX-fuelled Reactor Dosimetry Calculation (ongoing); VVER-1000 In-core Self-powered Neutron Detector Calculational Benchmark (started); MOX Fuel Rod Behaviour in Fast Power Pulse Conditions (started); Benchmark on the VENUS Plutonium Recycling Experiments Configuration 7 (started). This report describes the detailed results of the benchmark investigating the physics of a whole VVER-1000 reactor core using two-thirds low-enriched uranium (LEU) and one-third MOX fuel. It contributes to the computer code certification process and to the

  19. Shielding benchmark problems

    International Nuclear Information System (INIS)

    Tanaka, Shun-ichi; Sasamoto, Nobuo; Oka, Yoshiaki; Kawai, Masayoshi; Nakazawa, Masaharu.

    1978-09-01

    Shielding benchmark problems were prepared by the Working Group of Assessment of Shielding Experiments in the Research Comittee on Shielding Design of the Atomic Energy Society of Japan, and compiled by the Shielding Laboratory of Japan Atomic Energy Research Institute. Twenty-one kinds of shielding benchmark problems are presented for evaluating the calculational algorithm and the accuracy of computer codes based on the discrete ordinates method and the Monte Carlo method and for evaluating the nuclear data used in the codes. (author)

  20. Improving IMRT-plan quality with MLC leaf position refinement post plan optimization

    International Nuclear Information System (INIS)

    Niu Ying; Zhang Guowei; Berman, Barry L.; Parke, William C.; Yi Byongyong; Yu, Cedric X.

    2012-01-01

    Purpose: In intensity-modulated radiation therapy (IMRT) planning, reducing the pencil-beam size may lead to a significant improvement in dose conformity, but also increase the time needed for the dose calculation and plan optimization. The authors develop and evaluate a postoptimization refinement (POpR) method, which makes fine adjustments to the multileaf collimator (MLC) leaf positions after plan optimization, enhancing the spatial precision and improving the plan quality without a significant impact on the computational burden. Methods: The authors’ POpR method is implemented using a commercial treatment planning system based on direct aperture optimization. After an IMRT plan is optimized using pencil beams with regular pencil-beam step size, a greedy search is conducted by looping through all of the involved MLC leaves to see if moving the MLC leaf in or out by half of a pencil-beam step size will improve the objective function value. The half-sized pencil beams, which are used for updating dose distribution in the greedy search, are derived from the existing full-sized pencil beams without need for further pencil-beam dose calculations. A benchmark phantom case and a head-and-neck (HN) case are studied for testing the authors’ POpR method. Results: Using a benchmark phantom and a HN case, the authors have verified that their POpR method can be an efficient technique in the IMRT planning process. Effectiveness of POpR is confirmed by noting significant improvements in objective function values. Dosimetric benefits of POpR are comparable to those of using a finer pencil-beam size from the optimization start, but with far less computation and time. Conclusions: The POpR is a feasible and practical method to significantly improve IMRT-plan quality without compromising the planning efficiency.

  1. Dynamic benchmarking of simulation codes

    International Nuclear Information System (INIS)

    Henry, R.E.; Paik, C.Y.; Hauser, G.M.

    1996-01-01

    Computer simulation of nuclear power plant response can be a full-scope control room simulator, an engineering simulator to represent the general behavior of the plant under normal and abnormal conditions, or the modeling of the plant response to conditions that would eventually lead to core damage. In any of these, the underlying foundation for their use in analysing situations, training of vendor/utility personnel, etc. is how well they represent what has been known from industrial experience, large integral experiments and separate effects tests. Typically, simulation codes are benchmarked with some of these; the level of agreement necessary being dependent upon the ultimate use of the simulation tool. However, these analytical models are computer codes, and as a result, the capabilities are continually enhanced, errors are corrected, new situations are imposed on the code that are outside of the original design basis, etc. Consequently, there is a continual need to assure that the benchmarks with important transients are preserved as the computer code evolves. Retention of this benchmarking capability is essential to develop trust in the computer code. Given the evolving world of computer codes, how is this retention of benchmarking capabilities accomplished? For the MAAP4 codes this capability is accomplished through a 'dynamic benchmarking' feature embedded in the source code. In particular, a set of dynamic benchmarks are included in the source code and these are exercised every time the archive codes are upgraded and distributed to the MAAP users. Three different types of dynamic benchmarks are used: plant transients; large integral experiments; and separate effects tests. Each of these is performed in a different manner. The first is accomplished by developing a parameter file for the plant modeled and an input deck to describe the sequence; i.e. the entire MAAP4 code is exercised. The pertinent plant data is included in the source code and the computer

  2. Seven-Spot Ladybird Optimization: A Novel and Efficient Metaheuristic Algorithm for Numerical Optimization

    Directory of Open Access Journals (Sweden)

    Peng Wang

    2013-01-01

    Full Text Available This paper presents a novel biologically inspired metaheuristic algorithm called seven-spot ladybird optimization (SLO. The SLO is inspired by recent discoveries on the foraging behavior of a seven-spot ladybird. In this paper, the performance of the SLO is compared with that of the genetic algorithm, particle swarm optimization, and artificial bee colony algorithms by using five numerical benchmark functions with multimodality. The results show that SLO has the ability to find the best solution with a comparatively small population size and is suitable for solving optimization problems with lower dimensions.

  3. (U) Analytic First and Second Derivatives of the Uncollided Leakage for a Homogeneous Sphere

    Energy Technology Data Exchange (ETDEWEB)

    Favorite, Jeffrey A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-04-26

    The second-order adjoint sensitivity analysis methodology (2nd-ASAM), developed by Cacuci, has been applied by Cacuci to derive second derivatives of a response with respect to input parameters for uncollided particles in an inhomogeneous transport problem. In this memo, we present an analytic benchmark for verifying the derivatives of the 2nd-ASAM. The problem is a homogeneous sphere, and the response is the uncollided total leakage. This memo does not repeat the formulas given in Ref. 2. We are preparing a journal article that will include the derivation of Ref. 2 and the benchmark of this memo.

  4. Analytic Optimization of Near-Field Optical Chirality Enhancement

    Science.gov (United States)

    2017-01-01

    We present an analytic derivation for the enhancement of local optical chirality in the near field of plasmonic nanostructures by tuning the far-field polarization of external light. We illustrate the results by means of simulations with an achiral and a chiral nanostructure assembly and demonstrate that local optical chirality is significantly enhanced with respect to circular polarization in free space. The optimal external far-field polarizations are different from both circular and linear. Symmetry properties of the nanostructure can be exploited to determine whether the optimal far-field polarization is circular. Furthermore, the optimal far-field polarization depends on the frequency, which results in complex-shaped laser pulses for broadband optimization. PMID:28239617

  5. SAAFEC: Predicting the Effect of Single Point Mutations on Protein Folding Free Energy Using a Knowledge-Modified MM/PBSA Approach.

    Science.gov (United States)

    Getov, Ivan; Petukh, Marharyta; Alexov, Emil

    2016-04-07

    Folding free energy is an important biophysical characteristic of proteins that reflects the overall stability of the 3D structure of macromolecules. Changes in the amino acid sequence, naturally occurring or made in vitro, may affect the stability of the corresponding protein and thus could be associated with disease. Several approaches that predict the changes of the folding free energy caused by mutations have been proposed, but there is no method that is clearly superior to the others. The optimal goal is not only to accurately predict the folding free energy changes, but also to characterize the structural changes induced by mutations and the physical nature of the predicted folding free energy changes. Here we report a new method to predict the Single Amino Acid Folding free Energy Changes (SAAFEC) based on a knowledge-modified Molecular Mechanics Poisson-Boltzmann (MM/PBSA) approach. The method is comprised of two main components: a MM/PBSA component and a set of knowledge based terms delivered from a statistical study of the biophysical characteristics of proteins. The predictor utilizes a multiple linear regression model with weighted coefficients of various terms optimized against a set of experimental data. The aforementioned approach yields a correlation coefficient of 0.65 when benchmarked against 983 cases from 42 proteins in the ProTherm database. the webserver can be accessed via http://compbio.clemson.edu/SAAFEC/.

  6. A Model-Free No-arbitrage Price Bound for Variance Options

    Energy Technology Data Exchange (ETDEWEB)

    Bonnans, J. Frederic, E-mail: frederic.bonnans@inria.fr [Ecole Polytechnique, INRIA-Saclay (France); Tan Xiaolu, E-mail: xiaolu.tan@polytechnique.edu [Ecole Polytechnique, CMAP (France)

    2013-08-01

    We suggest a numerical approximation for an optimization problem, motivated by its applications in finance to find the model-free no-arbitrage bound of variance options given the marginal distributions of the underlying asset. A first approximation restricts the computation to a bounded domain. Then we propose a gradient projection algorithm together with the finite difference scheme to solve the optimization problem. We prove the general convergence, and derive some convergence rate estimates. Finally, we give some numerical examples to test the efficiency of the algorithm.

  7. PEBBLES Simulation of Static Friction and New Static Friction Benchmark

    International Nuclear Information System (INIS)

    Cogliati, Joshua J.; Ougouag, Abderrafi M.

    2010-01-01

    Pebble bed reactors contain large numbers of spherical fuel elements arranged randomly. Determining the motion and location of these fuel elements is required for calculating certain parameters of pebble bed reactor operation. This paper documents the PEBBLES static friction model. This model uses a three dimensional differential static friction approximation extended from the two dimensional Cundall and Strack model. The derivation of determining the rotational transformation of pebble to pebble static friction force is provided. A new implementation for a differential rotation method for pebble to container static friction force has been created. Previous published methods are insufficient for pebble bed reactor geometries. A new analytical static friction benchmark is documented that can be used to verify key static friction simulation parameters. This benchmark is based on determining the exact pebble to pebble and pebble to container static friction coefficients required to maintain a stable five sphere pyramid.

  8. Healthcare Analytics: Creating a Prioritized Improvement System with Performance Benchmarking.

    Science.gov (United States)

    Kolker, Eugene; Kolker, Evelyne

    2014-03-01

    The importance of healthcare improvement is difficult to overstate. This article describes our collaborative work with experts at Seattle Children's to create a prioritized improvement system using performance benchmarking. We applied analytics and modeling approaches to compare and assess performance metrics derived from U.S. News and World Report benchmarking data. We then compared a wide range of departmental performance metrics, including patient outcomes, structural and process metrics, survival rates, clinical practices, and subspecialist quality. By applying empirically simulated transformations and imputation methods, we built a predictive model that achieves departments' average rank correlation of 0.98 and average score correlation of 0.99. The results are then translated into prioritized departmental and enterprise-wide improvements, following a data to knowledge to outcomes paradigm. These approaches, which translate data into sustainable outcomes, are essential to solving a wide array of healthcare issues, improving patient care, and reducing costs.

  9. CompaRNA: a server for continuous benchmarking of automated methods for RNA secondary structure prediction

    Science.gov (United States)

    Puton, Tomasz; Kozlowski, Lukasz P.; Rother, Kristian M.; Bujnicki, Janusz M.

    2013-01-01

    We present a continuous benchmarking approach for the assessment of RNA secondary structure prediction methods implemented in the CompaRNA web server. As of 3 October 2012, the performance of 28 single-sequence and 13 comparative methods has been evaluated on RNA sequences/structures released weekly by the Protein Data Bank. We also provide a static benchmark generated on RNA 2D structures derived from the RNAstrand database. Benchmarks on both data sets offer insight into the relative performance of RNA secondary structure prediction methods on RNAs of different size and with respect to different types of structure. According to our tests, on the average, the most accurate predictions obtained by a comparative approach are generated by CentroidAlifold, MXScarna, RNAalifold and TurboFold. On the average, the most accurate predictions obtained by single-sequence analyses are generated by CentroidFold, ContextFold and IPknot. The best comparative methods typically outperform the best single-sequence methods if an alignment of homologous RNA sequences is available. This article presents the results of our benchmarks as of 3 October 2012, whereas the rankings presented online are continuously updated. We will gladly include new prediction methods and new measures of accuracy in the new editions of CompaRNA benchmarks. PMID:23435231

  10. CompaRNA: a server for continuous benchmarking of automated methods for RNA secondary structure prediction.

    Science.gov (United States)

    Puton, Tomasz; Kozlowski, Lukasz P; Rother, Kristian M; Bujnicki, Janusz M

    2013-04-01

    We present a continuous benchmarking approach for the assessment of RNA secondary structure prediction methods implemented in the CompaRNA web server. As of 3 October 2012, the performance of 28 single-sequence and 13 comparative methods has been evaluated on RNA sequences/structures released weekly by the Protein Data Bank. We also provide a static benchmark generated on RNA 2D structures derived from the RNAstrand database. Benchmarks on both data sets offer insight into the relative performance of RNA secondary structure prediction methods on RNAs of different size and with respect to different types of structure. According to our tests, on the average, the most accurate predictions obtained by a comparative approach are generated by CentroidAlifold, MXScarna, RNAalifold and TurboFold. On the average, the most accurate predictions obtained by single-sequence analyses are generated by CentroidFold, ContextFold and IPknot. The best comparative methods typically outperform the best single-sequence methods if an alignment of homologous RNA sequences is available. This article presents the results of our benchmarks as of 3 October 2012, whereas the rankings presented online are continuously updated. We will gladly include new prediction methods and new measures of accuracy in the new editions of CompaRNA benchmarks.

  11. Medical school benchmarking - from tools to programmes.

    Science.gov (United States)

    Wilkinson, Tim J; Hudson, Judith N; Mccoll, Geoffrey J; Hu, Wendy C Y; Jolly, Brian C; Schuwirth, Lambert W T

    2015-02-01

    Benchmarking among medical schools is essential, but may result in unwanted effects. To apply a conceptual framework to selected benchmarking activities of medical schools. We present an analogy between the effects of assessment on student learning and the effects of benchmarking on medical school educational activities. A framework by which benchmarking can be evaluated was developed and applied to key current benchmarking activities in Australia and New Zealand. The analogy generated a conceptual framework that tested five questions to be considered in relation to benchmarking: what is the purpose? what are the attributes of value? what are the best tools to assess the attributes of value? what happens to the results? and, what is the likely "institutional impact" of the results? If the activities were compared against a blueprint of desirable medical graduate outcomes, notable omissions would emerge. Medical schools should benchmark their performance on a range of educational activities to ensure quality improvement and to assure stakeholders that standards are being met. Although benchmarking potentially has positive benefits, it could also result in perverse incentives with unforeseen and detrimental effects on learning if it is undertaken using only a few selected assessment tools.

  12. Shielding Benchmark Computational Analysis

    International Nuclear Information System (INIS)

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.

    2000-01-01

    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)

  13. Novel Verification Method for Timing Optimization Based on DPSO

    Directory of Open Access Journals (Sweden)

    Chuandong Chen

    2018-01-01

    Full Text Available Timing optimization for logic circuits is one of the key steps in logic synthesis. Extant research data are mainly proposed based on various intelligence algorithms. Hence, they are neither comparable with timing optimization data collected by the mainstream electronic design automation (EDA tool nor able to verify the superiority of intelligence algorithms to the EDA tool in terms of optimization ability. To address these shortcomings, a novel verification method is proposed in this study. First, a discrete particle swarm optimization (DPSO algorithm was applied to optimize the timing of the mixed polarity Reed-Muller (MPRM logic circuit. Second, the Design Compiler (DC algorithm was used to optimize the timing of the same MPRM logic circuit through special settings and constraints. Finally, the timing optimization results of the two algorithms were compared based on MCNC benchmark circuits. The timing optimization results obtained using DPSO are compared with those obtained from DC, and DPSO demonstrates an average reduction of 9.7% in the timing delays of critical paths for a number of MCNC benchmark circuits. The proposed verification method directly ascertains whether the intelligence algorithm has a better timing optimization ability than DC.

  14. Issues in Benchmark Metric Selection

    Science.gov (United States)

    Crolotte, Alain

    It is true that a metric can influence a benchmark but will esoteric metrics create more problems than they will solve? We answer this question affirmatively by examining the case of the TPC-D metric which used the much debated geometric mean for the single-stream test. We will show how a simple choice influenced the benchmark and its conduct and, to some extent, DBMS development. After examining other alternatives our conclusion is that the “real” measure for a decision-support benchmark is the arithmetic mean.

  15. Benchmarking clinical photography services in the NHS.

    Science.gov (United States)

    Arbon, Giles

    2015-01-01

    Benchmarking is used in services across the National Health Service (NHS) using various benchmarking programs. Clinical photography services do not have a program in place and services have to rely on ad hoc surveys of other services. A trial benchmarking exercise was undertaken with 13 services in NHS Trusts. This highlights valuable data and comparisons that can be used to benchmark and improve services throughout the profession.

  16. Benchmarking Danish Industries

    DEFF Research Database (Denmark)

    Gammelgaard, Britta; Bentzen, Eric; Aagaard Andreassen, Mette

    2003-01-01

    compatible survey. The International Manufacturing Strategy Survey (IMSS) doesbring up the question of supply chain management, but unfortunately, we did not have access to thedatabase. Data from the members of the SCOR-model, in the form of benchmarked performance data,may exist, but are nonetheless...... not public. The survey is a cooperative project "Benchmarking DanishIndustries" with CIP/Aalborg University, the Danish Technological University, the DanishTechnological Institute and Copenhagen Business School as consortia partners. The project has beenfunded by the Danish Agency for Trade and Industry...

  17. QM/MM Geometry Optimization on Extensive Free-Energy Surfaces for Examination of Enzymatic Reactions and Design of Novel Functional Properties of Proteins.

    Science.gov (United States)

    Hayashi, Shigehiko; Uchida, Yoshihiro; Hasegawa, Taisuke; Higashi, Masahiro; Kosugi, Takahiro; Kamiya, Motoshi

    2017-05-05

    Many remarkable molecular functions of proteins use their characteristic global and slow conformational dynamics through coupling of local chemical states in reaction centers with global conformational changes of proteins. To theoretically examine the functional processes of proteins in atomic detail, a methodology of quantum mechanical/molecular mechanical (QM/MM) free-energy geometry optimization is introduced. In the methodology, a geometry optimization of a local reaction center is performed with a quantum mechanical calculation on a free-energy surface constructed with conformational samples of the surrounding protein environment obtained by a molecular dynamics simulation with a molecular mechanics force field. Geometry optimizations on extensive free-energy surfaces by a QM/MM reweighting free-energy self-consistent field method designed to be variationally consistent and computationally efficient have enabled examinations of the multiscale molecular coupling of local chemical states with global protein conformational changes in functional processes and analysis and design of protein mutants with novel functional properties.

  18. Comparative performance of an elitist teaching-learning-based optimization algorithm for solving unconstrained optimization problems

    Directory of Open Access Journals (Sweden)

    R. Venkata Rao

    2013-01-01

    Full Text Available Teaching-Learning-based optimization (TLBO is a recently proposed population based algorithm, which simulates the teaching-learning process of the class room. This algorithm requires only the common control parameters and does not require any algorithm-specific control parameters. In this paper, the effect of elitism on the performance of the TLBO algorithm is investigated while solving unconstrained benchmark problems. The effects of common control parameters such as the population size and the number of generations on the performance of the algorithm are also investigated. The proposed algorithm is tested on 76 unconstrained benchmark functions with different characteristics and the performance of the algorithm is compared with that of other well known optimization algorithms. A statistical test is also performed to investigate the results obtained using different algorithms. The results have proved the effectiveness of the proposed elitist TLBO algorithm.

  19. Using benchmarking for the primary allocation of EU allowances. An application to the German power sector

    Energy Technology Data Exchange (ETDEWEB)

    Schleich, J.; Cremer, C.

    2007-07-01

    Basing allocation of allowances for existing installations under the EU Emissions Trading Scheme on specific emission values (benchmarks) rather than on historic emissions may have several advantages. Benchmarking may recognize early ac-tion, provide higher incentives for replacing old installations and result in fewer distortions in case of updating, facilitate EU-wide harmonization of allocation rules or allow for simplified and more efficient closure rules. Applying an optimization model for the German power sector, we analyze the distributional effects of vari-ous allocation regimes across and within different generation technologies. Re-sults illustrate that regimes with a single uniform benchmark for all fuels or with a single benchmark for coal- and lignite-fired plants imply substantial distributional effects. In particular, lignite- and old coal-fired plants would be made worse off. Under a regime with fuel-specific benchmarks for gas, coal, and lignite 50 % of the gas-fired plants and 4 % of the lignite and coal-fired plants would face an allow-ance deficit of at least 10 %, while primarily modern lignite-fired plants would benefit. Capping the surplus and shortage of allowances would further moderate the distributional effects, but may tarnish incentives for efficiency improvements and recognition of early action. (orig.)

  20. Benchmarking of human resources management

    Directory of Open Access Journals (Sweden)

    David M. Akinnusi

    2008-11-01

    Full Text Available This paper reviews the role of human resource management (HRM which, today, plays a strategic partnership role in management. The focus of the paper is on HRM in the public sector, where much hope rests on HRM as a means of transforming the public service and achieving much needed service delivery. However, a critical evaluation of HRM practices in the public sector reveals that these services leave much to be desired. The paper suggests the adoption of benchmarking as a process to revamp HRM in the public sector so that it is able to deliver on its promises. It describes the nature and process of benchmarking and highlights the inherent difficulties in applying benchmarking in HRM. It concludes with some suggestions for a plan of action. The process of identifying “best” practices in HRM requires the best collaborative efforts of HRM practitioners and academicians. If used creatively, benchmarking has the potential to bring about radical and positive changes in HRM in the public sector. The adoption of the benchmarking process is, in itself, a litmus test of the extent to which HRM in the public sector has grown professionally.

  1. Integrating Best Practice and Performance Indicators To Benchmark the Performance of a School System. Benchmarking Paper 940317.

    Science.gov (United States)

    Cuttance, Peter

    This paper provides a synthesis of the literature on the role of benchmarking, with a focus on its use in the public sector. Benchmarking is discussed in the context of quality systems, of which it is an important component. The paper describes the basic types of benchmarking, pertinent research about its application in the public sector, the…

  2. Benchmarking Computational Fluid Dynamics for Application to PWR Fuel

    International Nuclear Information System (INIS)

    Smith, L.D. III; Conner, M.E.; Liu, B.; Dzodzo, B.; Paramonov, D.V.; Beasley, D.E.; Langford, H.M.; Holloway, M.V.

    2002-01-01

    The present study demonstrates a process used to develop confidence in Computational Fluid Dynamics (CFD) as a tool to investigate flow and temperature distributions in a PWR fuel bundle. The velocity and temperature fields produced by a mixing spacer grid of a PWR fuel assembly are quite complex. Before using CFD to evaluate these flow fields, a rigorous benchmarking effort should be performed to ensure that reasonable results are obtained. Westinghouse has developed a method to quantitatively benchmark CFD tools against data at conditions representative of the PWR. Several measurements in a 5 x 5 rod bundle were performed. Lateral flow-field testing employed visualization techniques and Particle Image Velocimetry (PIV). Heat transfer testing involved measurements of the single-phase heat transfer coefficient downstream of the spacer grid. These test results were used to compare with CFD predictions. Among the parameters optimized in the CFD models based on this comparison with data include computational mesh, turbulence model, and boundary conditions. As an outcome of this effort, a methodology was developed for CFD modeling that provides confidence in the numerical results. (authors)

  3. Benchmarking and Sustainable Transport Policy

    DEFF Research Database (Denmark)

    Gudmundsson, Henrik; Wyatt, Andrew; Gordon, Lucy

    2004-01-01

    Order to learn from the best. In 2000 the European Commission initiated research to explore benchmarking as a tool to promote policies for ‘sustainable transport’. This paper reports findings and recommendations on how to address this challenge. The findings suggest that benchmarking is a valuable...... tool that may indeed help to move forward the transport policy agenda. However, there are major conditions and limitations. First of all it is not always so straightforward to delimit, measure and compare transport services in order to establish a clear benchmark. Secondly ‘sustainable transport......’ evokes a broad range of concerns that are hard to address fully at the level of specific practices. Thirdly policies are not directly comparable across space and context. For these reasons attempting to benchmark ‘sustainable transport policies’ against one another would be a highly complex task, which...

  4. PROCEDURES FOR THE DERIVATION OF EQUILIBRIUM ...

    Science.gov (United States)

    This equilibrium partitioning sediment benchmark (ESB) document describes procedures to derive concentrations for 32 nonionic organic chemicals in sediment which are protective of the presence of freshwater and marine benthic organisms. The equilibrium partitioning (EqP) approach was chosen because it accounts for the varying biological availability of chemicals in different sediments and allows for the incorporation of the appropriate biological effects concentration. This provides for the derivation of benchmarks that are causally linked to the specific chemical, applicable across sediments, and appropriately protective of benthic organisms. EqP can be used to calculate ESBs for any toxicity endpoint for which there are water-only toxicity data; it is not limited to any single effect endpoint. For the purposes of this document, ESBs for 32 nonionic organic chemicals, including several low molecular weight aliphatic and aromatic compounds, pesticides, and phthalates, were derived using Final Chronic Values (FCV) from Water Quality Criteria (WQC) or Secondary Chronic Values (SCV) derived from existing toxicological data using the Great Lakes Water Quality Initiative (GLI) or narcosis theory approaches. These values are intended to be the concentration of each chemical in water that is protective of the presence of aquatic life. For nonionic organic chemicals demonstrating a narcotic mode of action, ESBs derived using the GLI approach specifically for fres

  5. Defining core elements and outstanding practice in Nutritional Science through collaborative benchmarking.

    Science.gov (United States)

    Samman, Samir; McCarthur, Jennifer O; Peat, Mary

    2006-01-01

    Benchmarking has been adopted by educational institutions as a potentially sensitive tool for improving learning and teaching. To date there has been limited application of benchmarking methodology in the Discipline of Nutritional Science. The aim of this survey was to define core elements and outstanding practice in Nutritional Science through collaborative benchmarking. Questionnaires that aimed to establish proposed core elements for Nutritional Science, and inquired about definitions of " good" and " outstanding" practice were posted to named representatives at eight Australian universities. Seven respondents identified core elements that included knowledge of nutrient metabolism and requirement, food production and processing, modern biomedical techniques that could be applied to understanding nutrition, and social and environmental issues as related to Nutritional Science. Four of the eight institutions who agreed to participate in the present survey identified the integration of teaching with research as an indicator of outstanding practice. Nutritional Science is a rapidly evolving discipline. Further and more comprehensive surveys are required to consolidate and update the definition of the discipline, and to identify the optimal way of teaching it. Global ideas and specific regional requirements also need to be considered.

  6. Semiempirical Quantum-Chemical Orthogonalization-Corrected Methods: Benchmarks for Ground-State Properties.

    Science.gov (United States)

    Dral, Pavlo O; Wu, Xin; Spörkel, Lasse; Koslowski, Axel; Thiel, Walter

    2016-03-08

    The semiempirical orthogonalization-corrected OMx methods (OM1, OM2, and OM3) go beyond the standard MNDO model by including additional interactions in the electronic structure calculation. When augmented with empirical dispersion corrections, the resulting OMx-Dn approaches offer a fast and robust treatment of noncovalent interactions. Here we evaluate the performance of the OMx and OMx-Dn methods for a variety of ground-state properties using a large and diverse collection of benchmark sets from the literature, with a total of 13035 original and derived reference data. Extensive comparisons are made with the results from established semiempirical methods (MNDO, AM1, PM3, PM6, and PM7) that also use the NDDO (neglect of diatomic differential overlap) integral approximation. Statistical evaluations show that the OMx and OMx-Dn methods outperform the other methods for most of the benchmark sets.

  7. Coevolutionary particle swarm optimization using Gaussian distribution for solving constrained optimization problems.

    Science.gov (United States)

    Krohling, Renato A; Coelho, Leandro dos Santos

    2006-12-01

    In this correspondence, an approach based on coevolutionary particle swarm optimization to solve constrained optimization problems formulated as min-max problems is presented. In standard or canonical particle swarm optimization (PSO), a uniform probability distribution is used to generate random numbers for the accelerating coefficients of the local and global terms. We propose a Gaussian probability distribution to generate the accelerating coefficients of PSO. Two populations of PSO using Gaussian distribution are used on the optimization algorithm that is tested on a suite of well-known benchmark constrained optimization problems. Results have been compared with the canonical PSO (constriction factor) and with a coevolutionary genetic algorithm. Simulation results show the suitability of the proposed algorithm in terms of effectiveness and robustness.

  8. On the Performance of Linear Decreasing Inertia Weight Particle Swarm Optimization for Global Optimization

    Science.gov (United States)

    Arasomwan, Martins Akugbe; Adewumi, Aderemi Oluyinka

    2013-01-01

    Linear decreasing inertia weight (LDIW) strategy was introduced to improve on the performance of the original particle swarm optimization (PSO). However, linear decreasing inertia weight PSO (LDIW-PSO) algorithm is known to have the shortcoming of premature convergence in solving complex (multipeak) optimization problems due to lack of enough momentum for particles to do exploitation as the algorithm approaches its terminal point. Researchers have tried to address this shortcoming by modifying LDIW-PSO or proposing new PSO variants. Some of these variants have been claimed to outperform LDIW-PSO. The major goal of this paper is to experimentally establish the fact that LDIW-PSO is very much efficient if its parameters are properly set. First, an experiment was conducted to acquire a percentage value of the search space limits to compute the particle velocity limits in LDIW-PSO based on commonly used benchmark global optimization problems. Second, using the experimentally obtained values, five well-known benchmark optimization problems were used to show the outstanding performance of LDIW-PSO over some of its competitors which have in the past claimed superiority over it. Two other recent PSO variants with different inertia weight strategies were also compared with LDIW-PSO with the latter outperforming both in the simulation experiments conducted. PMID:24324383

  9. Nitrogen and sulfur dual-doped chitin-derived carbon/graphene composites as effective metal-free electrocatalysts for dye sensitized solar cells

    Science.gov (United States)

    Di, Yi; Xiao, Zhanhai; Yan, Xiaoshuang; Ru, Geying; Chen, Bing; Feng, Jiwen

    2018-05-01

    The photovoltaic performance of dye-sensitized solar cell (DSSC) is strongly influenced by the electrocatalytic ability of its counter electrode (CE) materials. To obtain the affordable and high-performance electrocatalysts, the N/S dual-doped chitin-derived carbon materials SCCh were manufactured via in-situ S-doped method in the annealing process, where richer active sites are created compared to the pristine chitin-derived carbon matrix CCh, thus enhancing the intrinsic catalytic activity of carbon materials. When SCCh is incorporated with graphene, the yielded composites hold a further boosted catalytic activity due to facilitating the electronic fast transfer. The DSSC assembled with the optimizing rGO-SCCh-3 composite CE shows a favourable power conversion efficiency of 6.36%, which is comparable with that of the Pt-sputtering electrode (6.30%), indicate of the outstanding I3- reduction ability of the composite material. The electrochemical characterizations demonstrate that the low charge transfer resistance and excellent electrocatalytic activity all contribute to the superior photovoltaic performance. More importantly, the composite CE exhibits good electrochemical stability in the practical operation. In consideration of the low cost and the simple preparation procedure, the present metal-free carbonaceous composites could be used as a promising counter electrode material in future large scale production of DSSCs.

  10. Benchmarking for Cost Improvement. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The US Department of Energy`s (DOE) Office of Environmental Restoration and Waste Management (EM) conducted the Benchmarking for Cost Improvement initiative with three objectives: Pilot test benchmarking as an EM cost improvement tool; identify areas for cost improvement and recommend actions to address these areas; provide a framework for future cost improvement. The benchmarking initiative featured the use of four principal methods (program classification, nationwide cost improvement survey, paired cost comparison and component benchmarking). Interested parties contributed during both the design and execution phases. The benchmarking initiative was conducted on an accelerated basis. Of necessity, it considered only a limited set of data that may not be fully representative of the diverse and complex conditions found at the many DOE installations. The initiative generated preliminary data about cost differences and it found a high degree of convergence on several issues. Based on this convergence, the report recommends cost improvement strategies and actions. This report describes the steps taken as part of the benchmarking initiative and discusses the findings and recommended actions for achieving cost improvement. The results and summary recommendations, reported below, are organized by the study objectives.

  11. Benchmarking for controllere: metoder, teknikker og muligheder

    DEFF Research Database (Denmark)

    Bukh, Per Nikolaj; Sandalgaard, Niels Erik; Dietrichson, Lars Grubbe

    2008-01-01

    Benchmarking indgår på mange måder i både private og offentlige virksomheders ledelsespraksis. I økonomistyring anvendes benchmark-baserede indikatorer (eller nøgletal), eksempelvis ved fastlæggelse af mål i resultatkontrakter eller for at angive det ønskede niveau for visse nøgletal i et Balanced...... Scorecard eller tilsvarende målstyringsmodeller. Artiklen redegør for begrebet benchmarking ved at præsentere og diskutere forskellige facetter af det, samt redegør for fire forskellige anvendelser af benchmarking for at vise begrebets bredde og væsentligheden af at klarlægge formålet med et...... benchmarkingprojekt. Dernæst bliver forskellen på resultatbenchmarking og procesbenchmarking behandlet, hvorefter brugen af intern hhv. ekstern benchmarking, samt brugen af benchmarking i budgetlægning og budgetopfølgning, behandles....

  12. Professional Performance and Bureaucratic Benchmarking Information

    DEFF Research Database (Denmark)

    Schneider, Melanie L.; Mahlendorf, Matthias D.; Schäffer, Utz

    Prior research documents positive effects of benchmarking information provision on performance and attributes this to social comparisons. However, the effects on professional recipients are unclear. Studies of professional control indicate that professional recipients often resist bureaucratic...... controls because of organizational-professional conflicts. We therefore analyze the association between bureaucratic benchmarking information provision and professional performance and suggest that the association is more positive if prior professional performance was low. We test our hypotheses based...... on archival, publicly disclosed, professional performance data for 191 German orthopedics departments, matched with survey data on bureaucratic benchmarking information given to chief orthopedists by the administration. We find a positive association between bureaucratic benchmarking information provision...

  13. LQG/LTR optimal attitude control of small flexible spacecraft using free-free boundary conditions

    Science.gov (United States)

    Fulton, Joseph M.

    Due to the volume and power limitations of a small satellite, careful consideration must be taken while designing an attitude control system for 3-axis stabilization. Placing redundancy in the system proves difficult and utilizing power hungry, high accuracy, active actuators is not a viable option. Thus, it is customary to find dependable, passive actuators used in conjunction with small scale active control components. This document describes the application of Elastic Memory Composite materials in the construction of a flexible spacecraft appendage, such as a gravity gradient boom. Assumed modes methods are used with Finite Element Modeling information to obtain the equations of motion for the system while assuming free-free boundary conditions. A discussion is provided to illustrate how cantilever mode shapes are not always the best assumption when modeling small flexible spacecraft. A key point of interest is first resonant modes may be needed in the system design plant in spite of these modes being greater than one order of magnitude in frequency when compared to the crossover frequency of the controller. LQG/LTR optimal control techniques are implemented to compute attitude control gains while controller robustness considerations determine appropriate reduced order controllers and which flexible modes to include in the design model. Key satellite designer concerns in the areas of computer processor sizing, material uncertainty impacts on the system model, and system performance variations resulting from appendage length modifications are addressed.

  14. An extension to artifact-free projection overlaps

    International Nuclear Information System (INIS)

    Lin, Jianyu

    2015-01-01

    Purpose: In multipinhole single photon emission computed tomography, the overlapping of projections has been used to increase sensitivity. Avoiding artifacts in the reconstructed image associated with projection overlaps (multiplexing) is a critical issue. In our previous report, two types of artifact-free projection overlaps, i.e., projection overlaps that do not lead to artifacts in the reconstructed image, were formally defined and proved, and were validated via simulations. In this work, a new proposition is introduced to extend the previously defined type-II artifact-free projection overlaps so that a broader range of artifact-free overlaps is accommodated. One practical purpose of the new extension is to design a baffle window multipinhole system with artifact-free projection overlaps. Methods: First, the extended type-II artifact-free overlap was theoretically defined and proved. The new proposition accommodates the situation where the extended type-II artifact-free projection overlaps can be produced with incorrectly reconstructed portions in the reconstructed image. Next, to validate the theory, the extended-type-II artifact-free overlaps were employed in designing the multiplexing multipinhole spiral orbit imaging systems with a baffle window. Numerical validations were performed via simulations, where the corresponding 1-pinhole nonmultiplexing reconstruction results were used as the benchmark for artifact-free reconstructions. The mean square error (MSE) was the metric used for comparisons of noise-free reconstructed images. Noisy reconstructions were also performed as part of the validations. Results: Simulation results show that for noise-free reconstructions, the MSEs of the reconstructed images of the artifact-free multiplexing systems are very similar to those of the corresponding 1-pinhole systems. No artifacts were observed in the reconstructed images. Therefore, the testing results for artifact-free multiplexing systems designed using the

  15. Antioxidant and DPPH (1,1-diphenyl-2-picrylhydrazyl Free Radical Scavenging Activities of Boniger Acid and Calix[4]arene Derivative

    Directory of Open Access Journals (Sweden)

    E. ERDEM

    2014-07-01

    Full Text Available Diazonium derivative of calix[4]arene has been synthesized using three different synthetic steps. Initially p-tert-butylcalix[4]arene was synthesized with the condensation reaction of p-tert-butylphenol and formaldehyde in basic conditions. Calix[4]arene was obtained after the debutylation reaction of p-tert-butylcalix[4]arene with AlCl3. Calix[4]arene reacted with diazonium salt of Böniger acid to yield the 5,17-[(Bis(azo-bis(5-hydroxy-2,7-naphthalenedisulfonicacid]-25,26,27,28-tetrahydroxy calix[4]arene which has eight free phenolic hydroxyl group. Reaction steps were shown in Fig.1.2,7-naphthalenedisulfonicacid]-25,26,27,28-tetrahydroxy calix[4]arene The antioxidant activity of the Böniger acid and calix[4]aren derivative were determined using β-karotene-linoleic acid system. Moreover, the free radical scavenging activity values were tested with DPPH free radical. The two compounds showed strong antioxidant activity. Total antioxidant activity of Böniger acid and calix[4]aren derivative was determined using β–carotenelinoleic acid model system and was found the antioxidant activity of 84.00% and 85.60 % respectively. The free radical scavenging activities were determined as 83.05% and 84.69 %. Results show that, two compounds has the antioxidant activity. The calix[4]aren derivaties has more higher activity then Boniger acid because of calix[4]aren derivative has much hydroxl groups.

  16. EPA's Benchmark Dose Modeling Software

    Science.gov (United States)

    The EPA developed the Benchmark Dose Software (BMDS) as a tool to help Agency risk assessors facilitate applying benchmark dose (BMD) method’s to EPA’s human health risk assessment (HHRA) documents. The application of BMD methods overcomes many well know limitations ...

  17. Accelerator shielding benchmark problems

    International Nuclear Information System (INIS)

    Hirayama, H.; Ban, S.; Nakamura, T.

    1993-01-01

    Accelerator shielding benchmark problems prepared by Working Group of Accelerator Shielding in the Research Committee on Radiation Behavior in the Atomic Energy Society of Japan were compiled by Radiation Safety Control Center of National Laboratory for High Energy Physics. Twenty-five accelerator shielding benchmark problems are presented for evaluating the calculational algorithm, the accuracy of computer codes and the nuclear data used in codes. (author)

  18. Belief Propagation Algorithm for Portfolio Optimization Problems.

    Science.gov (United States)

    Shinzato, Takashi; Yasuda, Muneki

    2015-01-01

    The typical behavior of optimal solutions to portfolio optimization problems with absolute deviation and expected shortfall models using replica analysis was pioneeringly estimated by S. Ciliberti et al. [Eur. Phys. B. 57, 175 (2007)]; however, they have not yet developed an approximate derivation method for finding the optimal portfolio with respect to a given return set. In this study, an approximation algorithm based on belief propagation for the portfolio optimization problem is presented using the Bethe free energy formalism, and the consistency of the numerical experimental results of the proposed algorithm with those of replica analysis is confirmed. Furthermore, the conjecture of H. Konno and H. Yamazaki, that the optimal solutions with the absolute deviation model and with the mean-variance model have the same typical behavior, is verified using replica analysis and the belief propagation algorithm.

  19. Belief Propagation Algorithm for Portfolio Optimization Problems.

    Directory of Open Access Journals (Sweden)

    Takashi Shinzato

    Full Text Available The typical behavior of optimal solutions to portfolio optimization problems with absolute deviation and expected shortfall models using replica analysis was pioneeringly estimated by S. Ciliberti et al. [Eur. Phys. B. 57, 175 (2007]; however, they have not yet developed an approximate derivation method for finding the optimal portfolio with respect to a given return set. In this study, an approximation algorithm based on belief propagation for the portfolio optimization problem is presented using the Bethe free energy formalism, and the consistency of the numerical experimental results of the proposed algorithm with those of replica analysis is confirmed. Furthermore, the conjecture of H. Konno and H. Yamazaki, that the optimal solutions with the absolute deviation model and with the mean-variance model have the same typical behavior, is verified using replica analysis and the belief propagation algorithm.

  20. G-Doob-Meyer Decomposition and Its Applications in Bid-Ask Pricing for Derivatives under Knightian Uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2015-01-01

    Full Text Available The target of this paper is to establish the bid-ask pricing framework for the American contingent claims against risky assets with G-asset price systems on the financial market under Knightian uncertainty. First, we prove G-Dooby-Meyer decomposition for G-supermartingale. Furthermore, we consider bid-ask pricing American contingent claims under Knightian uncertainty, by using G-Dooby-Meyer decomposition; we construct dynamic superhedge strategies for the optimal stopping problem and prove that the value functions of the optimal stopping problems are the bid and ask prices of the American contingent claims under Knightian uncertainty. Finally, we consider a free boundary problem, prove the strong solution existence of the free boundary problem, and derive that the value function of the optimal stopping problem is equivalent to the strong solution to the free boundary problem.

  1. Mechanism of transfer of LDL-derived free cholesterol to HDL subfractions in human plasma

    International Nuclear Information System (INIS)

    Miida, T.; Fielding, C.J.; Fielding, P.E.

    1990-01-01

    The transfer of [ 3 H]cholesterol in low-density lipoprotein (LDL) to different high-density lipoprotein (HDL) species in native human plasma was determined by using nondenaturing two-dimensional electrophoresis. Transfer from LDL had a t 1/2 at 37 degree C of 51 ± 8 min and an activation energy of 18.0 kCal mol -1 . There was unexpected specificity among HDL species as acceptors of LDL-derived labeled cholesterol. The largest fraction of the major α-migrating class (HDL 2b ) was the major initial acceptor of LDL-derived cholesterol. Kinetic analysis indicated a rapid secondary transfer from HDL 2b to smaller αHDL (particularly HDL 3 ) driven enzymatically by the lecithin-cholesterol acyltransferase reaction. Rates of transfer among αHDL were most rapid from the largest αHDL fraction (HDL 2b ), suggesting possible protein-mediated facilitation. Simultaneous measurements of the transport of LDL-derived and cell-derived isotopic cholesterol indicated that the former preferably utilized the αHDL pathyway, with little label in pre-βHDL. The same experiments confirmed earlier data that cell-derived cholesterol is preferentially channeled through pre-βHDL. The authors suggest that the functional heterogeneity of HDL demonstrated here includes the ability to independently process cell- and LDL-derived free cholesterol

  2. Deviating From the Benchmarks

    DEFF Research Database (Denmark)

    Rocha, Vera; Van Praag, Mirjam; Carneiro, Anabela

    This paper studies three related questions: To what extent otherwise similar startups employ different quantities and qualities of human capital at the moment of entry? How persistent are initial human capital choices over time? And how does deviating from human capital benchmarks influence firm......, founders human capital, and the ownership structure of startups (solo entrepreneurs versus entrepreneurial teams). We then study the survival implications of exogenous deviations from these benchmarks, based on spline models for survival data. Our results indicate that (especially negative) deviations from...... the benchmark can be substantial, are persistent over time, and hinder the survival of firms. The implications may, however, vary according to the sector and the ownership structure at entry. Given the stickiness of initial choices, wrong human capital decisions at entry turn out to be a close to irreversible...

  3. A dynamic global and local combined particle swarm optimization algorithm

    International Nuclear Information System (INIS)

    Jiao Bin; Lian Zhigang; Chen Qunxian

    2009-01-01

    Particle swarm optimization (PSO) algorithm has been developing rapidly and many results have been reported. PSO algorithm has shown some important advantages by providing high speed of convergence in specific problems, but it has a tendency to get stuck in a near optimal solution and one may find it difficult to improve solution accuracy by fine tuning. This paper presents a dynamic global and local combined particle swarm optimization (DGLCPSO) algorithm to improve the performance of original PSO, in which all particles dynamically share the best information of the local particle, global particle and group particles. It is tested with a set of eight benchmark functions with different dimensions and compared with original PSO. Experimental results indicate that the DGLCPSO algorithm improves the search performance on the benchmark functions significantly, and shows the effectiveness of the algorithm to solve optimization problems.

  4. The influence of global benchmark oil prices on the regional oil spot market in multi-period evolution

    International Nuclear Information System (INIS)

    Jiang, Meihui; An, Haizhong; Jia, Xiaoliang; Sun, Xiaoqi

    2017-01-01

    Crude benchmark oil prices play a crucial role in energy policy and investment management. Previous research confined itself to studying the static, uncertain, short- or long-term relationship between global benchmark oil prices, ignoring the time-varying, quantitative, dynamic nature of the relationship during various stages of oil price volatility. This paper proposes a novel approach combining grey relation analysis, optimization wavelet analysis, and Bayesian network modeling to explore the multi-period evolution of the dynamic relationship between global benchmark oil prices and regional oil spot price. We analyze the evolution of the most significant decision-making risk periods, as well as the combined strategy-making reference oil prices and the corresponding periods during various stages of volatility. Furthermore, we determine that the network evolution of the quantitative lead/lag relationship between different influences of global benchmark oil prices shows a multi-period evolution phenomenon. For policy makers and market investors, our combined model can provide decision-making periods with the lowest expected risk and decision-making target reference oil prices and corresponding weights for strategy adjustment and market arbitrage. This study provides further information regarding period weights of target reference oil prices, facilitating efforts to perform multi-agent energy policy and intertemporal market arbitrage. - Highlights: • Multi-period evolution of the influence of different oil prices is discovered. • We combined grey relation analysis, optimization wavelet and Bayesian network. • The intensity of volatility, synchronization, and lead/lag effects are analyzed. • The target reference oil prices and corresponding period weights are determined.

  5. Developing integrated benchmarks for DOE performance measurement

    Energy Technology Data Exchange (ETDEWEB)

    Barancik, J.I.; Kramer, C.F.; Thode, Jr. H.C.

    1992-09-30

    The objectives of this task were to describe and evaluate selected existing sources of information on occupational safety and health with emphasis on hazard and exposure assessment, abatement, training, reporting, and control identifying for exposure and outcome in preparation for developing DOE performance benchmarks. Existing resources and methodologies were assessed for their potential use as practical performance benchmarks. Strengths and limitations of current data resources were identified. Guidelines were outlined for developing new or improved performance factors, which then could become the basis for selecting performance benchmarks. Data bases for non-DOE comparison populations were identified so that DOE performance could be assessed relative to non-DOE occupational and industrial groups. Systems approaches were described which can be used to link hazards and exposure, event occurrence, and adverse outcome factors, as needed to generate valid, reliable, and predictive performance benchmarks. Data bases were identified which contain information relevant to one or more performance assessment categories . A list of 72 potential performance benchmarks was prepared to illustrate the kinds of information that can be produced through a benchmark development program. Current information resources which may be used to develop potential performance benchmarks are limited. There is need to develop an occupational safety and health information and data system in DOE, which is capable of incorporating demonstrated and documented performance benchmarks prior to, or concurrent with the development of hardware and software. A key to the success of this systems approach is rigorous development and demonstration of performance benchmark equivalents to users of such data before system hardware and software commitments are institutionalized.

  6. Benchmarking gate-based quantum computers

    Science.gov (United States)

    Michielsen, Kristel; Nocon, Madita; Willsch, Dennis; Jin, Fengping; Lippert, Thomas; De Raedt, Hans

    2017-11-01

    With the advent of public access to small gate-based quantum processors, it becomes necessary to develop a benchmarking methodology such that independent researchers can validate the operation of these processors. We explore the usefulness of a number of simple quantum circuits as benchmarks for gate-based quantum computing devices and show that circuits performing identity operations are very simple, scalable and sensitive to gate errors and are therefore very well suited for this task. We illustrate the procedure by presenting benchmark results for the IBM Quantum Experience, a cloud-based platform for gate-based quantum computing.

  7. A Heterogeneous Medium Analytical Benchmark

    International Nuclear Information System (INIS)

    Ganapol, B.D.

    1999-01-01

    A benchmark, called benchmark BLUE, has been developed for one-group neutral particle (neutron or photon) transport in a one-dimensional sub-critical heterogeneous plane parallel medium with surface illumination. General anisotropic scattering is accommodated through the Green's Function Method (GFM). Numerical Fourier transform inversion is used to generate the required Green's functions which are kernels to coupled integral equations that give the exiting angular fluxes. The interior scalar flux is then obtained through quadrature. A compound iterative procedure for quadrature order and slab surface source convergence provides highly accurate benchmark qualities (4- to 5- places of accuracy) results

  8. Benchmarking i eksternt regnskab og revision

    DEFF Research Database (Denmark)

    Thinggaard, Frank; Kiertzner, Lars

    2001-01-01

    løbende i en benchmarking-proces. Dette kapitel vil bredt undersøge, hvor man med nogen ret kan få benchmarking-begrebet knyttet til eksternt regnskab og revision. Afsnit 7.1 beskæftiger sig med det eksterne årsregnskab, mens afsnit 7.2 tager fat i revisionsområdet. Det sidste afsnit i kapitlet opsummerer...... betragtningerne om benchmarking i forbindelse med begge områder....

  9. A framework for benchmarking land models

    Directory of Open Access Journals (Sweden)

    Y. Q. Luo

    2012-10-01

    Full Text Available Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1 targeted aspects of model performance to be evaluated, (2 a set of benchmarks as defined references to test model performance, (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4 model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data–model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties

  10. EPRI depletion benchmark calculations using PARAGON

    International Nuclear Information System (INIS)

    Kucukboyaci, Vefa N.

    2015-01-01

    Highlights: • PARAGON depletion calculations are benchmarked against the EPRI reactivity decrement experiments. • Benchmarks cover a wide range of enrichments, burnups, cooling times, and burnable absorbers, and different depletion and storage conditions. • Results from PARAGON-SCALE scheme are more conservative relative to the benchmark data. • ENDF/B-VII based data reduces the excess conservatism and brings the predictions closer to benchmark reactivity decrement values. - Abstract: In order to conservatively apply burnup credit in spent fuel pool criticality analyses, code validation for both fresh and used fuel is required. Fresh fuel validation is typically done by modeling experiments from the “International Handbook.” A depletion validation can determine a bias and bias uncertainty for the worth of the isotopes not found in the fresh fuel critical experiments. Westinghouse’s burnup credit methodology uses PARAGON™ (Westinghouse 2-D lattice physics code) and its 70-group cross-section library, which have been benchmarked, qualified, and licensed both as a standalone transport code and as a nuclear data source for core design simulations. A bias and bias uncertainty for the worth of depletion isotopes, however, are not available for PARAGON. Instead, the 5% decrement approach for depletion uncertainty is used, as set forth in the Kopp memo. Recently, EPRI developed a set of benchmarks based on a large set of power distribution measurements to ascertain reactivity biases. The depletion reactivity has been used to create 11 benchmark cases for 10, 20, 30, 40, 50, and 60 GWd/MTU and 3 cooling times 100 h, 5 years, and 15 years. These benchmark cases are analyzed with PARAGON and the SCALE package and sensitivity studies are performed using different cross-section libraries based on ENDF/B-VI.3 and ENDF/B-VII data to assess that the 5% decrement approach is conservative for determining depletion uncertainty

  11. Vegetable milks and their fermented derivative products

    Directory of Open Access Journals (Sweden)

    Neus Bernat

    2014-04-01

    Full Text Available The so-called vegetable milks are in the spotlight thanks to their lactose-free, animal protein-free and cholesterol-free features which fit well with the current demand for healthy food products. Nevertheless, and with the exception of soya, little information is available about these types of milks and their derivatives. The aims of this review, therefore, are to: highlight the main nutritional benefits of the nut and cereal vegetable milks available on the market, fermented or not; describe the basic processing steps involved in their manufacturing process; and analyze the major problems affecting their overall quality, together with the current feasible solutions. On the basis of the information gathered, vegetable milks and their derivatives have excellent nutritional properties which provide them a high potential and positive market expectation. Nevertheless, optimal processing conditions for each raw material or the application of new technologies have to be researched in order to improve the quality of the products. Hence, further studies need to be developed to ensure the physical stability of the products throughout their whole shelf-life. These studies would also allow for a reduction in the amount of additives (hydrocolloids and/or emulsifiers and thus reduce the cost of the products. In the particular case of fermented products, the use of starters which are able to both improve the quality (by synthesizing enhanced flavors and providing optimal textures and exert health benefits for consumers (i.e. probiotics is the main challenge to be faced in future studies.

  12. Ad hoc committee on reactor physics benchmarks

    International Nuclear Information System (INIS)

    Diamond, D.J.; Mosteller, R.D.; Gehin, J.C.

    1996-01-01

    In the spring of 1994, an ad hoc committee on reactor physics benchmarks was formed under the leadership of two American Nuclear Society (ANS) organizations. The ANS-19 Standards Subcommittee of the Reactor Physics Division and the Computational Benchmark Problem Committee of the Mathematics and Computation Division had both seen a need for additional benchmarks to help validate computer codes used for light water reactor (LWR) neutronics calculations. Although individual organizations had employed various means to validate the reactor physics methods that they used for fuel management, operations, and safety, additional work in code development and refinement is under way, and to increase accuracy, there is a need for a corresponding increase in validation. Both organizations thought that there was a need to promulgate benchmarks based on measured data to supplement the LWR computational benchmarks that have been published in the past. By having an organized benchmark activity, the participants also gain by being able to discuss their problems and achievements with others traveling the same route

  13. Murine pluripotent stem cells derived scaffold-free cartilage grafts from a micro-cavitary hydrogel platform.

    Science.gov (United States)

    He, Pengfei; Fu, Jiayin; Wang, Dong-An

    2016-04-15

    By means of appropriate cell type and scaffold, tissue-engineering approaches aim to construct grafts for cartilage repair. Pluripotent stem cells especially induced pluripotent stem cells (iPSCs) are of promising cell candidates due to the pluripotent plasticity and abundant cell source. We explored three dimensional (3D) culture and chondrogenesis of murine iPSCs (miPSCs) on an alginate-based micro-cavity hydrogel (MCG) platform in pursuit of fabricating synthetic-scaffold-free cartilage grafts. Murine embryonic stem cells (mESCs) were employed in parallel as the control. Chondrogenesis was fulfilled using a consecutive protocol via mesoderm differentiation followed by chondrogenic differentiation; subsequently, miPSC and mESC-seeded constructs were further respectively cultured in chondrocyte culture (CC) medium. Alginate phase in the constructs was then removed to generate a graft only comprised of induced chondrocytic cells and cartilaginous extracellular matrix (ECMs). We found that from the mESC-seeded constructs, formation of intact grafts could be achieved in greater sizes with relatively fewer chondrocytic cells and abundant ECMs; from miPSC-seeded constructs, relatively smaller sized cartilaginous grafts could be formed by cells with chondrocytic phenotype wrapped by abundant and better assembled collagen type II. This study demonstrated successful creation of pluripotent stem cells-derived cartilage/chondroid graft from a 3D MCG interim platform. By the support of materials and methodologies established from this study, particularly given the autologous availability of iPSCs, engineered autologous cartilage engraftment may be potentially fulfilled without relying on the limited and invasive autologous chondrocytes acquisition. In this study, we explored chondrogenic differentiation of pluripotent stem cells on a 3D micro-cavitary hydrogel interim platform and creation of pluripotent stem cells-derived cartilage/chondroid graft via a consecutive

  14. Benchmarking NWP Kernels on Multi- and Many-core Processors

    Science.gov (United States)

    Michalakes, J.; Vachharajani, M.

    2008-12-01

    Increased computing power for weather, climate, and atmospheric science has provided direct benefits for defense, agriculture, the economy, the environment, and public welfare and convenience. Today, very large clusters with many thousands of processors are allowing scientists to move forward with simulations of unprecedented size. But time-critical applications such as real-time forecasting or climate prediction need strong scaling: faster nodes and processors, not more of them. Moreover, the need for good cost- performance has never been greater, both in terms of performance per watt and per dollar. For these reasons, the new generations of multi- and many-core processors being mass produced for commercial IT and "graphical computing" (video games) are being scrutinized for their ability to exploit the abundant fine- grain parallelism in atmospheric models. We present results of our work to date identifying key computational kernels within the dynamics and physics of a large community NWP model, the Weather Research and Forecast (WRF) model. We benchmark and optimize these kernels on several different multi- and many-core processors. The goals are to (1) characterize and model performance of the kernels in terms of computational intensity, data parallelism, memory bandwidth pressure, memory footprint, etc. (2) enumerate and classify effective strategies for coding and optimizing for these new processors, (3) assess difficulties and opportunities for tool or higher-level language support, and (4) establish a continuing set of kernel benchmarks that can be used to measure and compare effectiveness of current and future designs of multi- and many-core processors for weather and climate applications.

  15. MTCB: A Multi-Tenant Customizable database Benchmark

    NARCIS (Netherlands)

    van der Zijden, WIm; Hiemstra, Djoerd; van Keulen, Maurice

    2017-01-01

    We argue that there is a need for Multi-Tenant Customizable OLTP systems. Such systems need a Multi-Tenant Customizable Database (MTC-DB) as a backing. To stimulate the development of such databases, we propose the benchmark MTCB. Benchmarks for OLTP exist and multi-tenant benchmarks exist, but no

  16. Multiobjective anatomy-based dose optimization for HDR-brachytherapy with constraint free deterministic algorithms

    International Nuclear Information System (INIS)

    Milickovic, N.; Lahanas, M.; Papagiannopoulou, M.; Zamboglou, N.; Baltas, D.

    2002-01-01

    In high dose rate (HDR) brachytherapy, conventional dose optimization algorithms consider multiple objectives in the form of an aggregate function that transforms the multiobjective problem into a single-objective problem. As a result, there is a loss of information on the available alternative possible solutions. This method assumes that the treatment planner exactly understands the correlation between competing objectives and knows the physical constraints. This knowledge is provided by the Pareto trade-off set obtained by single-objective optimization algorithms with a repeated optimization with different importance vectors. A mapping technique avoids non-feasible solutions with negative dwell weights and allows the use of constraint free gradient-based deterministic algorithms. We compare various such algorithms and methods which could improve their performance. This finally allows us to generate a large number of solutions in a few minutes. We use objectives expressed in terms of dose variances obtained from a few hundred sampling points in the planning target volume (PTV) and in organs at risk (OAR). We compare two- to four-dimensional Pareto fronts obtained with the deterministic algorithms and with a fast-simulated annealing algorithm. For PTV-based objectives, due to the convex objective functions, the obtained solutions are global optimal. If OARs are included, then the solutions found are also global optimal, although local minima may be present as suggested. (author)

  17. Internal Benchmarking for Institutional Effectiveness

    Science.gov (United States)

    Ronco, Sharron L.

    2012-01-01

    Internal benchmarking is an established practice in business and industry for identifying best in-house practices and disseminating the knowledge about those practices to other groups in the organization. Internal benchmarking can be done with structures, processes, outcomes, or even individuals. In colleges or universities with multicampuses or a…

  18. Benchmarking the Netherlands. Benchmarking for growth

    International Nuclear Information System (INIS)

    2003-01-01

    This is the fourth edition of the Ministry of Economic Affairs' publication 'Benchmarking the Netherlands', which aims to assess the competitiveness of the Dutch economy. The methodology and objective of the benchmarking remain the same. The basic conditions for economic activity (institutions, regulation, etc.) in a number of benchmark countries are compared in order to learn from the solutions found by other countries for common economic problems. This publication is devoted entirely to the potential output of the Dutch economy. In other words, its ability to achieve sustainable growth and create work over a longer period without capacity becoming an obstacle. This is important because economic growth is needed to increase prosperity in the broad sense and meeting social needs. Prosperity in both a material (per capita GDP) and immaterial (living environment, environment, health, etc) sense, in other words. The economy's potential output is determined by two structural factors: the growth of potential employment and the structural increase in labour productivity. Analysis by the Netherlands Bureau for Economic Policy Analysis (CPB) shows that in recent years the increase in the capacity for economic growth has been realised mainly by increasing the supply of labour and reducing the equilibrium unemployment rate. In view of the ageing of the population in the coming years and decades the supply of labour is unlikely to continue growing at the pace we have become accustomed to in recent years. According to a number of recent studies, to achieve a respectable rate of sustainable economic growth the aim will therefore have to be to increase labour productivity. To realise this we have to focus on for six pillars of economic policy: (1) human capital, (2) functioning of markets, (3) entrepreneurship, (4) spatial planning, (5) innovation, and (6) sustainability. These six pillars determine the course for economic policy aiming at higher productivity growth. Throughout

  19. Benchmarking the Netherlands. Benchmarking for growth

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-01-01

    This is the fourth edition of the Ministry of Economic Affairs' publication 'Benchmarking the Netherlands', which aims to assess the competitiveness of the Dutch economy. The methodology and objective of the benchmarking remain the same. The basic conditions for economic activity (institutions, regulation, etc.) in a number of benchmark countries are compared in order to learn from the solutions found by other countries for common economic problems. This publication is devoted entirely to the potential output of the Dutch economy. In other words, its ability to achieve sustainable growth and create work over a longer period without capacity becoming an obstacle. This is important because economic growth is needed to increase prosperity in the broad sense and meeting social needs. Prosperity in both a material (per capita GDP) and immaterial (living environment, environment, health, etc) sense, in other words. The economy's potential output is determined by two structural factors: the growth of potential employment and the structural increase in labour productivity. Analysis by the Netherlands Bureau for Economic Policy Analysis (CPB) shows that in recent years the increase in the capacity for economic growth has been realised mainly by increasing the supply of labour and reducing the equilibrium unemployment rate. In view of the ageing of the population in the coming years and decades the supply of labour is unlikely to continue growing at the pace we have become accustomed to in recent years. According to a number of recent studies, to achieve a respectable rate of sustainable economic growth the aim will therefore have to be to increase labour productivity. To realise this we have to focus on for six pillars of economic policy: (1) human capital, (2) functioning of markets, (3) entrepreneurship, (4) spatial planning, (5) innovation, and (6) sustainability. These six pillars determine the course for economic policy aiming at higher productivity

  20. Effects of exogenous oxygen derived free radicals on myocardial capillary permeability, vascular tone, and incidence of ventricular arrhythmias in the canine heart

    DEFF Research Database (Denmark)

    Svendsen, Jesper Hastrup; Bjerrum, P J

    1992-01-01

    The aim was to examine the effects of exogenous oxygen derived free radicals on myocardial capillary permeability for a small hydrophilic indicator, postischaemic vascular tone, and the occurrence of arrhythmias in the canine heart in vivo.......The aim was to examine the effects of exogenous oxygen derived free radicals on myocardial capillary permeability for a small hydrophilic indicator, postischaemic vascular tone, and the occurrence of arrhythmias in the canine heart in vivo....

  1. Validation of the AZTRAN 1.1 code with problems Benchmark of LWR reactors; Validacion del codigo AZTRAN 1.1 con problemas Benchmark de reactores LWR

    Energy Technology Data Exchange (ETDEWEB)

    Vallejo Q, J. A.; Bastida O, G. E.; Francois L, J. L. [UNAM, Facultad de Ingenieria, Departamento de Sistemas Energeticos, Ciudad Universitaria, 04510 Ciudad de Mexico (Mexico); Xolocostli M, J. V.; Gomez T, A. M., E-mail: amhed.jvq@gmail.com [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2016-09-15

    The AZTRAN module is a computational program that is part of the AZTLAN platform (Mexican modeling platform for the analysis and design of nuclear reactors) and that solves the neutron transport equation in 3-dimensional using the discrete ordinates method S{sub N}, steady state and Cartesian geometry. As part of the activities of Working Group 4 (users group) of the AZTLAN project, this work validates the AZTRAN code using the 2002 Yamamoto Benchmark for LWR reactors. For comparison, the commercial code CASMO-4 and the free code Serpent-2 are used; in addition, the results are compared with the data obtained from an article of the PHYSOR 2002 conference. The Benchmark consists of a fuel pin, two UO{sub 2} cells and two other of MOX cells; there is a problem of each cell for each type of reactor PWR and BWR. Although the AZTRAN code is at an early stage of development, the results obtained are encouraging and close to those reported with other internationally accepted codes and methodologies. (Author)

  2. Optimal allocation of energy storage in a co-optimized electricity market: Benefits assessment and deriving indicators for economic storage ventures

    International Nuclear Information System (INIS)

    Krishnan, Venkat; Das, Trishna

    2015-01-01

    This paper presents a framework for optimally allocating storage technologies in a power system. This decision support tool helps in quantitatively answering the questions on “where to and how much to install” considering the profits from arbitrage opportunities in a co-optimized electricity market. The developed framework is illustrated on a modified IEEE (Institute of Electrical and Electronics Engineers) 24 bus RTS (Reliability Test System), and the framework finds the optimal allocation solution and the revenues storage earns at each of these locations. Bulk energy storage, CAES (compressed air energy storage) is used as the representative storage technology, and the benefits of optimally allocated storage integration onto the grid are compared with transmission expansion solution. The paper also discusses about system-level indicators to identify candidate locations for economical storage ventures, which are derived based on the optimal storage allocation solution; and applies the market price based storage venture indicators on MISO (Mid-continental Independent System Operator) and PJM (Pennsylvania-New Jersey-Maryland Interconnection) electricity markets. - Highlights: • Storage optimal allocation framework based on high-fidelity storage dispatch model. • Storage with transmission addresses energy and ancillary issues under high renewables. • Bulk storage earns higher revenues from co-optimization (∼10× energy only market). • Grid offers distributed opportunities for investing in a strategic mix of storage. • Storage opportunities depend on cross-arbitrage, as seen from MISO (Mid-continental Independent System Operator) and PJM (Pennsylvania-New Jersey-Maryland Interconnection) markets

  3. Empirical optimization of DFT  +  U and HSE for the band structure of ZnO

    Science.gov (United States)

    Bashyal, Keshab; Pyles, Christopher K.; Afroosheh, Sajjad; Lamichhane, Aneer; Zayak, Alexey T.

    2018-02-01

    ZnO is a well-known wide band gap semiconductor with promising potential for applications in optoelectronics, transparent electronics, and spintronics. Computational simulations based on the density functional theory (DFT) play an important role in the research of ZnO, but the standard functionals, like Perdew-Burke-Erzenhof, result in largely underestimated values of the band gap and the binding energies of the Zn3d electrons. Methods like DFT  +  U and hybrid functionals are meant to remedy the weaknesses of plain DFT. However, both methods are not parameter-free. Direct comparison with experimental data is the best way to optimize the computational parameters. X-ray photoemission spectroscopy (XPS) is commonly considered as a benchmark for the computed electronic densities of states. In this work, both DFT  +  U and HSE methods were parametrized to fit almost exactly the binding energies of electrons in ZnO obtained by XPS. The optimized parameterizations of DFT  +  U and HSE lead to significantly worse results in reproducing the ion-clamped static dielectric tensor, compared to standard high-level calculations, including GW, which in turn yield a perfect match for the dielectric tensor. The failure of our XPS-based optimization reveals the fact that XPS does not report the ground state electronic structure for ZnO and should not be used for benchmarking ground state electronic structure calculations.

  4. Benchmark simulation models, quo vadis?

    DEFF Research Database (Denmark)

    Jeppsson, U.; Alex, J; Batstone, D. J.

    2013-01-01

    As the work of the IWA Task Group on Benchmarking of Control Strategies for wastewater treatment plants (WWTPs) is coming to an end, it is essential to disseminate the knowledge gained. For this reason, all authors of the IWA Scientific and Technical Report on benchmarking have come together to p...

  5. Benchmark for Strategic Performance Improvement.

    Science.gov (United States)

    Gohlke, Annette

    1997-01-01

    Explains benchmarking, a total quality management tool used to measure and compare the work processes in a library with those in other libraries to increase library performance. Topics include the main groups of upper management, clients, and staff; critical success factors for each group; and benefits of benchmarking. (Author/LRW)

  6. Establishing benchmarks and metrics for utilization management.

    Science.gov (United States)

    Melanson, Stacy E F

    2014-01-01

    The changing environment of healthcare reimbursement is rapidly leading to a renewed appreciation of the importance of utilization management in the clinical laboratory. The process of benchmarking of laboratory operations is well established for comparing organizational performance to other hospitals (peers) and for trending data over time through internal benchmarks. However, there are relatively few resources available to assist organizations in benchmarking for laboratory utilization management. This article will review the topic of laboratory benchmarking with a focus on the available literature and services to assist in managing physician requests for laboratory testing. © 2013.

  7. Derivation and characterization of the NIH registry human stem cell line NYSCF100 line under defined feeder-free conditions

    Directory of Open Access Journals (Sweden)

    Ana Sevilla

    2018-05-01

    Full Text Available The human embryonic stem cell line NYSCFe001-A was derived from a day 6 blastocyst in feeder-free and antibiotic free conditions. The blastocyst was voluntarily donated for research as surplus after in vitro fertilization treatment following informed consent. The NYSCFe001-A line, registered as NYSCF100 on the NIH registry, presents normal karyotype, is mycoplasma free, expresses all the pluripotency markers and has the potential to differentiate into all three germ layers in vitro.

  8. How Benchmarking and Higher Education Came Together

    Science.gov (United States)

    Levy, Gary D.; Ronco, Sharron L.

    2012-01-01

    This chapter introduces the concept of benchmarking and how higher education institutions began to use benchmarking for a variety of purposes. Here, benchmarking is defined as a strategic and structured approach whereby an organization compares aspects of its processes and/or outcomes to those of another organization or set of organizations to…

  9. Benchmark for Evaluating Moving Object Indexes

    DEFF Research Database (Denmark)

    Chen, Su; Jensen, Christian Søndergaard; Lin, Dan

    2008-01-01

    that targets techniques for the indexing of the current and near-future positions of moving objects. This benchmark enables the comparison of existing and future indexing techniques. It covers important aspects of such indexes that have not previously been covered by any benchmark. Notable aspects covered......Progress in science and engineering relies on the ability to measure, reliably and in detail, pertinent properties of artifacts under design. Progress in the area of database-index design thus relies on empirical studies based on prototype implementations of indexes. This paper proposes a benchmark...... include update efficiency, query efficiency, concurrency control, and storage requirements. Next, the paper applies the benchmark to half a dozen notable moving-object indexes, thus demonstrating the viability of the benchmark and offering new insight into the performance properties of the indexes....

  10. Benchmarking infrastructure for mutation text mining.

    Science.gov (United States)

    Klein, Artjom; Riazanov, Alexandre; Hindle, Matthew M; Baker, Christopher Jo

    2014-02-25

    Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption.

  11. Benchmarking infrastructure for mutation text mining

    Science.gov (United States)

    2014-01-01

    Background Experimental research on the automatic extraction of information about mutations from texts is greatly hindered by the lack of consensus evaluation infrastructure for the testing and benchmarking of mutation text mining systems. Results We propose a community-oriented annotation and benchmarking infrastructure to support development, testing, benchmarking, and comparison of mutation text mining systems. The design is based on semantic standards, where RDF is used to represent annotations, an OWL ontology provides an extensible schema for the data and SPARQL is used to compute various performance metrics, so that in many cases no programming is needed to analyze results from a text mining system. While large benchmark corpora for biological entity and relation extraction are focused mostly on genes, proteins, diseases, and species, our benchmarking infrastructure fills the gap for mutation information. The core infrastructure comprises (1) an ontology for modelling annotations, (2) SPARQL queries for computing performance metrics, and (3) a sizeable collection of manually curated documents, that can support mutation grounding and mutation impact extraction experiments. Conclusion We have developed the principal infrastructure for the benchmarking of mutation text mining tasks. The use of RDF and OWL as the representation for corpora ensures extensibility. The infrastructure is suitable for out-of-the-box use in several important scenarios and is ready, in its current state, for initial community adoption. PMID:24568600

  12. Design of proportional-integral-derivative type optimal controller for a nuclear reactor

    International Nuclear Information System (INIS)

    Pal, Jayanta

    1976-01-01

    A theoretic approach to the design of a proportional integral derivative (PID) type optimal controller for a nuclear reactor is considered. A linearized version of the state-space model of a nuclear-reactor-plant is investigated which shows very 'sluggish' response (settling time of the order of 600 seconds) to changes in the power demand and frequency. It is shown that with a judicious choice of state variables a PID type optimal controller realisation is possible. A controller is designed to minimise the effects of (a) a sudden increase or decrease in the electrical power demand (b) change in frequency at grid. The above controller, designed for a tracking problem, reduces the steady-state error (in response to a step input) to zero and the dynamics of the system become 'faster' (setting time of the order of 100 seconds). The controller is also insensitive to changes in system parameters. The superiority in the performance of the system with the optimal PID controller as compared with that of the conventional regulator is conclusively established. (author)

  13. Benchmarking: A Process for Improvement.

    Science.gov (United States)

    Peischl, Thomas M.

    One problem with the outcome-based measures used in higher education is that they measure quantity but not quality. Benchmarking, or the use of some external standard of quality to measure tasks, processes, and outputs, is partially solving that difficulty. Benchmarking allows for the establishment of a systematic process to indicate if outputs…

  14. A Gradient-Based Multistart Algorithm for Multimodal Aerodynamic Shape Optimization Problems Based on Free-Form Deformation

    Science.gov (United States)

    Streuber, Gregg Mitchell

    Environmental and economic factors motivate the pursuit of more fuel-efficient aircraft designs. Aerodynamic shape optimization is a powerful tool in this effort, but is hampered by the presence of multimodality in many design spaces. Gradient-based multistart optimization uses a sampling algorithm and multiple parallel optimizations to reliably apply fast gradient-based optimization to moderately multimodal problems. Ensuring that the sampled geometries remain physically realizable requires manually developing specialized linear constraints for each class of problem. Utilizing free-form deformation geometry control allows these linear constraints to be written in a geometry-independent fashion, greatly easing the process of applying the algorithm to new problems. This algorithm was used to assess the presence of multimodality when optimizing a wing in subsonic and transonic flows, under inviscid and viscous conditions, and a blended wing-body under transonic, viscous conditions. Multimodality was present in every wing case, while the blended wing-body was found to be generally unimodal.

  15. A Local and Global Search Combined Particle Swarm Optimization Algorithm and Its Convergence Analysis

    Directory of Open Access Journals (Sweden)

    Weitian Lin

    2014-01-01

    Full Text Available Particle swarm optimization algorithm (PSOA is an advantage optimization tool. However, it has a tendency to get stuck in a near optimal solution especially for middle and large size problems and it is difficult to improve solution accuracy by fine-tuning parameters. According to the insufficiency, this paper researches the local and global search combine particle swarm algorithm (LGSCPSOA, and its convergence and obtains its convergence qualification. At the same time, it is tested with a set of 8 benchmark continuous functions and compared their optimization results with original particle swarm algorithm (OPSOA. Experimental results indicate that the LGSCPSOA improves the search performance especially on the middle and large size benchmark functions significantly.

  16. Scavenging of free-radical metabolites of aniline xenobiotics and drugs by amino acid derivatives: toxicological implications of radical-transfer reactions.

    Science.gov (United States)

    Michail, Karim; Baghdasarian, Argishti; Narwaley, Malyaj; Aljuhani, Naif; Siraki, Arno G

    2013-12-16

    We investigated a novel scavenging mechanism of arylamine free radicals by poly- and monoaminocarboxylates. Free radicals of arylamine xenobiotics and drugs did not react with oxygen in peroxidase-catalyzed reactions; however, they showed marked oxygen uptake in the presence of an aminocarboxylate. These free-radical intermediates were identified using the spin trap 5,5-dimethyl-1-pyrroline-N-oxide (DMPO) and electron paramagnetic resonance (EPR) spectrometry. Diethylenetriaminepentaacetic acid (DTPA), a polyaminocarboxylate, caused a concentration-dependent attenuation of N-centered radicals produced by the peroxidative metabolism of arylamines with the subsequent formation of secondary aliphatic carbon-centered radicals stemming from the cosubstrate molecule. Analogously, N,N-dimethylglycine (DMG) and N-methyliminodiacetate (MIDA), but not iminodiacetic acid (IDA), demonstrated a similar scavenging effect of arylamine-derived free radicals in a horseradish peroxidase/H2O2 system. Using human promyelocytic leukemia (HL-60) cell lysate as a model of human neutrophils, DTPA, MIDA, and DMG readily reduced anilinium cation radicals derived from the arylamines and gave rise to the corresponding carbon radicals. The rate of peroxidase-triggered polymerization of aniline was studied as a measure of nitrogen-radical scavenging. Although, IDA had no effect on the rate of aniline polymerization, this was almost nullified in the presence of DTPA and MIDA at half of the molar concentration of the aniline substrate, whereas a 20 molar excess of DMPO caused only a partial inhibition. Furthermore, the yield of formaldehyde, a specific reaction endproduct of the oxidation of aminocarboxylates by aniline free-radical metabolites, was quantitatively determined. Azobenzene, a specific reaction product of peroxidase-catalyzed free-radical dimerization of aniline, was fully abrogated in the presence of DTPA, as confirmed by GC/MS. Under aerobic conditions, a radical-transfer reaction

  17. Optimization Based Clearance of Flight Control Laws A Civil Aircraft Application

    CERN Document Server

    Hansson, Anders; Puyou, Guilhem

    2012-01-01

    This book summarizes the main achievements of the EC funded 6th Framework Program project COFCLUO – Clearance of Flight Control Laws Using Optimization. This project successfully contributed to the achievement of a top-level objective to meet society’s needs for a more efficient, safer and environmentally friendly air transport by providing new techniques and tools for the clearance of flight control laws. This is an important part of the certification and qualification process of an aircraft – a costly and time-consuming process for the aeronautical industry.   The overall objective of the COFCLUO project was to develop and apply optimization techniques to the clearance of flight control laws in order to improve efficiency and reliability. In the book, the new techniques are explained and benchmarked against traditional techniques currently used by the industry. The new techniques build on mathematical criteria derived from the certification and qualification requirements together with suitable models...

  18. Hospital benchmarking: are U.S. eye hospitals ready?

    Science.gov (United States)

    de Korne, Dirk F; van Wijngaarden, Jeroen D H; Sol, Kees J C A; Betz, Robert; Thomas, Richard C; Schein, Oliver D; Klazinga, Niek S

    2012-01-01

    Benchmarking is increasingly considered a useful management instrument to improve quality in health care, but little is known about its applicability in hospital settings. The aims of this study were to assess the applicability of a benchmarking project in U.S. eye hospitals and compare the results with an international initiative. We evaluated multiple cases by applying an evaluation frame abstracted from the literature to five U.S. eye hospitals that used a set of 10 indicators for efficiency benchmarking. Qualitative analysis entailed 46 semistructured face-to-face interviews with stakeholders, document analyses, and questionnaires. The case studies only partially met the conditions of the evaluation frame. Although learning and quality improvement were stated as overall purposes, the benchmarking initiative was at first focused on efficiency only. No ophthalmic outcomes were included, and clinicians were skeptical about their reporting relevance and disclosure. However, in contrast with earlier findings in international eye hospitals, all U.S. hospitals worked with internal indicators that were integrated in their performance management systems and supported benchmarking. Benchmarking can support performance management in individual hospitals. Having a certain number of comparable institutes provide similar services in a noncompetitive milieu seems to lay fertile ground for benchmarking. International benchmarking is useful only when these conditions are not met nationally. Although the literature focuses on static conditions for effective benchmarking, our case studies show that it is a highly iterative and learning process. The journey of benchmarking seems to be more important than the destination. Improving patient value (health outcomes per unit of cost) requires, however, an integrative perspective where clinicians and administrators closely cooperate on both quality and efficiency issues. If these worlds do not share such a relationship, the added

  19. Theoretical insights on the antioxidant activity of edaravone free radical scavengers derivatives

    Science.gov (United States)

    Cerón-Carrasco, José P.; Roy, Hélène M.; Cerezo, Javier; Jacquemin, Denis; Laurent, Adèle D.

    2014-04-01

    The prediction of antioxidant properties is not straightforward due to the complexity of the in vivo systems. Here, we use theoretical descriptors, including the potential of ionization, the electrodonating power and the spin density distribution, to characterize the antioxidant capacity of edaravone (EDV) derivatives. Our computations reveal the relationship between these parameters and their potential bioactivity as free radical scavengers. We conclude that more efficient antioxidants could be synthesized by tuning the R1 and R2 positions of the EDV structure, rather than modifying the R3 group. Such modifications might improve the antioxidant activity in neutral and deprotonated forms.

  20. Laplace-Fourier-domain dispersion analysis of an average derivative optimal scheme for scalar-wave equation

    Science.gov (United States)

    Chen, Jing-Bo

    2014-06-01

    By using low-frequency components of the damped wavefield, Laplace-Fourier-domain full waveform inversion (FWI) can recover a long-wavelength velocity model from the original undamped seismic data lacking low-frequency information. Laplace-Fourier-domain modelling is an important foundation of Laplace-Fourier-domain FWI. Based on the numerical phase velocity and the numerical attenuation propagation velocity, a method for performing Laplace-Fourier-domain numerical dispersion analysis is developed in this paper. This method is applied to an average-derivative optimal scheme. The results show that within the relative error of 1 per cent, the Laplace-Fourier-domain average-derivative optimal scheme requires seven gridpoints per smallest wavelength and smallest pseudo-wavelength for both equal and unequal directional sampling intervals. In contrast, the classical five-point scheme requires 23 gridpoints per smallest wavelength and smallest pseudo-wavelength to achieve the same accuracy. Numerical experiments demonstrate the theoretical analysis.

  1. Virtual machine performance benchmarking.

    Science.gov (United States)

    Langer, Steve G; French, Todd

    2011-10-01

    The attractions of virtual computing are many: reduced costs, reduced resources and simplified maintenance. Any one of these would be compelling for a medical imaging professional attempting to support a complex practice on limited resources in an era of ever tightened reimbursement. In particular, the ability to run multiple operating systems optimized for different tasks (computational image processing on Linux versus office tasks on Microsoft operating systems) on a single physical machine is compelling. However, there are also potential drawbacks. High performance requirements need to be carefully considered if they are to be executed in an environment where the running software has to execute through multiple layers of device drivers before reaching the real disk or network interface. Our lab has attempted to gain insight into the impact of virtualization on performance by benchmarking the following metrics on both physical and virtual platforms: local memory and disk bandwidth, network bandwidth, and integer and floating point performance. The virtual performance metrics are compared to baseline performance on "bare metal." The results are complex, and indeed somewhat surprising.

  2. The fifth Atomic Energy Research dynamic benchmark calculation with HEXTRAN-SMABRE

    International Nuclear Information System (INIS)

    Haenaelaeinen, Anitta

    1998-01-01

    The fifth Atomic Energy Research dynamic benchmark is the first Atomic Energy Research benchmark for coupling of the thermohydraulic codes and three-dimensional reactor dynamic core models. In VTT HEXTRAN 2.7 is used for the core dynamics and SMABRE 4.6 as a thermohydraulic model for the primary and secondary loops. The plant model for SMABRE is based mainly on two input models. the Loviisa model and standard WWER-440/213 plant model. The primary circuit includes six separate loops, totally 505 nodes and 652 junctions. The reactor pressure vessel is divided into six parallel channels. In HEXTRAN calculation 176 symmetry is used in the core. In the sequence of main steam header break at the hot standby state, the liquid temperature is decreased symmetrically in the core inlet which leads to return to power. In the benchmark, no isolations of the steam generators are assumed and the maximum core power is about 38 % of the nominal power at four minutes after the break opening in the HEXTRAN-SMABRE calculation. Due to boric acid in the high pressure safety injection water, the power finally starts to decrease. The break flow is pure steam in the HEXTRAN-SMABRE calculation during the whole transient even in the swell levels in the steam generators are very high due to flashing. Because of sudden peaks in the preliminary results of the steam generator heat transfer, the SMABRE drift-flux model was modified. The new model is a simplified version of the EPRI correlation based on test data. The modified correlation behaves smoothly. In the calculations nuclear data is based on the ENDF/B-IV library and it has been evaluated with the CASMO-HEX code. The importance of the nuclear data was illustrated by repeating the benchmark calculation with using three different data sets. Optimal extensive data valid from hot to cold conditions were not available for all types of fuel enrichments needed in this benchmark.(Author)

  3. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    Science.gov (United States)

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2017-09-01

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  4. Optimization of free radical scavenging capacity and pH of Hylocereus polyrhizus peel by Response Surface Methodology

    Science.gov (United States)

    Putranto, A. W.; Dewi, S. R.; Puspitasari, Y.; Nuriah, F. A.

    2018-03-01

    Red dragon fruit (Hylocereus polyrhizus) peel, a by-product of juice processing, contains a high antioxidant that can be used for nutraceuticals. Hence, it is important to extract and investigate its antioxidant stability. The aim of this study was to optimize the free radical scavenging capacity and pH of H. polyrhizus peel extract using Central Composite Design (CCD) under Response Surface Methodology (RSM). The extraction of H. polyrhizus peel was done by using green-Pulsed Electric Field (PEF)-assisted extraction method. Factors optimized were electric field strength (kV/cm) and extraction time (seconds). The result showed that the correlation between responses (free radical-scavenging capacity and pH) and two factors was quadratic model. The optimum conditions was obtained at the electric field strength of 3.96 kV/cm, and treatment time of 31.9 seconds. Under these conditions, the actual free radical-scavenging capacity and pH were 75.86 ± 0.2 % and 4.8, respectively. The verification model showed that the actual values are in accordance with the predicted values, and have error rate values of free radical-scavenging capacity and pH responses were 0.1% and 3.98%, respectively. We suggest to extract the H. polyrhizus peel using a green and non-thermal extraction technology, PEF-assisted extraction, for research, food applications and nutraceuticals industry.

  5. Radiation Detection Computational Benchmark Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Shaver, Mark W.; Casella, Andrew M.; Wittman, Richard S.; McDonald, Ben S.

    2013-09-24

    Modeling forms an important component of radiation detection development, allowing for testing of new detector designs, evaluation of existing equipment against a wide variety of potential threat sources, and assessing operation performance of radiation detection systems. This can, however, result in large and complex scenarios which are time consuming to model. A variety of approaches to radiation transport modeling exist with complementary strengths and weaknesses for different problems. This variety of approaches, and the development of promising new tools (such as ORNL’s ADVANTG) which combine benefits of multiple approaches, illustrates the need for a means of evaluating or comparing different techniques for radiation detection problems. This report presents a set of 9 benchmark problems for comparing different types of radiation transport calculations, identifying appropriate tools for classes of problems, and testing and guiding the development of new methods. The benchmarks were drawn primarily from existing or previous calculations with a preference for scenarios which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22. From a technical perspective, the benchmarks were chosen to span a range of difficulty and to include gamma transport, neutron transport, or both and represent different important physical processes and a range of sensitivity to angular or energy fidelity. Following benchmark identification, existing information about geometry, measurements, and previous calculations were assembled. Monte Carlo results (MCNP decks) were reviewed or created and re-run in order to attain accurate computational times and to verify agreement with experimental data, when present. Benchmark information was then conveyed to ORNL in order to guide testing and development of hybrid calculations. The results of those ADVANTG calculations were then sent to PNNL for

  6. Iterative free-energy optimization for recurrent neural networks (INFERNO)

    Science.gov (United States)

    2017-01-01

    The intra-parietal lobe coupled with the Basal Ganglia forms a working memory that demonstrates strong planning capabilities for generating robust yet flexible neuronal sequences. Neurocomputational models however, often fails to control long range neural synchrony in recurrent spiking networks due to spontaneous activity. As a novel framework based on the free-energy principle, we propose to see the problem of spikes’ synchrony as an optimization problem of the neurons sub-threshold activity for the generation of long neuronal chains. Using a stochastic gradient descent, a reinforcement signal (presumably dopaminergic) evaluates the quality of one input vector to move the recurrent neural network to a desired activity; depending on the error made, this input vector is strengthened to hill-climb the gradient or elicited to search for another solution. This vector can be learned then by one associative memory as a model of the basal-ganglia to control the recurrent neural network. Experiments on habit learning and on sequence retrieving demonstrate the capabilities of the dual system to generate very long and precise spatio-temporal sequences, above two hundred iterations. Its features are applied then to the sequential planning of arm movements. In line with neurobiological theories, we discuss its relevance for modeling the cortico-basal working memory to initiate flexible goal-directed neuronal chains of causation and its relation to novel architectures such as Deep Networks, Neural Turing Machines and the Free-Energy Principle. PMID:28282439

  7. WWER-1000 Burnup Credit Benchmark (CB5)

    International Nuclear Information System (INIS)

    Manolova, M.A.

    2002-01-01

    In the paper the specification of WWER-1000 Burnup Credit Benchmark first phase (depletion calculations), given. The second phase - criticality calculations for the WWER-1000 fuel pin cell, will be given after the evaluation of the results, obtained at the first phase. The proposed benchmark is a continuation of the WWER benchmark activities in this field (Author)

  8. The role of benchmarking for yardstick competition

    International Nuclear Information System (INIS)

    Burns, Phil; Jenkins, Cloda; Riechmann, Christoph

    2005-01-01

    With the increasing interest in yardstick regulation, there is a need to understand the most appropriate method for realigning tariffs at the outset. Benchmarking is the tool used for such realignment and is therefore a necessary first-step in the implementation of yardstick competition. A number of concerns have been raised about the application of benchmarking, making some practitioners reluctant to move towards yardstick based regimes. We assess five of the key concerns often discussed and find that, in general, these are not as great as perceived. The assessment is based on economic principles and experiences with applying benchmarking to regulated sectors, e.g. in the electricity and water industries in the UK, The Netherlands, Austria and Germany in recent years. The aim is to demonstrate that clarity on the role of benchmarking reduces the concern about its application in different regulatory regimes. We find that benchmarking can be used in regulatory settlements, although the range of possible benchmarking approaches that are appropriate will be small for any individual regulatory question. Benchmarking is feasible as total cost measures and environmental factors are better defined in practice than is commonly appreciated and collusion is unlikely to occur in environments with more than 2 or 3 firms (where shareholders have a role in monitoring and rewarding performance). Furthermore, any concern about companies under-recovering costs is a matter to be determined through the regulatory settlement and does not affect the case for using benchmarking as part of that settlement. (author)

  9. Gaussian process regression for geometry optimization

    Science.gov (United States)

    Denzel, Alexander; Kästner, Johannes

    2018-03-01

    We implemented a geometry optimizer based on Gaussian process regression (GPR) to find minimum structures on potential energy surfaces. We tested both a two times differentiable form of the Matérn kernel and the squared exponential kernel. The Matérn kernel performs much better. We give a detailed description of the optimization procedures. These include overshooting the step resulting from GPR in order to obtain a higher degree of interpolation vs. extrapolation. In a benchmark against the Limited-memory Broyden-Fletcher-Goldfarb-Shanno optimizer of the DL-FIND library on 26 test systems, we found the new optimizer to generally reduce the number of required optimization steps.

  10. Benchmarking a Soil Moisture Data Assimilation System for Agricultural Drought Monitoring

    Science.gov (United States)

    Hun, Eunjin; Crow, Wade T.; Holmes, Thomas; Bolten, John

    2014-01-01

    Despite considerable interest in the application of land surface data assimilation systems (LDAS) for agricultural drought applications, relatively little is known about the large-scale performance of such systems and, thus, the optimal methodological approach for implementing them. To address this need, this paper evaluates an LDAS for agricultural drought monitoring by benchmarking individual components of the system (i.e., a satellite soil moisture retrieval algorithm, a soil water balance model and a sequential data assimilation filter) against a series of linear models which perform the same function (i.e., have the same basic inputoutput structure) as the full system component. Benchmarking is based on the calculation of the lagged rank cross-correlation between the normalized difference vegetation index (NDVI) and soil moisture estimates acquired for various components of the system. Lagged soil moistureNDVI correlations obtained using individual LDAS components versus their linear analogs reveal the degree to which non-linearities andor complexities contained within each component actually contribute to the performance of the LDAS system as a whole. Here, a particular system based on surface soil moisture retrievals from the Land Parameter Retrieval Model (LPRM), a two-layer Palmer soil water balance model and an Ensemble Kalman filter (EnKF) is benchmarked. Results suggest significant room for improvement in each component of the system.

  11. Group Counseling Optimization: A Novel Approach

    Science.gov (United States)

    Eita, M. A.; Fahmy, M. M.

    A new population-based search algorithm, which we call Group Counseling Optimizer (GCO), is presented. It mimics the group counseling behavior of humans in solving their problems. The algorithm is tested using seven known benchmark functions: Sphere, Rosenbrock, Griewank, Rastrigin, Ackley, Weierstrass, and Schwefel functions. A comparison is made with the recently published comprehensive learning particle swarm optimizer (CLPSO). The results demonstrate the efficiency and robustness of the proposed algorithm.

  12. Nonlinear Optimization-Based Device-Free Localization with Outlier Link Rejection

    Directory of Open Access Journals (Sweden)

    Wendong Xiao

    2015-04-01

    Full Text Available Device-free localization (DFL is an emerging wireless technique for estimating the location of target that does not have any attached electronic device. It has found extensive use in Smart City applications such as healthcare at home and hospitals, location-based services at smart spaces, city emergency response and infrastructure security. In DFL, wireless devices are used as sensors that can sense the target by transmitting and receiving wireless signals collaboratively. Many DFL systems are implemented based on received signal strength (RSS measurements and the location of the target is estimated by detecting the changes of the RSS measurements of the wireless links. Due to the uncertainty of the wireless channel, certain links may be seriously polluted and result in erroneous detection. In this paper, we propose a novel nonlinear optimization approach with outlier link rejection (NOOLR for RSS-based DFL. It consists of three key strategies, including: (1 affected link identification by differential RSS detection; (2 outlier link rejection via geometrical positional relationship among links; (3 target location estimation by formulating and solving a nonlinear optimization problem. Experimental results demonstrate that NOOLR is robust to the fluctuation of the wireless signals with superior localization accuracy compared with the existing Radio Tomographic Imaging (RTI approach.

  13. HIGHLY PRECISE APPROXIMATION OF FREE SURFACE GREEN FUNCTION AND ITS HIGH ORDER DERIVATIVES BASED ON REFINED SUBDOMAINS

    Directory of Open Access Journals (Sweden)

    Jiameng Wu

    2018-01-01

    Full Text Available The infinite depth free surface Green function (GF and its high order derivatives for diffraction and radiation of water waves are considered. Especially second order derivatives are essential requirements in high-order panel method. In this paper, concerning the classical representation, composed of a semi-infinite integral involving a Bessel function and a Cauchy singularity, not only the GF and its first order derivatives but also second order derivatives are derived from four kinds of analytical series expansion and refined division of whole calculation domain. The approximations of special functions, particularly the hypergeometric function and the algorithmic applicability with different subdomains are implemented. As a result, the computation accuracy can reach 10-9 in whole domain compared with conventional methods based on direct numerical integration. Furthermore, numerical efficiency is almost equivalent to that with the classical method.

  14. SP2Bench: A SPARQL Performance Benchmark

    Science.gov (United States)

    Schmidt, Michael; Hornung, Thomas; Meier, Michael; Pinkel, Christoph; Lausen, Georg

    A meaningful analysis and comparison of both existing storage schemes for RDF data and evaluation approaches for SPARQL queries necessitates a comprehensive and universal benchmark platform. We present SP2Bench, a publicly available, language-specific performance benchmark for the SPARQL query language. SP2Bench is settled in the DBLP scenario and comprises a data generator for creating arbitrarily large DBLP-like documents and a set of carefully designed benchmark queries. The generated documents mirror vital key characteristics and social-world distributions encountered in the original DBLP data set, while the queries implement meaningful requests on top of this data, covering a variety of SPARQL operator constellations and RDF access patterns. In this chapter, we discuss requirements and desiderata for SPARQL benchmarks and present the SP2Bench framework, including its data generator, benchmark queries and performance metrics.

  15. Root Growth Optimizer with Self-Similar Propagation

    Directory of Open Access Journals (Sweden)

    Xiaoxian He

    2015-01-01

    Full Text Available Most nature-inspired algorithms simulate intelligent behaviors of animals and insects that can move spontaneously and independently. The survival wisdom of plants, as another species of biology, has been neglected to some extent even though they have evolved for a longer period of time. This paper presents a new plant-inspired algorithm which is called root growth optimizer (RGO. RGO simulates the iterative growth behaviors of plant roots to optimize continuous space search. In growing process, main roots and lateral roots, classified by fitness values, implement different strategies. Main roots carry out exploitation tasks by self-similar propagation in relatively nutrient-rich areas, while lateral roots explore other places to seek for better chance. Inhibition mechanism of plant hormones is applied to main roots in case of explosive propagation in some local optimal areas. Once resources in a location are exhausted, roots would shrink away from infertile conditions to preserve their activity. In order to validate optimization effect of the algorithm, twelve benchmark functions, including eight classic functions and four CEC2005 test functions, are tested in the experiments. We compared RGO with other existing evolutionary algorithms including artificial bee colony, particle swarm optimizer, and differential evolution algorithm. The experimental results show that RGO outperforms other algorithms on most benchmark functions.

  16. RB reactor benchmark cores

    International Nuclear Information System (INIS)

    Pesic, M.

    1998-01-01

    A selected set of the RB reactor benchmark cores is presented in this paper. The first results of validation of the well-known Monte Carlo MCNP TM code and adjoining neutron cross section libraries are given. They confirm the idea for the proposal of the new U-D 2 O criticality benchmark system and support the intention to include this system in the next edition of the recent OECD/NEA Project: International Handbook of Evaluated Criticality Safety Experiment, in near future. (author)

  17. Sensitivity Study of Regional TDC in MATRA-S code Using PSBT Benchmark Exercise

    International Nuclear Information System (INIS)

    Kim, Seong Jin; Cha, Jeong Hun; Seo, Kyong Won; Kwon, Hyuk; Hwang, Dae Hyun

    2012-01-01

    In the sub-channel analysis code, the modeling of interchannel exchanges between adjacent sub-channels expressed as diversion cross flow, turbulent mixing and so on. The turbulent mixing in MATRA-S code is considered as TDC( β : thermal diffusion coefficient). The TDC becomes different according to the bundle, grid type, mixing vane, and so on. Generally, the thermal mixing test is conducted to optimize the TDC. In the OECD/NRC PSBT benchmark, the thermal mixing test was conducted and the optimized TDC was analyzed using MATRA-S code. It was shown that the exit temperature distribution of MATRA-S code was different from an experimental result even though the optimized TDC was applied to the code. In this study, concept of the regional TDC was introduced and sensitivity analysis of the regional TDC was presented

  18. Benchmarking specialty hospitals, a scoping review on theory and practice.

    Science.gov (United States)

    Wind, A; van Harten, W H

    2017-04-04

    Although benchmarking may improve hospital processes, research on this subject is limited. The aim of this study was to provide an overview of publications on benchmarking in specialty hospitals and a description of study characteristics. We searched PubMed and EMBASE for articles published in English in the last 10 years. Eligible articles described a project stating benchmarking as its objective and involving a specialty hospital or specific patient category; or those dealing with the methodology or evaluation of benchmarking. Of 1,817 articles identified in total, 24 were included in the study. Articles were categorized into: pathway benchmarking, institutional benchmarking, articles on benchmark methodology or -evaluation and benchmarking using a patient registry. There was a large degree of variability:(1) study designs were mostly descriptive and retrospective; (2) not all studies generated and showed data in sufficient detail; and (3) there was variety in whether a benchmarking model was just described or if quality improvement as a consequence of the benchmark was reported upon. Most of the studies that described a benchmark model described the use of benchmarking partners from the same industry category, sometimes from all over the world. Benchmarking seems to be more developed in eye hospitals, emergency departments and oncology specialty hospitals. Some studies showed promising improvement effects. However, the majority of the articles lacked a structured design, and did not report on benchmark outcomes. In order to evaluate the effectiveness of benchmarking to improve quality in specialty hospitals, robust and structured designs are needed including a follow up to check whether the benchmark study has led to improvements.

  19. Development of a California commercial building benchmarking database

    International Nuclear Information System (INIS)

    Kinney, Satkartar; Piette, Mary Ann

    2002-01-01

    Building energy benchmarking is a useful starting point for commercial building owners and operators to target energy savings opportunities. There are a number of tools and methods for benchmarking energy use. Benchmarking based on regional data can provides more relevant information for California buildings than national tools such as Energy Star. This paper discusses issues related to benchmarking commercial building energy use and the development of Cal-Arch, a building energy benchmarking database for California. Currently Cal-Arch uses existing survey data from California's Commercial End Use Survey (CEUS), a largely underutilized wealth of information collected by California's major utilities. Doe's Commercial Building Energy Consumption Survey (CBECS) is used by a similar tool, Arch, and by a number of other benchmarking tools. Future versions of Arch/Cal-Arch will utilize additional data sources including modeled data and individual buildings to expand the database

  20. Optimized dose distribution of a high dose rate vaginal cylinder

    International Nuclear Information System (INIS)

    Li Zuofeng; Liu, Chihray; Palta, Jatinder R.

    1998-01-01

    Purpose: To present a comparison of optimized dose distributions for a set of high-dose-rate (HDR) vaginal cylinders calculated by a commercial treatment-planning system with benchmark calculations using Monte-Carlo-calculated dosimetry data. Methods and Materials: Optimized dose distributions using both an isotropic and an anisotropic dose calculation model were obtained for a set of HDR vaginal cylinders. Mathematical optimization techniques available in the computer treatment-planning system were used to calculate dwell times and positions. These dose distributions were compared with benchmark calculations with TG43 formalism and using Monte-Carlo-calculated data. The same dwell times and positions were used for a quantitative comparison of dose calculated with three dose models. Results: The isotropic dose calculation model can result in discrepancies as high as 50%. The anisotropic dose calculation model compared better with benchmark calculations. The differences were more significant at the apex of the vaginal cylinder, which is typically used as the prescription point. Conclusion: Dose calculation models available in a computer treatment-planning system must be evaluated carefully to ensure their correct application. It should also be noted that when optimized dose distribution at a distance from the cylinder surface is calculated using an accurate dose calculation model, the vaginal mucosa dose becomes significantly higher, and therefore should be carefully monitored

  1. Precision and accuracy in smFRET based structural studies—A benchmark study of the Fast-Nano-Positioning System

    Science.gov (United States)

    Nagy, Julia; Eilert, Tobias; Michaelis, Jens

    2018-03-01

    Modern hybrid structural analysis methods have opened new possibilities to analyze and resolve flexible protein complexes where conventional crystallographic methods have reached their limits. Here, the Fast-Nano-Positioning System (Fast-NPS), a Bayesian parameter estimation-based analysis method and software, is an interesting method since it allows for the localization of unknown fluorescent dye molecules attached to macromolecular complexes based on single-molecule Förster resonance energy transfer (smFRET) measurements. However, the precision, accuracy, and reliability of structural models derived from results based on such complex calculation schemes are oftentimes difficult to evaluate. Therefore, we present two proof-of-principle benchmark studies where we use smFRET data to localize supposedly unknown positions on a DNA as well as on a protein-nucleic acid complex. Since we use complexes where structural information is available, we can compare Fast-NPS localization to the existing structural data. In particular, we compare different dye models and discuss how both accuracy and precision can be optimized.

  2. Design optimization for permanent magnet machine with efficient slot per pole ratio

    Science.gov (United States)

    Potnuru, Upendra Kumar; Rao, P. Mallikarjuna

    2018-04-01

    This paper presents a methodology for the enhancement of a Brush Less Direct Current motor (BLDC) with 6Poles and 8slots. In particular; it is focused on amulti-objective optimization using a Genetic Algorithmand Grey Wolf Optimization developed in MATLAB. The optimization aims to maximize the maximum output power value and minimize the total losses of a motor. This paper presents an application of the MATLAB optimization algorithms to brushless DC (BLDC) motor design, with 7 design parameters chosen to be free. The optimal design parameters of the motor derived by GA are compared with those obtained by Grey Wolf Optimization technique. A comparative report on the specified enhancement approaches appearsthat Grey Wolf Optimization technique has a better convergence.

  3. How benchmarking can improve patient nutrition.

    Science.gov (United States)

    Ellis, Jane

    Benchmarking is a tool that originated in business to enable organisations to compare their services with industry-wide best practice. Early last year the Department of Health published The Essence of Care, a benchmarking toolkit adapted for use in health care. It focuses on eight elements of care that are crucial to patients' experiences. Nurses and other health care professionals at a London NHS trust have begun a trust-wide benchmarking project. The aim is to improve patients' experiences of health care by sharing and comparing information, and by identifying examples of good practice and areas for improvement. The project began with two of the eight elements of The Essence of Care, with the intention of covering the rest later. This article describes the benchmarking process for nutrition and some of the consequent improvements in care.

  4. Non-grey benchmark results for two temperature non-equilibrium radiative transfer

    International Nuclear Information System (INIS)

    Su, B.; Olson, G.L.

    1999-01-01

    Benchmark solutions to time-dependent radiative transfer problems involving non-equilibrium coupling to the material temperature field are crucial for validating time-dependent radiation transport codes. Previous efforts on generating analytical solutions to non-equilibrium radiative transfer problems were all restricted to the one-group grey model. In this paper, a non-grey model, namely the picket-fence model, is considered for a two temperature non-equilibrium radiative transfer problem in an infinite medium. The analytical solutions, as functions of space and time, are constructed in the form of infinite integrals for both the diffusion description and transport description. These expressions are evaluated numerically and the benchmark results are generated. The asymptotic solutions for large and small times are also derived in terms of elementary functions and are compared with the exact results. Comparisons are given between the transport and diffusion solutions and between the grey and non-grey solutions. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)

  5. Pareto-Ranking Based Quantum-Behaved Particle Swarm Optimization for Multiobjective Optimization

    Directory of Open Access Journals (Sweden)

    Na Tian

    2015-01-01

    Full Text Available A study on pareto-ranking based quantum-behaved particle swarm optimization (QPSO for multiobjective optimization problems is presented in this paper. During the iteration, an external repository is maintained to remember the nondominated solutions, from which the global best position is chosen. The comparison between different elitist selection strategies (preference order, sigma value, and random selection is performed on four benchmark functions and two metrics. The results demonstrate that QPSO with preference order has comparative performance with sigma value according to different number of objectives. Finally, QPSO with sigma value is applied to solve multiobjective flexible job-shop scheduling problems.

  6. IAEA sodium void reactivity benchmark calculations

    International Nuclear Information System (INIS)

    Hill, R.N.; Finck, P.J.

    1992-01-01

    In this paper, the IAEA-1 992 ''Benchmark Calculation of Sodium Void Reactivity Effect in Fast Reactor Core'' problem is evaluated. The proposed design is a large axially heterogeneous oxide-fueled fast reactor as described in Section 2; the core utilizes a sodium plenum above the core to enhance leakage effects. The calculation methods used in this benchmark evaluation are described in Section 3. In Section 4, the calculated core performance results for the benchmark reactor model are presented; and in Section 5, the influence of steel and interstitial sodium heterogeneity effects is estimated

  7. Benchmark Imagery FY11 Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pope, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-06-14

    This report details the work performed in FY11 under project LL11-GS-PD06, “Benchmark Imagery for Assessing Geospatial Semantic Extraction Algorithms.” The original LCP for the Benchmark Imagery project called for creating a set of benchmark imagery for verifying and validating algorithms that extract semantic content from imagery. More specifically, the first year was slated to deliver real imagery that had been annotated, the second year to deliver real imagery that had composited features, and the final year was to deliver synthetic imagery modeled after the real imagery.

  8. A Robot Trajectory Optimization Approach for Thermal Barrier Coatings Used for Free-Form Components

    Science.gov (United States)

    Cai, Zhenhua; Qi, Beichun; Tao, Chongyuan; Luo, Jie; Chen, Yuepeng; Xie, Changjun

    2017-10-01

    This paper is concerned with a robot trajectory optimization approach for thermal barrier coatings. As the requirements of high reproducibility of complex workpieces increase, an optimal thermal spraying trajectory should not only guarantee an accurate control of spray parameters defined by users (e.g., scanning speed, spray distance, scanning step, etc.) to achieve coating thickness homogeneity but also help to homogenize the heat transfer distribution on the coating surface. A mesh-based trajectory generation approach is introduced in this work to generate path curves on a free-form component. Then, two types of meander trajectories are generated by performing a different connection method. Additionally, this paper presents a research approach for introducing the heat transfer analysis into the trajectory planning process. Combining heat transfer analysis with trajectory planning overcomes the defects of traditional trajectory planning methods (e.g., local over-heating), which helps form the uniform temperature field by optimizing the time sequence of path curves. The influence of two different robot trajectories on the process of heat transfer is estimated by coupled FEM models which demonstrates the effectiveness of the presented optimization approach.

  9. Review for session K - benchmarks

    International Nuclear Information System (INIS)

    McCracken, A.K.

    1980-01-01

    Eight of the papers to be considered in Session K are directly concerned, at least in part, with the Pool Critical Assembly (P.C.A.) benchmark at Oak Ridge. The remaining seven papers in this session, the subject of this review, are concerned with a variety of topics related to the general theme of Benchmarks and will be considered individually

  10. Tourism Destination Benchmarking: Evaluation and Selection of the Benchmarking Partners

    Directory of Open Access Journals (Sweden)

    Luštický Martin

    2012-03-01

    Full Text Available Tourism development has an irreplaceable role in regional policy of almost all countries. This is due to its undeniable benefits for the local population with regards to the economic, social and environmental sphere. Tourist destinations compete for visitors at tourism market and subsequently get into a relatively sharp competitive struggle. The main goal of regional governments and destination management institutions is to succeed in this struggle by increasing the competitiveness of their destination. The quality of strategic planning and final strategies is a key factor of competitiveness. Even though the tourism sector is not the typical field where the benchmarking methods are widely used, such approaches could be successfully applied. The paper focuses on key phases of the benchmarking process which lies in the search for suitable referencing partners. The partners are consequently selected to meet general requirements to ensure the quality if strategies. Following from this, some specific characteristics are developed according to the SMART approach. The paper tests this procedure with an expert evaluation of eight selected regional tourism strategies of regions in the Czech Republic, Slovakia and Great Britain. In this way it validates the selected criteria in the frame of the international environment. Hence, it makes it possible to find strengths and weaknesses of selected strategies and at the same time facilitates the discovery of suitable benchmarking partners.

  11. Statistical benchmarking in utility regulation: Role, standards and methods

    International Nuclear Information System (INIS)

    Newton Lowry, Mark; Getachew, Lullit

    2009-01-01

    Statistical benchmarking is being used with increasing frequency around the world in utility rate regulation. We discuss how and where benchmarking is in use for this purpose and the pros and cons of regulatory benchmarking. We then discuss alternative performance standards and benchmarking methods in regulatory applications. We use these to propose guidelines for the appropriate use of benchmarking in the rate setting process. The standards, which we term the competitive market and frontier paradigms, have a bearing on method selection. These along with regulatory experience suggest that benchmarking can either be used for prudence review in regulation or to establish rates or rate setting mechanisms directly

  12. Development of a California commercial building benchmarking database

    Energy Technology Data Exchange (ETDEWEB)

    Kinney, Satkartar; Piette, Mary Ann

    2002-05-17

    Building energy benchmarking is a useful starting point for commercial building owners and operators to target energy savings opportunities. There are a number of tools and methods for benchmarking energy use. Benchmarking based on regional data can provides more relevant information for California buildings than national tools such as Energy Star. This paper discusses issues related to benchmarking commercial building energy use and the development of Cal-Arch, a building energy benchmarking database for California. Currently Cal-Arch uses existing survey data from California's Commercial End Use Survey (CEUS), a largely underutilized wealth of information collected by California's major utilities. Doe's Commercial Building Energy Consumption Survey (CBECS) is used by a similar tool, Arch, and by a number of other benchmarking tools. Future versions of Arch/Cal-Arch will utilize additional data sources including modeled data and individual buildings to expand the database.

  13. 40 CFR 141.172 - Disinfection profiling and benchmarking.

    Science.gov (United States)

    2010-07-01

    ... benchmarking. 141.172 Section 141.172 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... Disinfection-Systems Serving 10,000 or More People § 141.172 Disinfection profiling and benchmarking. (a... sanitary surveys conducted by the State. (c) Disinfection benchmarking. (1) Any system required to develop...

  14. Raising Quality and Achievement. A College Guide to Benchmarking.

    Science.gov (United States)

    Owen, Jane

    This booklet introduces the principles and practices of benchmarking as a way of raising quality and achievement at further education colleges in Britain. Section 1 defines the concept of benchmarking. Section 2 explains what benchmarking is not and the steps that should be taken before benchmarking is initiated. The following aspects and…

  15. Prismatic Core Coupled Transient Benchmark

    International Nuclear Information System (INIS)

    Ortensi, J.; Pope, M.A.; Strydom, G.; Sen, R.S.; DeHart, M.D.; Gougar, H.D.; Ellis, C.; Baxter, A.; Seker, V.; Downar, T.J.; Vierow, K.; Ivanov, K.

    2011-01-01

    The Prismatic Modular Reactor (PMR) is one of the High Temperature Reactor (HTR) design concepts that have existed for some time. Several prismatic units have operated in the world (DRAGON, Fort St. Vrain, Peach Bottom) and one unit is still in operation (HTTR). The deterministic neutronics and thermal-fluids transient analysis tools and methods currently available for the design and analysis of PMRs have lagged behind the state of the art compared to LWR reactor technologies. This has motivated the development of more accurate and efficient tools for the design and safety evaluations of the PMR. In addition to the work invested in new methods, it is essential to develop appropriate benchmarks to verify and validate the new methods in computer codes. The purpose of this benchmark is to establish a well-defined problem, based on a common given set of data, to compare methods and tools in core simulation and thermal hydraulics analysis with a specific focus on transient events. The benchmark-working group is currently seeking OECD/NEA sponsorship. This benchmark is being pursued and is heavily based on the success of the PBMR-400 exercise.

  16. Biochemical systems identification by a random drift particle swarm optimization approach

    Science.gov (United States)

    2014-01-01

    Background Finding an efficient method to solve the parameter estimation problem (inverse problem) for nonlinear biochemical dynamical systems could help promote the functional understanding at the system level for signalling pathways. The problem is stated as a data-driven nonlinear regression problem, which is converted into a nonlinear programming problem with many nonlinear differential and algebraic constraints. Due to the typical ill conditioning and multimodality nature of the problem, it is in general difficult for gradient-based local optimization methods to obtain satisfactory solutions. To surmount this limitation, many stochastic optimization methods have been employed to find the global solution of the problem. Results This paper presents an effective search strategy for a particle swarm optimization (PSO) algorithm that enhances the ability of the algorithm for estimating the parameters of complex dynamic biochemical pathways. The proposed algorithm is a new variant of random drift particle swarm optimization (RDPSO), which is used to solve the above mentioned inverse problem and compared with other well known stochastic optimization methods. Two case studies on estimating the parameters of two nonlinear biochemical dynamic models have been taken as benchmarks, under both the noise-free and noisy simulation data scenarios. Conclusions The experimental results show that the novel variant of RDPSO algorithm is able to successfully solve the problem and obtain solutions of better quality than other global optimization methods used for finding the solution to the inverse problems in this study. PMID:25078435

  17. Numerical optimization of Combined Heat and Power Organic Rankine Cycles – Part A: Design optimization

    International Nuclear Information System (INIS)

    Martelli, Emanuele; Capra, Federico; Consonni, Stefano

    2015-01-01

    This two-part paper proposes an approach based on state-of-the-art numerical optimization methods for simultaneously determining the most profitable design and part-load operation of Combined Heat and Power Organic Rankine Cycles. Compared to the usual design practice, the important advantages of the proposed approach are (i) to consider the part-load performance of the ORC at the design stage, (ii) to optimize not only the cycle variables, but also the main turbine design variables (number of stages, stage loads, rotational speed). In this first part (Part A), the design model and the optimization algorithm are presented and tested on a real-world test case. PGS-COM, a recently proposed hybrid derivative-free algorithm, allows to efficiently tackle the challenging non-smooth black-box problem. - Highlights: • Algorithm for the simultaneous optimization Organic Rakine Cycle and turbine. • Thermodynamic and economic models of boiler, cycle, turbine are developed. • Non-smooth black-box optimization problem is successfully tackled with PGS-COM. • Test cases show that the algorithm returns optimal solutions within 4 min. • Toluene outperforms MDM (a siloxane) in terms of efficiency and costs.

  18. Pediatric Academic Productivity: Pediatric Benchmarks for the h- and g-Indices.

    Science.gov (United States)

    Tschudy, Megan M; Rowe, Tashi L; Dover, George J; Cheng, Tina L

    2016-02-01

    To describe h- and g-indices benchmarks in pediatric subspecialties and general academic pediatrics. Academic productivity is measured increasingly through bibliometrics that derive a statistical enumeration of academic output and impact. The h- and g-indices incorporate the number of publications and citations. Benchmarks for pediatrics have not been reported. Thirty programs were selected randomly from pediatric residency programs accredited by the Accreditation Council for Graduate Medical Education. The h- and g-indices of department chairs were calculated. For general academic pediatrics, pediatric gastroenterology, and pediatric nephrology, a random sample of 30 programs with fellowships were selected. Within each program, an MD faculty member from each academic rank was selected randomly. Google Scholar via Harzing's Publish or Perish was used to calculate the h-index, g-index, and total manuscripts. Only peer-reviewed and English language publications were included. For Chairs, calculations from Google Scholar were compared with Scopus. For all specialties, the mean h- and g-indices significantly increased with academic rank (all P calculation using different bibliographic databases only differed by ±1. Mean h-indices increased with academic rank and were not significantly different across the pediatric specialties. Benchmarks for h- and g-indices in pediatrics are provided and may be one measure of academic productivity and impact. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Benchmarking and Performance Management

    Directory of Open Access Journals (Sweden)

    Adrian TANTAU

    2010-12-01

    Full Text Available The relevance of the chosen topic is explained by the meaning of the firm efficiency concept - the firm efficiency means the revealed performance (how well the firm performs in the actual market environment given the basic characteristics of the firms and their markets that are expected to drive their profitability (firm size, market power etc.. This complex and relative performance could be due to such things as product innovation, management quality, work organization, some other factors can be a cause even if they are not directly observed by the researcher. The critical need for the management individuals/group to continuously improve their firm/company’s efficiency and effectiveness, the need for the managers to know which are the success factors and the competitiveness determinants determine consequently, what performance measures are most critical in determining their firm’s overall success. Benchmarking, when done properly, can accurately identify both successful companies and the underlying reasons for their success. Innovation and benchmarking firm level performance are critical interdependent activities. Firm level variables, used to infer performance, are often interdependent due to operational reasons. Hence, the managers need to take the dependencies among these variables into account when forecasting and benchmarking performance. This paper studies firm level performance using financial ratio and other type of profitability measures. It uses econometric models to describe and then propose a method to forecast and benchmark performance.

  20. Optimizing and benchmarking de novo transcriptome sequencing: from library preparation to assembly evaluation.

    Science.gov (United States)

    Hara, Yuichiro; Tatsumi, Kaori; Yoshida, Michio; Kajikawa, Eriko; Kiyonari, Hiroshi; Kuraku, Shigehiro

    2015-11-18

    RNA-seq enables gene expression profiling in selected spatiotemporal windows and yields massive sequence information with relatively low cost and time investment, even for non-model species. However, there remains a large room for optimizing its workflow, in order to take full advantage of continuously developing sequencing capacity. Transcriptome sequencing for three embryonic stages of Madagascar ground gecko (Paroedura picta) was performed with the Illumina platform. The output reads were assembled de novo for reconstructing transcript sequences. In order to evaluate the completeness of transcriptome assemblies, we prepared a reference gene set consisting of vertebrate one-to-one orthologs. To take advantage of increased read length of >150 nt, we demonstrated shortened RNA fragmentation time, which resulted in a dramatic shift of insert size distribution. To evaluate products of multiple de novo assembly runs incorporating reads with different RNA sources, read lengths, and insert sizes, we introduce a new reference gene set, core vertebrate genes (CVG), consisting of 233 genes that are shared as one-to-one orthologs by all vertebrate genomes examined (29 species)., The completeness assessment performed by the computational pipelines CEGMA and BUSCO referring to CVG, demonstrated higher accuracy and resolution than with the gene set previously established for this purpose. As a result of the assessment with CVG, we have derived the most comprehensive transcript sequence set of the Madagascar ground gecko by means of assembling individual libraries followed by clustering the assembled sequences based on their overall similarities. Our results provide several insights into optimizing de novo RNA-seq workflow, including the coordination between library insert size and read length, which manifested in improved connectivity of assemblies. The approach and assembly assessment with CVG demonstrated here would be applicable to transcriptome analysis of other species as