PARETO OPTIMAL SOLUTIONS FOR MULTI-OBJECTIVE GENERALIZED ASSIGNMENT PROBLEM
Directory of Open Access Journals (Sweden)
S. Prakash
2012-01-01
Full Text Available
ENGLISH ABSTRACT: The Multi-Objective Generalized Assignment Problem (MGAP with two objectives, where one objective is linear and the other one is non-linear, has been considered, with the constraints that a job is assigned to only one worker – though he may be assigned more than one job, depending upon the time available to him. An algorithm is proposed to find the set of Pareto optimal solutions of the problem, determining assignments of jobs to workers with two objectives without setting priorities for them. The two objectives are to minimise the total cost of the assignment and to reduce the time taken to complete all the jobs.
AFRIKAANSE OPSOMMING: ‘n Multi-doelwit veralgemeende toekenningsprobleem (“multi-objective generalised assignment problem – MGAP” met twee doelwitte, waar die een lineêr en die ander nielineêr is nie, word bestudeer, met die randvoorwaarde dat ‘n taak slegs toegedeel word aan een werker – alhoewel meer as een taak aan hom toegedeel kan word sou die tyd beskikbaar wees. ‘n Algoritme word voorgestel om die stel Pareto-optimale oplossings te vind wat die taaktoedelings aan werkers onderhewig aan die twee doelwitte doen sonder dat prioriteite toegeken word. Die twee doelwitte is om die totale koste van die opdrag te minimiseer en om die tyd te verminder om al die take te voltooi.
Multiobjective Optimization of Linear Cooperative Spectrum Sensing: Pareto Solutions and Refinement.
Yuan, Wei; You, Xinge; Xu, Jing; Leung, Henry; Zhang, Tianhang; Chen, Chun Lung Philip
2016-01-01
In linear cooperative spectrum sensing, the weights of secondary users and detection threshold should be optimally chosen to minimize missed detection probability and to maximize secondary network throughput. Since these two objectives are not completely compatible, we study this problem from the viewpoint of multiple-objective optimization. We aim to obtain a set of evenly distributed Pareto solutions. To this end, here, we introduce the normal constraint (NC) method to transform the problem into a set of single-objective optimization (SOO) problems. Each SOO problem usually results in a Pareto solution. However, NC does not provide any solution method to these SOO problems, nor any indication on the optimal number of Pareto solutions. Furthermore, NC has no preference over all Pareto solutions, while a designer may be only interested in some of them. In this paper, we employ a stochastic global optimization algorithm to solve the SOO problems, and then propose a simple method to determine the optimal number of Pareto solutions under a computational complexity constraint. In addition, we extend NC to refine the Pareto solutions and select the ones of interest. Finally, we verify the effectiveness and efficiency of the proposed methods through computer simulations.
DEFF Research Database (Denmark)
Andersen, Kurt Munk; Sandqvist, Allan
1997-01-01
We investigate the domain of definition and the domain of values for the successor function of a cooperative differential system x'=f(t,x), where the coordinate functions are concave in x for any fixed value of t. Moreover, we give a characterization of a weakly Pareto optimal solution.......We investigate the domain of definition and the domain of values for the successor function of a cooperative differential system x'=f(t,x), where the coordinate functions are concave in x for any fixed value of t. Moreover, we give a characterization of a weakly Pareto optimal solution....
Directory of Open Access Journals (Sweden)
Yang Sun
2018-01-01
Full Text Available Using Pareto optimization in Multi-Objective Reinforcement Learning (MORL leads to better learning results for network defense games. This is particularly useful for network security agents, who must often balance several goals when choosing what action to take in defense of a network. If the defender knows his preferred reward distribution, the advantages of Pareto optimization can be retained by using a scalarization algorithm prior to the implementation of the MORL. In this paper, we simulate a network defense scenario by creating a multi-objective zero-sum game and using Pareto optimization and MORL to determine optimal solutions and compare those solutions to different scalarization approaches. We build a Pareto Defense Strategy Selection Simulator (PDSSS system for assisting network administrators on decision-making, specifically, on defense strategy selection, and the experiment results show that the Satisficing Trade-Off Method (STOM scalarization approach performs better than linear scalarization or GUESS method. The results of this paper can aid network security agents attempting to find an optimal defense policy for network security games.
International Nuclear Information System (INIS)
Ferreira, Jose C.; Gaspar-Cunha, Antonio; Fonseca, Carlos M.
2007-01-01
Most of the real world optimization problems involve multiple, usually conflicting, optimization criteria. Generating Pareto optimal solutions plays an important role in multi-objective optimization, and the problem is considered to be solved when the Pareto optimal set is found, i.e., the set of non-dominated solutions. Multi-Objective Evolutionary Algorithms based on the principle of Pareto optimality are designed to produce the complete set of non-dominated solutions. However, this is not allays enough since the aim is not only to know the Pareto set but, also, to obtain one solution from this Pareto set. Thus, the definition of a methodology able to select a single solution from the set of non-dominated solutions (or a region of the Pareto frontier), and taking into account the preferences of a Decision Maker (DM), is necessary. A different method, based on a weighted stress function, is proposed. It is able to integrate the user's preferences in order to find the best region of the Pareto frontier accordingly with these preferences. This method was tested on some benchmark test problems, with two and three criteria, and on a polymer extrusion problem. This methodology is able to select efficiently the best Pareto-frontier region for the specified relative importance of the criteria
Giesy, D. P.
1978-01-01
A technique is presented for the calculation of Pareto-optimal solutions to a multiple-objective constrained optimization problem by solving a series of single-objective problems. Threshold-of-acceptability constraints are placed on the objective functions at each stage to both limit the area of search and to mathematically guarantee convergence to a Pareto optimum.
DEFF Research Database (Denmark)
Bligaard, Thomas; Johannesson, Gisli Holmar; Ruban, Andrei
2003-01-01
Large databases that can be used in the search for new materials with specific properties remain an elusive goal in materials science. The problem is complicated by the fact that the optimal material for a given application is usually a compromise between a number of materials properties and the ......Large databases that can be used in the search for new materials with specific properties remain an elusive goal in materials science. The problem is complicated by the fact that the optimal material for a given application is usually a compromise between a number of materials properties...... and the cost. In this letter we present a database consisting of the lattice parameters, bulk moduli, and heats of formation for over 64 000 ordered metallic alloys, which has been established by direct first-principles density-functional-theory calculations. Furthermore, we use a concept from economic theory......, the Pareto-optimal set, to determine optimal alloy solutions for the compromise between low compressibility, high stability, and cost....
Pareto optimization in algebraic dynamic programming.
Saule, Cédric; Giegerich, Robert
2015-01-01
Pareto optimization combines independent objectives by computing the Pareto front of its search space, defined as the set of all solutions for which no other candidate solution scores better under all objectives. This gives, in a precise sense, better information than an artificial amalgamation of different scores into a single objective, but is more costly to compute. Pareto optimization naturally occurs with genetic algorithms, albeit in a heuristic fashion. Non-heuristic Pareto optimization so far has been used only with a few applications in bioinformatics. We study exact Pareto optimization for two objectives in a dynamic programming framework. We define a binary Pareto product operator [Formula: see text] on arbitrary scoring schemes. Independent of a particular algorithm, we prove that for two scoring schemes A and B used in dynamic programming, the scoring scheme [Formula: see text] correctly performs Pareto optimization over the same search space. We study different implementations of the Pareto operator with respect to their asymptotic and empirical efficiency. Without artificial amalgamation of objectives, and with no heuristics involved, Pareto optimization is faster than computing the same number of answers separately for each objective. For RNA structure prediction under the minimum free energy versus the maximum expected accuracy model, we show that the empirical size of the Pareto front remains within reasonable bounds. Pareto optimization lends itself to the comparative investigation of the behavior of two alternative scoring schemes for the same purpose. For the above scoring schemes, we observe that the Pareto front can be seen as a composition of a few macrostates, each consisting of several microstates that differ in the same limited way. We also study the relationship between abstract shape analysis and the Pareto front, and find that they extract information of a different nature from the folding space and can be meaningfully combined.
Pareto optimal pairwise sequence alignment.
DeRonne, Kevin W; Karypis, George
2013-01-01
Sequence alignment using evolutionary profiles is a commonly employed tool when investigating a protein. Many profile-profile scoring functions have been developed for use in such alignments, but there has not yet been a comprehensive study of Pareto optimal pairwise alignments for combining multiple such functions. We show that the problem of generating Pareto optimal pairwise alignments has an optimal substructure property, and develop an efficient algorithm for generating Pareto optimal frontiers of pairwise alignments. All possible sets of two, three, and four profile scoring functions are used from a pool of 11 functions and applied to 588 pairs of proteins in the ce_ref data set. The performance of the best objective combinations on ce_ref is also evaluated on an independent set of 913 protein pairs extracted from the BAliBASE RV11 data set. Our dynamic-programming-based heuristic approach produces approximated Pareto optimal frontiers of pairwise alignments that contain comparable alignments to those on the exact frontier, but on average in less than 1/58th the time in the case of four objectives. Our results show that the Pareto frontiers contain alignments whose quality is better than the alignments obtained by single objectives. However, the task of identifying a single high-quality alignment among those in the Pareto frontier remains challenging.
Directory of Open Access Journals (Sweden)
I. K. Romanova
2015-01-01
Full Text Available The article research concerns the multi-criteria optimization (MCO, which assumes that operation quality criteria of the system are independent and specifies a way to improve values of these criteria. Mutual contradiction of some criteria is a major problem in MCO. One of the most important areas of research is to obtain the so-called Pareto - optimal options.The subject of research is Pareto front, also called the Pareto frontier. The article discusses front classifications by its geometric representation for the case of two-criterion task. It presents a mathematical description of the front characteristics using the gradients and their projections. A review of current domestic and foreign literature has revealed that the aim of works in constructing the Pareto frontier is to conduct research in conditions of uncertainty, in the stochastic statement, with no restrictions. A topology both in two- and in three-dimensional case is under consideration. The targets of modern applications are multi-agent systems and groups of players in differential games. However, all considered works have no task to provide an active management of the front.The objective of this article is to discuss the research problem the Pareto frontier in a new production, namely, with the active co-developers of the systems and (or the decision makers (DM in the management of the Pareto frontier. It notes that such formulation differs from the traditionally accepted approach based on the analysis of already existing solutions.The article discusses three ways to describe a quality of the object management system. The first way is to use the direct quality criteria for the model of a closed system as the vibrational level of the General form. The second one is to study a specific two-loop system of an aircraft control using the angular velocity and normal acceleration loops. The third is the use of the integrated quality criteria. In all three cases, the selected criteria are
Kudo, Fumiya; Yoshikawa, Tomohiro; Furuhashi, Takeshi
Recentry, Multi-objective Genetic Algorithm, which is the application of Genetic Algorithm to Multi-objective Optimization Problems is focused on in the engineering design field. In this field, the analysis of design variables in the acquired Pareto solutions, which gives the designers useful knowledge in the applied problem, is important as well as the acquisition of advanced solutions. This paper proposes a new visualization method using Isomap which visualizes the geometric distances of solutions in the design variable space considering their distances in the objective space. The proposed method enables a user to analyze the design variables of the acquired solutions considering their relationship in the objective space. This paper applies the proposed method to the conceptual design optimization problem of hybrid rocket engine and studies the effectiveness of the proposed method.
Pareto optimality in infinite horizon linear quadratic differential games
Reddy, P.V.; Engwerda, J.C.
2013-01-01
In this article we derive conditions for the existence of Pareto optimal solutions for linear quadratic infinite horizon cooperative differential games. First, we present a necessary and sufficient characterization for Pareto optimality which translates to solving a set of constrained optimal
Energy Technology Data Exchange (ETDEWEB)
Leimbach, Marian [Potsdam-Institut fuer Klimafolgenforschung e.V., Potsdam (Germany); Eisenack, Klaus [Oldenburg Univ. (Germany). Dept. of Economics and Statistics
2008-11-15
In this paper we present an algorithm that deals with trade interactions within a multi-region model. In contrast to traditional approaches this algorithm is able to handle spillover externalities. Technological spillovers are expected to foster the diffusion of new technologies, which helps to lower the cost of climate change mitigation. We focus on technological spillovers which are due to capital trade. The algorithm of finding a pareto-optimal solution in an intertemporal framework is embedded in a decomposed optimization process. The paper analyzes convergence and equilibrium properties of this algorithm. In the final part of the paper, we apply the algorithm to investigate possible impacts of technological spillovers. While benefits of technological spillovers are significant for the capital-importing region, benefits for the capital-exporting region depend on the type of regional disparities and the resulting specialization and terms-of-trade effects. (orig.)
Can we reach Pareto optimal outcomes using bottom-up approaches?
V. Sanchez-Anguix (Victor); R. Aydoğan (Reyhan); T. Baarslag (Tim); C.M. Jonker (Catholijn)
2016-01-01
textabstractClassically, disciplines like negotiation and decision making have focused on reaching Pareto optimal solutions due to its stability and efficiency properties. Despite the fact that many practical and theoretical algorithms have successfully attempted to provide Pareto optimal solutions,
Ayadi, Omar; Felfel, Houssem; Masmoudi, Faouzi
2017-07-01
The current manufacturing environment has changed from traditional single-plant to multi-site supply chain where multiple plants are serving customer demands. In this article, a tactical multi-objective, multi-period, multi-product, multi-site supply-chain planning problem is proposed. A corresponding optimization model aiming to simultaneously minimize the total cost, maximize product quality and maximize the customer satisfaction demand level is developed. The proposed solution approach yields to a front of Pareto-optimal solutions that represents the trade-offs among the different objectives. Subsequently, the analytic hierarchy process method is applied to select the best Pareto-optimal solution according to the preferences of the decision maker. The robustness of the solutions and the proposed approach are discussed based on a sensitivity analysis and an application to a real case from the textile and apparel industry.
Tractable Pareto Optimization of Temporal Preferences
Morris, Robert; Morris, Paul; Khatib, Lina; Venable, Brent
2003-01-01
This paper focuses on temporal constraint problems where the objective is to optimize a set of local preferences for when events occur. In previous work, a subclass of these problems has been formalized as a generalization of Temporal CSPs, and a tractable strategy for optimization has been proposed, where global optimality is defined as maximizing the minimum of the component preference values. This criterion for optimality, which we call 'Weakest Link Optimization' (WLO), is known to have limited practical usefulness because solutions are compared only on the basis of their worst value; thus, there is no requirement to improve the other values. To address this limitation, we introduce a new algorithm that re-applies WLO iteratively in a way that leads to improvement of all the values. We show the value of this strategy by proving that, with suitable preference functions, the resulting solutions are Pareto Optimal.
RNA-Pareto: interactive analysis of Pareto-optimal RNA sequence-structure alignments.
Schnattinger, Thomas; Schöning, Uwe; Marchfelder, Anita; Kestler, Hans A
2013-12-01
Incorporating secondary structure information into the alignment process improves the quality of RNA sequence alignments. Instead of using fixed weighting parameters, sequence and structure components can be treated as different objectives and optimized simultaneously. The result is not a single, but a Pareto-set of equally optimal solutions, which all represent different possible weighting parameters. We now provide the interactive graphical software tool RNA-Pareto, which allows a direct inspection of all feasible results to the pairwise RNA sequence-structure alignment problem and greatly facilitates the exploration of the optimal solution set.
Post Pareto optimization-A case
Popov, Stoyan; Baeva, Silvia; Marinova, Daniela
2017-12-01
Simulation performance may be evaluated according to multiple quality measures that are in competition and their simultaneous consideration poses a conflict. In the current study we propose a practical framework for investigating such simulation performance criteria, exploring the inherent conflicts amongst them and identifying the best available tradeoffs, based upon multi-objective Pareto optimization. This approach necessitates the rigorous derivation of performance criteria to serve as objective functions and undergo vector optimization. We demonstrate the effectiveness of our proposed approach by applying it with multiple stochastic quality measures. We formulate performance criteria of this use-case, pose an optimization problem, and solve it by means of a simulation-based Pareto approach. Upon attainment of the underlying Pareto Frontier, we analyze it and prescribe preference-dependent configurations for the optimal simulation training.
PARETO: A novel evolutionary optimization approach to multiobjective IMRT planning
International Nuclear Information System (INIS)
Fiege, Jason; McCurdy, Boyd; Potrebko, Peter; Champion, Heather; Cull, Andrew
2011-01-01
Purpose: In radiation therapy treatment planning, the clinical objectives of uniform high dose to the planning target volume (PTV) and low dose to the organs-at-risk (OARs) are invariably in conflict, often requiring compromises to be made between them when selecting the best treatment plan for a particular patient. In this work, the authors introduce Pareto-Aware Radiotherapy Evolutionary Treatment Optimization (pareto), a multiobjective optimization tool to solve for beam angles and fluence patterns in intensity-modulated radiation therapy (IMRT) treatment planning. Methods: pareto is built around a powerful multiobjective genetic algorithm (GA), which allows us to treat the problem of IMRT treatment plan optimization as a combined monolithic problem, where all beam fluence and angle parameters are treated equally during the optimization. We have employed a simple parameterized beam fluence representation with a realistic dose calculation approach, incorporating patient scatter effects, to demonstrate feasibility of the proposed approach on two phantoms. The first phantom is a simple cylindrical phantom containing a target surrounded by three OARs, while the second phantom is more complex and represents a paraspinal patient. Results: pareto results in a large database of Pareto nondominated solutions that represent the necessary trade-offs between objectives. The solution quality was examined for several PTV and OAR fitness functions. The combination of a conformity-based PTV fitness function and a dose-volume histogram (DVH) or equivalent uniform dose (EUD) -based fitness function for the OAR produced relatively uniform and conformal PTV doses, with well-spaced beams. A penalty function added to the fitness functions eliminates hotspots. Comparison of resulting DVHs to those from treatment plans developed with a single-objective fluence optimizer (from a commercial treatment planning system) showed good correlation. Results also indicated that pareto shows
PARETO: A novel evolutionary optimization approach to multiobjective IMRT planning.
Fiege, Jason; McCurdy, Boyd; Potrebko, Peter; Champion, Heather; Cull, Andrew
2011-09-01
In radiation therapy treatment planning, the clinical objectives of uniform high dose to the planning target volume (PTV) and low dose to the organs-at-risk (OARs) are invariably in conflict, often requiring compromises to be made between them when selecting the best treatment plan for a particular patient. In this work, the authors introduce Pareto-Aware Radiotherapy Evolutionary Treatment Optimization (pareto), a multiobjective optimization tool to solve for beam angles and fluence patterns in intensity-modulated radiation therapy (IMRT) treatment planning. pareto is built around a powerful multiobjective genetic algorithm (GA), which allows us to treat the problem of IMRT treatment plan optimization as a combined monolithic problem, where all beam fluence and angle parameters are treated equally during the optimization. We have employed a simple parameterized beam fluence representation with a realistic dose calculation approach, incorporating patient scatter effects, to demonstrate feasibility of the proposed approach on two phantoms. The first phantom is a simple cylindrical phantom containing a target surrounded by three OARs, while the second phantom is more complex and represents a paraspinal patient. pareto results in a large database of Pareto nondominated solutions that represent the necessary trade-offs between objectives. The solution quality was examined for several PTV and OAR fitness functions. The combination of a conformity-based PTV fitness function and a dose-volume histogram (DVH) or equivalent uniform dose (EUD) -based fitness function for the OAR produced relatively uniform and conformal PTV doses, with well-spaced beams. A penalty function added to the fitness functions eliminates hotspots. Comparison of resulting DVHs to those from treatment plans developed with a single-objective fluence optimizer (from a commercial treatment planning system) showed good correlation. Results also indicated that pareto shows promise in optimizing the number
Pareto optimality in organelle energy metabolism analysis.
Angione, Claudio; Carapezza, Giovanni; Costanza, Jole; Lió, Pietro; Nicosia, Giuseppe
2013-01-01
In low and high eukaryotes, energy is collected or transformed in compartments, the organelles. The rich variety of size, characteristics, and density of the organelles makes it difficult to build a general picture. In this paper, we make use of the Pareto-front analysis to investigate the optimization of energy metabolism in mitochondria and chloroplasts. Using the Pareto optimality principle, we compare models of organelle metabolism on the basis of single- and multiobjective optimization, approximation techniques (the Bayesian Automatic Relevance Determination), robustness, and pathway sensitivity analysis. Finally, we report the first analysis of the metabolic model for the hydrogenosome of Trichomonas vaginalis, which is found in several protozoan parasites. Our analysis has shown the importance of the Pareto optimality for such comparison and for insights into the evolution of the metabolism from cytoplasmic to organelle bound, involving a model order reduction. We report that Pareto fronts represent an asymptotic analysis useful to describe the metabolism of an organism aimed at maximizing concurrently two or more metabolite concentrations.
How Well Do We Know Pareto Optimality?
Mathur, Vijay K.
1991-01-01
Identifies sources of ambiguity in economics textbooks' discussion of the condition for efficient output mix. Points out that diverse statements without accompanying explanations create confusion among students. Argues that conflicting views concerning the concept of Pareto optimality as one source of ambiguity. Suggests clarifying additions to…
Performance-based Pareto optimal design
Sariyildiz, I.S.; Bittermann, M.S.; Ciftcioglu, O.
2008-01-01
A novel approach for performance-based design is presented, where Pareto optimality is pursued. Design requirements may contain linguistic information, which is difficult to bring into computation or make consistent their impartial estimations from case to case. Fuzzy logic and soft computing are
Pareto-optimal phylogenetic tree reconciliation.
Libeskind-Hadas, Ran; Wu, Yi-Chieh; Bansal, Mukul S; Kellis, Manolis
2014-06-15
Phylogenetic tree reconciliation is a widely used method for reconstructing the evolutionary histories of gene families and species, hosts and parasites and other dependent pairs of entities. Reconciliation is typically performed using maximum parsimony, in which each evolutionary event type is assigned a cost and the objective is to find a reconciliation of minimum total cost. It is generally understood that reconciliations are sensitive to event costs, but little is understood about the relationship between event costs and solutions. Moreover, choosing appropriate event costs is a notoriously difficult problem. We address this problem by giving an efficient algorithm for computing Pareto-optimal sets of reconciliations, thus providing the first systematic method for understanding the relationship between event costs and reconciliations. This, in turn, results in new techniques for computing event support values and, for cophylogenetic analyses, performing robust statistical tests. We provide new software tools and demonstrate their use on a number of datasets from evolutionary genomic and cophylogenetic studies. Our Python tools are freely available at www.cs.hmc.edu/∼hadas/xscape. . © The Author 2014. Published by Oxford University Press.
Pareto Optimal Design for Synthetic Biology.
Patanè, Andrea; Santoro, Andrea; Costanza, Jole; Carapezza, Giovanni; Nicosia, Giuseppe
2015-08-01
Recent advances in synthetic biology call for robust, flexible and efficient in silico optimization methodologies. We present a Pareto design approach for the bi-level optimization problem associated to the overproduction of specific metabolites in Escherichia coli. Our method efficiently explores the high dimensional genetic manipulation space, finding a number of trade-offs between synthetic and biological objectives, hence furnishing a deeper biological insight to the addressed problem and important results for industrial purposes. We demonstrate the computational capabilities of our Pareto-oriented approach comparing it with state-of-the-art heuristics in the overproduction problems of i) 1,4-butanediol, ii) myristoyl-CoA, i ii) malonyl-CoA , iv) acetate and v) succinate. We show that our algorithms are able to gracefully adapt and scale to more complex models and more biologically-relevant simulations of the genetic manipulations allowed. The Results obtained for 1,4-butanediol overproduction significantly outperform results previously obtained, in terms of 1,4-butanediol to biomass formation ratio and knock-out costs. In particular overproduction percentage is of +662.7%, from 1.425 mmolh⁻¹gDW⁻¹ (wild type) to 10.869 mmolh⁻¹gDW⁻¹, with a knockout cost of 6. Whereas, Pareto-optimal designs we have found in fatty acid optimizations strictly dominate the ones obtained by the other methodologies, e.g., biomass and myristoyl-CoA exportation improvement of +21.43% (0.17 h⁻¹) and +5.19% (1.62 mmolh⁻¹gDW⁻¹), respectively. Furthermore CPU time required by our heuristic approach is more than halved. Finally we implement pathway oriented sensitivity analysis, epsilon-dominance analysis and robustness analysis to enhance our biological understanding of the problem and to improve the optimization algorithm capabilities.
A. Bouter (Anton); K. Pirpinia (Kleopatra); T. Alderliesten (Tanja); P.A.N. Bosman (Peter)
2017-01-01
textabstractA multi-objective optimization approach is o.en followed by an a posteriori decision-making process, during which the most appropriate solution of the Pareto set is selected by a professional in the .eld. Conventional visualization methods do not correct for Pareto fronts with
Pareto optimal design of sectored toroidal superconducting magnet for SMES
Energy Technology Data Exchange (ETDEWEB)
Bhunia, Uttam, E-mail: ubhunia@vecc.gov.in; Saha, Subimal; Chakrabarti, Alok
2014-10-15
Highlights: • The optimization approach minimizes both the magnet size and necessary cable length of a sectored toroidal SMES unit. • Design approach is suitable for low temperature superconducting cable suitable for medium size SMES unit. • It investigates coil parameters with respect to practical engineering aspects. - Abstract: A novel multi-objective optimization design approach for sectored toroidal superconducting magnetic energy storage coil has been developed considering the practical engineering constraints. The objectives include the minimization of necessary superconductor length and torus overall size or volume, which determines a significant part of cost towards realization of SMES. The best trade-off between the necessary conductor length for winding and magnet overall size is achieved in the Pareto-optimal solutions, the compact magnet size leads to increase in required superconducting cable length or vice versa The final choice among Pareto optimal configurations can be done in relation to other issues such as AC loss during transient operation, stray magnetic field at outside the coil assembly, and available discharge period, which is not considered in the optimization process. The proposed design approach is adapted for a 4.5 MJ/1 MW SMES system using low temperature niobium–titanium based Rutherford type cable. Furthermore, the validity of the representative Pareto solutions is confirmed by finite-element analysis (FEA) with a reasonably acceptable accuracy.
Pareto optimal design of sectored toroidal superconducting magnet for SMES
International Nuclear Information System (INIS)
Bhunia, Uttam; Saha, Subimal; Chakrabarti, Alok
2014-01-01
Highlights: • The optimization approach minimizes both the magnet size and necessary cable length of a sectored toroidal SMES unit. • Design approach is suitable for low temperature superconducting cable suitable for medium size SMES unit. • It investigates coil parameters with respect to practical engineering aspects. - Abstract: A novel multi-objective optimization design approach for sectored toroidal superconducting magnetic energy storage coil has been developed considering the practical engineering constraints. The objectives include the minimization of necessary superconductor length and torus overall size or volume, which determines a significant part of cost towards realization of SMES. The best trade-off between the necessary conductor length for winding and magnet overall size is achieved in the Pareto-optimal solutions, the compact magnet size leads to increase in required superconducting cable length or vice versa The final choice among Pareto optimal configurations can be done in relation to other issues such as AC loss during transient operation, stray magnetic field at outside the coil assembly, and available discharge period, which is not considered in the optimization process. The proposed design approach is adapted for a 4.5 MJ/1 MW SMES system using low temperature niobium–titanium based Rutherford type cable. Furthermore, the validity of the representative Pareto solutions is confirmed by finite-element analysis (FEA) with a reasonably acceptable accuracy
Derivative-free generation and interpolation of convex Pareto optimal IMRT plans
Hoffmann, Aswin L.; Siem, Alex Y. D.; den Hertog, Dick; Kaanders, Johannes H. A. M.; Huizenga, Henk
2006-12-01
In inverse treatment planning for intensity-modulated radiation therapy (IMRT), beamlet intensity levels in fluence maps of high-energy photon beams are optimized. Treatment plan evaluation criteria are used as objective functions to steer the optimization process. Fluence map optimization can be considered a multi-objective optimization problem, for which a set of Pareto optimal solutions exists: the Pareto efficient frontier (PEF). In this paper, a constrained optimization method is pursued to iteratively estimate the PEF up to some predefined error. We use the property that the PEF is convex for a convex optimization problem to construct piecewise-linear upper and lower bounds to approximate the PEF from a small initial set of Pareto optimal plans. A derivative-free Sandwich algorithm is presented in which these bounds are used with three strategies to determine the location of the next Pareto optimal solution such that the uncertainty in the estimated PEF is maximally reduced. We show that an intelligent initial solution for a new Pareto optimal plan can be obtained by interpolation of fluence maps from neighbouring Pareto optimal plans. The method has been applied to a simplified clinical test case using two convex objective functions to map the trade-off between tumour dose heterogeneity and critical organ sparing. All three strategies produce representative estimates of the PEF. The new algorithm is particularly suitable for dynamic generation of Pareto optimal plans in interactive treatment planning.
Derivative-free generation and interpolation of convex Pareto optimal IMRT plans
International Nuclear Information System (INIS)
Hoffmann, Aswin L; Siem, Alex Y D; Hertog, Dick den; Kaanders, Johannes H A M; Huizenga, Henk
2006-01-01
In inverse treatment planning for intensity-modulated radiation therapy (IMRT), beamlet intensity levels in fluence maps of high-energy photon beams are optimized. Treatment plan evaluation criteria are used as objective functions to steer the optimization process. Fluence map optimization can be considered a multi-objective optimization problem, for which a set of Pareto optimal solutions exists: the Pareto efficient frontier (PEF). In this paper, a constrained optimization method is pursued to iteratively estimate the PEF up to some predefined error. We use the property that the PEF is convex for a convex optimization problem to construct piecewise-linear upper and lower bounds to approximate the PEF from a small initial set of Pareto optimal plans. A derivative-free Sandwich algorithm is presented in which these bounds are used with three strategies to determine the location of the next Pareto optimal solution such that the uncertainty in the estimated PEF is maximally reduced. We show that an intelligent initial solution for a new Pareto optimal plan can be obtained by interpolation of fluence maps from neighbouring Pareto optimal plans. The method has been applied to a simplified clinical test case using two convex objective functions to map the trade-off between tumour dose heterogeneity and critical organ sparing. All three strategies produce representative estimates of the PEF. The new algorithm is particularly suitable for dynamic generation of Pareto optimal plans in interactive treatment planning
Pareto optimal design of sectored toroidal superconducting magnet for SMES
Bhunia, Uttam; Saha, Subimal; Chakrabarti, Alok
2014-10-01
A novel multi-objective optimization design approach for sectored toroidal superconducting magnetic energy storage coil has been developed considering the practical engineering constraints. The objectives include the minimization of necessary superconductor length and torus overall size or volume, which determines a significant part of cost towards realization of SMES. The best trade-off between the necessary conductor length for winding and magnet overall size is achieved in the Pareto-optimal solutions, the compact magnet size leads to increase in required superconducting cable length or vice versa The final choice among Pareto optimal configurations can be done in relation to other issues such as AC loss during transient operation, stray magnetic field at outside the coil assembly, and available discharge period, which is not considered in the optimization process. The proposed design approach is adapted for a 4.5 MJ/1 MW SMES system using low temperature niobium-titanium based Rutherford type cable. Furthermore, the validity of the representative Pareto solutions is confirmed by finite-element analysis (FEA) with a reasonably acceptable accuracy.
Directory of Open Access Journals (Sweden)
Ziaul Huque
2012-01-01
Full Text Available A Computational Fluid Dynamics (CFD and response surface-based multiobjective design optimization were performed for six different 2D airfoil profiles, and the Pareto optimal front of each airfoil is presented. FLUENT, which is a commercial CFD simulation code, was used to determine the relevant aerodynamic loads. The Lift Coefficient (CL and Drag Coefficient (CD data at a range of 0° to 12° angles of attack (α and at three different Reynolds numbers (Re=68,459, 479, 210, and 958, 422 for all the six airfoils were obtained. Realizable k-ε turbulence model with a second-order upwind solution method was used in the simulations. The standard least square method was used to generate response surface by the statistical code JMP. Elitist Non-dominated Sorting Genetic Algorithm (NSGA-II was used to determine the Pareto optimal set based on the response surfaces. Each Pareto optimal solution represents a different compromise between design objectives. This gives the designer a choice to select a design compromise that best suits the requirements from a set of optimal solutions. The Pareto solution set is presented in the form of a Pareto optimal front.
Huang, Hui; Ning, Jixian
2017-01-01
Prederivatives play an important role in the research of set optimization problems. First, we establish several existence theorems of prederivatives for γ -paraconvex set-valued mappings in Banach spaces with [Formula: see text]. Then, in terms of prederivatives, we establish both necessary and sufficient conditions for the existence of Pareto minimal solution of set optimization problems.
Phase transitions in Pareto optimal complex networks.
Seoane, Luís F; Solé, Ricard
2015-09-01
The organization of interactions in complex systems can be described by networks connecting different units. These graphs are useful representations of the local and global complexity of the underlying systems. The origin of their topological structure can be diverse, resulting from different mechanisms including multiplicative processes and optimization. In spatial networks or in graphs where cost constraints are at work, as it occurs in a plethora of situations from power grids to the wiring of neurons in the brain, optimization plays an important part in shaping their organization. In this paper we study network designs resulting from a Pareto optimization process, where different simultaneous constraints are the targets of selection. We analyze three variations on a problem, finding phase transitions of different kinds. Distinct phases are associated with different arrangements of the connections, but the need of drastic topological changes does not determine the presence or the nature of the phase transitions encountered. Instead, the functions under optimization do play a determinant role. This reinforces the view that phase transitions do not arise from intrinsic properties of a system alone, but from the interplay of that system with its external constraints.
Automated Design Framework for Synthetic Biology Exploiting Pareto Optimality.
Otero-Muras, Irene; Banga, Julio R
2017-07-21
In this work we consider Pareto optimality for automated design in synthetic biology. We present a generalized framework based on a mixed-integer dynamic optimization formulation that, given design specifications, allows the computation of Pareto optimal sets of designs, that is, the set of best trade-offs for the metrics of interest. We show how this framework can be used for (i) forward design, that is, finding the Pareto optimal set of synthetic designs for implementation, and (ii) reverse design, that is, analyzing and inferring motifs and/or design principles of gene regulatory networks from the Pareto set of optimal circuits. Finally, we illustrate the capabilities and performance of this framework considering four case studies. In the first problem we consider the forward design of an oscillator. In the remaining problems, we illustrate how to apply the reverse design approach to find motifs for stripe formation, rapid adaption, and fold-change detection, respectively.
Hu, Xiao-Bing; Wang, Ming; Di Paolo, Ezequiel
2013-06-01
Searching the Pareto front for multiobjective optimization problems usually involves the use of a population-based search algorithm or of a deterministic method with a set of different single aggregate objective functions. The results are, in fact, only approximations of the real Pareto front. In this paper, we propose a new deterministic approach capable of fully determining the real Pareto front for those discrete problems for which it is possible to construct optimization algorithms to find the k best solutions to each of the single-objective problems. To this end, two theoretical conditions are given to guarantee the finding of the actual Pareto front rather than its approximation. Then, a general methodology for designing a deterministic search procedure is proposed. A case study is conducted, where by following the general methodology, a ripple-spreading algorithm is designed to calculate the complete exact Pareto front for multiobjective route optimization. When compared with traditional Pareto front search methods, the obvious advantage of the proposed approach is its unique capability of finding the complete Pareto front. This is illustrated by the simulation results in terms of both solution quality and computational efficiency.
Determination of Pareto frontier in multi-objective maintenance optimization
International Nuclear Information System (INIS)
Certa, Antonella; Galante, Giacomo; Lupo, Toni; Passannanti, Gianfranco
2011-01-01
The objective of a maintenance policy generally is the global maintenance cost minimization that involves not only the direct costs for both the maintenance actions and the spare parts, but also those ones due to the system stop for preventive maintenance and the downtime for failure. For some operating systems, the failure event can be dangerous so that they are asked to operate assuring a very high reliability level between two consecutive fixed stops. The present paper attempts to individuate the set of elements on which performing maintenance actions so that the system can assure the required reliability level until the next fixed stop for maintenance, minimizing both the global maintenance cost and the total maintenance time. In order to solve the previous constrained multi-objective optimization problem, an effective approach is proposed to obtain the best solutions (that is the Pareto optimal frontier) among which the decision maker will choose the more suitable one. As well known, describing the whole Pareto optimal frontier generally is a troublesome task. The paper proposes an algorithm able to rapidly overcome this problem and its effectiveness is shown by an application to a case study regarding a complex series-parallel system.
A Pareto Optimal Auction Mechanism for Carbon Emission Rights
Directory of Open Access Journals (Sweden)
Mingxi Wang
2014-01-01
Full Text Available The carbon emission rights do not fit well into the framework of existing multi-item auction mechanisms because of their own unique features. This paper proposes a new auction mechanism which converges to a unique Pareto optimal equilibrium in a finite number of periods. In the proposed auction mechanism, the assignment outcome is Pareto efficient and the carbon emission rights’ resources are efficiently used. For commercial application and theoretical completeness, both discrete and continuous markets—represented by discrete and continuous bid prices, respectively—are examined, and the results show the existence of a Pareto optimal equilibrium under the constraint of individual rationality. With no ties, the Pareto optimal equilibrium can be further proven to be unique.
Projections onto the Pareto surface in multicriteria radiation therapy optimization.
Bokrantz, Rasmus; Miettinen, Kaisa
2015-10-01
To eliminate or reduce the error to Pareto optimality that arises in Pareto surface navigation when the Pareto surface is approximated by a small number of plans. The authors propose to project the navigated plan onto the Pareto surface as a postprocessing step to the navigation. The projection attempts to find a Pareto optimal plan that is at least as good as or better than the initial navigated plan with respect to all objective functions. An augmented form of projection is also suggested where dose-volume histogram constraints are used to prevent that the projection causes a violation of some clinical goal. The projections were evaluated with respect to planning for intensity modulated radiation therapy delivered by step-and-shoot and sliding window and spot-scanned intensity modulated proton therapy. Retrospective plans were generated for a prostate and a head and neck case. The projections led to improved dose conformity and better sparing of organs at risk (OARs) for all three delivery techniques and both patient cases. The mean dose to OARs decreased by 3.1 Gy on average for the unconstrained form of the projection and by 2.0 Gy on average when dose-volume histogram constraints were used. No consistent improvements in target homogeneity were observed. There are situations when Pareto navigation leaves room for improvement in OAR sparing and dose conformity, for example, if the approximation of the Pareto surface is coarse or the problem formulation has too permissive constraints. A projection onto the Pareto surface can identify an inaccurate Pareto surface representation and, if necessary, improve the quality of the navigated plan.
Projections onto the Pareto surface in multicriteria radiation therapy optimization
International Nuclear Information System (INIS)
Bokrantz, Rasmus; Miettinen, Kaisa
2015-01-01
Purpose: To eliminate or reduce the error to Pareto optimality that arises in Pareto surface navigation when the Pareto surface is approximated by a small number of plans. Methods: The authors propose to project the navigated plan onto the Pareto surface as a postprocessing step to the navigation. The projection attempts to find a Pareto optimal plan that is at least as good as or better than the initial navigated plan with respect to all objective functions. An augmented form of projection is also suggested where dose–volume histogram constraints are used to prevent that the projection causes a violation of some clinical goal. The projections were evaluated with respect to planning for intensity modulated radiation therapy delivered by step-and-shoot and sliding window and spot-scanned intensity modulated proton therapy. Retrospective plans were generated for a prostate and a head and neck case. Results: The projections led to improved dose conformity and better sparing of organs at risk (OARs) for all three delivery techniques and both patient cases. The mean dose to OARs decreased by 3.1 Gy on average for the unconstrained form of the projection and by 2.0 Gy on average when dose–volume histogram constraints were used. No consistent improvements in target homogeneity were observed. Conclusions: There are situations when Pareto navigation leaves room for improvement in OAR sparing and dose conformity, for example, if the approximation of the Pareto surface is coarse or the problem formulation has too permissive constraints. A projection onto the Pareto surface can identify an inaccurate Pareto surface representation and, if necessary, improve the quality of the navigated plan
Song, Q Chelsea; Wee, Serena; Newman, Daniel A
2017-12-01
To reduce adverse impact potential and improve diversity outcomes from personnel selection, one promising technique is De Corte, Lievens, and Sackett's (2007) Pareto-optimal weighting strategy. De Corte et al.'s strategy has been demonstrated on (a) a composite of cognitive and noncognitive (e.g., personality) tests (De Corte, Lievens, & Sackett, 2008) and (b) a composite of specific cognitive ability subtests (Wee, Newman, & Joseph, 2014). Both studies illustrated how Pareto-weighting (in contrast to unit weighting) could lead to substantial improvement in diversity outcomes (i.e., diversity improvement), sometimes more than doubling the number of job offers for minority applicants. The current work addresses a key limitation of the technique-the possibility of shrinkage, especially diversity shrinkage, in the Pareto-optimal solutions. Using Monte Carlo simulations, sample size and predictor combinations were varied and cross-validated Pareto-optimal solutions were obtained. Although diversity shrinkage was sizable for a composite of cognitive and noncognitive predictors when sample size was at or below 500, diversity shrinkage was typically negligible for a composite of specific cognitive subtest predictors when sample size was at least 100. Diversity shrinkage was larger when the Pareto-optimal solution suggested substantial diversity improvement. When sample size was at least 100, cross-validated Pareto-optimal weights typically outperformed unit weights-suggesting that diversity improvement is often possible, despite diversity shrinkage. Implications for Pareto-optimal weighting, adverse impact, sample size of validation studies, and optimizing the diversity-job performance tradeoff are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Diversity comparison of Pareto front approximations in many-objective optimization.
Li, Miqing; Yang, Shengxiang; Liu, Xiaohui
2014-12-01
Diversity assessment of Pareto front approximations is an important issue in the stochastic multiobjective optimization community. Most of the diversity indicators in the literature were designed to work for any number of objectives of Pareto front approximations in principle, but in practice many of these indicators are infeasible or not workable when the number of objectives is large. In this paper, we propose a diversity comparison indicator (DCI) to assess the diversity of Pareto front approximations in many-objective optimization. DCI evaluates relative quality of different Pareto front approximations rather than provides an absolute measure of distribution for a single approximation. In DCI, all the concerned approximations are put into a grid environment so that there are some hyperboxes containing one or more solutions. The proposed indicator only considers the contribution of different approximations to nonempty hyperboxes. Therefore, the computational cost does not increase exponentially with the number of objectives. In fact, the implementation of DCI is of quadratic time complexity, which is fully independent of the number of divisions used in grid. Systematic experiments are conducted using three groups of artificial Pareto front approximations and seven groups of real Pareto front approximations with different numbers of objectives to verify the effectiveness of DCI. Moreover, a comparison with two diversity indicators used widely in many-objective optimization is made analytically and empirically. Finally, a parametric investigation reveals interesting insights of the division number in grid and also offers some suggested settings to the users with different preferences.
Pareto Efficient Solutions of Attack-Defence Trees
DEFF Research Database (Denmark)
Aslanyan, Zaruhi; Nielson, Flemming
2015-01-01
Attack-defence trees are a promising approach for representing threat scenarios and possible countermeasures in a concise and intuitive manner. An attack-defence tree describes the interaction between an attacker and a defender, and is evaluated by assigning parameters to the nodes, such as proba......Attack-defence trees are a promising approach for representing threat scenarios and possible countermeasures in a concise and intuitive manner. An attack-defence tree describes the interaction between an attacker and a defender, and is evaluated by assigning parameters to the nodes......, such as probability or cost of attacks and defences. In case of multiple parameters most analytical methods optimise one parameter at a time, e.g., minimise cost or maximise probability of an attack. Such methods may lead to sub-optimal solutions when optimising conflicting parameters, e.g., minimising cost while...... maximising probability. In order to tackle this challenge, we devise automated techniques that optimise all parameters at once. Moreover, in the case of conflicting parameters our techniques compute the set of all optimal solutions, defined in terms of Pareto efficiency. The developments are carried out...
Reddy, P.V.; Engwerda, J.C.
2011-01-01
In this article we derive necessary and sufficient conditions for the existence of Pareto optimal solutions for infinite horizon cooperative differential games. We consider games defined by non autonomous and discounted autonomous systems. The obtained results are used to analyze the regular
Approximating the Pareto Set of Multiobjective Linear Programs via Robust Optimization
Gorissen, B.L.; den Hertog, D.
2012-01-01
Abstract: The Pareto set of a multiobjective optimization problem consists of the solutions for which one or more objectives can not be improved without deteriorating one or more other objectives. We consider problems with linear objectives and linear constraints and use Adjustable Robust
COMPROMISE, OPTIMAL AND TRACTIONAL ACCOUNTS ON PARETO SET
Directory of Open Access Journals (Sweden)
V. V. Lahuta
2010-11-01
Full Text Available The problem of optimum traction calculations is considered as a problem about optimum distribution of a resource. The dynamic programming solution is based on a step-by-step calculation of set of points of Pareto-optimum values of a criterion function (energy expenses and a resource (time.
Pareto-Ranking Based Quantum-Behaved Particle Swarm Optimization for Multiobjective Optimization
Directory of Open Access Journals (Sweden)
Na Tian
2015-01-01
Full Text Available A study on pareto-ranking based quantum-behaved particle swarm optimization (QPSO for multiobjective optimization problems is presented in this paper. During the iteration, an external repository is maintained to remember the nondominated solutions, from which the global best position is chosen. The comparison between different elitist selection strategies (preference order, sigma value, and random selection is performed on four benchmark functions and two metrics. The results demonstrate that QPSO with preference order has comparative performance with sigma value according to different number of objectives. Finally, QPSO with sigma value is applied to solve multiobjective flexible job-shop scheduling problems.
Craft, David
2010-10-01
A discrete set of points and their convex combinations can serve as a sparse representation of the Pareto surface in multiple objective convex optimization. We develop a method to evaluate the quality of such a representation, and show by example that in multiple objective radiotherapy planning, the number of Pareto optimal solutions needed to represent Pareto surfaces of up to five dimensions grows at most linearly with the number of objectives. The method described is also applicable to the representation of convex sets. Copyright © 2009 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Kantian Optimization, Social Ethos, and Pareto Efficiency
John E. Roemer
2012-01-01
Although evidence accrues in biology, anthropology and experimental economics that homo sapiens is a cooperative species, the reigning assumption in economic theory is that individuals optimize in an autarkic manner (as in Nash and Walrasian equilibrium). I here postulate an interdependent kind of optimizing behavior, called Kantian. It is shown that in simple economic models, when there are negative externalities (such as congestion effects from use of a commonly owned resource) or positive ...
Pareto-Optimal Multi-objective Inversion of Geophysical Data
Schnaidt, Sebastian; Conway, Dennis; Krieger, Lars; Heinson, Graham
2018-01-01
In the process of modelling geophysical properties, jointly inverting different data sets can greatly improve model results, provided that the data sets are compatible, i.e., sensitive to similar features. Such a joint inversion requires a relationship between the different data sets, which can either be analytic or structural. Classically, the joint problem is expressed as a scalar objective function that combines the misfit functions of multiple data sets and a joint term which accounts for the assumed connection between the data sets. This approach suffers from two major disadvantages: first, it can be difficult to assess the compatibility of the data sets and second, the aggregation of misfit terms introduces a weighting of the data sets. We present a pareto-optimal multi-objective joint inversion approach based on an existing genetic algorithm. The algorithm treats each data set as a separate objective, avoiding forced weighting and generating curves of the trade-off between the different objectives. These curves are analysed by their shape and evolution to evaluate data set compatibility. Furthermore, the statistical analysis of the generated solution population provides valuable estimates of model uncertainty.
Reddy, P.V.; Engwerda, J.C.
2010-01-01
In this article we derive necessary and sufficient conditions for the existence of Pareto optimal solutions for an N player cooperative infinite horizon differential game. Firstly, we write the problem of finding Pareto candidates as solving N constrained optimal control subproblems. We derive some
Directory of Open Access Journals (Sweden)
Lina Yang
2018-02-01
Full Text Available Land-use allocation is of great significance in urban development. This type of allocation is usually considered to be a complex multi-objective spatial optimization problem, whose optimized result is a set of Pareto-optimal solutions (Pareto front reflecting different tradeoffs in several objectives. However, obtaining a Pareto front is a challenging task, and the Pareto front obtained by state-of-the-art algorithms is still not sufficient. To achieve better Pareto solutions, taking the grid-representative land-use allocation problem with two objectives as an example, an artificial bee colony optimization algorithm for multi-objective land-use allocation (ABC-MOLA is proposed. In this algorithm, the traditional ABC’s search direction guiding scheme and solution maintaining process are modified. In addition, a knowledge-informed neighborhood search strategy, which utilizes the auxiliary knowledge of natural geography and spatial structures to facilitate the neighborhood spatial search around each solution, is developed to further improve the Pareto front’s quality. A series of comparison experiments (a simulated experiment with small data volume and a real-world data experiment for a large area shows that all the Pareto fronts obtained by ABC-MOLA totally dominate the Pareto fronts by other algorithms, which demonstrates ABC-MOLA’s effectiveness in achieving Pareto fronts of high quality.
Pareto-Optimal Model Selection via SPRINT-Race.
Zhang, Tiantian; Georgiopoulos, Michael; Anagnostopoulos, Georgios C
2018-02-01
In machine learning, the notion of multi-objective model selection (MOMS) refers to the problem of identifying the set of Pareto-optimal models that optimize by compromising more than one predefined objectives simultaneously. This paper introduces SPRINT-Race, the first multi-objective racing algorithm in a fixed-confidence setting, which is based on the sequential probability ratio with indifference zone test. SPRINT-Race addresses the problem of MOMS with multiple stochastic optimization objectives in the proper Pareto-optimality sense. In SPRINT-Race, a pairwise dominance or non-dominance relationship is statistically inferred via a non-parametric, ternary-decision, dual-sequential probability ratio test. The overall probability of falsely eliminating any Pareto-optimal models or mistakenly returning any clearly dominated models is strictly controlled by a sequential Holm's step-down family-wise error rate control method. As a fixed-confidence model selection algorithm, the objective of SPRINT-Race is to minimize the computational effort required to achieve a prescribed confidence level about the quality of the returned models. The performance of SPRINT-Race is first examined via an artificially constructed MOMS problem with known ground truth. Subsequently, SPRINT-Race is applied on two real-world applications: 1) hybrid recommender system design and 2) multi-criteria stock selection. The experimental results verify that SPRINT-Race is an effective and efficient tool for such MOMS problems. code of SPRINT-Race is available at https://github.com/watera427/SPRINT-Race.
The application of analytical methods to the study of Pareto - optimal control systems
Directory of Open Access Journals (Sweden)
I. K. Romanova
2014-01-01
Full Text Available The subject of research articles - - methods of multicriteria optimization and their application for parametric synthesis of double-circuit control systems in conditions of inconsistency of individual criteria. The basis for solving multicriteria problems is a fundamental principle of a multi-criteria choice - the principle of the Edgeworth - Pareto. Getting Pareto - optimal variants due to inconsistency of individual criteria does not mean reaching a final decision. Set these options only offers the designer (DM.An important issue when using traditional numerical methods is their computational cost. An example is the use of methods of sounding the parameter space, including with use of uniform grids and uniformly distributed sequences. Very complex computational task is the application of computer methods of approximation bounds of Pareto.The purpose of this work is the development of a fairly simple search methods of Pareto - optimal solutions for the case of the criteria set out in the analytical form.The proposed solution is based on the study of the properties of the analytical dependences of criteria. The case is not covered so far in the literature, namely, the topology of the task, in which no touch of indifference curves (lines level. It is shown that for such tasks may be earmarked for compromise solutions. Prepositional use of the angular position of antigradient to the indifference curves in the parameter space relative to the coordinate axes. Formulated propositions on the characteristics of comonotonicity and contramonotonicity and angular characteristics of antigradient to determine Pareto optimal solutions. Considers the General algorithm of calculation: determine the scope of permissible values of parameters; investigates properties comonotonicity and contraventanas; to build an equal level (indifference curves; determined touch type: single sided (task is not strictly multicriteria or bilateral (objective relates to the Pareto
Pareto-optimal multi-objective design of airplane control systems
Schy, A. A.; Johnson, K. G.; Giesy, D. P.
1980-01-01
A constrained minimization algorithm for the computer aided design of airplane control systems to meet many requirements over a set of flight conditions is generalized using the concept of Pareto-optimization. The new algorithm yields solutions on the boundary of the achievable domain in objective space in a single run, whereas the older method required a sequence of runs to approximate such a limiting solution. However, Pareto-optimality does not guarantee a satisfactory design, since such solutions may emphasize some objectives at the expense of others. The designer must still interact with the program to obtain a well-balanced set of objectives. Using the example of a fighter lateral stability augmentation system (SAS) design over five flight conditions, several effective techniques are developed for obtaining well-balanced Pareto-optimal solutions. For comparison, one of these techniques is also used in a recently developed algorithm of Kreisselmeier and Steinhauser, which replaces the hard constraints with soft constraints, using a special penalty function. It is shown that comparable results can be obtained.
Taghanaki, Saeid Asgari; Kawahara, Jeremy; Miles, Brandon; Hamarneh, Ghassan
2017-07-01
Feature reduction is an essential stage in computer aided breast cancer diagnosis systems. Multilayer neural networks can be trained to extract relevant features by encoding high-dimensional data into low-dimensional codes. Optimizing traditional auto-encoders works well only if the initial weights are close to a proper solution. They are also trained to only reduce the mean squared reconstruction error (MRE) between the encoder inputs and the decoder outputs, but do not address the classification error. The goal of the current work is to test the hypothesis that extending traditional auto-encoders (which only minimize reconstruction error) to multi-objective optimization for finding Pareto-optimal solutions provides more discriminative features that will improve classification performance when compared to single-objective and other multi-objective approaches (i.e. scalarized and sequential). In this paper, we introduce a novel multi-objective optimization of deep auto-encoder networks, in which the auto-encoder optimizes two objectives: MRE and mean classification error (MCE) for Pareto-optimal solutions, rather than just MRE. These two objectives are optimized simultaneously by a non-dominated sorting genetic algorithm. We tested our method on 949 X-ray mammograms categorized into 12 classes. The results show that the features identified by the proposed algorithm allow a classification accuracy of up to 98.45%, demonstrating favourable accuracy over the results of state-of-the-art methods reported in the literature. We conclude that adding the classification objective to the traditional auto-encoder objective and optimizing for finding Pareto-optimal solutions, using evolutionary multi-objective optimization, results in producing more discriminative features. Copyright © 2017 Elsevier B.V. All rights reserved.
Optimal PMU Placement with Uncertainty Using Pareto Method
Directory of Open Access Journals (Sweden)
A. Ketabi
2012-01-01
Full Text Available This paper proposes a method for optimal placement of Phasor Measurement Units (PMUs in state estimation considering uncertainty. State estimation has first been turned into an optimization exercise in which the objective function is selected to be the number of unobservable buses which is determined based on Singular Value Decomposition (SVD. For the normal condition, Differential Evolution (DE algorithm is used to find the optimal placement of PMUs. By considering uncertainty, a multiobjective optimization exercise is hence formulated. To achieve this, DE algorithm based on Pareto optimum method has been proposed here. The suggested strategy is applied on the IEEE 30-bus test system in several case studies to evaluate the optimal PMUs placement.
Evolutionary tradeoffs, Pareto optimality and the morphology of ammonite shells.
Tendler, Avichai; Mayo, Avraham; Alon, Uri
2015-03-07
Organisms that need to perform multiple tasks face a fundamental tradeoff: no design can be optimal at all tasks at once. Recent theory based on Pareto optimality showed that such tradeoffs lead to a highly defined range of phenotypes, which lie in low-dimensional polyhedra in the space of traits. The vertices of these polyhedra are called archetypes- the phenotypes that are optimal at a single task. To rigorously test this theory requires measurements of thousands of species over hundreds of millions of years of evolution. Ammonoid fossil shells provide an excellent model system for this purpose. Ammonoids have a well-defined geometry that can be parameterized using three dimensionless features of their logarithmic-spiral-shaped shells. Their evolutionary history includes repeated mass extinctions. We find that ammonoids fill out a pyramid in morphospace, suggesting five specific tasks - one for each vertex of the pyramid. After mass extinctions, surviving species evolve to refill essentially the same pyramid, suggesting that the tasks are unchanging. We infer putative tasks for each archetype, related to economy of shell material, rapid shell growth, hydrodynamics and compactness. These results support Pareto optimality theory as an approach to study evolutionary tradeoffs, and demonstrate how this approach can be used to infer the putative tasks that may shape the natural selection of phenotypes.
International Nuclear Information System (INIS)
Gollub, C; De Vivie-Riedle, R
2009-01-01
A multi-objective genetic algorithm is applied to optimize picosecond laser fields, driving vibrational quantum processes. Our examples are state-to-state transitions and unitary transformations. The approach allows features of the shaped laser fields and of the excitation mechanisms to be controlled simultaneously with the quantum yield. Within the parameter range accessible to the experiment, we focus on short pulse durations and low pulse energies to optimize preferably robust laser fields. Multidimensional Pareto fronts for these conflicting objectives could be constructed. Comparison with previous work showed that the solutions from Pareto optimizations and from optimal control theory match very well.
Abdul Rani, Khairul Najmy; Abdulmalek, Mohamedfareq; A Rahim, Hasliza; Siew Chin, Neoh; Abd Wahab, Alawiyah
2017-04-20
This research proposes the various versions of modified cuckoo search (MCS) metaheuristic algorithm deploying the strength Pareto evolutionary algorithm (SPEA) multiobjective (MO) optimization technique in rectangular array geometry synthesis. Precisely, the MCS algorithm is proposed by incorporating the Roulette wheel selection operator to choose the initial host nests (individuals) that give better results, adaptive inertia weight to control the positions exploration of the potential best host nests (solutions), and dynamic discovery rate to manage the fraction probability of finding the best host nests in 3-dimensional search space. In addition, the MCS algorithm is hybridized with the particle swarm optimization (PSO) and hill climbing (HC) stochastic techniques along with the standard strength Pareto evolutionary algorithm (SPEA) forming the MCSPSOSPEA and MCSHCSPEA, respectively. All the proposed MCS-based algorithms are examined to perform MO optimization on Zitzler-Deb-Thiele's (ZDT's) test functions. Pareto optimum trade-offs are done to generate a set of three non-dominated solutions, which are locations, excitation amplitudes, and excitation phases of array elements, respectively. Overall, simulations demonstrates that the proposed MCSPSOSPEA outperforms other compatible competitors, in gaining a high antenna directivity, small half-power beamwidth (HPBW), low average side lobe level (SLL) suppression, and/or significant predefined nulls mitigation, simultaneously.
Pareto optimization of an industrial ecosystem: sustainability maximization
Directory of Open Access Journals (Sweden)
J. G. M.-S. Monteiro
2010-09-01
Full Text Available This work investigates a procedure to design an Industrial Ecosystem for sequestrating CO2 and consuming glycerol in a Chemical Complex with 15 integrated processes. The Complex is responsible for the production of methanol, ethylene oxide, ammonia, urea, dimethyl carbonate, ethylene glycol, glycerol carbonate, β-carotene, 1,2-propanediol and olefins, and is simulated using UNISIM Design (Honeywell. The process environmental impact (EI is calculated using the Waste Reduction Algorithm, while Profit (P is estimated using classic cost correlations. MATLAB (The Mathworks Inc is connected to UNISIM to enable optimization. The objective is granting maximum process sustainability, which involves finding a compromise between high profitability and low environmental impact. Sustainability maximization is therefore understood as a multi-criteria optimization problem, addressed by means of the Pareto optimization methodology for trading off P vs. EI.
Wismans, Luc Johannes Josephus; van Berkum, Eric C.; Bliemer, Michiel; Allkim, T.P.; van Arem, Bart
2010-01-01
Multi objective optimization of externalities of traffic is performed solving a network design problem in which Dynamic Traffic Management measures are used. The resulting Pareto optimal set is determined by employing the SPEA2+ evolutionary algorithm.
Directory of Open Access Journals (Sweden)
Yan Sun
2015-09-01
Full Text Available Purpose: The purpose of study is to solve the multi-modal transportation routing planning problem that aims to select an optimal route to move a consignment of goods from its origin to its destination through the multi-modal transportation network. And the optimization is from two viewpoints including cost and time. Design/methodology/approach: In this study, a bi-objective mixed integer linear programming model is proposed to optimize the multi-modal transportation routing planning problem. Minimizing the total transportation cost and the total transportation time are set as the optimization objectives of the model. In order to balance the benefit between the two objectives, Pareto optimality is utilized to solve the model by gaining its Pareto frontier. The Pareto frontier of the model can provide the multi-modal transportation operator (MTO and customers with better decision support and it is gained by the normalized normal constraint method. Then, an experimental case study is designed to verify the feasibility of the model and Pareto optimality by using the mathematical programming software Lingo. Finally, the sensitivity analysis of the demand and supply in the multi-modal transportation organization is performed based on the designed case. Findings: The calculation results indicate that the proposed model and Pareto optimality have good performance in dealing with the bi-objective optimization. The sensitivity analysis also shows the influence of the variation of the demand and supply on the multi-modal transportation organization clearly. Therefore, this method can be further promoted to the practice. Originality/value: A bi-objective mixed integer linear programming model is proposed to optimize the multi-modal transportation routing planning problem. The Pareto frontier based sensitivity analysis of the demand and supply in the multi-modal transportation organization is performed based on the designed case.
Pareto-Optimal Estimates of California Precipitation Change
Langenbrunner, Baird; Neelin, J. David
2017-12-01
In seeking constraints on global climate model projections under global warming, one commonly finds that different subsets of models perform well under different objective functions, and these trade-offs are difficult to weigh. Here a multiobjective approach is applied to a large set of subensembles generated from the Climate Model Intercomparison Project phase 5 ensemble. We use observations and reanalyses to constrain tropical Pacific sea surface temperatures, upper level zonal winds in the midlatitude Pacific, and California precipitation. An evolutionary algorithm identifies the set of Pareto-optimal subensembles across these three measures, and these subensembles are used to constrain end-of-century California wet season precipitation change. This methodology narrows the range of projections throughout California, increasing confidence in estimates of positive mean precipitation change. Finally, we show how this technique complements and generalizes emergent constraint approaches for restricting uncertainty in end-of-century projections within multimodel ensembles using multiple criteria for observational constraints.
Directory of Open Access Journals (Sweden)
Vimal Savsani
2017-01-01
Full Text Available Most of the modern multiobjective optimization algorithms are based on the search technique of genetic algorithms; however the search techniques of other recently developed metaheuristics are emerging topics among researchers. This paper proposes a novel multiobjective optimization algorithm named multiobjective heat transfer search (MOHTS algorithm, which is based on the search technique of heat transfer search (HTS algorithm. MOHTS employs the elitist nondominated sorting and crowding distance approach of an elitist based nondominated sorting genetic algorithm-II (NSGA-II for obtaining different nondomination levels and to preserve the diversity among the optimal set of solutions, respectively. The capability in yielding a Pareto front as close as possible to the true Pareto front of MOHTS has been tested on the multiobjective optimization problem of the vehicle suspension design, which has a set of five second-order linear ordinary differential equations. Half car passive ride model with two different sets of five objectives is employed for optimizing the suspension parameters using MOHTS and NSGA-II. The optimization studies demonstrate that MOHTS achieves the better nondominated Pareto front with the widespread (diveresed set of optimal solutions as compared to NSGA-II, and further the comparison of the extreme points of the obtained Pareto front reveals the dominance of MOHTS over NSGA-II, multiobjective uniform diversity genetic algorithm (MUGA, and combined PSO-GA based MOEA.
Strength Pareto particle swarm optimization and hybrid EA-PSO for multi-objective optimization.
Elhossini, Ahmed; Areibi, Shawki; Dony, Robert
2010-01-01
This paper proposes an efficient particle swarm optimization (PSO) technique that can handle multi-objective optimization problems. It is based on the strength Pareto approach originally used in evolutionary algorithms (EA). The proposed modified particle swarm algorithm is used to build three hybrid EA-PSO algorithms to solve different multi-objective optimization problems. This algorithm and its hybrid forms are tested using seven benchmarks from the literature and the results are compared to the strength Pareto evolutionary algorithm (SPEA2) and a competitive multi-objective PSO using several metrics. The proposed algorithm shows a slower convergence, compared to the other algorithms, but requires less CPU time. Combining PSO and evolutionary algorithms leads to superior hybrid algorithms that outperform SPEA2, the competitive multi-objective PSO (MO-PSO), and the proposed strength Pareto PSO based on different metrics.
Kyroudi, Archonteia; Petersson, Kristoffer; Ghandour, Sarah; Pachoud, Marc; Matzinger, Oscar; Ozsahin, Mahmut; Bourhis, Jean; Bochud, François; Moeckli, Raphaël
2016-08-01
Multi-criteria optimization provides decision makers with a range of clinical choices through Pareto plans that can be explored during real time navigation and then converted into deliverable plans. Our study shows that dosimetric differences can arise between the two steps, which could compromise the clinical choices made during navigation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
A Pareto-based multi-objective optimization algorithm to design energy-efficient shading devices
International Nuclear Information System (INIS)
Khoroshiltseva, Marina; Slanzi, Debora; Poli, Irene
2016-01-01
Highlights: • We present a multi-objective optimization algorithm for shading design. • We combine Harmony search and Pareto-based procedures. • Thermal and daylighting performances of external shading were considered. • We applied the optimization process to a residential social housing in Madrid. - Abstract: In this paper we address the problem of designing new energy-efficient static daylight devices that will surround the external windows of a residential building in Madrid. Shading devices can in fact largely influence solar gains in a building and improve thermal and lighting comforts by selectively intercepting the solar radiation and by reducing the undesirable glare. A proper shading device can therefore significantly increase the thermal performance of a building by reducing its energy demand in different climate conditions. In order to identify the set of optimal shading devices that allow a low energy consumption of the dwelling while maintaining high levels of thermal and lighting comfort for the inhabitants we derive a multi-objective optimization methodology based on Harmony Search and Pareto front approaches. The results show that the multi-objective approach here proposed is an effective procedure in designing energy efficient shading devices when a large set of conflicting objectives characterizes the performance of the proposed solutions.
Pareto-optimal estimates that constrain mean California precipitation change
Langenbrunner, B.; Neelin, J. D.
2017-12-01
Global climate model (GCM) projections of greenhouse gas-induced precipitation change can exhibit notable uncertainty at the regional scale, particularly in regions where the mean change is small compared to internal variability. This is especially true for California, which is located in a transition zone between robust precipitation increases to the north and decreases to the south, and where GCMs from the Climate Model Intercomparison Project phase 5 (CMIP5) archive show no consensus on mean change (in either magnitude or sign) across the central and southern parts of the state. With the goal of constraining this uncertainty, we apply a multiobjective approach to a large set of subensembles (subsets of models from the full CMIP5 ensemble). These constraints are based on subensemble performance in three fields important to California precipitation: tropical Pacific sea surface temperatures, upper-level zonal winds in the midlatitude Pacific, and precipitation over the state. An evolutionary algorithm is used to sort through and identify the set of Pareto-optimal subensembles across these three measures in the historical climatology, and we use this information to constrain end-of-century California wet season precipitation change. This technique narrows the range of projections throughout the state and increases confidence in estimates of positive mean change. Furthermore, these methods complement and generalize emergent constraint approaches that aim to restrict uncertainty in end-of-century projections, and they have applications to even broader aspects of uncertainty quantification, including parameter sensitivity and model calibration.
Xu, Chuanpei; Niu, Junhao; Ling, Jing; Wang, Suyan
2018-03-01
In this paper, we present a parallel test strategy for bandwidth division multiplexing under the test access mechanism bandwidth constraint. The Pareto solution set is combined with a cloud evolutionary algorithm to optimize the test time and power consumption of a three-dimensional network-on-chip (3D NoC). In the proposed method, all individuals in the population are sorted in non-dominated order and allocated to the corresponding level. Individuals with extreme and similar characteristics are then removed. To increase the diversity of the population and prevent the algorithm from becoming stuck around local optima, a competition strategy is designed for the individuals. Finally, we adopt an elite reservation strategy and update the individuals according to the cloud model. Experimental results show that the proposed algorithm converges to the optimal Pareto solution set rapidly and accurately. This not only obtains the shortest test time, but also optimizes the power consumption of the 3D NoC.
Mukhopadhyay, Somparna; Hazra, Lakshminarayan
2015-11-01
Resolution capability of an optical imaging system can be enhanced by reducing the width of the central lobe of the point spread function. Attempts to achieve the same by pupil plane filtering give rise to a concomitant increase in sidelobe intensity. The mutual exclusivity between these two objectives may be considered as a multiobjective optimization problem that does not have a unique solution; rather, a class of trade-off solutions called Pareto optimal solutions may be generated. Pareto fronts in the synthesis of lossless phase-only pupil plane filters to achieve superresolution with prespecified lower limits for the Strehl ratio are explored by using the particle swarm optimization technique.
A Regionalization Approach to select the final watershed parameter set among the Pareto solutions
Park, G. H.; Micheletty, P. D.; Carney, S.; Quebbeman, J.; Day, G. N.
2017-12-01
The calibration of hydrological models often results in model parameters that are inconsistent with those from neighboring basins. Considering that physical similarity exists within neighboring basins some of the physically related parameters should be consistent among them. Traditional manual calibration techniques require an iterative process to make the parameters consistent, which takes additional effort in model calibration. We developed a multi-objective optimization procedure to calibrate the National Weather Service (NWS) Research Distributed Hydrological Model (RDHM), using the Nondominant Sorting Genetic Algorithm (NSGA-II) with expert knowledge of the model parameter interrelationships one objective function. The multi-objective algorithm enables us to obtain diverse parameter sets that are equally acceptable with respect to the objective functions and to choose one from the pool of the parameter sets during a subsequent regionalization step. Although all Pareto solutions are non-inferior, we exclude some of the parameter sets that show extremely values for any of the objective functions to expedite the selection process. We use an apriori model parameter set derived from the physical properties of the watershed (Koren et al., 2000) to assess the similarity for a given parameter across basins. Each parameter is assigned a weight based on its assumed similarity, such that parameters that are similar across basins are given higher weights. The parameter weights are useful to compute a closeness measure between Pareto sets of nearby basins. The regionalization approach chooses the Pareto parameter sets that minimize the closeness measure of the basin being regionalized. The presentation will describe the results of applying the regionalization approach to a set of pilot basins in the Upper Colorado basin as part of a NASA-funded project.
He, Lu; Friedman, Alan M.; Bailey-Kellogg, Chris
2016-01-01
In developing improved protein variants by site-directed mutagenesis or recombination, there are often competing objectives that must be considered in designing an experiment (selecting mutations or breakpoints): stability vs. novelty, affinity vs. specificity, activity vs. immunogenicity, and so forth. Pareto optimal experimental designs make the best trade-offs between competing objectives. Such designs are not “dominated”; i.e., no other design is better than a Pareto optimal design for one objective without being worse for another objective. Our goal is to produce all the Pareto optimal designs (the Pareto frontier), in order to characterize the trade-offs and suggest designs most worth considering, but to avoid explicitly considering the large number of dominated designs. To do so, we develop a divide-and-conquer algorithm, PEPFR (Protein Engineering Pareto FRontier), that hierarchically subdivides the objective space, employing appropriate dynamic programming or integer programming methods to optimize designs in different regions. This divide-and-conquer approach is efficient in that the number of divisions (and thus calls to the optimizer) is directly proportional to the number of Pareto optimal designs. We demonstrate PEPFR with three protein engineering case studies: site-directed recombination for stability and diversity via dynamic programming, site-directed mutagenesis of interacting proteins for affinity and specificity via integer programming, and site-directed mutagenesis of a therapeutic protein for activity and immunogenicity via integer programming. We show that PEPFR is able to effectively produce all the Pareto optimal designs, discovering many more designs than previous methods. The characterization of the Pareto frontier provides additional insights into the local stability of design choices as well as global trends leading to trade-offs between competing criteria. PMID:22180081
A Pareto-optimal moving average multigene genetic programming model for daily streamflow prediction
Danandeh Mehr, Ali; Kahya, Ercan
2017-06-01
Genetic programming (GP) is able to systematically explore alternative model structures of different accuracy and complexity from observed input and output data. The effectiveness of GP in hydrological system identification has been recognized in recent studies. However, selecting a parsimonious (accurate and simple) model from such alternatives still remains a question. This paper proposes a Pareto-optimal moving average multigene genetic programming (MA-MGGP) approach to develop a parsimonious model for single-station streamflow prediction. The three main components of the approach that take us from observed data to a validated model are: (1) data pre-processing, (2) system identification and (3) system simplification. The data pre-processing ingredient uses a simple moving average filter to diminish the lagged prediction effect of stand-alone data-driven models. The multigene ingredient of the model tends to identify the underlying nonlinear system with expressions simpler than classical monolithic GP and, eventually simplification component exploits Pareto front plot to select a parsimonious model through an interactive complexity-efficiency trade-off. The approach was tested using the daily streamflow records from a station on Senoz Stream, Turkey. Comparing to the efficiency results of stand-alone GP, MGGP, and conventional multi linear regression prediction models as benchmarks, the proposed Pareto-optimal MA-MGGP model put forward a parsimonious solution, which has a noteworthy importance of being applied in practice. In addition, the approach allows the user to enter human insight into the problem to examine evolved models and pick the best performing programs out for further analysis.
Giller, C A
2011-12-01
The use of conformity indices to optimize Gamma Knife planning is common, but does not address important tradeoffs between dose to tumor and normal tissue. Pareto analysis has been used for this purpose in other applications, but not for Gamma Knife (GK) planning. The goal of this work is to use computer models to show that Pareto analysis may be feasible for GK planning to identify dosimetric tradeoffs. We define a GK plan A to be Pareto dominant to B if the prescription isodose volume of A covers more tumor but not more normal tissue than B, or if A covers less normal tissue but not less tumor than B. A plan is Pareto optimal if it is not dominated by any other plan. Two different Pareto optimal plans represent different tradeoffs between dose to tumor and normal tissue, because neither plan dominates the other. 'GK simulator' software calculated dose distributions for GK plans, and was called repetitively by a genetic algorithm to calculate Pareto dominant plans. Three irregular tumor shapes were tested in 17 trials using various combinations of shots. The mean number of Pareto dominant plans/trial was 59 ± 17 (sd). Different planning strategies were identified by large differences in shot positions, and 70 of the 153 coordinate plots (46%) showed differences of 5mm or more. The Pareto dominant plans dominated other nearby plans. Pareto dominant plans represent different dosimetric tradeoffs and can be systematically calculated using genetic algorithms. Automatic identification of non-intuitive planning strategies may be feasible with these methods.
Saborido, Rubén; Ruiz, Ana B; Luque, Mariano
2017-01-01
In this article, we propose a new evolutionary algorithm for multiobjective optimization called Global WASF-GA ( global weighting achievement scalarizing function genetic algorithm), which falls within the aggregation-based evolutionary algorithms. The main purpose of Global WASF-GA is to approximate the whole Pareto optimal front. Its fitness function is defined by an achievement scalarizing function (ASF) based on the Tchebychev distance, in which two reference points are considered (both utopian and nadir objective vectors) and the weight vector used is taken from a set of weight vectors whose inverses are well-distributed. At each iteration, all individuals are classified into different fronts. Each front is formed by the solutions with the lowest values of the ASF for the different weight vectors in the set, using the utopian vector and the nadir vector as reference points simultaneously. Varying the weight vector in the ASF while considering the utopian and the nadir vectors at the same time enables the algorithm to obtain a final set of nondominated solutions that approximate the whole Pareto optimal front. We compared Global WASF-GA to MOEA/D (different versions) and NSGA-II in two-, three-, and five-objective problems. The computational results obtained permit us to conclude that Global WASF-GA gets better performance, regarding the hypervolume metric and the epsilon indicator, than the other two algorithms in many cases, especially in three- and five-objective problems.
Approximating the Pareto set of multiobjective linear programs via robust optimization
Gorissen, B.L.; den Hertog, D.
2012-01-01
We consider problems with multiple linear objectives and linear constraints and use adjustable robust optimization and polynomial optimization as tools to approximate the Pareto set with polynomials of arbitrarily large degree. The main difference with existing techniques is that we optimize a
Pardo-Montero, Juan; Fenwick, John D
2010-06-01
The purpose of this work is twofold: To further develop an approach to multiobjective optimization of rotational therapy treatments recently introduced by the authors [J. Pardo-Montero and J. D. Fenwick, "An approach to multiobjective optimization of rotational therapy," Med. Phys. 36, 3292-3303 (2009)], especially regarding its application to realistic geometries, and to study the quality (Pareto optimality) of plans obtained using such an approach by comparing them with Pareto optimal plans obtained through inverse planning. In the previous work of the authors, a methodology is proposed for constructing a large number of plans, with different compromises between the objectives involved, from a small number of geometrically based arcs, each arc prioritizing different objectives. Here, this method has been further developed and studied. Two different techniques for constructing these arcs are investigated, one based on image-reconstruction algorithms and the other based on more common gradient-descent algorithms. The difficulty of dealing with organs abutting the target, briefly reported in previous work of the authors, has been investigated using partial OAR unblocking. Optimality of the solutions has been investigated by comparison with a Pareto front obtained from inverse planning. A relative Euclidean distance has been used to measure the distance of these plans to the Pareto front, and dose volume histogram comparisons have been used to gauge the clinical impact of these distances. A prostate geometry has been used for the study. For geometries where a blocked OAR abuts the target, moderate OAR unblocking can substantially improve target dose distribution and minimize hot spots while not overly compromising dose sparing of the organ. Image-reconstruction type and gradient-descent blocked-arc computations generate similar results. The Pareto front for the prostate geometry, reconstructed using a large number of inverse plans, presents a hockey-stick shape
Jiang, Shouyong; Yang, Shengxiang
2016-02-01
The multiobjective evolutionary algorithm based on decomposition (MOEA/D) has been shown to be very efficient in solving multiobjective optimization problems (MOPs). In practice, the Pareto-optimal front (POF) of many MOPs has complex characteristics. For example, the POF may have a long tail and sharp peak and disconnected regions, which significantly degrades the performance of MOEA/D. This paper proposes an improved MOEA/D for handling such kind of complex problems. In the proposed algorithm, a two-phase strategy (TP) is employed to divide the whole optimization procedure into two phases. Based on the crowdedness of solutions found in the first phase, the algorithm decides whether or not to delicate computational resources to handle unsolved subproblems in the second phase. Besides, a new niche scheme is introduced into the improved MOEA/D to guide the selection of mating parents to avoid producing duplicate solutions, which is very helpful for maintaining the population diversity when the POF of the MOP being optimized is discontinuous. The performance of the proposed algorithm is investigated on some existing benchmark and newly designed MOPs with complex POF shapes in comparison with several MOEA/D variants and other approaches. The experimental results show that the proposed algorithm produces promising performance on these complex problems.
DEFF Research Database (Denmark)
Barmby, Tim; Smith, Nina
1996-01-01
This paper analyses the labour supply behaviour of households in Denmark and Britain. It employs models in which the preferences of individuals within the household are explicitly represented. The households are then assumed to decide on their labour supply in a Pareto-Optimal fashion. Describing...
A multiobjective optimization algorithm is applied to a groundwater quality management problem involving remediation by pump-and-treat (PAT). The multiobjective optimization framework uses the niched Pareto genetic algorithm (NPGA) and is applied to simultaneously minimize the...
Pareto evolution of gene networks: an algorithm to optimize multiple fitness objectives
International Nuclear Information System (INIS)
Warmflash, Aryeh; Siggia, Eric D; Francois, Paul
2012-01-01
The computational evolution of gene networks functions like a forward genetic screen to generate, without preconceptions, all networks that can be assembled from a defined list of parts to implement a given function. Frequently networks are subject to multiple design criteria that cannot all be optimized simultaneously. To explore how these tradeoffs interact with evolution, we implement Pareto optimization in the context of gene network evolution. In response to a temporal pulse of a signal, we evolve networks whose output turns on slowly after the pulse begins, and shuts down rapidly when the pulse terminates. The best performing networks under our conditions do not fall into categories such as feed forward and negative feedback that also encode the input–output relation we used for selection. Pareto evolution can more efficiently search the space of networks than optimization based on a single ad hoc combination of the design criteria. (paper)
Pareto evolution of gene networks: an algorithm to optimize multiple fitness objectives.
Warmflash, Aryeh; Francois, Paul; Siggia, Eric D
2012-10-01
The computational evolution of gene networks functions like a forward genetic screen to generate, without preconceptions, all networks that can be assembled from a defined list of parts to implement a given function. Frequently networks are subject to multiple design criteria that cannot all be optimized simultaneously. To explore how these tradeoffs interact with evolution, we implement Pareto optimization in the context of gene network evolution. In response to a temporal pulse of a signal, we evolve networks whose output turns on slowly after the pulse begins, and shuts down rapidly when the pulse terminates. The best performing networks under our conditions do not fall into categories such as feed forward and negative feedback that also encode the input-output relation we used for selection. Pareto evolution can more efficiently search the space of networks than optimization based on a single ad hoc combination of the design criteria.
Directory of Open Access Journals (Sweden)
Carlos Pozo
Full Text Available Optimization models in metabolic engineering and systems biology focus typically on optimizing a unique criterion, usually the synthesis rate of a metabolite of interest or the rate of growth. Connectivity and non-linear regulatory effects, however, make it necessary to consider multiple objectives in order to identify useful strategies that balance out different metabolic issues. This is a fundamental aspect, as optimization of maximum yield in a given condition may involve unrealistic values in other key processes. Due to the difficulties associated with detailed non-linear models, analysis using stoichiometric descriptions and linear optimization methods have become rather popular in systems biology. However, despite being useful, these approaches fail in capturing the intrinsic nonlinear nature of the underlying metabolic systems and the regulatory signals involved. Targeting more complex biological systems requires the application of global optimization methods to non-linear representations. In this work we address the multi-objective global optimization of metabolic networks that are described by a special class of models based on the power-law formalism: the generalized mass action (GMA representation. Our goal is to develop global optimization methods capable of efficiently dealing with several biological criteria simultaneously. In order to overcome the numerical difficulties of dealing with multiple criteria in the optimization, we propose a heuristic approach based on the epsilon constraint method that reduces the computational burden of generating a set of Pareto optimal alternatives, each achieving a unique combination of objectives values. To facilitate the post-optimal analysis of these solutions and narrow down their number prior to being tested in the laboratory, we explore the use of Pareto filters that identify the preferred subset of enzymatic profiles. We demonstrate the usefulness of our approach by means of a case study
Pozo, Carlos; Guillén-Gosálbez, Gonzalo; Sorribas, Albert; Jiménez, Laureano
2012-01-01
Optimization models in metabolic engineering and systems biology focus typically on optimizing a unique criterion, usually the synthesis rate of a metabolite of interest or the rate of growth. Connectivity and non-linear regulatory effects, however, make it necessary to consider multiple objectives in order to identify useful strategies that balance out different metabolic issues. This is a fundamental aspect, as optimization of maximum yield in a given condition may involve unrealistic values in other key processes. Due to the difficulties associated with detailed non-linear models, analysis using stoichiometric descriptions and linear optimization methods have become rather popular in systems biology. However, despite being useful, these approaches fail in capturing the intrinsic nonlinear nature of the underlying metabolic systems and the regulatory signals involved. Targeting more complex biological systems requires the application of global optimization methods to non-linear representations. In this work we address the multi-objective global optimization of metabolic networks that are described by a special class of models based on the power-law formalism: the generalized mass action (GMA) representation. Our goal is to develop global optimization methods capable of efficiently dealing with several biological criteria simultaneously. In order to overcome the numerical difficulties of dealing with multiple criteria in the optimization, we propose a heuristic approach based on the epsilon constraint method that reduces the computational burden of generating a set of Pareto optimal alternatives, each achieving a unique combination of objectives values. To facilitate the post-optimal analysis of these solutions and narrow down their number prior to being tested in the laboratory, we explore the use of Pareto filters that identify the preferred subset of enzymatic profiles. We demonstrate the usefulness of our approach by means of a case study that optimizes the
TU-C-17A-01: A Data-Based Development for Pratical Pareto Optimality Assessment and Identification
International Nuclear Information System (INIS)
Ruan, D; Qi, S; DeMarco, J; Kupelian, P; Low, D
2014-01-01
Purpose: To develop an efficient Pareto optimality assessment scheme to support plan comparison and practical determination of best-achievable practical treatment plan goals. Methods: Pareto efficiency reflects the tradeoffs among competing target coverage and normal tissue sparing in multi-criterion optimization (MCO) based treatment planning. Assessing and understanding Pareto optimality provides insightful guidance for future planning. However, current MCO-driven Pareto estimation makes relaxed assumptions about the Pareto structure and insufficiently account for practical limitations in beam complexity, leading to performance upper bounds that may be unachievable. This work proposed an alternative data-driven approach that implicitly incorporates the practical limitations, and identifies the Pareto frontier subset by eliminating dominated plans incrementally using the Edgeworth Pareto hull (EPH). The exactness of this elimination process also permits the development of a hierarchical procedure for speedup when the plan cohort size is large, by partitioning the cohort and performing elimination in each subset before a final aggregated elimination. The developed algorithm was first tested on 2D and 3D where accuracy can be reliably assessed. As a specific application, the algorithm was applied to compare systematic plan quality for lower head-and-neck, amongst 4 competing treatment modalities. Results: The algorithm agrees exactly with brute-force pairwise comparison and visual inspection in low dimensions. The hierarchical algorithm shows sqrt(k) folds speedup with k being the number of data points in the plan cohort, demonstrating good efficiency enhancement for heavy testing tasks. Application to plan performance comparison showed superiority of tomotherapy plans for the lower head-and-neck, and revealed a potential nonconvex Pareto frontier structure. Conclusion: An accurate and efficient scheme to identify Pareto frontier from a plan cohort has been
TU-C-17A-01: A Data-Based Development for Pratical Pareto Optimality Assessment and Identification
Energy Technology Data Exchange (ETDEWEB)
Ruan, D; Qi, S; DeMarco, J; Kupelian, P; Low, D [UCLA Department of Radiation Oncology, Los Angeles, CA (United States)
2014-06-15
Purpose: To develop an efficient Pareto optimality assessment scheme to support plan comparison and practical determination of best-achievable practical treatment plan goals. Methods: Pareto efficiency reflects the tradeoffs among competing target coverage and normal tissue sparing in multi-criterion optimization (MCO) based treatment planning. Assessing and understanding Pareto optimality provides insightful guidance for future planning. However, current MCO-driven Pareto estimation makes relaxed assumptions about the Pareto structure and insufficiently account for practical limitations in beam complexity, leading to performance upper bounds that may be unachievable. This work proposed an alternative data-driven approach that implicitly incorporates the practical limitations, and identifies the Pareto frontier subset by eliminating dominated plans incrementally using the Edgeworth Pareto hull (EPH). The exactness of this elimination process also permits the development of a hierarchical procedure for speedup when the plan cohort size is large, by partitioning the cohort and performing elimination in each subset before a final aggregated elimination. The developed algorithm was first tested on 2D and 3D where accuracy can be reliably assessed. As a specific application, the algorithm was applied to compare systematic plan quality for lower head-and-neck, amongst 4 competing treatment modalities. Results: The algorithm agrees exactly with brute-force pairwise comparison and visual inspection in low dimensions. The hierarchical algorithm shows sqrt(k) folds speedup with k being the number of data points in the plan cohort, demonstrating good efficiency enhancement for heavy testing tasks. Application to plan performance comparison showed superiority of tomotherapy plans for the lower head-and-neck, and revealed a potential nonconvex Pareto frontier structure. Conclusion: An accurate and efficient scheme to identify Pareto frontier from a plan cohort has been
A new mechanism for maintaining diversity of Pareto archive in multi-objective optimization
Czech Academy of Sciences Publication Activity Database
Hájek, J.; Szöllös, A.; Šístek, Jakub
2010-01-01
Roč. 41, 7-8 (2010), s. 1031-1057 ISSN 0965-9978 R&D Projects: GA AV ČR IAA100760702 Institutional research plan: CEZ:AV0Z10190503 Keywords : multi-objective optimization * micro-genetic algorithm * diversity * Pareto archive Subject RIV: BA - General Mathematics Impact factor: 1.004, year: 2010 http://www.sciencedirect.com/science/article/pii/S0965997810000451
A new mechanism for maintaining diversity of Pareto archive in multi-objective optimization
Czech Academy of Sciences Publication Activity Database
Hájek, J.; Szöllös, A.; Šístek, Jakub
2010-01-01
Roč. 41, 7-8 (2010), s. 1031-1057 ISSN 0965-9978 R&D Projects: GA AV ČR IAA100760702 Institutional research plan: CEZ:AV0Z10190503 Keywords : multi-objective optimization * micro- genetic algorithm * diversity * Pareto archive Subject RIV: BA - General Mathematics Impact factor: 1.004, year: 2010 http://www.sciencedirect.com/science/article/pii/S0965997810000451
Improving predicted protein loop structure ranking using a Pareto-optimality consensus method.
Li, Yaohang; Rata, Ionel; Chiu, See-wing; Jakobsson, Eric
2010-07-20
Accurate protein loop structure models are important to understand functions of many proteins. Identifying the native or near-native models by distinguishing them from the misfolded ones is a critical step in protein loop structure prediction. We have developed a Pareto Optimal Consensus (POC) method, which is a consensus model ranking approach to integrate multiple knowledge- or physics-based scoring functions. The procedure of identifying the models of best quality in a model set includes: 1) identifying the models at the Pareto optimal front with respect to a set of scoring functions, and 2) ranking them based on the fuzzy dominance relationship to the rest of the models. We apply the POC method to a large number of decoy sets for loops of 4- to 12-residue in length using a functional space composed of several carefully-selected scoring functions: Rosetta, DOPE, DDFIRE, OPLS-AA, and a triplet backbone dihedral potential developed in our lab. Our computational results show that the sets of Pareto-optimal decoys, which are typically composed of approximately 20% or less of the overall decoys in a set, have a good coverage of the best or near-best decoys in more than 99% of the loop targets. Compared to the individual scoring function yielding best selection accuracy in the decoy sets, the POC method yields 23%, 37%, and 64% less false positives in distinguishing the native conformation, indentifying a near-native model (RMSD Pareto optimality and fuzzy dominance, the POC method is effective in distinguishing the best loop models from the other ones within a loop model set.
Optimal Reinsurance Design for Pareto Optimum: From the Perspective of Multiple Reinsurers
Directory of Open Access Journals (Sweden)
Xing Rong
2016-01-01
Full Text Available This paper investigates optimal reinsurance strategies for an insurer which cedes the insured risk to multiple reinsurers. Assume that the insurer and every reinsurer apply the coherent risk measures. Then, we find out the necessary and sufficient conditions for the reinsurance market to achieve Pareto optimum; that is, every ceded-loss function and the retention function are in the form of “multiple layers reinsurance.”
Filatovas, Ernestas; Podkopaev, Dmitry; Kurasova, Olga
2015-01-01
Interactive methods of multiobjective optimization repetitively derive Pareto optimal solutions based on decision maker’s preference information and present the obtained solutions for his/her consideration. Some interactive methods save the obtained solutions into a solution pool and, at each iteration, allow the decision maker considering any of solutions obtained earlier. This feature contributes to the flexibility of exploring the Pareto optimal set and learning about the op...
He, Lu; Friedman, Alan M; Bailey-Kellogg, Chris
2012-03-01
In developing improved protein variants by site-directed mutagenesis or recombination, there are often competing objectives that must be considered in designing an experiment (selecting mutations or breakpoints): stability versus novelty, affinity versus specificity, activity versus immunogenicity, and so forth. Pareto optimal experimental designs make the best trade-offs between competing objectives. Such designs are not "dominated"; that is, no other design is better than a Pareto optimal design for one objective without being worse for another objective. Our goal is to produce all the Pareto optimal designs (the Pareto frontier), to characterize the trade-offs and suggest designs most worth considering, but to avoid explicitly considering the large number of dominated designs. To do so, we develop a divide-and-conquer algorithm, Protein Engineering Pareto FRontier (PEPFR), that hierarchically subdivides the objective space, using appropriate dynamic programming or integer programming methods to optimize designs in different regions. This divide-and-conquer approach is efficient in that the number of divisions (and thus calls to the optimizer) is directly proportional to the number of Pareto optimal designs. We demonstrate PEPFR with three protein engineering case studies: site-directed recombination for stability and diversity via dynamic programming, site-directed mutagenesis of interacting proteins for affinity and specificity via integer programming, and site-directed mutagenesis of a therapeutic protein for activity and immunogenicity via integer programming. We show that PEPFR is able to effectively produce all the Pareto optimal designs, discovering many more designs than previous methods. The characterization of the Pareto frontier provides additional insights into the local stability of design choices as well as global trends leading to trade-offs between competing criteria. Copyright © 2011 Wiley Periodicals, Inc.
The Mass-Longevity Triangle: Pareto Optimality and the Geometry of Life-History Trait Space
Szekely, Pablo; Korem, Yael; Moran, Uri; Mayo, Avi; Alon, Uri
2015-01-01
When organisms need to perform multiple tasks they face a fundamental tradeoff: no phenotype can be optimal at all tasks. This situation was recently analyzed using Pareto optimality, showing that tradeoffs between tasks lead to phenotypes distributed on low dimensional polygons in trait space. The vertices of these polygons are archetypes—phenotypes optimal at a single task. This theory was applied to examples from animal morphology and gene expression. Here we ask whether Pareto optimality theory can apply to life history traits, which include longevity, fecundity and mass. To comprehensively explore the geometry of life history trait space, we analyze a dataset of life history traits of 2105 endothermic species. We find that, to a first approximation, life history traits fall on a triangle in log-mass log-longevity space. The vertices of the triangle suggest three archetypal strategies, exemplified by bats, shrews and whales, with specialists near the vertices and generalists in the middle of the triangle. To a second approximation, the data lies in a tetrahedron, whose extra vertex above the mass-longevity triangle suggests a fourth strategy related to carnivory. Each animal species can thus be placed in a coordinate system according to its distance from the archetypes, which may be useful for genome-scale comparative studies of mammalian aging and other biological aspects. We further demonstrate that Pareto optimality can explain a range of previous studies which found animal and plant phenotypes which lie in triangles in trait space. This study demonstrates the applicability of multi-objective optimization principles to understand life history traits and to infer archetypal strategies that suggest why some mammalian species live much longer than others of similar mass. PMID:26465336
The Mass-Longevity Triangle: Pareto Optimality and the Geometry of Life-History Trait Space.
Szekely, Pablo; Korem, Yael; Moran, Uri; Mayo, Avi; Alon, Uri
2015-10-01
When organisms need to perform multiple tasks they face a fundamental tradeoff: no phenotype can be optimal at all tasks. This situation was recently analyzed using Pareto optimality, showing that tradeoffs between tasks lead to phenotypes distributed on low dimensional polygons in trait space. The vertices of these polygons are archetypes--phenotypes optimal at a single task. This theory was applied to examples from animal morphology and gene expression. Here we ask whether Pareto optimality theory can apply to life history traits, which include longevity, fecundity and mass. To comprehensively explore the geometry of life history trait space, we analyze a dataset of life history traits of 2105 endothermic species. We find that, to a first approximation, life history traits fall on a triangle in log-mass log-longevity space. The vertices of the triangle suggest three archetypal strategies, exemplified by bats, shrews and whales, with specialists near the vertices and generalists in the middle of the triangle. To a second approximation, the data lies in a tetrahedron, whose extra vertex above the mass-longevity triangle suggests a fourth strategy related to carnivory. Each animal species can thus be placed in a coordinate system according to its distance from the archetypes, which may be useful for genome-scale comparative studies of mammalian aging and other biological aspects. We further demonstrate that Pareto optimality can explain a range of previous studies which found animal and plant phenotypes which lie in triangles in trait space. This study demonstrates the applicability of multi-objective optimization principles to understand life history traits and to infer archetypal strategies that suggest why some mammalian species live much longer than others of similar mass.
The Mass-Longevity Triangle: Pareto Optimality and the Geometry of Life-History Trait Space.
Directory of Open Access Journals (Sweden)
Pablo Szekely
2015-10-01
Full Text Available When organisms need to perform multiple tasks they face a fundamental tradeoff: no phenotype can be optimal at all tasks. This situation was recently analyzed using Pareto optimality, showing that tradeoffs between tasks lead to phenotypes distributed on low dimensional polygons in trait space. The vertices of these polygons are archetypes--phenotypes optimal at a single task. This theory was applied to examples from animal morphology and gene expression. Here we ask whether Pareto optimality theory can apply to life history traits, which include longevity, fecundity and mass. To comprehensively explore the geometry of life history trait space, we analyze a dataset of life history traits of 2105 endothermic species. We find that, to a first approximation, life history traits fall on a triangle in log-mass log-longevity space. The vertices of the triangle suggest three archetypal strategies, exemplified by bats, shrews and whales, with specialists near the vertices and generalists in the middle of the triangle. To a second approximation, the data lies in a tetrahedron, whose extra vertex above the mass-longevity triangle suggests a fourth strategy related to carnivory. Each animal species can thus be placed in a coordinate system according to its distance from the archetypes, which may be useful for genome-scale comparative studies of mammalian aging and other biological aspects. We further demonstrate that Pareto optimality can explain a range of previous studies which found animal and plant phenotypes which lie in triangles in trait space. This study demonstrates the applicability of multi-objective optimization principles to understand life history traits and to infer archetypal strategies that suggest why some mammalian species live much longer than others of similar mass.
Zalazinsky, A. G.; Kryuchkov, D. I.; Nesterenko, A. V.; Titov, V. G.
2017-12-01
The results of an experimental study of the mechanical properties of pressed and sintered briquettes consisting of powders obtained from a high-strength VT-22 titanium alloy by plasma spraying with additives of PTM-1 titanium powder obtained by the hydride-calcium method and powder of PV-N70Yu30 nickel-aluminum alloy are presented. The task is set for the choice of an optimal charge material composition of a composite material providing the required mechanical characteristics and cost of semi-finished products and items. Pareto optimal values for the composition of the composite material charge have been obtained.
International Nuclear Information System (INIS)
Zio, E.; Bazzo, R.
2010-01-01
In this paper, a framework is developed for identifying a limited number of representative solutions of a multiobjective optimization problem concerning the inspection intervals of the components of a safety system of a nuclear power plant. Pareto Front solutions are first clustered into 'families', which are then synthetically represented by a 'head of the family' solution. Three clustering methods are analyzed. Level Diagrams are then used to represent, analyse and interpret the Pareto Fronts reduced to their head-of-the-family solutions. Two decision situations are considered: without or with decision maker preferences, the latter implying the introduction of a scoring system to rank the solutions with respect to the different objectives: a fuzzy preference assignment is then employed to this purpose. The results of the application of the framework of analysis to the problem of optimizing the inspection intervals of a nuclear power plant safety system show that the clustering-based reduction maintains the Pareto Front shape and relevant characteristics, while making it easier for the decision maker to select the final solution.
Klammer, Martin; Dybowski, J Nikolaj; Hoffmann, Daniel; Schaab, Christoph
2015-01-01
Multivariate biomarkers that can predict the effectiveness of targeted therapy in individual patients are highly desired. Previous biomarker discovery studies have largely focused on the identification of single biomarker signatures, aimed at maximizing prediction accuracy. Here, we present a different approach that identifies multiple biomarkers by simultaneously optimizing their predictive power, number of features, and proximity to the drug target in a protein-protein interaction network. To this end, we incorporated NSGA-II, a fast and elitist multi-objective optimization algorithm that is based on the principle of Pareto optimality, into the biomarker discovery workflow. The method was applied to quantitative phosphoproteome data of 19 non-small cell lung cancer (NSCLC) cell lines from a previous biomarker study. The algorithm successfully identified a total of 77 candidate biomarker signatures predicting response to treatment with dasatinib. Through filtering and similarity clustering, this set was trimmed to four final biomarker signatures, which then were validated on an independent set of breast cancer cell lines. All four candidates reached the same good prediction accuracy (83%) as the originally published biomarker. Although the newly discovered signatures were diverse in their composition and in their size, the central protein of the originally published signature - integrin β4 (ITGB4) - was also present in all four Pareto signatures, confirming its pivotal role in predicting dasatinib response in NSCLC cell lines. In summary, the method presented here allows for a robust and simultaneous identification of multiple multivariate biomarkers that are optimized for prediction performance, size, and relevance.
Finding the Pareto Optimal Equitable Allocation of Homogeneous Divisible Goods Among Three Players
Directory of Open Access Journals (Sweden)
Marco Dall'Aglio
2017-01-01
Full Text Available We consider the allocation of a finite number of homogeneous divisible items among three players. Under the assumption that each player assigns a positive value to every item, we develop a simple algorithm that returns a Pareto optimal and equitable allocation. This is based on the tight relationship between two geometric objects of fair division: The Individual Pieces Set (IPS and the Radon-Nykodim Set (RNS. The algorithm can be considered as an extension of the Adjusted Winner procedure by Brams and Taylor to the three-player case, without the guarantee of envy-freeness. (original abstract
DEFF Research Database (Denmark)
Mozaffari, Ahmad; Gorji-Bandpy, Mofid; Samadian, Pendar
2013-01-01
Optimizing and controlling of complex engineering systems is a phenomenon that has attracted an incremental interest of numerous scientists. Until now, a variety of intelligent optimizing and controlling techniques such as neural networks, fuzzy logic, game theory, support vector machines...... and stochastic algorithms were proposed to facilitate controlling of the engineering systems. In this study, an extended version of mutable smart bee algorithm (MSBA) called Pareto based mutable smart bee (PBMSB) is inspired to cope with multi-objective problems. Besides, a set of benchmark problems and four...... well-known Pareto based optimizing algorithms i.e. multi-objective bee algorithm (MOBA), multi-objective particle swarm optimization (MOPSO) algorithm, non-dominated sorting genetic algorithm (NSGA-II), and strength Pareto evolutionary algorithm (SPEA 2) are utilized to confirm the acceptable...
Arkell, Karolina; Knutson, Hans-Kristian; Frederiksen, Søren S; Breil, Martin P; Nilsson, Bernt
2018-01-12
With the shift of focus of the regulatory bodies, from fixed process conditions towards flexible ones based on process understanding, model-based optimization is becoming an important tool for process development within the biopharmaceutical industry. In this paper, a multi-objective optimization study of separation of three insulin variants by reversed-phase chromatography (RPC) is presented. The decision variables were the load factor, the concentrations of ethanol and KCl in the eluent, and the cut points for the product pooling. In addition to the purity constraints, a solubility constraint on the total insulin concentration was applied. The insulin solubility is a function of the ethanol concentration in the mobile phase, and the main aim was to investigate the effect of this constraint on the maximal productivity. Multi-objective optimization was performed with and without the solubility constraint, and visualized as Pareto fronts, showing the optimal combinations of the two objectives productivity and yield for each case. Comparison of the constrained and unconstrained Pareto fronts showed that the former diverges when the constraint becomes active, because the increase in productivity with decreasing yield is almost halted. Consequently, we suggest the operating point at which the total outlet concentration of insulin reaches the solubility limit as the most suitable one. According to the results from the constrained optimizations, the maximal productivity on the C 4 adsorbent (0.41 kg/(m 3 column h)) is less than half of that on the C 18 adsorbent (0.87 kg/(m 3 column h)). This is partly caused by the higher selectivity between the insulin variants on the C 18 adsorbent, but the main reason is the difference in how the solubility constraint affects the processes. Since the optimal ethanol concentration for elution on the C 18 adsorbent is higher than for the C 4 one, the insulin solubility is also higher, allowing a higher pool concentration
Nouiri, Issam
2017-11-01
This paper presents the development of multi-objective Genetic Algorithms to optimize chlorination design and management in drinking water networks (DWN). Three objectives have been considered: the improvement of the chlorination uniformity (healthy objective), the minimization of chlorine booster stations number, and the injected chlorine mass (economic objectives). The problem has been dissociated in medium and short terms ones. The proposed methodology was tested on hypothetical and real DWN. Results proved the ability of the developed optimization tool to identify relationships between the healthy and economic objectives as Pareto fronts. The proposed approach was efficient in computing solutions ensuring better chlorination uniformity while requiring the weakest injected chlorine mass when compared to other approaches. For the real DWN studied, chlorination optimization has been crowned by great improvement of free-chlorine-dosing uniformity and by a meaningful chlorine mass reduction, in comparison with the conventional chlorination.
Pareto-optimal electricity tariff rates in the Republic of Armenia
International Nuclear Information System (INIS)
Kaiser, M.J.
2000-01-01
The economic impact of electricity tariff rates on the residential sector of Yerevan, Armenia, is examined. The effect of tariff design on revenue generation and equity measures is considered, and the combination of energy pricing and compensatory social policies which provides the best mix of efficiency and protection for poor households is examined. An equity measure is defined in terms of a cumulative distribution function which describes the percent of the population that spends x percent or less of their income on electricity consumption. An optimal (Pareto-efficient) tariff is designed based on the analysis of survey data and an econometric model, and the Armenian tariff rate effective 1 January 1997 to 15 September 1997 is shown to be non-optimal relative to this rate. 22 refs
Mapping the Pareto optimal design space for a functionally deimmunized biotherapeutic candidate.
Salvat, Regina S; Parker, Andrew S; Choi, Yoonjoo; Bailey-Kellogg, Chris; Griswold, Karl E
2015-01-01
The immunogenicity of biotherapeutics can bottleneck development pipelines and poses a barrier to widespread clinical application. As a result, there is a growing need for improved deimmunization technologies. We have recently described algorithms that simultaneously optimize proteins for both reduced T cell epitope content and high-level function. In silico analysis of this dual objective design space reveals that there is no single global optimum with respect to protein deimmunization. Instead, mutagenic epitope deletion yields a spectrum of designs that exhibit tradeoffs between immunogenic potential and molecular function. The leading edge of this design space is the Pareto frontier, i.e. the undominated variants for which no other single design exhibits better performance in both criteria. Here, the Pareto frontier of a therapeutic enzyme has been designed, constructed, and evaluated experimentally. Various measures of protein performance were found to map a functional sequence space that correlated well with computational predictions. These results represent the first systematic and rigorous assessment of the functional penalty that must be paid for pursuing progressively more deimmunized biotherapeutic candidates. Given this capacity to rapidly assess and design for tradeoffs between protein immunogenicity and functionality, these algorithms may prove useful in augmenting, accelerating, and de-risking experimental deimmunization efforts.
Sun, Kaibiao; Kasperski, Andrzej; Tian, Yuan
2014-10-01
The aim of this study is the optimization of a product-driven self-cycling bioprocess and presentation of a way to determine the best possible decision variables out of a set of alternatives based on the designed model. Initially, a product-driven generalized kinetic model, which allows a flexible choice of the most appropriate kinetics is designed and analysed. The optimization problem is given as the bi-objective one, where maximization of biomass productivity and minimization of unproductive loss of substrate are the objective functions. Then, the Pareto fronts are calculated for exemplary kinetics. It is found that in the designed bioprocess, a decrease of emptying/refilling fraction and an increase of substrate feeding concentration cause an increase of the biomass productivity. An increase of emptying/refilling fraction and a decrease of substrate feeding concentration cause a decrease of unproductive loss of substrate. The preferred solutions are calculated using the minimum distance from an ideal solution method, while giving proposals of their modifications derived from a decision maker's reactions to the generated solutions.
McInerney, David; Thyer, Mark; Kavetski, Dmitri; Lerat, Julien; Kuczera, George
2017-03-01
Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. This study focuses on approaches for representing error heteroscedasticity with respect to simulated streamflow, i.e., the pattern of larger errors in higher streamflow predictions. We evaluate eight common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter λ) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and the United States, and two lumped hydrological models. Performance is quantified using predictive reliability, precision, and volumetric bias metrics. We find the choice of heteroscedastic error modeling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with λ of 0.2 and 0.5, and the log scheme (λ = 0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Paradoxically, calibration of λ is often counterproductive: in perennial catchments, it tends to overfit low flows at the expense of abysmal precision in high flows. The log-sinh transformation is dominated by the simpler Pareto optimal schemes listed above. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.
Pareto-Optimization of HTS CICC for High-Current Applications in Self-Field
Directory of Open Access Journals (Sweden)
Giordano Tomassetti
2018-01-01
Full Text Available The ENEA superconductivity laboratory developed a novel design for Cable-in-Conduit Conductors (CICCs comprised of stacks of 2nd-generation REBCO coated conductors. In its original version, the cable was made up of 150 HTS tapes distributed in five slots, twisted along an aluminum core. In this work, taking advantage of a 2D finite element model, able to estimate the cable’s current distribution in the cross-section, a multiobjective optimization procedure was implemented. The aim of optimization was to simultaneously maximize both engineering current density and total current flowing inside the tapes when operating in self-field, by varying the cross-section layout. Since the optimization process involved both integer and real geometrical variables, the choice of an evolutionary search algorithm was strictly necessary. The use of an evolutionary algorithm in the frame of a multiple objective optimization made it an obliged choice to numerically approach the problem using a nonstandard fast-converging optimization algorithm. By means of this algorithm, the Pareto frontiers for the different configurations were calculated, providing a powerful tool for the designer to achieve the desired preliminary operating conditions in terms of engineering current density and/or total current, depending on the specific application field, that is, power transmission cable and bus bar systems.
Hurford, Anthony; Harou, Julien
2014-05-01
Water related eco-system services are important to the livelihoods of the poorest sectors of society in developing countries. Degradation or loss of these services can increase the vulnerability of people decreasing their capacity to support themselves. New approaches to help guide water resources management decisions are needed which account for the non-market value of ecosystem goods and services. In case studies from Brazil and Kenya we demonstrate the capability of many objective Pareto-optimal trade-off analysis to help decision makers balance economic and non-market benefits from the management of existing multi-reservoir systems. A multi-criteria search algorithm is coupled to a water resources management simulator of each basin to generate a set of Pareto-approximate trade-offs representing the best case management decisions. In both cases, volume dependent reservoir release rules are the management decisions being optimised. In the Kenyan case we further assess the impacts of proposed irrigation investments, and how the possibility of new investments impacts the system's trade-offs. During the multi-criteria search (optimisation), performance of different sets of management decisions (policies) is assessed against case-specific objective functions representing provision of water supply and irrigation, hydropower generation and maintenance of ecosystem services. Results are visualised as trade-off surfaces to help decision makers understand the impacts of different policies on a broad range of stakeholders and to assist in decision-making. These case studies show how the approach can reveal unexpected opportunities for win-win solutions, and quantify the trade-offs between investing to increase agricultural revenue and negative impacts on protected ecosystems which support rural livelihoods.
International Nuclear Information System (INIS)
Shojaeefard, Mohammad Hasan; Behnagh, Reza Abdi; Akbari, Mostafa; Givi, Mohammad Kazem Besharati; Farhani, Foad
2013-01-01
Highlights: ► Defect-free friction stir welds have been produced for AA5083-O/AA7075-O. ► Back-propagation was sufficient for predicting hardness and tensile strength. ► A hybrid multi-objective algorithm is proposed to deal with this MOP. ► Multi-objective particle swarm optimization was used to find the Pareto solutions. ► TOPSIS is used to rank the given alternatives of the Pareto solutions. -- Abstract: Friction Stir Welding (FSW) has been successfully used to weld similar and dissimilar cast and wrought aluminium alloys, especially for aircraft aluminium alloys, that generally present with low weldability by the traditional fusion welding process. This paper focuses on the microstructural and mechanical properties of the Friction Stir Welding (FSW) of AA7075-O to AA5083-O aluminium alloys. Weld microstructures, hardness and tensile properties were evaluated in as-welded condition. Tensile tests indicated that mechanical properties of the joint were better than in the base metals. An Artificial Neural Network (ANN) model was developed to simulate the correlation between the Friction Stir Welding parameters and mechanical properties. Performance of the ANN model was excellent and the model was employed to predict the ultimate tensile strength and hardness of butt joint of AA7075–AA5083 as functions of weld and rotational speeds. The multi-objective particle swarm optimization was used to obtain the Pareto-optimal set. Finally, the Technique for Order Preference by Similarity to the Ideal Solution (TOPSIS) was applied to determine the best compromised solution.
Energy Technology Data Exchange (ETDEWEB)
Kirlik, G; Zhang, H [University of Maryland School of Medicine, Baltimore, MD (United States)
2015-06-15
Purpose: To present a novel multi-criteria optimization (MCO) solution approach that generates well-dispersed representation of the Pareto front for radiation treatment planning. Methods: Different algorithms have been proposed and implemented in commercial planning software to generate MCO plans for external-beam radiation therapy. These algorithms consider convex optimization problems. We propose a grid-based algorithm to generate well-dispersed treatment plans over Pareto front. Our method is able to handle nonconvexity in the problem to deal with dose-volume objectives/constraints, biological objectives, such as equivalent uniform dose (EUD), tumor control probability (TCP), normal tissue complication probability (NTCP), etc. In addition, our algorithm is able to provide single MCO plan when clinicians are targeting narrow bounds of objectives for patients. In this situation, usually none of the generated plans were within the bounds and a solution is difficult to identify via manual navigation. We use the subproblem formulation utilized in the grid-based algorithm to obtain a plan within the specified bounds. The subproblem aims to generate a solution that maps into the rectangle defined by the bounds. If such a solution does not exist, it generates the solution closest to the rectangle. We tested our method with 10 locally advanced head and neck cancer cases. Results: 8 objectives were used including 3 different objectives for primary target volume, high-risk and low-risk target volumes, and 5 objectives for each of the organs-at-risk (OARs) (two parotids, spinal cord, brain stem and oral cavity). Given tight bounds, uniform dose was achieved for all targets while as much as 26% improvement was achieved in OAR sparing comparing to clinical plans without MCO and previously proposed MCO method. Conclusion: Our method is able to obtain well-dispersed treatment plans to attain better approximation for convex and nonconvex Pareto fronts. Single treatment plan can
PAPR-Constrained Pareto-Optimal Waveform Design for OFDM-STAP Radar
Energy Technology Data Exchange (ETDEWEB)
Sen, Satyabrata [ORNL
2014-01-01
We propose a peak-to-average power ratio (PAPR) constrained Pareto-optimal waveform design approach for an orthogonal frequency division multiplexing (OFDM) radar signal to detect a target using the space-time adaptive processing (STAP) technique. The use of an OFDM signal does not only increase the frequency diversity of our system, but also enables us to adaptively design the OFDM coefficients in order to further improve the system performance. First, we develop a parametric OFDM-STAP measurement model by considering the effects of signaldependent clutter and colored noise. Then, we observe that the resulting STAP-performance can be improved by maximizing the output signal-to-interference-plus-noise ratio (SINR) with respect to the signal parameters. However, in practical scenarios, the computation of output SINR depends on the estimated values of the spatial and temporal frequencies and target scattering responses. Therefore, we formulate a PAPR-constrained multi-objective optimization (MOO) problem to design the OFDM spectral parameters by simultaneously optimizing four objective functions: maximizing the output SINR, minimizing two separate Cramer-Rao bounds (CRBs) on the normalized spatial and temporal frequencies, and minimizing the trace of CRB matrix on the target scattering coefficients estimations. We present several numerical examples to demonstrate the achieved performance improvement due to the adaptive waveform design.
Pareto-Optimal Evaluation of Ultimate Limit States in Offshore Wind Turbine Structural Analysis
Directory of Open Access Journals (Sweden)
Michael Muskulus
2015-12-01
Full Text Available The ultimate capacity of support structures is checked with extreme loads. This is straightforward when the limit state equations depend on a single load component, and it has become common to report maxima for each load component. However, if more than one load component is influential, e.g., both axial force and bending moments, it is not straightforward how to define an extreme load. The combination of univariate maxima can be too conservative, and many different combinations of load components can result in the worst value of the limit state equations. The use of contemporaneous load vectors is typically non-conservative. Therefore, in practice, limit state checks are done for each possible load vector, from each time step of a simulation. This is not feasible when performing reliability assessments and structural optimization, where additional, time-consuming computations are involved for each load vector. We therefore propose to use Pareto-optimal loads, which are a small set of loads that together represent all possible worst case scenarios. Simulations with two reference wind turbines show that this approach can be very useful for jacket structures, whereas the design of monopiles is often governed by the bending moment only. Even in this case, the approach might be useful when approaching the structural limits during optimization.
Using Pareto optimality to explore the topology and dynamics of the human connectome.
Avena-Koenigsberger, Andrea; Goñi, Joaquín; Betzel, Richard F; van den Heuvel, Martijn P; Griffa, Alessandra; Hagmann, Patric; Thiran, Jean-Philippe; Sporns, Olaf
2014-10-05
Graph theory has provided a key mathematical framework to analyse the architecture of human brain networks. This architecture embodies an inherently complex relationship between connection topology, the spatial arrangement of network elements, and the resulting network cost and functional performance. An exploration of these interacting factors and driving forces may reveal salient network features that are critically important for shaping and constraining the brain's topological organization and its evolvability. Several studies have pointed to an economic balance between network cost and network efficiency with networks organized in an 'economical' small-world favouring high communication efficiency at a low wiring cost. In this study, we define and explore a network morphospace in order to characterize different aspects of communication efficiency in human brain networks. Using a multi-objective evolutionary approach that approximates a Pareto-optimal set within the morphospace, we investigate the capacity of anatomical brain networks to evolve towards topologies that exhibit optimal information processing features while preserving network cost. This approach allows us to investigate network topologies that emerge under specific selection pressures, thus providing some insight into the selectional forces that may have shaped the network architecture of existing human brains.
SU-F-R-10: Selecting the Optimal Solution for Multi-Objective Radiomics Model
International Nuclear Information System (INIS)
Zhou, Z; Folkert, M; Wang, J
2016-01-01
Purpose: To develop an evidential reasoning approach for selecting the optimal solution from a Pareto solution set obtained by a multi-objective radiomics model for predicting distant failure in lung SBRT. Methods: In the multi-objective radiomics model, both sensitivity and specificity are considered as the objective functions simultaneously. A Pareto solution set with many feasible solutions will be resulted from the multi-objective optimization. In this work, an optimal solution Selection methodology for Multi-Objective radiomics Learning model using the Evidential Reasoning approach (SMOLER) was proposed to select the optimal solution from the Pareto solution set. The proposed SMOLER method used the evidential reasoning approach to calculate the utility of each solution based on pre-set optimal solution selection rules. The solution with the highest utility was chosen as the optimal solution. In SMOLER, an optimal learning model coupled with clonal selection algorithm was used to optimize model parameters. In this study, PET, CT image features and clinical parameters were utilized for predicting distant failure in lung SBRT. Results: Total 126 solution sets were generated by adjusting predictive model parameters. Each Pareto set contains 100 feasible solutions. The solution selected by SMOLER within each Pareto set was compared to the manually selected optimal solution. Five-cross-validation was used to evaluate the optimal solution selection accuracy of SMOLER. The selection accuracies for five folds were 80.00%, 69.23%, 84.00%, 84.00%, 80.00%, respectively. Conclusion: An optimal solution selection methodology for multi-objective radiomics learning model using the evidential reasoning approach (SMOLER) was proposed. Experimental results show that the optimal solution can be found in approximately 80% cases.
SU-F-R-10: Selecting the Optimal Solution for Multi-Objective Radiomics Model
Energy Technology Data Exchange (ETDEWEB)
Zhou, Z; Folkert, M; Wang, J [UT Southwestern Medical Center, Dallas, TX (United States)
2016-06-15
Purpose: To develop an evidential reasoning approach for selecting the optimal solution from a Pareto solution set obtained by a multi-objective radiomics model for predicting distant failure in lung SBRT. Methods: In the multi-objective radiomics model, both sensitivity and specificity are considered as the objective functions simultaneously. A Pareto solution set with many feasible solutions will be resulted from the multi-objective optimization. In this work, an optimal solution Selection methodology for Multi-Objective radiomics Learning model using the Evidential Reasoning approach (SMOLER) was proposed to select the optimal solution from the Pareto solution set. The proposed SMOLER method used the evidential reasoning approach to calculate the utility of each solution based on pre-set optimal solution selection rules. The solution with the highest utility was chosen as the optimal solution. In SMOLER, an optimal learning model coupled with clonal selection algorithm was used to optimize model parameters. In this study, PET, CT image features and clinical parameters were utilized for predicting distant failure in lung SBRT. Results: Total 126 solution sets were generated by adjusting predictive model parameters. Each Pareto set contains 100 feasible solutions. The solution selected by SMOLER within each Pareto set was compared to the manually selected optimal solution. Five-cross-validation was used to evaluate the optimal solution selection accuracy of SMOLER. The selection accuracies for five folds were 80.00%, 69.23%, 84.00%, 84.00%, 80.00%, respectively. Conclusion: An optimal solution selection methodology for multi-objective radiomics learning model using the evidential reasoning approach (SMOLER) was proposed. Experimental results show that the optimal solution can be found in approximately 80% cases.
International Nuclear Information System (INIS)
Amanifard, N.; Nariman-Zadeh, N.; Borji, M.; Khalkhali, A.; Habibdoust, A.
2008-01-01
Three-dimensional heat transfer characteristics and pressure drop of water flow in a set of rectangular microchannels are numerically investigated using Fluent and compared with those of experimental results. Two metamodels based on the evolved group method of data handling (GMDH) type neural networks are then obtained for modelling of both pressure drop (ΔP) and Nusselt number (Nu) with respect to design variables such as geometrical parameters of microchannels, the amount of heat flux and the Reynolds number. Using such obtained polynomial neural networks, multi-objective genetic algorithms (GAs) (non-dominated sorting genetic algorithm, NSGA-II) with a new diversity preserving mechanism is then used for Pareto based optimization of microchannels considering two conflicting objectives such as (ΔP) and (Nu). It is shown that some interesting and important relationships as useful optimal design principles involved in the performance of microchannels can be discovered by Pareto based multi-objective optimization of the obtained polynomial metamodels representing their heat transfer and flow characteristics. Such important optimal principles would not have been obtained without the use of both GMDH type neural network modelling and the Pareto optimization approach
TreePOD: Sensitivity-Aware Selection of Pareto-Optimal Decision Trees.
Muhlbacher, Thomas; Linhardt, Lorenz; Moller, Torsten; Piringer, Harald
2018-01-01
Balancing accuracy gains with other objectives such as interpretability is a key challenge when building decision trees. However, this process is difficult to automate because it involves know-how about the domain as well as the purpose of the model. This paper presents TreePOD, a new approach for sensitivity-aware model selection along trade-offs. TreePOD is based on exploring a large set of candidate trees generated by sampling the parameters of tree construction algorithms. Based on this set, visualizations of quantitative and qualitative tree aspects provide a comprehensive overview of possible tree characteristics. Along trade-offs between two objectives, TreePOD provides efficient selection guidance by focusing on Pareto-optimal tree candidates. TreePOD also conveys the sensitivities of tree characteristics on variations of selected parameters by extending the tree generation process with a full-factorial sampling. We demonstrate how TreePOD supports a variety of tasks involved in decision tree selection and describe its integration in a holistic workflow for building and selecting decision trees. For evaluation, we illustrate a case study for predicting critical power grid states, and we report qualitative feedback from domain experts in the energy sector. This feedback suggests that TreePOD enables users with and without statistical background a confident and efficient identification of suitable decision trees.
Directory of Open Access Journals (Sweden)
Anat Lerner
2014-04-01
Full Text Available We characterize the efficiency space of deterministic, dominant-strategy incentive compatible, individually rational and Pareto-optimal combinatorial auctions in a model with two players and k nonidentical items. We examine a model with multidimensional types, private values and quasilinear preferences for the players with one relaxation: one of the players is subject to a publicly known budget constraint. We show that if it is publicly known that the valuation for the largest bundle is less than the budget for at least one of the players, then Vickrey-Clarke-Groves (VCG uniquely fulfills the basic properties of being deterministic, dominant-strategy incentive compatible, individually rational and Pareto optimal. Our characterization of the efficient space for deterministic budget constrained combinatorial auctions is similar in spirit to that of Maskin 2000 for Bayesian single-item constrained efficiency auctions and comparable with Ausubel and Milgrom 2002 for non-constrained combinatorial auctions.
Penrod, Nadia M; Greene, Casey S; Moore, Jason H
2014-01-01
Molecularly targeted drugs promise a safer and more effective treatment modality than conventional chemotherapy for cancer patients. However, tumors are dynamic systems that readily adapt to these agents activating alternative survival pathways as they evolve resistant phenotypes. Combination therapies can overcome resistance but finding the optimal combinations efficiently presents a formidable challenge. Here we introduce a new paradigm for the design of combination therapy treatment strategies that exploits the tumor adaptive process to identify context-dependent essential genes as druggable targets. We have developed a framework to mine high-throughput transcriptomic data, based on differential coexpression and Pareto optimization, to investigate drug-induced tumor adaptation. We use this approach to identify tumor-essential genes as druggable candidates. We apply our method to a set of ER(+) breast tumor samples, collected before (n = 58) and after (n = 60) neoadjuvant treatment with the aromatase inhibitor letrozole, to prioritize genes as targets for combination therapy with letrozole treatment. We validate letrozole-induced tumor adaptation through coexpression and pathway analyses in an independent data set (n = 18). We find pervasive differential coexpression between the untreated and letrozole-treated tumor samples as evidence of letrozole-induced tumor adaptation. Based on patterns of coexpression, we identify ten genes as potential candidates for combination therapy with letrozole including EPCAM, a letrozole-induced essential gene and a target to which drugs have already been developed as cancer therapeutics. Through replication, we validate six letrozole-induced coexpression relationships and confirm the epithelial-to-mesenchymal transition as a process that is upregulated in the residual tumor samples following letrozole treatment. To derive the greatest benefit from molecularly targeted drugs it is critical to design combination
David, McInerney; Mark, Thyer; Dmitri, Kavetski; George, Kuczera
2017-04-01
This study provides guidance to hydrological researchers which enables them to provide probabilistic predictions of daily streamflow with the best reliability and precision for different catchment types (e.g. high/low degree of ephemerality). Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. It is commonly known that hydrological model residual errors are heteroscedastic, i.e. there is a pattern of larger errors in higher streamflow predictions. Although multiple approaches exist for representing this heteroscedasticity, few studies have undertaken a comprehensive evaluation and comparison of these approaches. This study fills this research gap by evaluating 8 common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter, lambda) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and USA, and two lumped hydrological models. We find the choice of heteroscedastic error modelling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with lambda of 0.2 and 0.5, and the log scheme (lambda=0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.
Improving Polyp Detection Algorithms for CT Colonography: Pareto Front Approach.
Huang, Adam; Li, Jiang; Summers, Ronald M; Petrick, Nicholas; Hara, Amy K
2010-03-21
We investigated a Pareto front approach to improving polyp detection algorithms for CT colonography (CTC). A dataset of 56 CTC colon surfaces with 87 proven positive detections of 53 polyps sized 4 to 60 mm was used to evaluate the performance of a one-step and a two-step curvature-based region growing algorithm. The algorithmic performance was statistically evaluated and compared based on the Pareto optimal solutions from 20 experiments by evolutionary algorithms. The false positive rate was lower (pPareto optimization process can effectively help in fine-tuning and redesigning polyp detection algorithms.
International Nuclear Information System (INIS)
Taboada, Heidi A.; Baheranwala, Fatema; Coit, David W.; Wattanapongsakorn, Naruemon
2007-01-01
For multiple-objective optimization problems, a common solution methodology is to determine a Pareto optimal set. Unfortunately, these sets are often large and can become difficult to comprehend and consider. Two methods are presented as practical approaches to reduce the size of the Pareto optimal set for multiple-objective system reliability design problems. The first method is a pseudo-ranking scheme that helps the decision maker select solutions that reflect his/her objective function priorities. In the second approach, we used data mining clustering techniques to group the data by using the k-means algorithm to find clusters of similar solutions. This provides the decision maker with just k general solutions to choose from. With this second method, from the clustered Pareto optimal set, we attempted to find solutions which are likely to be more relevant to the decision maker. These are solutions where a small improvement in one objective would lead to a large deterioration in at least one other objective. To demonstrate how these methods work, the well-known redundancy allocation problem was solved as a multiple objective problem by using the NSGA genetic algorithm to initially find the Pareto optimal solutions, and then, the two proposed methods are applied to prune the Pareto set
Ikefuji, M.; Laeven, R.J.A.; Magnus, J.R.; Muris, C.H.M.
2013-01-01
In searching for an appropriate utility function in the expected utility framework, we formulate four properties that we want the utility function to satisfy. We conduct a search for such a function, and we identify Pareto utility as a function satisfying all four desired properties. Pareto utility
2011-01-01
Itaalia majandusteadlase Vilfredo Pareto jõudmisest oma kuulsa printsiibini ja selle printsiibi mõjust tänapäevasele juhtimisele. Pareto printsiibi kohaselt ei aita suurem osa tegevusest meid tulemuseni jõuda, vaid on aja raiskamine. Diagramm
Active learning of Pareto fronts.
Campigotto, Paolo; Passerini, Andrea; Battiti, Roberto
2014-03-01
This paper introduces the active learning of Pareto fronts (ALP) algorithm, a novel approach to recover the Pareto front of a multiobjective optimization problem. ALP casts the identification of the Pareto front into a supervised machine learning task. This approach enables an analytical model of the Pareto front to be built. The computational effort in generating the supervised information is reduced by an active learning strategy. In particular, the model is learned from a set of informative training objective vectors. The training objective vectors are approximated Pareto-optimal vectors obtained by solving different scalarized problem instances. The experimental results show that ALP achieves an accurate Pareto front approximation with a lower computational effort than state-of-the-art estimation of distribution algorithms and widely known genetic techniques.
Energy Technology Data Exchange (ETDEWEB)
2016-12-21
The JMP Add-In TopN-PFS provides an automated tool for finding layered Pareto front to identify the top N solutions from an enumerated list of candidates subject to optimizing multiple criteria. The approach constructs the N layers of Pareto fronts, and then provides a suite of graphical tools to explore the alternatives based on different prioritizations of the criteria. The tool is designed to provide a set of alternatives from which the decision-maker can select the best option for their study goals.
Cilla, Savino; Ianiro, Anna; Deodato, Francesco; Macchia, Gabriella; Digesù, Cinzia; Valentini, Vincenzo; Morganti, Alessio G
2017-11-27
We explored the Pareto fronts mathematical strategy to determine the optimal block margin and prescription isodose for stereotactic body radiotherapy (SBRT) treatments of liver metastases using the volumetric-modulated arc therapy (VMAT) technique. Three targets (planning target volumes [PTVs] = 20, 55, and 101 cc) were selected. A single fraction dose of 26 Gy was prescribed (prescription dose [PD]). VMAT plans were generated for 3 different beam energies. Pareto fronts based on (1) different multileaf collimator (MLC) block margin around PTV and (2) different prescription isodose lines (IDL) were produced. For each block margin, the greatest IDL fulfilling the criteria (95% of PTV reached 100%) was considered as providing the optimal clinical plan for PTV coverage. Liver D mean , V7Gy, and V12Gy were used against the PTV coverage to generate the fronts. Gradient indexes (GI and mGI), homogeneity index (HI), and healthy liver irradiation in terms of D mean , V7Gy, and V12Gy were calculated to compare different plans. In addition, each target was also optimized with a full-inverse planning engine to obtain a direct comparison with anatomy-based treatment planning system (TPS) results. About 900 plans were calculated to generate the fronts. GI and mGI show a U-shaped behavior as a function of beam margin with minimal values obtained with a +1 mm MLC margin. For these plans, the IDL ranges from 74% to 86%. GI and mGI show also a V-shaped behavior with respect to HI index, with minimum values at 1 mm for all metrics, independent of tumor dimensions and beam energy. Full-inversed optimized plans reported worse results with respect to Pareto plans. In conclusion, Pareto fronts provide a rigorous strategy to choose clinical optimal plans in SBRT treatments. We show that a 1-mm MLC block margin provides the best results with regard to healthy liver tissue irradiation and steepness of dose fallout. Copyright © 2017 American Association of Medical Dosimetrists
A fast method for calculating reliable event supports in tree reconciliations via Pareto optimality.
To, Thu-Hien; Jacox, Edwin; Ranwez, Vincent; Scornavacca, Celine
2015-11-14
Given a gene and a species tree, reconciliation methods attempt to retrieve the macro-evolutionary events that best explain the discrepancies between the two tree topologies. The DTL parsimonious approach searches for a most parsimonious reconciliation between a gene tree and a (dated) species tree, considering four possible macro-evolutionary events (speciation, duplication, transfer, and loss) with specific costs. Unfortunately, many events are erroneously predicted due to errors in the input trees, inappropriate input cost values or because of the existence of several equally parsimonious scenarios. It is thus crucial to provide a measure of the reliability for predicted events. It has been recently proposed that the reliability of an event can be estimated via its frequency in the set of most parsimonious reconciliations obtained using a variety of reasonable input cost vectors. To compute such a support, a straightforward but time-consuming approach is to generate the costs slightly departing from the original ones, independently compute the set of all most parsimonious reconciliations for each vector, and combine these sets a posteriori. Another proposed approach uses Pareto-optimality to partition cost values into regions which induce reconciliations with the same number of DTL events. The support of an event is then defined as its frequency in the set of regions. However, often, the number of regions is not large enough to provide reliable supports. We present here a method to compute efficiently event supports via a polynomial-sized graph, which can represent all reconciliations for several different costs. Moreover, two methods are proposed to take into account alternative input costs: either explicitly providing an input cost range or allowing a tolerance for the over cost of a reconciliation. Our methods are faster than the region based method, substantially faster than the sampling-costs approach, and have a higher event-prediction accuracy on
Champion, H; Fiege, J; McCurdy, B; Potrebko, P; Cull, A
2012-07-01
PARETO (Pareto-Aware Radiotherapy Evolutionary Treatment Optimization) is a novel multiobjective treatment planning system that performs beam orientation and fluence optimization simultaneously using an advanced evolutionary algorithm. In order to reduce the number of parameters involved in this enormous search space, we present several methods for modeling the beam fluence. The parameterizations are compared using innovative tools that evaluate fluence complexity, solution quality, and run efficiency. A PARETO run is performed using the basic weight (BW), linear gradient (LG), cosine transform (CT), beam group (BG), and isodose-projection (IP) methods for applying fluence modulation over the projection of the Planning Target Volume in the beam's-eye-view plane. The solutions of each run are non-dominated with respect to other trial solutions encountered during the run. However, to compare the solution quality of independent runs, each run competes against every other run in a round robin fashion. Score is assigned based on the fraction of solutions that survive when a tournament selection operator is applied to the solutions of the two competitors. To compare fluence complexity, a modulation index, fractal dimension, and image gradient entropy are calculated for the fluence maps of each optimal plan. We have found that the LG method results in superior solution quality for a spine phantom, lung patient, and cauda equina patient. The BG method produces solutions with the highest degree of fluence complexity. Most methods result in comparable run times. The LG method produces superior solution quality using a moderate degree of fluence modulation. © 2012 American Association of Physicists in Medicine.
Directory of Open Access Journals (Sweden)
Rica Gonen
2013-11-01
Full Text Available We analyze the space of deterministic, dominant-strategy incentive compatible, individually rational and Pareto optimal combinatorial auctions. We examine a model with multidimensional types, nonidentical items, private values and quasilinear preferences for the players with one relaxation; the players are subject to publicly-known budget constraints. We show that the space includes dictatorial mechanisms and that if dictatorial mechanisms are ruled out by a natural anonymity property, then an impossibility of design is revealed. The same impossibility naturally extends to other abstract mechanisms with an arbitrary outcome set if one maintains the original assumptions of players with quasilinear utilities, public budgets and nonnegative prices.
Directory of Open Access Journals (Sweden)
Sergey E. Bukhtoyarov
2005-05-01
Full Text Available A multicriterion linear combinatorial problem with a parametric principle of optimality is considered. This principle is defined by a partitioning of partial criteria onto Pareto preference relation groups within each group and the lexicographic preference relation between them. Quasistability of the problem is investigated. This type of stability is a discrete analog of Hausdorff lower semi-continuity of the multiple-valued mapping that defines the choice function. A formula of quasistability radius is derived for the case of the metric l∞. Some known results are stated as corollaries. Mathematics Subject Classification 2000: 90C05, 90C10, 90C29, 90C31.
Liu, Xian
2010-02-10
This paper shows that optical signal transmission over intersatellite links with swaying transmitters can be described as an equivalent fading model. In this model, the instantaneous signal-to-noise ratio is stochastic and follows the reciprocal Pareto distribution. With this model, we show that the transmitter power can be minimized, subject to a specified outage probability, by appropriately adjusting some system parameters, such as the transmitter gain.
Directory of Open Access Journals (Sweden)
Arnaut Dierck
2015-01-01
Full Text Available Designing textile antennas for real-life applications requires a design strategy that is able to produce antennas that are optimized over a wide bandwidth for often conflicting characteristics, such as impedance matching, axial ratio, efficiency, and gain, and, moreover, that is able to account for the variations that apply for the characteristics of the unconventional materials used in smart textile systems. In this paper, such a strategy, incorporating a multiobjective constrained Pareto optimization, is presented and applied to the design of a Galileo E6-band antenna with optimal return loss and wide-band axial ratio characteristics. Subsequently, different prototypes of the optimized antenna are fabricated and measured to validate the proposed design strategy.
Lechner, Wolfgang; Kragl, Gabriele; Georg, Dietmar
2013-12-01
To investigate the differences in treatment plan quality of IMRT and VMAT with and without flattening filter using Pareto optimal fronts, for two treatment sites of different anatomic complexity. Pareto optimal fronts (POFs) were generated for six prostate and head-and-neck cancer patients by stepwise reduction of the constraint (during the optimization process) of the primary organ-at-risk (OAR). 9-static field IMRT and 360°-single-arc VMAT plans with flattening filter (FF) and without flattening filter (FFF) were compared. The volume receiving 5 Gy or more (V5 Gy) was used to estimate the low dose exposure. Furthermore, the number of monitor units (MUs) and measurements of the delivery time (T) were used to assess the efficiency of the treatment plans. A significant increase in MUs was found when using FFF-beams while the treatment plan quality was at least equivalent to the FF-beams. T was decreased by 18% for prostate for IMRT with FFF-beams and by 4% for head-and-neck cases, but increased by 22% and 16% for VMAT. A reduction of up to 5% of V5 Gy was found for IMRT prostate cases with FFF-beams. The evaluation of the POFs showed an at least comparable treatment plan quality of FFF-beams compared to FF-beams for both treatment sites and modalities. For smaller targets the advantageous characteristics of FFF-beams could be better exploited. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
International Nuclear Information System (INIS)
Ottosson, Rickard O.; Sjoestroem, David; Behrens, Claus F.; Karlsson, Anna; Engstroem, Per E.; Knoeoes, Tommy; Ceberg, Crister
2009-01-01
Pareto optimality is a concept that formalises the trade-off between a given set of mutually contradicting objectives. A solution is said to be Pareto optimal when it is not possible to improve one objective without deteriorating at least one of the other. A set of Pareto optimal solutions constitute the Pareto front. The Pareto concept applies well to the inverse planning process, which involves inherently contradictory objectives, high and uniform target dose on one hand, and sparing of surrounding tissue and nearby organs at risk (OAR) on the other. Due to the specific characteristics of a treatment planning system (TPS), treatment strategy or delivery technique, Pareto fronts for a given case are likely to differ. The aim of this study was to investigate the feasibility of using Pareto fronts as a comparative tool for TPSs, treatment strategies and delivery techniques. In order to sample Pareto fronts, multiple treatment plans with varying target conformity and dose sparing of OAR were created for a number of prostate and head and neck IMRT cases. The DVHs of each plan were evaluated with respect to target coverage and dose to relevant OAR. Pareto fronts were successfully created for all studied cases. The results did indeed follow the definition of the Pareto concept, i.e. dose sparing of the OAR could not be improved without target coverage being impaired or vice versa. Furthermore, various treatment techniques resulted in distinguished and well separated Pareto fronts. Pareto fronts may be used to evaluate a number of parameters within radiotherapy. Examples are TPS optimization algorithms, the variation between accelerators or delivery techniques and the degradation of a plan during the treatment planning process. The issue of designing a model for unbiased comparison of parameters with such large inherent discrepancies, e.g. different TPSs, is problematic and should be carefully considered
Ottosson, Rickard O; Engstrom, Per E; Sjöström, David; Behrens, Claus F; Karlsson, Anna; Knöös, Tommy; Ceberg, Crister
2009-01-01
Pareto optimality is a concept that formalises the trade-off between a given set of mutually contradicting objectives. A solution is said to be Pareto optimal when it is not possible to improve one objective without deteriorating at least one of the other. A set of Pareto optimal solutions constitute the Pareto front. The Pareto concept applies well to the inverse planning process, which involves inherently contradictory objectives, high and uniform target dose on one hand, and sparing of surrounding tissue and nearby organs at risk (OAR) on the other. Due to the specific characteristics of a treatment planning system (TPS), treatment strategy or delivery technique, Pareto fronts for a given case are likely to differ. The aim of this study was to investigate the feasibility of using Pareto fronts as a comparative tool for TPSs, treatment strategies and delivery techniques. In order to sample Pareto fronts, multiple treatment plans with varying target conformity and dose sparing of OAR were created for a number of prostate and head & neck IMRT cases. The DVHs of each plan were evaluated with respect to target coverage and dose to relevant OAR. Pareto fronts were successfully created for all studied cases. The results did indeed follow the definition of the Pareto concept, i.e. dose sparing of the OAR could not be improved without target coverage being impaired or vice versa. Furthermore, various treatment techniques resulted in distinguished and well separated Pareto fronts. Pareto fronts may be used to evaluate a number of parameters within radiotherapy. Examples are TPS optimization algorithms, the variation between accelerators or delivery techniques and the degradation of a plan during the treatment planning process. The issue of designing a model for unbiased comparison of parameters with such large inherent discrepancies, e.g. different TPSs, is problematic and should be carefully considered.
Pareto front estimation for decision making.
Giagkiozis, Ioannis; Fleming, Peter J
2014-01-01
The set of available multi-objective optimisation algorithms continues to grow. This fact can be partially attributed to their widespread use and applicability. However, this increase also suggests several issues remain to be addressed satisfactorily. One such issue is the diversity and the number of solutions available to the decision maker (DM). Even for algorithms very well suited for a particular problem, it is difficult-mainly due to the computational cost-to use a population large enough to ensure the likelihood of obtaining a solution close to the DM's preferences. In this paper we present a novel methodology that produces additional Pareto optimal solutions from a Pareto optimal set obtained at the end run of any multi-objective optimisation algorithm for two-objective and three-objective problem instances.
International Nuclear Information System (INIS)
Park, Jungsoo; Song, Soonho; Lee, Kyo Seung
2015-01-01
Highlights: • Model-based control of dual-loop EGR system is performed. • EGR split index is developed to provide non-dimensional index for optimization. • EGR rates are calibrated using EGR split index at specific operating conditions. • Multi-objective Pareto optimization is performed to minimize NO X and BSFC. • Optimum split strategies are suggested with LP-rich dual-loop EGR at high load. - Abstract: A proposed dual-loop exhaust-gas recirculation (EGR) system that combines the features of high-pressure (HP) and low-pressure (LP) systems is considered a key technology for improving the combustion behavior of diesel engines. The fraction of HP and LP flows, known as the EGR split, for a given dual-loop EGR rate play an important role in determining the engine performance and emission characteristics. Therefore, identifying the proper EGR split is important for the engine optimization and calibration processes, which affect the EGR response and deNO X efficiencies. The objective of this research was to develop a dual-loop EGR split strategy using numerical analysis and one-dimensional (1D) cycle simulation. A control system was modeled by coupling the 1D cycle simulation and the control logic. An EGR split index was developed to investigate the HP/LP split effects on the engine performance and emissions. Using the model-based control system, a multi-objective Pareto (MOP) analysis was used to minimize the NO X formation and fuel consumption through optimized engine operating parameters. The MOP analysis was performed using a response surface model extracted from Latin hypercube sampling as a fractional factorial design of experiment. By using an LP rich dual-loop EGR, a high EGR rate was attained at low, medium, and high engine speeds, increasing the applicable load ranges compared to base conditions
Application of Pareto optimization method for ontology matching in nuclear reactor domain
International Nuclear Information System (INIS)
Meenachi, N. Madurai; Baba, M. Sai
2017-01-01
This article describes the need for ontology matching and describes the methods to achieve the same. Efforts are put in the implementation of the semantic web based knowledge management system for nuclear domain which necessitated use of the methods for development of ontology matching. In order to exchange information in a distributed environment, ontology mapping has been used. The constraints in matching the ontology are also discussed. Pareto based ontology matching algorithm is used to find the similarity between two ontologies in the nuclear reactor domain. Algorithms like Jaro Winkler distance, Needleman Wunsch algorithm, Bigram, Kull Back and Cosine divergence are employed to demonstrate ontology matching. A case study was carried out to analysis the ontology matching in diversity in the nuclear reactor domain and same was illustrated.
Application of Pareto optimization method for ontology matching in nuclear reactor domain
Energy Technology Data Exchange (ETDEWEB)
Meenachi, N. Madurai [Indira Gandhi Centre for Atomic Research, HBNI, Tamil Nadu (India). Planning and Human Resource Management Div.; Baba, M. Sai [Indira Gandhi Centre for Atomic Research, HBNI, Tamil Nadu (India). Resources Management Group
2017-12-15
This article describes the need for ontology matching and describes the methods to achieve the same. Efforts are put in the implementation of the semantic web based knowledge management system for nuclear domain which necessitated use of the methods for development of ontology matching. In order to exchange information in a distributed environment, ontology mapping has been used. The constraints in matching the ontology are also discussed. Pareto based ontology matching algorithm is used to find the similarity between two ontologies in the nuclear reactor domain. Algorithms like Jaro Winkler distance, Needleman Wunsch algorithm, Bigram, Kull Back and Cosine divergence are employed to demonstrate ontology matching. A case study was carried out to analysis the ontology matching in diversity in the nuclear reactor domain and same was illustrated.
International Nuclear Information System (INIS)
Gharari, Rahman; Poursalehi, Navid; Abbasi, Mohmmadreza; Aghale, Mahdi
2016-01-01
In this research, for the first time, a new optimization method, i.e., strength Pareto evolutionary algorithm II (SPEA-II), is developed for the burnable poison placement (BPP) optimization of a nuclear reactor core. In the BPP problem, an optimized placement map of fuel assemblies with burnable poison is searched for a given core loading pattern according to defined objectives. In this work, SPEA-II coupled with a nodal expansion code is used for solving the BPP problem of Kraftwerk Union AG (KWU) pressurized water reactor. Our optimization goal for the BPP is to achieve a greater multiplication factor (K-e-f-f) for gaining possible longer operation cycles along with more flattening of fuel assembly relative power distribution, considering a safety constraint on the radial power peaking factor. For appraising the proposed methodology, the basic approach, i.e., SPEA, is also developed in order to compare obtained results. In general, results reveal the acceptance performance and high strength of SPEA, particularly its new version, i.e., SPEA-II, in achieving a semioptimized loading pattern for the BPP optimization of KWU pressurized water reactor
Energy Technology Data Exchange (ETDEWEB)
Gharari, Rahman [Nuclear Science and Technology Research Institute (NSTRI), Tehran (Iran, Islamic Republic of); Poursalehi, Navid; Abbasi, Mohmmadreza; Aghale, Mahdi [Nuclear Engineering Dept, Shahid Beheshti University, Tehran (Iran, Islamic Republic of)
2016-10-15
In this research, for the first time, a new optimization method, i.e., strength Pareto evolutionary algorithm II (SPEA-II), is developed for the burnable poison placement (BPP) optimization of a nuclear reactor core. In the BPP problem, an optimized placement map of fuel assemblies with burnable poison is searched for a given core loading pattern according to defined objectives. In this work, SPEA-II coupled with a nodal expansion code is used for solving the BPP problem of Kraftwerk Union AG (KWU) pressurized water reactor. Our optimization goal for the BPP is to achieve a greater multiplication factor (K-e-f-f) for gaining possible longer operation cycles along with more flattening of fuel assembly relative power distribution, considering a safety constraint on the radial power peaking factor. For appraising the proposed methodology, the basic approach, i.e., SPEA, is also developed in order to compare obtained results. In general, results reveal the acceptance performance and high strength of SPEA, particularly its new version, i.e., SPEA-II, in achieving a semioptimized loading pattern for the BPP optimization of KWU pressurized water reactor.
Directory of Open Access Journals (Sweden)
Kangji Li
2017-02-01
Full Text Available This paper is concerned with the development of a high-resolution and control-friendly optimization framework in enclosed environments that helps improve thermal comfort, indoor air quality (IAQ, and energy costs of heating, ventilation and air conditioning (HVAC system simultaneously. A computational fluid dynamics (CFD-based optimization method which couples algorithms implemented in Matlab with CFD simulation is proposed. The key part of this method is a data interactive mechanism which efficiently passes parameters between CFD simulations and optimization functions. A two-person office room is modeled for the numerical optimization. The multi-objective evolutionary algorithm—non-dominated-and-crowding Sorting Genetic Algorithm II (NSGA-II—is realized to explore the environment/energy Pareto front of the enclosed space. Performance analysis will demonstrate the effectiveness of the presented optimization method.
Sánchez, M S; Sarabia, L A; Ortiz, M C
2012-11-19
Experimental designs for a given task should be selected on the base of the problem being solved and of some criteria that measure their quality. There are several such criteria because there are several aspects to be taken into account when making a choice. The most used criteria are probably the so-called alphabetical optimality criteria (for example, the A-, E-, and D-criteria related to the joint estimation of the coefficients, or the I- and G-criteria related to the prediction variance). Selecting a proper design to solve a problem implies finding a balance among these several criteria that measure the performance of the design in different aspects. Technically this is a problem of multi-criteria optimization, which can be tackled from different views. The approach presented here addresses the problem in its real vector nature, so that ad hoc experimental designs are generated with an algorithm based on evolutionary algorithms to find the Pareto-optimal front. There is not theoretical limit to the number of criteria that can be studied and, contrary to other approaches, no just one experimental design is computed but a set of experimental designs all of them with the property of being Pareto-optimal in the criteria needed by the user. Besides, the use of an evolutionary algorithm makes it possible to search in both continuous and discrete domains and avoid the need of having a set of candidate points, usual in exchange algorithms. Copyright © 2012 Elsevier B.V. All rights reserved.
De Kerf, Geert; Van Gestel, Dirk; Mommaerts, Lobke; Van den Weyngaert, Danielle; Verellen, Dirk
2015-09-17
Modulation factor (MF) and pitch have an impact on Helical TomoTherapy (HT) plan quality and HT users mostly use vendor-recommended settings. This study analyses the effect of these two parameters on both plan quality and treatment time for plans made with TomoEdge planning software by using the concept of Pareto optimal fronts. More than 450 plans with different combinations of pitch [0.10-0.50] and MF [1.2-3.0] were produced. These HT plans, with a field width (FW) of 5 cm, were created for five head and neck patients and homogeneity index, conformity index, dose-near-maximum (D2), and dose-near-minimum (D98) were analysed for the planning target volumes, as well as the mean dose and D2 for most critical organs at risk. For every dose metric the median value will be plotted against treatment time. A Pareto-like method is used in the analysis which will show how pitch and MF influence both treatment time and plan quality. For small pitches (≤0.20), MF does not influence treatment time. The contrary is true for larger pitches (≥0.25) as lowering MF will both decrease treatment time and plan quality until maximum gantry speed is reached. At this moment, treatment time is saturated and only plan quality will further decrease. The Pareto front analysis showed optimal combinations of pitch [0.23-0.45] and MF > 2.0 for a FW of 5 cm. Outside this range, plans will become less optimal. As the vendor-recommended settings fall within this range, the use of these settings is validated.
A numerical approach to weak Pareto solutions to equilibrium problems with equilibrium constraints
Czech Academy of Sciences Publication Activity Database
Červinka, Michal
2006-01-01
Roč. 57, č. 7 (2006), s. 14-17 ISSN 1335-3632. [ISCAM 2006. International Conference in Applied Mathematics for undergraduate and graduate students. Bratislava, 07.04.2006-08.04.2006] Institutional research plan: CEZ:AV0Z10750506 Keywords : equilibrium problems with complementarity constraints * multiobjective optimization * variational analysis Subject RIV: BD - Theory of Information
Wang, Yibing; Breedveld, Sebastiaan; Heijmen, Ben; Petit, Steven F
2016-06-07
IMRT planning with commercial Treatment Planning Systems (TPSs) is a trial-and-error process. Consequently, the quality of treatment plans may not be consistent among patients, planners and institutions. Recently, different plan quality assurance (QA) models have been proposed, that could flag and guide improvement of suboptimal treatment plans. However, the performance of these models was validated using plans that were created using the conventional trail-and-error treatment planning process. Consequently, it is challenging to assess and compare quantitatively the accuracy of different treatment planning QA models. Therefore, we created a golden standard dataset of consistently planned Pareto-optimal IMRT plans for 115 prostate patients. Next, the dataset was used to assess the performance of a treatment planning QA model that uses the overlap volume histogram (OVH). 115 prostate IMRT plans were fully automatically planned using our in-house developed TPS Erasmus-iCycle. An existing OVH model was trained on the plans of 58 of the patients. Next it was applied to predict DVHs of the rectum, bladder and anus of the remaining 57 patients. The predictions were compared with the achieved values of the golden standard plans for the rectum D mean, V 65, and V 75, and D mean of the anus and the bladder. For the rectum, the prediction errors (predicted-achieved) were only -0.2 ± 0.9 Gy (mean ± 1 SD) for D mean,-1.0 ± 1.6% for V 65, and -0.4 ± 1.1% for V 75. For D mean of the anus and the bladder, the prediction error was 0.1 ± 1.6 Gy and 4.8 ± 4.1 Gy, respectively. Increasing the training cohort to 114 patients only led to minor improvements. A dataset of consistently planned Pareto-optimal prostate IMRT plans was generated. This dataset can be used to train new, and validate and compare existing treatment planning QA models, and has been made publicly available. The OVH model was highly accurate
Toward computational screening in heterogeneous catalysis: Pareto-optimal methanation catalysts
DEFF Research Database (Denmark)
Andersson, Martin; Bligaard, Thomas; Kustov, Arkadii
2006-01-01
Finding the solids that are the best catalysts for a given reaction is a daunting task due to the large number of combinations and structures of multicomponent Surfaces. In addition, it is not only the reaction rate that needs to be optimized: the selectivity. durability. and cost Must also be ta...
Informed multi-objective decision-making in environmental management using Pareto optimality
Maureen C. Kennedy; E. David Ford; Peter Singleton; Mark Finney; James K. Agee
2008-01-01
Effective decisionmaking in environmental management requires the consideration of multiple objectives that may conflict. Common optimization methods use weights on the multiple objectives to aggregate them into a single value, neglecting valuable insight into the relationships among the objectives in the management problem.
A bridge network maintenance framework for Pareto optimization of stakeholders/users costs
International Nuclear Information System (INIS)
Orcesi, Andre D.; Cremona, Christian F.
2010-01-01
For managing highway bridges, stakeholders require efficient and practical decision making techniques. In a context of limited bridge management budget, it is crucial to determine the most effective breakdown of financial resources over the different structures of a bridge network. Bridge management systems (BMSs) have been developed for such a purpose. However, they generally rely on an individual approach. The influence of the position of bridges in the transportation network, the consequences of inadequate service for the network users, due to maintenance actions or bridge failure, are not taken into consideration. Therefore, maintenance strategies obtained with current BMSs do not necessarily lead to an optimal level of service (LOS) of the bridge network for the users of the transportation network. Besides, the assessment of the structural performance of highway bridges usually requires the access to the geometrical and mechanical properties of its components. Such information might not be available for all structures in a bridge network for which managers try to schedule and prioritize maintenance strategies. On the contrary, visual inspections are performed regularly and information is generally available for all structures of the bridge network. The objective of this paper is threefold (i) propose an advanced network-level bridge management system considering the position of each bridge in the transportation network, (ii) use information obtained at visual inspections to assess the performance of bridges, and (iii) compare optimal maintenance strategies, obtained with a genetic algorithm, when considering interests of users and bridge owner either separately as conflicting criteria, or simultaneously as a common interest for the whole community. In each case, safety and serviceability aspects are taken into account in the model when determining optimal strategies. The theoretical and numerical developments are applied on a French bridge network.
Level Diagrams analysis of Pareto Front for multiobjective system redundancy allocation
International Nuclear Information System (INIS)
Zio, E.; Bazzo, R.
2011-01-01
Reliability-based and risk-informed design, operation, maintenance and regulation lead to multiobjective (multicriteria) optimization problems. In this context, the Pareto Front and Set found in a multiobjective optimality search provide a family of solutions among which the decision maker has to look for the best choice according to his or her preferences. Efficient visualization techniques for Pareto Front and Set analyses are needed for helping decision makers in the selection task. In this paper, we consider the multiobjective optimization of system redundancy allocation and use the recently introduced Level Diagrams technique for graphically representing the resulting Pareto Front and Set. Each objective and decision variable is represented on separate diagrams where the points of the Pareto Front and Set are positioned according to their proximity to ideally optimal points, as measured by a metric of normalized objective values. All diagrams are synchronized across all objectives and decision variables. On the basis of the analysis of the Level Diagrams, we introduce a procedure for reducing the number of solutions in the Pareto Front; from the reduced set of solutions, the decision maker can more easily identify his or her preferred solution.
Mozaffari, Ahmad; Vajedi, Mahyar; Chehresaz, Maryyeh; Azad, Nasser L.
2016-03-01
The urgent need to meet increasingly tight environmental regulations and new fuel economy requirements has motivated system science researchers and automotive engineers to take advantage of emerging computational techniques to further advance hybrid electric vehicle and plug-in hybrid electric vehicle (PHEV) designs. In particular, research has focused on vehicle powertrain system design optimization, to reduce the fuel consumption and total energy cost while improving the vehicle's driving performance. In this work, two different natural optimization machines, namely the synchronous self-learning Pareto strategy and the elitism non-dominated sorting genetic algorithm, are implemented for component sizing of a specific power-split PHEV platform with a Toyota plug-in Prius as the baseline vehicle. To do this, a high-fidelity model of the Toyota plug-in Prius is employed for the numerical experiments using the Autonomie simulation software. Based on the simulation results, it is demonstrated that Pareto-based algorithms can successfully optimize the design parameters of the vehicle powertrain.
A note on the estimation of the Pareto efficient set for multiobjective matrix permutation problems.
Brusco, Michael J; Steinley, Douglas
2012-02-01
There are a number of important problems in quantitative psychology that require the identification of a permutation of the n rows and columns of an n × n proximity matrix. These problems encompass applications such as unidimensional scaling, paired-comparison ranking, and anti-Robinson forms. The importance of simultaneously incorporating multiple objective criteria in matrix permutation applications is well recognized in the literature; however, to date, there has been a reliance on weighted-sum approaches that transform the multiobjective problem into a single-objective optimization problem. Although exact solutions to these single-objective problems produce supported Pareto efficient solutions to the multiobjective problem, many interesting unsupported Pareto efficient solutions may be missed. We illustrate the limitation of the weighted-sum approach with an example from the psychological literature and devise an effective heuristic algorithm for estimating both the supported and unsupported solutions of the Pareto efficient set. © 2011 The British Psychological Society.
Energy Technology Data Exchange (ETDEWEB)
Kontaxis, C; Bol, G; Lagendijk, J; Raaymakers, B [University Medical Center Utrecht, Utrecht (Netherlands); Breedveld, S; Sharfo, A; Heijmen, B [Erasmus University Medical Center Rotterdam, Rotterdam (Netherlands)
2016-06-15
Purpose: To develop a new IMRT treatment planning methodology suitable for the new generation of MR-linear accelerator machines. The pipeline is able to deliver Pareto-optimal plans and can be utilized for conventional treatments as well as for inter- and intrafraction plan adaptation based on real-time MR-data. Methods: A Pareto-optimal plan is generated using the automated multicriterial optimization approach Erasmus-iCycle. The resulting dose distribution is used as input to the second part of the pipeline, an iterative process which generates deliverable segments that target the latest anatomical state and gradually converges to the prescribed dose. This process continues until a certain percentage of the dose has been delivered. Under a conventional treatment, a Segment Weight Optimization (SWO) is then performed to ensure convergence to the prescribed dose. In the case of inter- and intrafraction adaptation, post-processing steps like SWO cannot be employed due to the changing anatomy. This is instead addressed by transferring the missing/excess dose to the input of the subsequent fraction. In this work, the resulting plans were delivered on a Delta4 phantom as a final Quality Assurance test. Results: A conventional static SWO IMRT plan was generated for two prostate cases. The sequencer faithfully reproduced the input dose for all volumes of interest. For the two cases the mean relative dose difference of the PTV between the ideal input and sequenced dose was 0.1% and −0.02% respectively. Both plans were delivered on a Delta4 phantom and passed the clinical Quality Assurance procedures by achieving 100% pass rate at a 3%/3mm gamma analysis. Conclusion: We have developed a new sequencing methodology capable of online plan adaptation. In this work, we extended the pipeline to support Pareto-optimal input and clinically validated that it can accurately achieve these ideal distributions, while its flexible design enables inter- and intrafraction plan
International Nuclear Information System (INIS)
Kontaxis, C; Bol, G; Lagendijk, J; Raaymakers, B; Breedveld, S; Sharfo, A; Heijmen, B
2016-01-01
Purpose: To develop a new IMRT treatment planning methodology suitable for the new generation of MR-linear accelerator machines. The pipeline is able to deliver Pareto-optimal plans and can be utilized for conventional treatments as well as for inter- and intrafraction plan adaptation based on real-time MR-data. Methods: A Pareto-optimal plan is generated using the automated multicriterial optimization approach Erasmus-iCycle. The resulting dose distribution is used as input to the second part of the pipeline, an iterative process which generates deliverable segments that target the latest anatomical state and gradually converges to the prescribed dose. This process continues until a certain percentage of the dose has been delivered. Under a conventional treatment, a Segment Weight Optimization (SWO) is then performed to ensure convergence to the prescribed dose. In the case of inter- and intrafraction adaptation, post-processing steps like SWO cannot be employed due to the changing anatomy. This is instead addressed by transferring the missing/excess dose to the input of the subsequent fraction. In this work, the resulting plans were delivered on a Delta4 phantom as a final Quality Assurance test. Results: A conventional static SWO IMRT plan was generated for two prostate cases. The sequencer faithfully reproduced the input dose for all volumes of interest. For the two cases the mean relative dose difference of the PTV between the ideal input and sequenced dose was 0.1% and −0.02% respectively. Both plans were delivered on a Delta4 phantom and passed the clinical Quality Assurance procedures by achieving 100% pass rate at a 3%/3mm gamma analysis. Conclusion: We have developed a new sequencing methodology capable of online plan adaptation. In this work, we extended the pipeline to support Pareto-optimal input and clinically validated that it can accurately achieve these ideal distributions, while its flexible design enables inter- and intrafraction plan
Mokeddem, Diab; Khellaf, Abdelhafid
2009-01-01
Optimal design problem are widely known by their multiple performance measures that are often competing with each other. In this paper, an optimal multiproduct batch chemical plant design is presented. The design is firstly formulated as a multiobjective optimization problem, to be solved using the well suited non dominating sorting genetic algorithm (NSGA-II). The NSGA-II have capability to achieve fine tuning of variables in determining a set of non dominating solutions distributed along the Pareto front in a single run of the algorithm. The NSGA-II ability to identify a set of optimal solutions provides the decision-maker DM with a complete picture of the optimal solution space to gain better and appropriate choices. Then an outranking with PROMETHEE II helps the decision-maker to finalize the selection of a best compromise. The effectiveness of NSGA-II method with multiojective optimization problem is illustrated through two carefully referenced examples. PMID:19543537
Directory of Open Access Journals (Sweden)
Jarosław Rudy
2015-01-01
Full Text Available In this paper the job shop scheduling problem (JSP with minimizing two criteria simultaneously is considered. JSP is frequently used model in real world applications of combinatorial optimization. Multi-objective job shop problems (MOJSP were rarely studied. We implement and compare two multi-agent nature-based methods, namely ant colony optimization (ACO and genetic algorithm (GA for MOJSP. Both of those methods employ certain technique, taken from the multi-criteria decision analysis in order to establish ranking of solutions. ACO and GA differ in a method of keeping information about previously found solutions and their quality, which affects the course of the search. In result, new features of Pareto approximations provided by said algorithms are observed: aside from the slight superiority of the ACO method the Pareto frontier approximations provided by both methods are disjoint sets. Thus, both methods can be used to search mutually exclusive areas of the Pareto frontier.
International Nuclear Information System (INIS)
Zio, E.; Bazzo, R.
2010-01-01
In this paper, a procedure is developed for identifying a number of representative solutions manageable for decision-making in a multiobjective optimization problem concerning the test intervals of the components of a safety system of a nuclear power plant. Pareto Front solutions are identified by a genetic algorithm and then clustered by subtractive clustering into 'families'. On the basis of the decision maker's preferences, each family is then synthetically represented by a 'head of the family' solution. This is done by introducing a scoring system that ranks the solutions with respect to the different objectives: a fuzzy preference assignment is employed to this purpose. Level Diagrams are then used to represent, analyze and interpret the Pareto Fronts reduced to the head-of-the-family solutions
Pareto joint inversion of 2D magnetotelluric and gravity data
Miernik, Katarzyna; Bogacz, Adrian; Kozubal, Adam; Danek, Tomasz; Wojdyła, Marek
2015-04-01
In this contribution, the first results of the "Innovative technology of petrophysical parameters estimation of geological media using joint inversion algorithms" project were described. At this stage of the development, Pareto joint inversion scheme for 2D MT and gravity data was used. Additionally, seismic data were provided to set some constrains for the inversion. Sharp Boundary Interface(SBI) approach and description model with set of polygons were used to limit the dimensionality of the solution space. The main engine was based on modified Particle Swarm Optimization(PSO). This algorithm was properly adapted to handle two or more target function at once. Additional algorithm was used to eliminate non- realistic solution proposals. Because PSO is a method of stochastic global optimization, it requires a lot of proposals to be evaluated to find a single Pareto solution and then compose a Pareto front. To optimize this stage parallel computing was used for both inversion engine and 2D MT forward solver. There are many advantages of proposed solution of joint inversion problems. First of all, Pareto scheme eliminates cumbersome rescaling of the target functions, that can highly affect the final solution. Secondly, the whole set of solution is created in one optimization run, providing a choice of the final solution. This choice can be based off qualitative data, that are usually very hard to be incorporated into the regular inversion schema. SBI parameterisation not only limits the problem of dimensionality, but also makes constraining of the solution easier. At this stage of work, decision to test the approach using MT and gravity data was made, because this combination is often used in practice. It is important to mention, that the general solution is not limited to this two methods and it is flexible enough to be used with more than two sources of data. Presented results were obtained for synthetic models, imitating real geological conditions, where
Directory of Open Access Journals (Sweden)
Yoichi Hayashi
2016-01-01
Full Text Available Historically, the assessment of credit risk has proved to be both highly important and extremely difficult. Currently, financial institutions rely on the use of computer-generated credit scores for risk assessment. However, automated risk evaluations are currently imperfect, and the loss of vast amounts of capital could be prevented by improving the performance of computerized credit assessments. A number of approaches have been developed for the computation of credit scores over the last several decades, but these methods have been considered too complex without good interpretability and have therefore not been widely adopted. Therefore, in this study, we provide the first comprehensive comparison of results regarding the assessment of credit risk obtained using 10 runs of 10-fold cross validation of the Re-RX algorithm family, including the Re-RX algorithm, the Re-RX algorithm with both discrete and continuous attributes (Continuous Re-RX, the Re-RX algorithm with J48graft, the Re-RX algorithm with a trained neural network (Sampling Re-RX, NeuroLinear, NeuroLinear+GRG, and three unique rule extraction techniques involving support vector machines and Minerva from four real-life, two-class mixed credit-risk datasets. We also discuss the roles of various newly-extended types of the Re-RX algorithm and high performance classifiers from a Pareto optimal perspective. Our findings suggest that Continuous Re-RX, Re-RX with J48graft, and Sampling Re-RX comprise a powerful management tool that allows the creation of advanced, accurate, concise and interpretable decision support systems for credit risk evaluation. In addition, from a Pareto optimal perspective, the Re-RX algorithm family has superior features in relation to the comprehensibility of extracted rules and the potential for credit scoring with Big Data.
Searching for the Pareto frontier in multi-objective protein design.
Nanda, Vikas; Belure, Sandeep V; Shir, Ofer M
2017-08-01
The goal of protein engineering and design is to identify sequences that adopt three-dimensional structures of desired function. Often, this is treated as a single-objective optimization problem, identifying the sequence-structure solution with the lowest computed free energy of folding. However, many design problems are multi-state, multi-specificity, or otherwise require concurrent optimization of multiple objectives. There may be tradeoffs among objectives, where improving one feature requires compromising another. The challenge lies in determining solutions that are part of the Pareto optimal set-designs where no further improvement can be achieved in any of the objectives without degrading one of the others. Pareto optimality problems are found in all areas of study, from economics to engineering to biology, and computational methods have been developed specifically to identify the Pareto frontier. We review progress in multi-objective protein design, the development of Pareto optimization methods, and present a specific case study using multi-objective optimization methods to model the tradeoff between three parameters, stability, specificity, and complexity, of a set of interacting synthetic collagen peptides.
A Collaborative Neurodynamic Approach to Multiple-Objective Distributed Optimization.
Yang, Shaofu; Liu, Qingshan; Wang, Jun
2018-04-01
This paper is concerned with multiple-objective distributed optimization. Based on objective weighting and decision space decomposition, a collaborative neurodynamic approach to multiobjective distributed optimization is presented. In the approach, a system of collaborative neural networks is developed to search for Pareto optimal solutions, where each neural network is associated with one objective function and given constraints. Sufficient conditions are derived for ascertaining the convergence to a Pareto optimal solution of the collaborative neurodynamic system. In addition, it is proved that each connected subsystem can generate a Pareto optimal solution when the communication topology is disconnected. Then, a switching-topology-based method is proposed to compute multiple Pareto optimal solutions for discretized approximation of Pareto front. Finally, simulation results are discussed to substantiate the performance of the collaborative neurodynamic approach. A portfolio selection application is also given.
Efficiently approximating the Pareto frontier: Hydropower dam placement in the Amazon basin
Wu, Xiaojian; Gomes-Selman, Jonathan; Shi, Qinru; Xue, Yexiang; Garcia-Villacorta, Roosevelt; Anderson, Elizabeth; Sethi, Suresh; Steinschneider, Scott; Flecker, Alexander; Gomes, Carla P.
2018-01-01
Real–world problems are often not fully characterized by a single optimal solution, as they frequently involve multiple competing objectives; it is therefore important to identify the so-called Pareto frontier, which captures solution trade-offs. We propose a fully polynomial-time approximation scheme based on Dynamic Programming (DP) for computing a polynomially succinct curve that approximates the Pareto frontier to within an arbitrarily small > 0 on treestructured networks. Given a set of objectives, our approximation scheme runs in time polynomial in the size of the instance and 1/. We also propose a Mixed Integer Programming (MIP) scheme to approximate the Pareto frontier. The DP and MIP Pareto frontier approaches have complementary strengths and are surprisingly effective. We provide empirical results showing that our methods outperform other approaches in efficiency and accuracy. Our work is motivated by a problem in computational sustainability concerning the proliferation of hydropower dams throughout the Amazon basin. Our goal is to support decision-makers in evaluating impacted ecosystem services on the full scale of the Amazon basin. Our work is general and can be applied to approximate the Pareto frontier of a variety of multiobjective problems on tree-structured networks.
Pareto-path multitask multiple kernel learning.
Li, Cong; Georgiopoulos, Michael; Anagnostopoulos, Georgios C
2015-01-01
A traditional and intuitively appealing Multitask Multiple Kernel Learning (MT-MKL) method is to optimize the sum (thus, the average) of objective functions with (partially) shared kernel function, which allows information sharing among the tasks. We point out that the obtained solution corresponds to a single point on the Pareto Front (PF) of a multiobjective optimization problem, which considers the concurrent optimization of all task objectives involved in the Multitask Learning (MTL) problem. Motivated by this last observation and arguing that the former approach is heuristic, we propose a novel support vector machine MT-MKL framework that considers an implicitly defined set of conic combinations of task objectives. We show that solving our framework produces solutions along a path on the aforementioned PF and that it subsumes the optimization of the average of objective functions as a special case. Using the algorithms we derived, we demonstrate through a series of experimental results that the framework is capable of achieving a better classification performance, when compared with other similar MTL approaches.
Savsani, Vimal; Patel, Vivek; Gadhvi, Bhargav; Tawhid, Mohamed
2017-01-01
Most of the modern multiobjective optimization algorithms are based on the search technique of genetic algorithms; however the search techniques of other recently developed metaheuristics are emerging topics among researchers. This paper proposes a novel multiobjective optimization algorithm named multiobjective heat transfer search (MOHTS) algorithm, which is based on the search technique of heat transfer search (HTS) algorithm. MOHTS employs the elitist nondominated sorting and crowding dis...
Computing gap free Pareto front approximations with stochastic search algorithms.
Schütze, Oliver; Laumanns, Marco; Tantar, Emilia; Coello, Carlos A Coello; Talbi, El-Ghazali
2010-01-01
Recently, a convergence proof of stochastic search algorithms toward finite size Pareto set approximations of continuous multi-objective optimization problems has been given. The focus was on obtaining a finite approximation that captures the entire solution set in some suitable sense, which was defined by the concept of epsilon-dominance. Though bounds on the quality of the limit approximation-which are entirely determined by the archiving strategy and the value of epsilon-have been obtained, the strategies do not guarantee to obtain a gap free approximation of the Pareto front. That is, such approximations A can reveal gaps in the sense that points f in the Pareto front can exist such that the distance of f to any image point F(a), a epsilon A, is "large." Since such gap free approximations are desirable in certain applications, and the related archiving strategies can be advantageous when memetic strategies are included in the search process, we are aiming in this work for such methods. We present two novel strategies that accomplish this task in the probabilistic sense and under mild assumptions on the stochastic search algorithm. In addition to the convergence proofs, we give some numerical results to visualize the behavior of the different archiving strategies. Finally, we demonstrate the potential for a possible hybridization of a given stochastic search algorithm with a particular local search strategy-multi-objective continuation methods-by showing that the concept of epsilon-dominance can be integrated into this approach in a suitable way.
Pareto fronts in clinical practice for pinnacle.
Janssen, Tomas; van Kesteren, Zdenko; Franssen, Gijs; Damen, Eugène; van Vliet, Corine
2013-03-01
Our aim was to develop a framework to objectively perform treatment planning studies using Pareto fronts. The Pareto front represents all optimal possible tradeoffs among several conflicting criteria and is an ideal tool with which to study the possibilities of a given treatment technique. The framework should require minimal user interaction and should resemble and be applicable to daily clinical practice. To generate the Pareto fronts, we used the native scripting language of Pinnacle(3) (Philips Healthcare, Andover, MA). The framework generates thousands of plans automatically from which the Pareto front is generated. As an example, the framework is applied to compare intensity modulated radiation therapy (IMRT) with volumetric modulated arc therapy (VMAT) for prostate cancer patients. For each patient and each technique, 3000 plans are generated, resulting in a total of 60,000 plans. The comparison is based on 5-dimensional Pareto fronts. Generating 3000 plans for 10 patients in parallel requires on average 96 h for IMRT and 483 hours for VMAT. Using VMAT, compared to IMRT, the maximum dose of the boost PTV was reduced by 0.4 Gy (P=.074), the mean dose in the anal sphincter by 1.6 Gy (P=.055), the conformity index of the 95% isodose (CI(95%)) by 0.02 (P=.005), and the rectal wall V(65 Gy) by 1.1% (P=.008). We showed the feasibility of automatically generating Pareto fronts with Pinnacle(3). Pareto fronts provide a valuable tool for performing objective comparative treatment planning studies. We compared VMAT with IMRT in prostate patients and found VMAT had a dosimetric advantage over IMRT. Copyright © 2013 Elsevier Inc. All rights reserved.
Pareto Fronts in Clinical Practice for Pinnacle
International Nuclear Information System (INIS)
Janssen, Tomas; Kesteren, Zdenko van; Franssen, Gijs; Damen, Eugène; Vliet, Corine van
2013-01-01
Purpose: Our aim was to develop a framework to objectively perform treatment planning studies using Pareto fronts. The Pareto front represents all optimal possible tradeoffs among several conflicting criteria and is an ideal tool with which to study the possibilities of a given treatment technique. The framework should require minimal user interaction and should resemble and be applicable to daily clinical practice. Methods and Materials: To generate the Pareto fronts, we used the native scripting language of Pinnacle 3 (Philips Healthcare, Andover, MA). The framework generates thousands of plans automatically from which the Pareto front is generated. As an example, the framework is applied to compare intensity modulated radiation therapy (IMRT) with volumetric modulated arc therapy (VMAT) for prostate cancer patients. For each patient and each technique, 3000 plans are generated, resulting in a total of 60,000 plans. The comparison is based on 5-dimensional Pareto fronts. Results: Generating 3000 plans for 10 patients in parallel requires on average 96 h for IMRT and 483 hours for VMAT. Using VMAT, compared to IMRT, the maximum dose of the boost PTV was reduced by 0.4 Gy (P=.074), the mean dose in the anal sphincter by 1.6 Gy (P=.055), the conformity index of the 95% isodose (CI 95% ) by 0.02 (P=.005), and the rectal wall V 65 Gy by 1.1% (P=.008). Conclusions: We showed the feasibility of automatically generating Pareto fronts with Pinnacle 3 . Pareto fronts provide a valuable tool for performing objective comparative treatment planning studies. We compared VMAT with IMRT in prostate patients and found VMAT had a dosimetric advantage over IMRT
International Nuclear Information System (INIS)
Okamoto, Takashi; Hanaoka, Yuya; Aiyoshi, Eitaro; Kobayashi, Yoko
2012-01-01
In this paper, we consider a multi-objective optimization method in order to obtain a preferred solution for the buffer material optimal design problem in the high-level radioactive wastes geological disposal. The buffer material optimal design problem is formulated as a constrained multi-objective optimization problem. Its Pareto optimal solutions are distributed evenly on whole bounds of the feasible region. Hence, we develop a search method to find a preferred solution easily for a decision maker from the Pareto optimal solutions which are distributed evenly and vastly. In the preferred solution search method, the visualization technique of a Pareto optimal solution set using the self-organizing map is introduced into the satisficing trade-off method which is the interactive method to obtain a Pareto optimal solution that satisfies a decision maker. We confirm the effectiveness of the preferred solution search method in the buffer material optimal design problem. (author)
2006-05-15
of different evolutionary approaches to multiobjective optimal design are given by Van Veldhuizen ,7 Van Veldhuizen and Lamont,8 and Zitzler and Thiele...and Machine Learning, Addison-Wesley, Boston, 1989. 7. D. A. Van Veldhuizen , "Multiobjective Evolutionary Algorithms: Classifications, Analyses, and...New Innovations," Ph.D. Dissertation, Air Force Institute of Technology, 1999. 39 8. D. A. Van Veldhuizen and G. B. Lamont, "Multiobjective
Coordinated Pitch & Torque Control of Large-Scale Wind Turbine Based on Pareto Eciency Analysis
DEFF Research Database (Denmark)
Lin, Zhongwei; Chen, Zhenyu; Wu, Qiuwei
2018-01-01
For the existing pitch and torque control of the wind turbine generator system (WTGS), further development on coordinated control is necessary to improve effectiveness for practical applications. In this paper, the WTGS is modeled as a coupling combination of two subsystems: the generator torque...... control subsystem and blade pitch control subsystem. Then, the pole positions in each control subsystem are adjusted coordinately to evaluate the controller participation and used as the objective of optimization. A two-level parameters-controllers coordinated optimization scheme is proposed and applied...... to optimize the controller coordination based on the Pareto optimization theory. Three solutions are obtained through optimization, which includes the optimal torque solution, optimal power solution, and satisfactory solution. Detailed comparisons evaluate the performance of the three selected solutions...
Many-objective thermodynamic optimization of Stirling heat engine
International Nuclear Information System (INIS)
Patel, Vivek; Savsani, Vimal; Mudgal, Anurag
2017-01-01
This paper presents a rigorous investigation of many-objective (four-objective) thermodynamic optimization of a Stirling heat engine. Many-objective optimization problem is formed by considering maximization of thermal efficiency, power output, ecological function and exergy efficiency. Multi-objective heat transfer search (MOHTS) algorithm is proposed and applied to obtain a set of Pareto-optimal points. Many objective optimization results form a solution in a four dimensional hyper objective space and for visualization it is represented on a two dimension objective space. Thus, results of four-objective optimization are represented by six Pareto fronts in two dimension objective space. These six Pareto fronts are compared with their corresponding two-objective Pareto fronts. Quantitative assessment of the obtained Pareto solutions is reported in terms of spread and the spacing measures. Different decision making approaches such as LINMAP, TOPSIS and fuzzy are used to select a final optimal solution from Pareto optimal set of many-objective optimization. Finally, to reveal the level of conflict between these objectives, distribution of each decision variable in their allowable range is also shown in two dimensional objective spaces. - Highlights: • Many-objective (i.e. four objective) optimization of Stirling engine is investigated. • MOHTS algorithm is introduced and applied to obtain a set of Pareto points. • Comparative results of many-objective and multi-objectives are presented. • Relationship of design variables in many-objective optimization are obtained. • Optimum solution is selected by using decision making approaches.
Wakano, Joe Yuichiro; Miura, Chiaki
2014-02-01
Inheritance of culture is achieved by social learning and improvement is achieved by individual learning. To realize cumulative cultural evolution, social and individual learning should be performed in this order in one's life. However, it is not clear whether such a learning schedule can evolve by the maximization of individual fitness. Here we study optimal allocation of lifetime to learning and exploitation in a two-stage life history model under a constant environment. We show that the learning schedule by which high cultural level is achieved through cumulative cultural evolution is unlikely to evolve as a result of the maximization of individual fitness, if there exists a trade-off between the time spent in learning and the time spent in exploiting the knowledge that has been learned in earlier stages of one's life. Collapse of a fully developed culture is predicted by a game-theoretical analysis where individuals behave selfishly, e.g., less learning and more exploiting. The present study suggests that such factors as group selection, the ability of learning-while-working ("on the job training"), or environmental fluctuation might be important in the realization of rapid and cumulative cultural evolution that is observed in humans. Copyright © 2013 Elsevier Inc. All rights reserved.
Sensitivity analysis of efficient solution in vector MINMAX boolean programming problem
Directory of Open Access Journals (Sweden)
Vladimir A. Emelichev
2002-11-01
Full Text Available We consider a multiple criterion Boolean programming problem with MINMAX partial criteria. The extreme level of independent perturbations of partial criteria parameters such that efficient (Pareto optimal solution preserves optimality was obtained.
Pareto navigation: algorithmic foundation of interactive multi-criteria IMRT planning.
Monz, M; Küfer, K H; Bortfeld, T R; Thieke, C
2008-02-21
Inherently, IMRT treatment planning involves compromising between different planning goals. Multi-criteria IMRT planning directly addresses this compromising and thus makes it more systematic. Usually, several plans are computed from which the planner selects the most promising following a certain procedure. Applying Pareto navigation for this selection step simultaneously increases the variety of planning options and eases the identification of the most promising plan. Pareto navigation is an interactive multi-criteria optimization method that consists of the two navigation mechanisms 'selection' and 'restriction'. The former allows the formulation of wishes whereas the latter allows the exclusion of unwanted plans. They are realized as optimization problems on the so-called plan bundle -- a set constructed from pre-computed plans. They can be approximately reformulated so that their solution time is a small fraction of a second. Thus, the user can be provided with immediate feedback regarding his or her decisions. Pareto navigation was implemented in the MIRA navigator software and allows real-time manipulation of the current plan and the set of considered plans. The changes are triggered by simple mouse operations on the so-called navigation star and lead to real-time updates of the navigation star and the dose visualizations. Since any Pareto-optimal plan in the plan bundle can be found with just a few navigation operations the MIRA navigator allows a fast and directed plan determination. Besides, the concept allows for a refinement of the plan bundle, thus offering a middle course between single plan computation and multi-criteria optimization. Pareto navigation offers so far unmatched real-time interactions, ease of use and plan variety, setting it apart from the multi-criteria IMRT planning methods proposed so far.
Pareto navigation-algorithmic foundation of interactive multi-criteria IMRT planning
International Nuclear Information System (INIS)
Monz, M; Kuefer, K H; Bortfeld, T R; Thieke, C
2008-01-01
Inherently, IMRT treatment planning involves compromising between different planning goals. Multi-criteria IMRT planning directly addresses this compromising and thus makes it more systematic. Usually, several plans are computed from which the planner selects the most promising following a certain procedure. Applying Pareto navigation for this selection step simultaneously increases the variety of planning options and eases the identification of the most promising plan. Pareto navigation is an interactive multi-criteria optimization method that consists of the two navigation mechanisms 'selection' and 'restriction'. The former allows the formulation of wishes whereas the latter allows the exclusion of unwanted plans. They are realized as optimization problems on the so-called plan bundle-a set constructed from pre-computed plans. They can be approximately reformulated so that their solution time is a small fraction of a second. Thus, the user can be provided with immediate feedback regarding his or her decisions. Pareto navigation was implemented in the MIRA navigator software and allows real-time manipulation of the current plan and the set of considered plans. The changes are triggered by simple mouse operations on the so-called navigation star and lead to real-time updates of the navigation star and the dose visualizations. Since any Pareto-optimal plan in the plan bundle can be found with just a few navigation operations the MIRA navigator allows a fast and directed plan determination. Besides, the concept allows for a refinement of the plan bundle, thus offering a middle course between single plan computation and multi-criteria optimization. Pareto navigation offers so far unmatched real-time interactions, ease of use and plan variety, setting it apart from the multi-criteria IMRT planning methods proposed so far
Approximating convex Pareto surfaces in multiobjective radiotherapy planning
International Nuclear Information System (INIS)
Craft, David L.; Halabi, Tarek F.; Shih, Helen A.; Bortfeld, Thomas R.
2006-01-01
Radiotherapy planning involves inherent tradeoffs: the primary mission, to treat the tumor with a high, uniform dose, is in conflict with normal tissue sparing. We seek to understand these tradeoffs on a case-to-case basis, by computing for each patient a database of Pareto optimal plans. A treatment plan is Pareto optimal if there does not exist another plan which is better in every measurable dimension. The set of all such plans is called the Pareto optimal surface. This article presents an algorithm for computing well distributed points on the (convex) Pareto optimal surface of a multiobjective programming problem. The algorithm is applied to intensity-modulated radiation therapy inverse planning problems, and results of a prostate case and a skull base case are presented, in three and four dimensions, investigating tradeoffs between tumor coverage and critical organ sparing
Approximative solutions of stochastic optimization problem
Czech Academy of Sciences Publication Activity Database
Lachout, Petr
2010-01-01
Roč. 46, č. 3 (2010), s. 513-523 ISSN 0023-5954 R&D Projects: GA ČR GA201/08/0539 Institutional research plan: CEZ:AV0Z10750506 Keywords : Stochastic optimization problem * sensitivity * approximative solution Subject RIV: BA - General Mathematics Impact factor: 0.461, year: 2010 http://library.utia.cas.cz/separaty/2010/SI/lachout-approximative solutions of stochastic optimization problem.pdf
Multiobjective optimization of urban water resources: Moving toward more practical solutions
Mortazavi, Mohammad; Kuczera, George; Cui, Lijie
2012-03-01
The issue of drought security is of paramount importance for cities located in regions subject to severe prolonged droughts. The prospect of "running out of water" for an extended period would threaten the very existence of the city. Managing drought security for an urban water supply is a complex task involving trade-offs between conflicting objectives. In this paper a multiobjective optimization approach for urban water resource planning and operation is developed to overcome practically significant shortcomings identified in previous work. A case study based on the headworks system for Sydney (Australia) demonstrates the approach and highlights the potentially serious shortcomings of Pareto optimal solutions conditioned on short climate records, incomplete decision spaces, and constraints to which system response is sensitive. Where high levels of drought security are required, optimal solutions conditioned on short climate records are flawed. Our approach addresses drought security explicitly by identifying approximate optimal solutions in which the system does not "run dry" in severe droughts with expected return periods up to a nominated (typically large) value. In addition, it is shown that failure to optimize the full mix of interacting operational and infrastructure decisions and to explore the trade-offs associated with sensitive constraints can lead to significantly more costly solutions.
An extension of the directed search domain algorithm to bilevel optimization
Wang, Kaiqiang; Utyuzhnikov, Sergey V.
2017-08-01
A method is developed for generating a well-distributed Pareto set for the upper level in bilevel multiobjective optimization. The approach is based on the Directed Search Domain (DSD) algorithm, which is a classical approach for generation of a quasi-evenly distributed Pareto set in multiobjective optimization. The approach contains a double-layer optimizer designed in a specific way under the framework of the DSD method. The double-layer optimizer is based on bilevel single-objective optimization and aims to find a unique optimal Pareto solution rather than generate the whole Pareto frontier on the lower level in order to improve the optimization efficiency. The proposed bilevel DSD approach is verified on several test cases, and a relevant comparison against another classical approach is made. It is shown that the approach can generate a quasi-evenly distributed Pareto set for the upper level with relatively low time consumption.
Multiobjective Optimization Involving Quadratic Functions
Directory of Open Access Journals (Sweden)
Oscar Brito Augusto
2014-01-01
Full Text Available Multiobjective optimization is nowadays a word of order in engineering projects. Although the idea involved is simple, the implementation of any procedure to solve a general problem is not an easy task. Evolutionary algorithms are widespread as a satisfactory technique to find a candidate set for the solution. Usually they supply a discrete picture of the Pareto front even if this front is continuous. In this paper we propose three methods for solving unconstrained multiobjective optimization problems involving quadratic functions. In the first, for biobjective optimization defined in the bidimensional space, a continuous Pareto set is found analytically. In the second, applicable to multiobjective optimization, a condition test is proposed to check if a point in the decision space is Pareto optimum or not and, in the third, with functions defined in n-dimensional space, a direct noniterative algorithm is proposed to find the Pareto set. Simple problems highlight the suitability of the proposed methods.
Directory of Open Access Journals (Sweden)
Kareema Abed Al-Kadim
2017-12-01
Full Text Available In this paper Rayleigh Pareto distribution have introduced denote by( R_PD. We stated some useful functions. Therefor we give some of its properties like the entropy function, mean, mode, median , variance , the r-th moment about the mean, the rth moment about the origin, reliability, hazard functions, coefficients of variation, of sekeness and of kurtosis. Finally, we estimate the parameters so the aim of this search is to introduce a new distribution
Multi-objective Optimization Strategies Using Adjoint Method and Game Theory in Aerodynamics
Tang, Zhili
2006-08-01
There are currently three different game strategies originated in economics: (1) Cooperative games (Pareto front), (2) Competitive games (Nash game) and (3) Hierarchical games (Stackelberg game). Each game achieves different equilibria with different performance, and their players play different roles in the games. Here, we introduced game concept into aerodynamic design, and combined it with adjoint method to solve multi-criteria aerodynamic optimization problems. The performance distinction of the equilibria of these three game strategies was investigated by numerical experiments. We computed Pareto front, Nash and Stackelberg equilibria of the same optimization problem with two conflicting and hierarchical targets under different parameterizations by using the deterministic optimization method. The numerical results show clearly that all the equilibria solutions are inferior to the Pareto front. Non-dominated Pareto front solutions are obtained, however the CPU cost to capture a set of solutions makes the Pareto front an expensive tool to the designer.
Multi-objective optimization strategies using adjoint method and game theory in aerodynamics
Institute of Scientific and Technical Information of China (English)
Zhili Tang
2006-01-01
There are currently three different game strategies originated in economics:(1) Cooperative games (Pareto front),(2)Competitive games (Nash game) and (3)Hierarchical games (Stackelberg game).Each game achieves different equilibria with different performance,and their players play different roles in the games.Here,we introduced game concept into aerodynamic design, and combined it with adjoint method to solve multicriteria aerodynamic optimization problems.The performance distinction of the equilibria of these three game strategies was investigated by numerical experiments.We computed Pareto front, Nash and Stackelberg equilibria of the same optimization problem with two conflicting and hierarchical targets under different parameterizations by using the deterministic optimization method.The numerical results show clearly that all the equilibria solutions are inferior to the Pareto front.Non-dominated Pareto front solutions are obtained,however the CPU cost to capture a set of solutions makes the Pareto front an expensive tool to the designer.
Monopoly, Pareto and Ramsey mark-ups
Ten Raa, T.
2009-01-01
Monopoly prices are too high. It is a price level problem, in the sense that the relative mark-ups have Ramsey optimal proportions, at least for independent constant elasticity demands. I show that this feature of monopoly prices breaks down the moment one demand is replaced by the textbook linear demand or, even within the constant elasticity framework, dependence is introduced. The analysis provides a single Generalized Inverse Elasticity Rule for the problems of monopoly, Pareto and Ramsey.
GENERALIZED DOUBLE PARETO SHRINKAGE.
Armagan, Artin; Dunson, David B; Lee, Jaeyong
2013-01-01
We propose a generalized double Pareto prior for Bayesian shrinkage estimation and inferences in linear models. The prior can be obtained via a scale mixture of Laplace or normal distributions, forming a bridge between the Laplace and Normal-Jeffreys' priors. While it has a spike at zero like the Laplace density, it also has a Student's t -like tail behavior. Bayesian computation is straightforward via a simple Gibbs sampling algorithm. We investigate the properties of the maximum a posteriori estimator, as sparse estimation plays an important role in many problems, reveal connections with some well-established regularization procedures, and show some asymptotic results. The performance of the prior is tested through simulations and an application.
Trade-off bounds for the Pareto surface approximation in multi-criteria IMRT planning
International Nuclear Information System (INIS)
Serna, J I; Monz, M; Kuefer, K H; Thieke, C
2009-01-01
One approach to multi-criteria IMRT planning is to automatically calculate a data set of Pareto-optimal plans for a given planning problem in a first phase, and then interactively explore the solution space and decide on the clinically best treatment plan in a second phase. The challenge of computing the plan data set is to ensure that all clinically meaningful plans are covered and that as many clinically irrelevant plans as possible are excluded to keep computation times within reasonable limits. In this work, we focus on the approximation of the clinically relevant part of the Pareto surface, the process that constitutes the first phase. It is possible that two plans on the Pareto surface have a small, clinically insignificant difference in one criterion and a significant difference in another criterion. For such cases, only the plan that is clinically clearly superior should be included into the data set. To achieve this during the Pareto surface approximation, we propose to introduce bounds that restrict the relative quality between plans, the so-called trade-off bounds. We show how to integrate these trade-off bounds into the approximation scheme and study their effects. The proposed scheme is applied to two artificial cases and one clinical case of a paraspinal tumor. For all cases, the quality of the Pareto surface approximation is measured with respect to the number of computed plans, and the range of values occurring in the approximation for different criteria is compared. Through enforcing trade-off bounds, the scheme disregards clinically irrelevant plans during the approximation. Thereby, the number of plans necessary to achieve a good approximation quality can be significantly reduced. Thus, trade-off bounds are an effective tool to focus the planning and to reduce computation time.
Trade-off bounds for the Pareto surface approximation in multi-criteria IMRT planning.
Serna, J I; Monz, M; Küfer, K H; Thieke, C
2009-10-21
One approach to multi-criteria IMRT planning is to automatically calculate a data set of Pareto-optimal plans for a given planning problem in a first phase, and then interactively explore the solution space and decide on the clinically best treatment plan in a second phase. The challenge of computing the plan data set is to ensure that all clinically meaningful plans are covered and that as many clinically irrelevant plans as possible are excluded to keep computation times within reasonable limits. In this work, we focus on the approximation of the clinically relevant part of the Pareto surface, the process that constitutes the first phase. It is possible that two plans on the Pareto surface have a small, clinically insignificant difference in one criterion and a significant difference in another criterion. For such cases, only the plan that is clinically clearly superior should be included into the data set. To achieve this during the Pareto surface approximation, we propose to introduce bounds that restrict the relative quality between plans, the so-called trade-off bounds. We show how to integrate these trade-off bounds into the approximation scheme and study their effects. The proposed scheme is applied to two artificial cases and one clinical case of a paraspinal tumor. For all cases, the quality of the Pareto surface approximation is measured with respect to the number of computed plans, and the range of values occurring in the approximation for different criteria is compared. Through enforcing trade-off bounds, the scheme disregards clinically irrelevant plans during the approximation. Thereby, the number of plans necessary to achieve a good approximation quality can be significantly reduced. Thus, trade-off bounds are an effective tool to focus the planning and to reduce computation time.
Kullback-Leibler divergence and the Pareto-Exponential approximation.
Weinberg, G V
2016-01-01
Recent radar research interests in the Pareto distribution as a model for X-band maritime surveillance radar clutter returns have resulted in analysis of the asymptotic behaviour of this clutter model. In particular, it is of interest to understand when the Pareto distribution is well approximated by an Exponential distribution. The justification for this is that under the latter clutter model assumption, simpler radar detection schemes can be applied. An information theory approach is introduced to investigate the Pareto-Exponential approximation. By analysing the Kullback-Leibler divergence between the two distributions it is possible to not only assess when the approximation is valid, but to determine, for a given Pareto model, the optimal Exponential approximation.
Solution quality improvement in chiller loading optimization
International Nuclear Information System (INIS)
Geem, Zong Woo
2011-01-01
In order to reduce greenhouse gas emission, we can energy-efficiently operate a multiple chiller system using optimization techniques. So far, various optimization techniques have been proposed to the optimal chiller loading problem. Most of those techniques are meta-heuristic algorithms such as genetic algorithm, simulated annealing, and particle swarm optimization. However, this study applied a gradient-based method, named generalized reduced gradient, and then obtains better results when compared with other approaches. When two additional approaches (hybridization between meta-heuristic algorithm and gradient-based algorithm; and reformulation of optimization structure by adding a binary variable which denotes chiller's operating status) were introduced, generalized reduced gradient found even better solutions. - Highlights: → Chiller loading problem is optimized by generalized reduced gradient (GRG) method. → Results are compared with meta-heuristic algorithms such as genetic algorithm. → Results are even enhanced by hybridizing meta-heuristic and gradient techniques. → Results are even enhanced by modifying the optimization formulation.
Optimization of the annual construction program solutions
Directory of Open Access Journals (Sweden)
Oleinik Pavel
2017-01-01
Full Text Available The article considers potentially possible optimization solutions in scheduling while forming the annual production programs of the construction complex organizations. The optimization instrument is represented as a two-component system. As a fundamentally new approach in the first block of the annual program solutions, the authors propose to use a scientifically grounded methodology for determining the scope of work permissible for the transfer to a subcontractor without risk of General Contractor’s management control losing over the construction site. For this purpose, a special indicator is introduced that characterizes the activity of the general construction organization - the coefficient of construction production management. In the second block, the principal methods for the formation of calendar plans for the fulfillment of the critical work effort by the leading stream are proposed, depending on the intensity characteristic.
Zheng, Y.; Chen, J.
2017-09-01
A modified multi-objective particle swarm optimization method is proposed for obtaining Pareto-optimal solutions effectively. Different from traditional multi-objective particle swarm optimization methods, Kriging meta-models and the trapezoid index are introduced and integrated with the traditional one. Kriging meta-models are built to match expensive or black-box functions. By applying Kriging meta-models, function evaluation numbers are decreased and the boundary Pareto-optimal solutions are identified rapidly. For bi-objective optimization problems, the trapezoid index is calculated as the sum of the trapezoid's area formed by the Pareto-optimal solutions and one objective axis. It can serve as a measure whether the Pareto-optimal solutions converge to the Pareto front. Illustrative examples indicate that to obtain Pareto-optimal solutions, the method proposed needs fewer function evaluations than the traditional multi-objective particle swarm optimization method and the non-dominated sorting genetic algorithm II method, and both the accuracy and the computational efficiency are improved. The proposed method is also applied to the design of a deepwater composite riser example in which the structural performances are calculated by numerical analysis. The design aim was to enhance the tension strength and minimize the cost. Under the buckling constraint, the optimal trade-off of tensile strength and material volume is obtained. The results demonstrated that the proposed method can effectively deal with multi-objective optimizations with black-box functions.
Yue, Lei; Guan, Zailin; Saif, Ullah; Zhang, Fei; Wang, Hao
2016-01-01
Group scheduling is significant for efficient and cost effective production system. However, there exist setup times between the groups, which require to decrease it by sequencing groups in an efficient way. Current research is focused on a sequence dependent group scheduling problem with an aim to minimize the makespan in addition to minimize the total weighted tardiness simultaneously. In most of the production scheduling problems, the processing time of jobs is assumed as fixed. However, the actual processing time of jobs may be reduced due to "learning effect". The integration of sequence dependent group scheduling problem with learning effects has been rarely considered in literature. Therefore, current research considers a single machine group scheduling problem with sequence dependent setup times and learning effects simultaneously. A novel hybrid Pareto artificial bee colony algorithm (HPABC) with some steps of genetic algorithm is proposed for current problem to get Pareto solutions. Furthermore, five different sizes of test problems (small, small medium, medium, large medium, large) are tested using proposed HPABC. Taguchi method is used to tune the effective parameters of the proposed HPABC for each problem category. The performance of HPABC is compared with three famous multi objective optimization algorithms, improved strength Pareto evolutionary algorithm (SPEA2), non-dominated sorting genetic algorithm II (NSGAII) and particle swarm optimization algorithm (PSO). Results indicate that HPABC outperforms SPEA2, NSGAII and PSO and gives better Pareto optimal solutions in terms of diversity and quality for almost all the instances of the different sizes of problems.
A Pareto-Improving Minimum Wage
Eliav Danziger; Leif Danziger
2014-01-01
This paper shows that a graduated minimum wage, in contrast to a constant minimum wage, can provide a strict Pareto improvement over what can be achieved with an optimal income tax. The reason is that a graduated minimum wage requires high-productivity workers to work more to earn the same income as low-productivity workers, which makes it more difficult for the former to mimic the latter. In effect, a graduated minimum wage allows the low-productivity workers to benefit from second-degree pr...
On the Truncated Pareto Distribution with applications
Zaninetti, Lorenzo; Ferraro, Mario
2008-01-01
The Pareto probability distribution is widely applied in different fields such us finance, physics, hydrology, geology and astronomy. This note deals with an application of the Pareto distribution to astrophysics and more precisely to the statistical analysis of mass of stars and of diameters of asteroids. In particular a comparison between the usual Pareto distribution and its truncated version is presented. Finally a possible physical mechanism that produces Pareto tails for the distributio...
Record Values of a Pareto Distribution.
Ahsanullah, M.
The record values of the Pareto distribution, labelled Pareto (II) (alpha, beta, nu), are reviewed. The best linear unbiased estimates of the parameters in terms of the record values are provided. The prediction of the sth record value based on the first m (s>m) record values are obtained. A classical Pareto distribution provides reasonably…
Efficient approximation of black-box functions and Pareto sets
Rennen, G.
2009-01-01
In the case of time-consuming simulation models or other so-called black-box functions, we determine a metamodel which approximates the relation between the input- and output-variables of the simulation model. To solve multi-objective optimization problems, we approximate the Pareto set, i.e. the
Pareto law and Pareto index in the income distribution of Japanese companies
Ishikawa, Atushi
2004-01-01
In order to study the phenomenon in detail that income distribution follows Pareto law, we analyze the database of high income companies in Japan. We find a quantitative relation between the average capital of the companies and the Pareto index. The larger the average capital becomes, the smaller the Pareto index becomes. From this relation, we can possibly explain that the Pareto index of company income distribution hardly changes, while the Pareto index of personal income distribution chang...
Investigating multi-objective fluence and beam orientation IMRT optimization
Potrebko, Peter S.; Fiege, Jason; Biagioli, Matthew; Poleszczuk, Jan
2017-07-01
Radiation Oncology treatment planning requires compromises to be made between clinical objectives that are invariably in conflict. It would be beneficial to have a ‘bird’s-eye-view’ perspective of the full spectrum of treatment plans that represent the possible trade-offs between delivering the intended dose to the planning target volume (PTV) while optimally sparing the organs-at-risk (OARs). In this work, the authors demonstrate Pareto-aware radiotherapy evolutionary treatment optimization (PARETO), a multi-objective tool featuring such bird’s-eye-view functionality, which optimizes fluence patterns and beam angles for intensity-modulated radiation therapy (IMRT) treatment planning. The problem of IMRT treatment plan optimization is managed as a combined monolithic problem, where all beam fluence and angle parameters are treated equally during the optimization. To achieve this, PARETO is built around a powerful multi-objective evolutionary algorithm, called Ferret, which simultaneously optimizes multiple fitness functions that encode the attributes of the desired dose distribution for the PTV and OARs. The graphical interfaces within PARETO provide useful information such as: the convergence behavior during optimization, trade-off plots between the competing objectives, and a graphical representation of the optimal solution database allowing for the rapid exploration of treatment plan quality through the evaluation of dose-volume histograms and isodose distributions. PARETO was evaluated for two relatively complex clinical cases, a paranasal sinus and a pancreas case. The end result of each PARETO run was a database of optimal (non-dominated) treatment plans that demonstrated trade-offs between the OAR and PTV fitness functions, which were all equally good in the Pareto-optimal sense (where no one objective can be improved without worsening at least one other). Ferret was able to produce high quality solutions even though a large number of parameters
Multi-objective optimal strategy for generating and bidding in the power market
International Nuclear Information System (INIS)
Peng Chunhua; Sun Huijuan; Guo Jianfeng; Liu Gang
2012-01-01
Highlights: ► A new benefit/risk/emission comprehensive generation optimization model is established. ► A hybrid multi-objective differential evolution optimization algorithm is designed. ► Fuzzy set theory and entropy weighting method are employed to extract the general best solution. ► The proposed approach of generating and bidding is efficient for maximizing profit and minimizing both risk and emissions. - Abstract: Based on the coordinated interaction between units output and electricity market prices, the benefit/risk/emission comprehensive generation optimization model with objectives of maximal profit and minimal bidding risk and emissions is established. A hybrid multi-objective differential evolution optimization algorithm, which successfully integrates Pareto non-dominated sorting with differential evolution algorithm and improves individual crowding distance mechanism and mutation strategy to avoid premature and unevenly search, is designed to achieve Pareto optimal set of this model. Moreover, fuzzy set theory and entropy weighting method are employed to extract one of the Pareto optimal solutions as the general best solution. Several optimization runs have been carried out on different cases of generation bidding and scheduling. The results confirm the potential and effectiveness of the proposed approach in solving the multi-objective optimization problem of generation bidding and scheduling. In addition, the comparison with the classical optimization algorithms demonstrates the superiorities of the proposed algorithm such as integrality of Pareto front, well-distributed Pareto-optimal solutions, high search speed.
Classification as clustering: a Pareto cooperative-competitive GP approach.
McIntyre, Andrew R; Heywood, Malcolm I
2011-01-01
Intuitively population based algorithms such as genetic programming provide a natural environment for supporting solutions that learn to decompose the overall task between multiple individuals, or a team. This work presents a framework for evolving teams without recourse to prespecifying the number of cooperating individuals. To do so, each individual evolves a mapping to a distribution of outcomes that, following clustering, establishes the parameterization of a (Gaussian) local membership function. This gives individuals the opportunity to represent subsets of tasks, where the overall task is that of classification under the supervised learning domain. Thus, rather than each team member representing an entire class, individuals are free to identify unique subsets of the overall classification task. The framework is supported by techniques from evolutionary multiobjective optimization (EMO) and Pareto competitive coevolution. EMO establishes the basis for encouraging individuals to provide accurate yet nonoverlaping behaviors; whereas competitive coevolution provides the mechanism for scaling to potentially large unbalanced datasets. Benchmarking is performed against recent examples of nonlinear SVM classifiers over 12 UCI datasets with between 150 and 200,000 training instances. Solutions from the proposed coevolutionary multiobjective GP framework appear to provide a good balance between classification performance and model complexity, especially as the dataset instance count increases.
Directory of Open Access Journals (Sweden)
J. S. Sadaghiani
2014-04-01
Full Text Available Flexible job shop scheduling problem is a key factor of using efficiently in production systems. This paper attempts to simultaneously optimize three objectives including minimization of the make span, total workload and maximum workload of jobs. Since the multi objective flexible job shop scheduling problem is strongly NP-Hard, an integrated heuristic approach has been used to solve it. The proposed approach was based on a floating search procedure that has used some heuristic algorithms. Within floating search procedure utilize local heuristic algorithms; it makes the considered problem into two sections including assigning and sequencing sub problem. First of all search is done upon assignment space achieving an acceptable solution and then search would continue on sequencing space based on a heuristic algorithm. This paper has used a multi-objective approach for producing Pareto solution. Thus proposed approach was adapted on NSGA II algorithm and evaluated Pareto-archives. The elements and parameters of the proposed algorithms were adjusted upon preliminary experiments. Finally, computational results were used to analyze efficiency of the proposed algorithm and this results showed that the proposed algorithm capable to produce efficient solutions.
Existence of pareto equilibria for multiobjective games without compactness
Shiraishi, Yuya; Kuroiwa, Daishi
2013-01-01
In this paper, we investigate the existence of Pareto and weak Pareto equilibria for multiobjective games without compactness. By employing an existence theorem of Pareto equilibria due to Yu and Yuan([10]), several existence theorems of Pareto and weak Pareto equilibria for the multiobjective games are established in a similar way to Flores-B´azan.
The geometry of the Pareto front in biological phenotype space
Sheftel, Hila; Shoval, Oren; Mayo, Avi; Alon, Uri
2013-01-01
When organisms perform a single task, selection leads to phenotypes that maximize performance at that task. When organisms need to perform multiple tasks, a trade-off arises because no phenotype can optimize all tasks. Recent work addressed this question, and assumed that the performance at each task decays with distance in trait space from the best phenotype at that task. Under this assumption, the best-fitness solutions (termed the Pareto front) lie on simple low-dimensional shapes in trait space: line segments, triangles and other polygons. The vertices of these polygons are specialists at a single task. Here, we generalize this finding, by considering performance functions of general form, not necessarily functions that decay monotonically with distance from their peak. We find that, except for performance functions with highly eccentric contours, simple shapes in phenotype space are still found, but with mildly curving edges instead of straight ones. In a wide range of systems, complex data on multiple quantitative traits, which might be expected to fill a high-dimensional phenotype space, is predicted instead to collapse onto low-dimensional shapes; phenotypes near the vertices of these shapes are predicted to be specialists, and can thus suggest which tasks may be at play. PMID:23789060
Framatome ANP outage optimization support solutions
International Nuclear Information System (INIS)
Bombail, Jean Paul
2003-01-01
Over the last several years, leading plant operators have demonstrated that availability factors can be improved while safety and reliability can be enhanced on a long-term basis and operating costs reduced. Outage optimization is the new term being used to describe these long-term initiatives through which a variety of measures aimed at shortening scheduled plant outages have been developed and successfully implemented by these leaders working with their service providers who were introducing new technologies and process improvements. Following the leaders, all operators now have ambitious outage optimization plans and the median and average outage duration are decreasing world-wide. Future objectives are even more stringent and must include plant upgrades and component replacements being performed for life extension of plant operation. Outage optimization covers a broad range of activities from modifications of plant systems to faster cool down rates to human behavior improvements. It has been proven to reduce costs, avoid unplanned outages and thus support plant availability and help to ensure the utility's competitive position in the marketplace
The exponentiated generalized Pareto distribution | Adeyemi | Ife ...
African Journals Online (AJOL)
Recently Gupta et al. (1998) introduced the exponentiated exponential distribution as a generalization of the standard exponential distribution. In this paper, we introduce a three-parameter generalized Pareto distribution, the exponentiated generalized Pareto distribution (EGP). We present a comprehensive treatment of the ...
Finding Multiple Optimal Solutions to Optimal Load Distribution Problem in Hydropower Plant
Directory of Open Access Journals (Sweden)
Xinhao Jiang
2012-05-01
Full Text Available Optimal load distribution (OLD among generator units of a hydropower plant is a vital task for hydropower generation scheduling and management. Traditional optimization methods for solving this problem focus on finding a single optimal solution. However, many practical constraints on hydropower plant operation are very difficult, if not impossible, to be modeled, and the optimal solution found by those models might be of limited practical uses. This motivates us to find multiple optimal solutions to the OLD problem, which can provide more flexible choices for decision-making. Based on a special dynamic programming model, we use a modified shortest path algorithm to produce multiple solutions to the problem. It is shown that multiple optimal solutions exist for the case study of China’s Geheyan hydropower plant, and they are valuable for assessing the stability of generator units, showing the potential of reducing occurrence times of units across vibration areas.
Optimization of process and solution parameters in electrospinning polyethylene oxide
CSIR Research Space (South Africa)
Jacobs, V
2011-11-01
Full Text Available This paper reports the optimization of electrospinning process and solution parameters using factorial design approach to obtain uniform polyethylene oxide (PEO) nanofibers. The parameters studied were distance between nozzle and collector screen...
IMRT optimization: Variability of solutions and its radiobiological impact
International Nuclear Information System (INIS)
Mattia, Maurizio; Del Giudice, Paolo; Caccia, Barbara
2004-01-01
We aim at (1) defining and measuring a 'complexity' index for the optimization process of an intensity modulated radiation therapy treatment plan (IMRT TP), (2) devising an efficient approximate optimization strategy, and (3) evaluating the impact of the complexity of the optimization process on the radiobiological quality of the treatment. In this work, for a prostate therapy case, the IMRT TP optimization problem has been formulated in terms of dose-volume constraints. The cost function has been minimized in order to achieve the optimal solution, by means of an iterative procedure, which is repeated for many initial modulation profiles, and for each of them the final optimal solution is recorded. To explore the complexity of the space of such solutions we have chosen to minimize the cost function with an algorithm that is unable to avoid local minima. The size of the (sub)optimal solutions distribution is taken as an indicator of the complexity of the optimization problem. The impact of the estimated complexity on the probability of success of the therapy is evaluated using radiobiological indicators (Poissonian TCP model [S. Webb and A. E. Nahum, Phys. Med. Biol. 38(6), 653-666 (1993)] and NTCP relative seriality model [Kallman et al., Int. J. Radiat. Biol. 62(2), 249-262 (1992)]). We find in the examined prostate case a nontrivial distribution of local minima, which has symmetry properties allowing a good estimate of near-optimal solutions with a moderate computational load. We finally demonstrate that reducing the a priori uncertainty in the optimal solution results in a significant improvement of the probability of success of the TP, based on TCP and NTCP estimates
Optimal Mortgage Refinancing: A Closed Form Solution.
Agarwal, Sumit; Driscoll, John C; Laibson, David I
2013-06-01
We derive the first closed-form optimal refinancing rule: Refinance when the current mortgage interest rate falls below the original rate by at least [Formula: see text] In this formula W (.) is the Lambert W -function, [Formula: see text] ρ is the real discount rate, λ is the expected real rate of exogenous mortgage repayment, σ is the standard deviation of the mortgage rate, κ/M is the ratio of the tax-adjusted refinancing cost and the remaining mortgage value, and τ is the marginal tax rate. This expression is derived by solving a tractable class of refinancing problems. Our quantitative results closely match those reported by researchers using numerical methods.
Optimal Mortgage Refinancing: A Closed Form Solution
Agarwal, Sumit; Driscoll, John C.; Laibson, David I.
2013-01-01
We derive the first closed-form optimal refinancing rule: Refinance when the current mortgage interest rate falls below the original rate by at least 1ψ[ϕ+W(−exp(−ϕ))]. In this formula W(.) is the Lambert W-function, ψ=2(ρ+λ)σ,ϕ=1+ψ(ρ+λ)κ∕M(1−τ), ρ is the real discount rate, λ is the expected real rate of exogenous mortgage repayment, σ is the standard deviation of the mortgage rate, κ/M is the ratio of the tax-adjusted refinancing cost and the remaining mortgage value, and τ is the marginal tax rate. This expression is derived by solving a tractable class of refinancing problems. Our quantitative results closely match those reported by researchers using numerical methods. PMID:25843977
Othman, Muhammad Murtadha; Abd Rahman, Nurulazmi; Musirin, Ismail; Fotuhi-Firuzabad, Mahmud; Rajabi-Ghahnavieh, Abbas
2015-01-01
This paper introduces a novel multiobjective approach for capacity benefit margin (CBM) assessment taking into account tie-line reliability of interconnected systems. CBM is the imperative information utilized as a reference by the load-serving entities (LSE) to estimate a certain margin of transfer capability so that a reliable access to generation through interconnected system could be attained. A new Pareto-based evolutionary programming (EP) technique is used to perform a simultaneous determination of CBM for all areas of the interconnected system. The selection of CBM at the Pareto optimal front is proposed to be performed by referring to a heuristic ranking index that takes into account system loss of load expectation (LOLE) in various conditions. Eventually, the power transfer based available transfer capability (ATC) is determined by considering the firm and nonfirm transfers of CBM. A comprehensive set of numerical studies are conducted on the modified IEEE-RTS79 and the performance of the proposed method is numerically investigated in detail. The main advantage of the proposed technique is in terms of flexibility offered to an independent system operator in selecting an appropriate solution of CBM simultaneously for all areas.
Directory of Open Access Journals (Sweden)
Muhammad Murtadha Othman
2015-01-01
Full Text Available This paper introduces a novel multiobjective approach for capacity benefit margin (CBM assessment taking into account tie-line reliability of interconnected systems. CBM is the imperative information utilized as a reference by the load-serving entities (LSE to estimate a certain margin of transfer capability so that a reliable access to generation through interconnected system could be attained. A new Pareto-based evolutionary programming (EP technique is used to perform a simultaneous determination of CBM for all areas of the interconnected system. The selection of CBM at the Pareto optimal front is proposed to be performed by referring to a heuristic ranking index that takes into account system loss of load expectation (LOLE in various conditions. Eventually, the power transfer based available transfer capability (ATC is determined by considering the firm and nonfirm transfers of CBM. A comprehensive set of numerical studies are conducted on the modified IEEE-RTS79 and the performance of the proposed method is numerically investigated in detail. The main advantage of the proposed technique is in terms of flexibility offered to an independent system operator in selecting an appropriate solution of CBM simultaneously for all areas.
Optimal configuration of power grid sources based on optimal particle swarm algorithm
Wen, Yuanhua
2018-04-01
In order to optimize the distribution problem of power grid sources, an optimized particle swarm optimization algorithm is proposed. First, the concept of multi-objective optimization and the Pareto solution set are enumerated. Then, the performance of the classical genetic algorithm, the classical particle swarm optimization algorithm and the improved particle swarm optimization algorithm are analyzed. The three algorithms are simulated respectively. Compared with the test results of each algorithm, the superiority of the algorithm in convergence and optimization performance is proved, which lays the foundation for subsequent micro-grid power optimization configuration solution.
Optimization of multi-objective micro-grid based on improved particle swarm optimization algorithm
Zhang, Jian; Gan, Yang
2018-04-01
The paper presents a multi-objective optimal configuration model for independent micro-grid with the aim of economy and environmental protection. The Pareto solution set can be obtained by solving the multi-objective optimization configuration model of micro-grid with the improved particle swarm algorithm. The feasibility of the improved particle swarm optimization algorithm for multi-objective model is verified, which provides an important reference for multi-objective optimization of independent micro-grid.
Multi-objective Reactive Power Optimization Based on Improved Particle Swarm Algorithm
Cui, Xue; Gao, Jian; Feng, Yunbin; Zou, Chenlu; Liu, Huanlei
2018-01-01
In this paper, an optimization model with the minimum active power loss and minimum voltage deviation of node and maximum static voltage stability margin as the optimization objective is proposed for the reactive power optimization problems. By defining the index value of reactive power compensation, the optimal reactive power compensation node was selected. The particle swarm optimization algorithm was improved, and the selection pool of global optimal and the global optimal of probability (p-gbest) were introduced. A set of Pareto optimal solution sets is obtained by this algorithm. And by calculating the fuzzy membership value of the pareto optimal solution sets, individuals with the smallest fuzzy membership value were selected as the final optimization results. The above improved algorithm is used to optimize the reactive power of IEEE14 standard node system. Through the comparison and analysis of the results, it has been proven that the optimization effect of this algorithm was very good.
Interactive Nonlinear Multiobjective Optimization Methods
Miettinen, Kaisa; Hakanen, Jussi; Podkopaev, Dmitry
2016-01-01
An overview of interactive methods for solving nonlinear multiobjective optimization problems is given. In interactive methods, the decision maker progressively provides preference information so that the most satisfactory Pareto optimal solution can be found for her or his. The basic features of several methods are introduced and some theoretical results are provided. In addition, references to modifications and applications as well as to other methods are indicated. As the...
Complicated problem solution techniques in optimal parameter searching
International Nuclear Information System (INIS)
Gergel', V.P.; Grishagin, V.A.; Rogatneva, E.A.; Strongin, R.G.; Vysotskaya, I.N.; Kukhtin, V.V.
1992-01-01
An algorithm is presented of a global search for numerical solution of multidimentional multiextremal multicriteria optimization problems with complicated constraints. A boundedness of object characteristic changes is assumed at restricted changes of its parameters (Lipschitz condition). The algorithm was realized as a computer code. The algorithm was realized as a computer code. The programme was used to solve in practice the different applied optimization problems. 10 refs.; 3 figs
Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.
Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O
2016-06-01
We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.
Analytic solution to variance optimization with no short positions
Kondor, Imre; Papp, Gábor; Caccioli, Fabio
2017-12-01
We consider the variance portfolio optimization problem with a ban on short selling. We provide an analytical solution by means of the replica method for the case of a portfolio of independent, but not identically distributed, assets. We study the behavior of the solution as a function of the ratio r between the number N of assets and the length T of the time series of returns used to estimate risk. The no-short-selling constraint acts as an asymmetric \
Kinetics of wealth and the Pareto law.
Boghosian, Bruce M
2014-04-01
An important class of economic models involve agents whose wealth changes due to transactions with other agents. Several authors have pointed out an analogy with kinetic theory, which describes molecules whose momentum and energy change due to interactions with other molecules. We pursue this analogy and derive a Boltzmann equation for the time evolution of the wealth distribution of a population of agents for the so-called Yard-Sale Model of wealth exchange. We examine the solutions to this equation by a combination of analytical and numerical methods and investigate its long-time limit. We study an important limit of this equation for small transaction sizes and derive a partial integrodifferential equation governing the evolution of the wealth distribution in a closed economy. We then describe how this model can be extended to include features such as inflation, production, and taxation. In particular, we show that the model with taxation exhibits the basic features of the Pareto law, namely, a lower cutoff to the wealth density at small values of wealth, and approximate power-law behavior at large values of wealth.
Inverse planning and optimization: a comparison of solutions
Energy Technology Data Exchange (ETDEWEB)
Ringor, Michael [School of Health Sciences, Purdue University, West Lafayette, IN (United States); Papiez, Lech [Department of Radiation Oncology, Indiana University, Indianapolis, IN (United States)
1998-09-01
The basic problem in radiation therapy treatment planning is to determine an appropriate set of treatment parameters that would induce an effective dose distribution inside a patient. One can approach this task as an inverse problem, or as an optimization problem. In this presentation, we compare both approaches. The inverse problem is presented as a dose reconstruction problem similar to tomography reconstruction. We formulate the optimization problem as linear and quadratic programs. Explicit comparisons are made between the solutions obtained by inversion and those obtained by optimization for the case in which scatter and attenuation are ignored (the NS-NA approximation)
Ghosh, D.; Sierksma, G.
2000-01-01
Sensitivity analysis of e-optimal solutions is the problem of calculating the range within which a problem parameter may lie so that the given solution re-mains e-optimal. In this paper we study the sensitivity analysis problem for e-optimal solutions tocombinatorial optimization problems with
Optimization of the solution of the problem of scheduling theory ...
African Journals Online (AJOL)
This article describes the genetic algorithm used to solve the problem related to the scheduling theory. A large number of different methods is described in the scientific literature. The main issue that faced the problem in question is that it is necessary to search the optimal solution in a large search space for the set of ...
Optimality conditions for the numerical solution of optimization problems with PDE constraints :
Energy Technology Data Exchange (ETDEWEB)
Aguilo Valentin, Miguel Alejandro; Ridzal, Denis
2014-03-01
A theoretical framework for the numerical solution of partial di erential equation (PDE) constrained optimization problems is presented in this report. This theoretical framework embodies the fundamental infrastructure required to e ciently implement and solve this class of problems. Detail derivations of the optimality conditions required to accurately solve several parameter identi cation and optimal control problems are also provided in this report. This will allow the reader to further understand how the theoretical abstraction presented in this report translates to the application.
International Nuclear Information System (INIS)
Feng, Yongqiang; Hung, TzuChen; Zhang, Yaning; Li, Bingxi; Yang, Jinfu; Shi, Yang
2015-01-01
Based on the thermoeconomic multi-objective optimization and decision makings, considering both exergy efficiency and LEC (levelized energy cost), the performance comparison of low-grade ORCs (organic Rankine cycles) using R245fa, pentane and their mixtures has been investigated. The effects of mass fraction of R245fa and four key parameters on the exergy efficiency and LEC are examined. The Pareto-optimal solutions are selected from the Pareto optimal frontier obtained by NSGA-II algorithm using three decision makings, including Shannon Entropy, LINMAP and TOPSIS. The deviation index is introduced to evaluate different decision makings. Research demonstrates that as the mass fraction of R245fa increasing, the exergy efficiency decreases first and then increases, while LEC presents a reverse trend. The optimum values from TOPSIS decision making are selected as the preferred Pareto-optimal solution for its lowest deviation index. The Pareto-optimal solutions for pentane, R245fa, and 0.5pentane/0.5R245fa in pairs of (exergy efficiency, LEC) are (0.5425, 0.104), (0.5502, 0.111), and (0.5212, 0.108), respectively. The mixture working fluids present lower thermodynamic performance and moderate economic performance than the pure working fluids under the Pareto optimization. - Highlights: • The thermoeconomic comparison between pure and mixture working fluids is investigated. • The Pareto-optimal solutions with bi-objective function using three decision makings are obtained. • The optimum values from TOPSIS decision making are selected as the preferred Pareto-optimal solution. • The mixture yields lower thermodynamic performance and moderate economic performance.
Comparative analysis of Pareto surfaces in multi-criteria IMRT planning
Energy Technology Data Exchange (ETDEWEB)
Teichert, K; Suess, P; Serna, J I; Monz, M; Kuefer, K H [Department of Optimization, Fraunhofer Institute for Industrial Mathematics (ITWM), Fraunhofer Platz 1, 67663 Kaiserslautern (Germany); Thieke, C, E-mail: katrin.teichert@itwm.fhg.de [Clinical Cooperation Unit Radiation Oncology, German Cancer Research Center, Im Neuenheimer Feld 280, 69120 Heidelberg (Germany)
2011-06-21
In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g. photons versus protons) than with the classical method of comparing single treatment plans.
Comparative analysis of Pareto surfaces in multi-criteria IMRT planning.
Teichert, K; Süss, P; Serna, J I; Monz, M; Küfer, K H; Thieke, C
2011-06-21
In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g., photons versus protons) than with the classical method of comparing single treatment plans.
Comparative analysis of Pareto surfaces in multi-criteria IMRT planning
International Nuclear Information System (INIS)
Teichert, K; Suess, P; Serna, J I; Monz, M; Kuefer, K H; Thieke, C
2011-01-01
In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g. photons versus protons) than with the classical method of comparing single treatment plans.
Multi-objective optimization approach for air traffic flow management
Directory of Open Access Journals (Sweden)
Fadil Rabie
2017-01-01
The decision-making stage was then performed with the aid of data clustering techniques to reduce the sizeof the Pareto-optimal set and obtain a smaller representation of the multi-objective design space, there by making it easier for the decision-maker to find satisfactory and meaningful trade-offs, and to select a preferred final design solution.
Pareto versus lognormal: a maximum entropy test.
Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano
2011-08-01
It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.
[Optimal solution and analysis of muscular force during standing balance].
Wang, Hongrui; Zheng, Hui; Liu, Kun
2015-02-01
The present study was aimed at the optimal solution of the main muscular force distribution in the lower extremity during standing balance of human. The movement musculoskeletal system of lower extremity was simplified to a physical model with 3 joints and 9 muscles. Then on the basis of this model, an optimum mathematical model was built up to solve the problem of redundant muscle forces. Particle swarm optimization (PSO) algorithm is used to calculate the single objective and multi-objective problem respectively. The numerical results indicated that the multi-objective optimization could be more reasonable to obtain the distribution and variation of the 9 muscular forces. Finally, the coordination of each muscle group during maintaining standing balance under the passive movement was qualitatively analyzed using the simulation results obtained.
Optimal Design Solutions for Permanent Magnet Synchronous Machines
Directory of Open Access Journals (Sweden)
POPESCU, M.
2011-11-01
Full Text Available This paper presents optimal design solutions for reducing the cogging torque of permanent magnets synchronous machines. A first solution proposed in the paper consists in using closed stator slots that determines a nearly isotropic magnetic structure of the stator core, reducing the mutual attraction between permanent magnets and the slotted armature. To avoid complications in the windings manufacture technology the stator slots are closed using wedges made of soft magnetic composite materials. The second solution consists in properly choosing the combination of pole number and stator slots number that typically leads to a winding with fractional number of slots/pole/phase. The proposed measures for cogging torque reduction are analyzed by means of 2D/3D finite element models developed using the professional Flux software package. Numerical results are discussed and compared with experimental ones obtained by testing a PMSM prototype.
International Nuclear Information System (INIS)
Zarepisheh, Masoud; Uribe-Sanchez, Andres F.; Li, Nan; Jia, Xun; Jiang, Steve B.
2014-01-01
Purpose: To establish a new mathematical framework for radiotherapy treatment optimization with voxel-dependent optimization parameters. Methods: In the treatment plan optimization problem for radiotherapy, a clinically acceptable plan is usually generated by an optimization process with weighting factors or reference doses adjusted for a set of the objective functions associated to the organs. Recent discoveries indicate that adjusting parameters associated with each voxel may lead to better plan quality. However, it is still unclear regarding the mathematical reasons behind it. Furthermore, questions about the objective function selection and parameter adjustment to assure Pareto optimality as well as the relationship between the optimal solutions obtained from the organ-based and voxel-based models remain unanswered. To answer these questions, the authors establish in this work a new mathematical framework equipped with two theorems. Results: The new framework clarifies the different consequences of adjusting organ-dependent and voxel-dependent parameters for the treatment plan optimization of radiation therapy, as well as the impact of using different objective functions on plan qualities and Pareto surfaces. The main discoveries are threefold: (1) While in the organ-based model the selection of the objective function has an impact on the quality of the optimized plans, this is no longer an issue for the voxel-based model since the Pareto surface is independent of the objective function selection and the entire Pareto surface could be generated as long as the objective function satisfies certain mathematical conditions; (2) All Pareto solutions generated by the organ-based model with different objective functions are parts of a unique Pareto surface generated by the voxel-based model with any appropriate objective function; (3) A much larger Pareto surface is explored by adjusting voxel-dependent parameters than by adjusting organ-dependent parameters, possibly
Multiresolution strategies for the numerical solution of optimal control problems
Jain, Sachin
There exist many numerical techniques for solving optimal control problems but less work has been done in the field of making these algorithms run faster and more robustly. The main motivation of this work is to solve optimal control problems accurately in a fast and efficient way. Optimal control problems are often characterized by discontinuities or switchings in the control variables. One way of accurately capturing the irregularities in the solution is to use a high resolution (dense) uniform grid. This requires a large amount of computational resources both in terms of CPU time and memory. Hence, in order to accurately capture any irregularities in the solution using a few computational resources, one can refine the mesh locally in the region close to an irregularity instead of refining the mesh uniformly over the whole domain. Therefore, a novel multiresolution scheme for data compression has been designed which is shown to outperform similar data compression schemes. Specifically, we have shown that the proposed approach results in fewer grid points in the grid compared to a common multiresolution data compression scheme. The validity of the proposed mesh refinement algorithm has been verified by solving several challenging initial-boundary value problems for evolution equations in 1D. The examples have demonstrated the stability and robustness of the proposed algorithm. The algorithm adapted dynamically to any existing or emerging irregularities in the solution by automatically allocating more grid points to the region where the solution exhibited sharp features and fewer points to the region where the solution was smooth. Thereby, the computational time and memory usage has been reduced significantly, while maintaining an accuracy equivalent to the one obtained using a fine uniform mesh. Next, a direct multiresolution-based approach for solving trajectory optimization problems is developed. The original optimal control problem is transcribed into a
Robust bayesian inference of generalized Pareto distribution ...
African Journals Online (AJOL)
En utilisant une etude exhaustive de Monte Carlo, nous prouvons que, moyennant une fonction perte generalisee adequate, on peut construire un estimateur Bayesien robuste du modele. Key words: Bayesian estimation; Extreme value; Generalized Fisher information; Gener- alized Pareto distribution; Monte Carlo; ...
Axiomatizations of Pareto Equilibria in Multicriteria Games
Voorneveld, M.; Vermeulen, D.; Borm, P.E.M.
1997-01-01
We focus on axiomatizations of the Pareto equilibrium concept in multicriteria games based on consistency.Axiomatizations of the Nash equilibrium concept by Peleg and Tijs (1996) and Peleg, Potters, and Tijs (1996) have immediate generalizations.The axiomatization of Norde et al.(1996) cannot be
van Zyl, J. Martin
2012-01-01
Random variables of the generalized Pareto distribution, can be transformed to that of the Pareto distribution. Explicit expressions exist for the maximum likelihood estimators of the parameters of the Pareto distribution. The performance of the estimation of the shape parameter of generalized Pareto distributed using transformed observations, based on the probability weighted method is tested. It was found to improve the performance of the probability weighted estimator and performs good wit...
Improved Shape Parameter Estimation in Pareto Distributed Clutter with Neural Networks
Directory of Open Access Journals (Sweden)
José Raúl Machado-Fernández
2016-12-01
Full Text Available The main problem faced by naval radars is the elimination of the clutter input which is a distortion signal appearing mixed with target reflections. Recently, the Pareto distribution has been related to sea clutter measurements suggesting that it may provide a better fit than other traditional distributions. The authors propose a new method for estimating the Pareto shape parameter based on artificial neural networks. The solution achieves a precise estimation of the parameter, having a low computational cost, and outperforming the classic method which uses Maximum Likelihood Estimates (MLE. The presented scheme contributes to the development of the NATE detector for Pareto clutter, which uses the knowledge of clutter statistics for improving the stability of the detection, among other applications.
Identifying best-fitting inputs in health-economic model calibration: a Pareto frontier approach.
Enns, Eva A; Cipriano, Lauren E; Simons, Cyrena T; Kong, Chung Yin
2015-02-01
To identify best-fitting input sets using model calibration, individual calibration target fits are often combined into a single goodness-of-fit (GOF) measure using a set of weights. Decisions in the calibration process, such as which weights to use, influence which sets of model inputs are identified as best-fitting, potentially leading to different health economic conclusions. We present an alternative approach to identifying best-fitting input sets based on the concept of Pareto-optimality. A set of model inputs is on the Pareto frontier if no other input set simultaneously fits all calibration targets as well or better. We demonstrate the Pareto frontier approach in the calibration of 2 models: a simple, illustrative Markov model and a previously published cost-effectiveness model of transcatheter aortic valve replacement (TAVR). For each model, we compare the input sets on the Pareto frontier to an equal number of best-fitting input sets according to 2 possible weighted-sum GOF scoring systems, and we compare the health economic conclusions arising from these different definitions of best-fitting. For the simple model, outcomes evaluated over the best-fitting input sets according to the 2 weighted-sum GOF schemes were virtually nonoverlapping on the cost-effectiveness plane and resulted in very different incremental cost-effectiveness ratios ($79,300 [95% CI 72,500-87,600] v. $139,700 [95% CI 79,900-182,800] per quality-adjusted life-year [QALY] gained). Input sets on the Pareto frontier spanned both regions ($79,000 [95% CI 64,900-156,200] per QALY gained). The TAVR model yielded similar results. Choices in generating a summary GOF score may result in different health economic conclusions. The Pareto frontier approach eliminates the need to make these choices by using an intuitive and transparent notion of optimality as the basis for identifying best-fitting input sets. © The Author(s) 2014.
Multi-objective optimization of a series–parallel system using GPSIA
International Nuclear Information System (INIS)
Okafor, Ekene Gabriel; Sun Youchao
2012-01-01
The optimal solution of a multi-objective optimization problem (MOP) corresponds to a Pareto set that is characterized by a tradeoff between objectives. Genetic Pareto Set Identification Algorithm (GPSIA) proposed for reliability-redundant MOPs is a hybrid technique which combines genetic and heuristic principles to generate non-dominated solutions. Series–parallel system with active redundancy is studied in this paper. Reliability and cost were the research objective functions subject to cost and weight constraints. The results reveal an evenly distributed non-dominated front. The distances between successive Pareto points were used to evaluate the general performance of the method. Plots were also used to show the computational results for the type of system studied and the robustness of the technique is discussed in comparison with NSGA-II and SPEA-2.
Genetic Algorithm Optimizes Q-LAW Control Parameters
Lee, Seungwon; von Allmen, Paul; Petropoulos, Anastassios; Terrile, Richard
2008-01-01
A document discusses a multi-objective, genetic algorithm designed to optimize Lyapunov feedback control law (Q-law) parameters in order to efficiently find Pareto-optimal solutions for low-thrust trajectories for electronic propulsion systems. These would be propellant-optimal solutions for a given flight time, or flight time optimal solutions for a given propellant requirement. The approximate solutions are used as good initial solutions for high-fidelity optimization tools. When the good initial solutions are used, the high-fidelity optimization tools quickly converge to a locally optimal solution near the initial solution. Q-law control parameters are represented as real-valued genes in the genetic algorithm. The performances of the Q-law control parameters are evaluated in the multi-objective space (flight time vs. propellant mass) and sorted by the non-dominated sorting method that assigns a better fitness value to the solutions that are dominated by a fewer number of other solutions. With the ranking result, the genetic algorithm encourages the solutions with higher fitness values to participate in the reproduction process, improving the solutions in the evolution process. The population of solutions converges to the Pareto front that is permitted within the Q-law control parameter space.
Risk Sensitivity, Independence of Irrelevant Alternatives and Continuity of Bargaining Solutions
M.B.M. de Koster (René); H.J.M. Peters (Hans); S.H. Tijs; P.P. Wakker (Peter)
1983-01-01
textabstractBargaining solutions are considered which have the following four properties: individual rationality, Pareto optimality, independence of equivalent utility representations, and independence of irrelevant alternatives. A main result of this paper is a simple proof of the fact that all
Spectral-Efficiency - Illumination Pareto Front for Energy Harvesting Enabled VLC System
Abdelhady, Amr Mohamed Abdelaziz
2017-12-13
The continuous improvement in optical energy harvesting devices motivates visible light communication (VLC) system developers to utilize such available free energy sources. An outdoor VLC system is considered where an optical base station sends data to multiple users that are capable of harvesting the optical energy. The proposed VLC system serves multiple users using time division multiple access (TDMA) with unequal time and power allocation, which are allocated to improve the system performance. The adopted optical system provides users with illumination and data communication services. The outdoor optical design objective is to maximize the illumination, while the communication design objective is to maximize the spectral efficiency (SE). The design objectives are shown to be conflicting, therefore, a multiobjective optimization problem is formulated to obtain the Pareto front performance curve for the proposed system. To this end, the marginal optimization problems are solved first using low complexity algorithms. Then, based on the proposed algorithms, a low complexity algorithm is developed to obtain an inner bound of the Pareto front for the illumination-SE tradeoff. The inner bound for the Pareto-front is shown to be close to the optimal Pareto-frontier via several simulation scenarios for different system parameters.
Vorozheikin, A.; Gonchar, T.; Panfilov, I.; Sopov, E.; Sopov, S.
2009-01-01
A new algorithm for the solution of complex constrained optimization problems based on the probabilistic genetic algorithm with optimal solution prediction is proposed. The efficiency investigation results in comparison with standard genetic algorithm are presented.
A choice of the parameters of NPP steam generators on the basis of vector optimization
International Nuclear Information System (INIS)
Lemeshev, V.U.; Metreveli, D.G.
1981-01-01
The optimization problem of the parameters of the designed systems is considered as the problem of multicriterion optimization. It is proposed to choose non-dominant, optimal according to Pareto, parameters. An algorithm is built on the basis of the required and sufficient non-dominant conditions to find non-dominant solutions. This algorithm has been employed to solve the problem on a choice of optimal parameters for the counterflow shell-tube steam generator of NPP of BRGD type [ru
Directory of Open Access Journals (Sweden)
A. P. Karpenko
2016-01-01
Full Text Available We consider a class of algorithms for multi-objective optimization - Pareto-approximation algorithms, which suppose a preliminary building of finite-dimensional approximation of a Pareto set, thereby also a Pareto front of the problem. The article gives an overview of population and non-population algorithms of the Pareto-approximation, identifies their strengths and weaknesses, and presents a canonical algorithm "predator-prey", showing its shortcomings. We offer a number of modifications of the canonical algorithm "predator-prey" with the aim to overcome the drawbacks of this algorithm, present the results of a broad study of the efficiency of these modifications of the algorithm. The peculiarity of the study is the use of the quality indicators of the Pareto-approximation, which previous publications have not used. In addition, we present the results of the meta-optimization of the modified algorithm, i.e. determining the optimal values of some free parameters of the algorithm. The study of efficiency of the modified algorithm "predator-prey" has shown that the proposed modifications allow us to improve the following indicators of the basic algorithm: cardinality of a set of the archive solutions, uniformity of archive solutions, and computation time. By and large, the research results have shown that the modified and meta-optimized algorithm enables achieving exactly the same approximation as the basic algorithm, but with the number of preys being one order less. Computational costs are proportionally reduced.
Evaluation of Preanalytical Quality Indicators by Six Sigma and Pareto`s Principle.
Kulkarni, Sweta; Ramesh, R; Srinivasan, A R; Silvia, C R Wilma Delphine
2018-01-01
Preanalytical steps are the major sources of error in clinical laboratory. The analytical errors can be corrected by quality control procedures but there is a need for stringent quality checks in preanalytical area as these processes are done outside the laboratory. Sigma value depicts the performance of laboratory and its quality measures. Hence in the present study six sigma and Pareto principle was applied to preanalytical quality indicators to evaluate the clinical biochemistry laboratory performance. This observational study was carried out for a period of 1 year from November 2015-2016. A total of 1,44,208 samples and 54,265 test requisition forms were screened for preanalytical errors like missing patient information, sample collection details in forms and hemolysed, lipemic, inappropriate, insufficient samples and total number of errors were calculated and converted into defects per million and sigma scale. Pareto`s chart was drawn using total number of errors and cumulative percentage. In 75% test requisition forms diagnosis was not mentioned and sigma value of 0.9 was obtained and for other errors like sample receiving time, stat and type of sample sigma values were 2.9, 2.6, and 2.8 respectively. For insufficient sample and improper ratio of blood to anticoagulant sigma value was 4.3. Pareto`s chart depicts out of 80% of errors in requisition forms, 20% is contributed by missing information like diagnosis. The development of quality indicators, application of six sigma and Pareto`s principle are quality measures by which not only preanalytical, the total testing process can be improved.
Directory of Open Access Journals (Sweden)
Firat Evirgen
2016-04-01
Full Text Available In this paper, a class of Nonlinear Programming problem is modeled with gradient based system of fractional order differential equations in Caputo's sense. To see the overlap between the equilibrium point of the fractional order dynamic system and theoptimal solution of the NLP problem in a longer timespan the Multistage Variational İteration Method isapplied. The comparisons among the multistage variational iteration method, the variationaliteration method and the fourth order Runge-Kutta method in fractional and integer order showthat fractional order model and techniques can be seen as an effective and reliable tool for finding optimal solutions of Nonlinear Programming problems.
Multiclass gene selection using Pareto-fronts.
Rajapakse, Jagath C; Mundra, Piyushkumar A
2013-01-01
Filter methods are often used for selection of genes in multiclass sample classification by using microarray data. Such techniques usually tend to bias toward a few classes that are easily distinguishable from other classes due to imbalances of strong features and sample sizes of different classes. It could therefore lead to selection of redundant genes while missing the relevant genes, leading to poor classification of tissue samples. In this manuscript, we propose to decompose multiclass ranking statistics into class-specific statistics and then use Pareto-front analysis for selection of genes. This alleviates the bias induced by class intrinsic characteristics of dominating classes. The use of Pareto-front analysis is demonstrated on two filter criteria commonly used for gene selection: F-score and KW-score. A significant improvement in classification performance and reduction in redundancy among top-ranked genes were achieved in experiments with both synthetic and real-benchmark data sets.
Pareto vs Simmel: residui ed emozioni
Directory of Open Access Journals (Sweden)
Silvia Fornari
2017-08-01
Full Text Available A cento anni dalla pubblicazione del Trattato di sociologia generale (Pareto 1988 siamo a mantenere vivo ed attuale lo studio paretiano con una rilettura contemporanea del suo pensiero. Ricordato per la grande versatilità intellettuale dagli economisti, rimane lo scienziato rigoroso ed analitico i cui contributi sono ancora discussi a livello internazionale. Noi ne analizzeremo gli aspetti che l’hanno portato ad avvicinarsi all’approccio sociologico, con l’introduzione della nota distinzione dell’azione sociale: logica e non-logica. Una dicotomia utilizzata per dare conto dei cambiamenti sociali riguardanti le modalità d’azione degli uomini e delle donne. Com’è noto le azioni logiche sono quelle che riguardano comportamenti mossi da logicità e raziocinio, in cui vi è una diretta relazione causa-effetto, azioni oggetto di studio degli economisti, e di cui non si occupano i sociologi. Le azioni non-logiche riguardano tutte le tipologie di agire umano che rientrano nel novero delle scienze sociali, e che rappresentano la parte più ampia dell’agire sociale. Sono le azioni guidate dai sentimenti, dall’emotività, dalla superstizione, ecc., illustrate da Pareto nel Trattato di sociologia generale e in saggi successivi, dove riprende anche il concetto di eterogenesi dei fini, formulato per la prima volta da Giambattista Vico. Concetto secondo il quale la storia umana, pur conservando in potenza la realizzazione di certi fini, non è lineare e lungo il suo percorso evolutivo può accadere che l’uomo nel tentativo di raggiungere una finalità arrivi a conclusioni opposte. Pareto collega la definizione del filosofo napoletano alle tipologie di azione sociale e alla loro distinzione (logiche, non-logiche. L’eterogenesi dei fini per Pareto è dunque l’esito di un particolare tipo di azione non-logica dell’essere umano e della collettività.
Fernández Caballero, Juan Carlos; Martínez, Francisco José; Hervás, César; Gutiérrez, Pedro Antonio
2010-05-01
This paper proposes a multiclassification algorithm using multilayer perceptron neural network models. It tries to boost two conflicting main objectives of multiclassifiers: a high correct classification rate level and a high classification rate for each class. This last objective is not usually optimized in classification, but is considered here given the need to obtain high precision in each class in real problems. To solve this machine learning problem, we use a Pareto-based multiobjective optimization methodology based on a memetic evolutionary algorithm. We consider a memetic Pareto evolutionary approach based on the NSGA2 evolutionary algorithm (MPENSGA2). Once the Pareto front is built, two strategies or automatic individual selection are used: the best model in accuracy and the best model in sensitivity (extremes in the Pareto front). These methodologies are applied to solve 17 classification benchmark problems obtained from the University of California at Irvine (UCI) repository and one complex real classification problem. The models obtained show high accuracy and a high classification rate for each class.
Chevrier , Rémy
2010-01-01
International audience; An approach for speed tuning in railway management is presented for optimizing both travel duration and energy saving. This approach is based on a state-of-the-art evolutionary algorithm with Pareto approach. This algorithm provides a set of diversified non-dominated solutions to the decision-maker. A case study on Gonesse connection (France) is also reported and analyzed.
Multi-objective optimization of a joule cycle for re-liquefaction of the Liquefied Natural Gas
International Nuclear Information System (INIS)
Sayyaadi, Hoseyn; Babaelahi, M.
2011-01-01
Highlights: → A typical LNG boil off gas re-liquefaction plant system is optimized. → Objective functions based on thermodynamic and thermoeconomic analysis are obtained. → The cost of the system product and the exergetic efficiency are optimized, simultaneously. → A decision-making process for selection of the final optimal design is introduced. → Results obtained using various optimization scenarios are compared and discussed. - Abstract: A LNG re-liquefaction plant is optimized with a multi-objective approach which simultaneously considers exergetic and exergoeconomic objectives. In this regard, optimization is performed in order to maximize the exergetic efficiency of plant and minimize the unit cost of the system product (refrigeration effect), simultaneously. Thermodynamic modeling is performed based on energy and exergy analyses, while an exergoeconomic model based on the total revenue requirement (TRR) are developed. Optimization programming in MATLAB is performed using one of the most powerful and robust multi-objective optimization algorithms namely NSGA-II. This approach which is based on the Genetic Algorithm is applied to find a set of Pareto optimal solutions. Pareto optimal frontier is obtained and a final optimal solution is selected in a decision-making process. An example of decision-making process for selection of the final solution from the available optimal points of the Pareto frontier is presented here. The feature of selected final optimal system is compared with corresponding features of the base case and exergoeconomic single-objective optimized systems and discussed.
International Nuclear Information System (INIS)
Elsays, Mostafa A.; Naguib Aly, M.; Badawi, Alya A.
2009-01-01
In this paper, the Particle Swarm Optimization (PSO) algorithm is modified to deal with Multiobjective Optimization Problems (MOPs). A mathematical model for predicting the dynamic response of the H. B. Robinson nuclear power plant, which represents an Initial Value Problem (IVP) of Ordinary Differential Equations (ODEs), is solved using Runge-Kutta formula. The resulted data values are represented as a system of nonlinear algebraic equations by interpolation schemes for data fitting. This system of fitted nonlinear algebraic equations represents a nonlinear multiobjective optimization problem. A Multiobjective Particle Swarm Optimizer (MOPSO) which is based on the Pareto optimality concept is developed and applied to maximize the above mentioned problem. Results show that MOPSO efficiently cope with the problem and finds multiple Pareto optimal solutions. (orig.)
Optimal resource allocation solutions for heterogeneous cognitive radio networks
Directory of Open Access Journals (Sweden)
Babatunde Awoyemi
2017-05-01
Full Text Available Cognitive radio networks (CRN are currently gaining immense recognition as the most-likely next-generation wireless communication paradigm, because of their enticing promise of mitigating the spectrum scarcity and/or underutilisation challenge. Indisputably, for this promise to ever materialise, CRN must of necessity devise appropriate mechanisms to judiciously allocate their rather scarce or limited resources (spectrum and others among their numerous users. ‘Resource allocation (RA in CRN', which essentially describes mechanisms that can effectively and optimally carry out such allocation, so as to achieve the utmost for the network, has therefore recently become an important research focus. However, in most research works on RA in CRN, a highly significant factor that describes a more realistic and practical consideration of CRN has been ignored (or only partially explored, i.e., the aspect of the heterogeneity of CRN. To address this important aspect, in this paper, RA models that incorporate the most essential concepts of heterogeneity, as applicable to CRN, are developed and the imports of such inclusion in the overall networking are investigated. Furthermore, to fully explore the relevance and implications of the various heterogeneous classifications to the RA formulations, weights are attached to the different classes and their effects on the network performance are studied. In solving the developed complex RA problems for heterogeneous CRN, a solution approach that examines and exploits the structure of the problem in achieving a less-complex reformulation, is extensively employed. This approach, as the results presented show, makes it possible to obtain optimal solutions to the rather difficult RA problems of heterogeneous CRN.
A probabilistic computational framework for bridge network optimal maintenance scheduling
International Nuclear Information System (INIS)
Bocchini, Paolo; Frangopol, Dan M.
2011-01-01
This paper presents a probabilistic computational framework for the Pareto optimization of the preventive maintenance applications to bridges of a highway transportation network. The bridge characteristics are represented by their uncertain reliability index profiles. The in/out of service states of the bridges are simulated taking into account their correlation structure. Multi-objective Genetic Algorithms have been chosen as numerical tool for the solution of the optimization problem. The design variables of the optimization are the preventive maintenance schedules of all the bridges of the network. The two conflicting objectives are the minimization of the total present maintenance cost and the maximization of the network performance indicator. The final result is the Pareto front of optimal solutions among which the managers should chose, depending on engineering and economical factors. A numerical example illustrates the application of the proposed approach.
International Nuclear Information System (INIS)
Tian, Hua; Chang, Liwen; Shu, Gequn; Shi, Lingfeng
2017-01-01
Highlights: • A systematic optimization methodology is presented for carbon dioxide power cycle. • Adding the regenerator is a significant means to improve the system performance. • A decision making based on the optimization results is conducted in depth. • Specific optimal solutions are selected from Pareto fronts for different demands. - Abstract: In this paper, a systematic multi-objective optimization methodology is presented for the carbon dioxide transcritical power cycle with various configurations used in engine waste heat recovery to generate more power efficiently and economically. The parametric optimization is performed for the maximum net power output and exergy efficiency, as well as the minimum electricity production cost by using the genetic algorithm. The comparison of the optimization results shows the thermodynamic performance can be most enhanced by simultaneously adding the preheater and regenerator based on the basic configuration, and the highest net power output and exergy efficiency are 25.89 kW and 40.95%, respectively. Meanwhile, the best economic performance corresponding to the lowest electricity production cost of 0.560$/kW·h is achieved with simply applying an additional regenerator. Moreover, a thorough decision making is conducted for a further screening of the obtained optimal solutions. A most preferred Pareto optimal solution or a representative subset of the Pareto optimal solutions is obtained according to additional subjective preferences while a referential optimal solution is also provided on the condition of no additional preference.
Optimal Water-Power Flow Problem: Formulation and Distributed Optimal Solution
Energy Technology Data Exchange (ETDEWEB)
Dall-Anese, Emiliano [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zhao, Changhong [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zamzam, Admed S. [University of Minnesota; Sidiropoulos, Nicholas D. [University of Minnesota; Taylor, Josh A. [University of Toronto
2018-01-12
This paper formalizes an optimal water-power flow (OWPF) problem to optimize the use of controllable assets across power and water systems while accounting for the couplings between the two infrastructures. Tanks and pumps are optimally managed to satisfy water demand while improving power grid operations; {for the power network, an AC optimal power flow formulation is augmented to accommodate the controllability of water pumps.} Unfortunately, the physics governing the operation of the two infrastructures and coupling constraints lead to a nonconvex (and, in fact, NP-hard) problem; however, after reformulating OWPF as a nonconvex, quadratically-constrained quadratic problem, a feasible point pursuit-successive convex approximation approach is used to identify feasible and optimal solutions. In addition, a distributed solver based on the alternating direction method of multipliers enables water and power operators to pursue individual objectives while respecting the couplings between the two networks. The merits of the proposed approach are demonstrated for the case of a distribution feeder coupled with a municipal water distribution network.
Electrical Discharge Platinum Machining Optimization Using Stefan Problem Solutions
Directory of Open Access Journals (Sweden)
I. B. Stavitskiy
2015-01-01
Full Text Available The article presents the theoretical study results of platinum workability by electrical discharge machining (EDM, based on the solution of the thermal problem of moving the boundary of material change phase, i.e. Stefan problem. The problem solution enables defining the surface melt penetration of the material under the heat flow proceeding from the time of its action and the physical properties of the processed material. To determine the rational EDM operating conditions of platinum the article suggests relating its workability with machinability of materials, for which the rational EDM operating conditions are, currently, defined. It is shown that at low densities of the heat flow corresponding to the finishing EDM operating conditions, the processing conditions used for steel 45 are appropriate for platinum machining; with EDM at higher heat flow densities (e.g. 50 GW / m2 for this purpose copper processing conditions are used; at the high heat flow densities corresponding to heavy roughing EDM it is reasonable to use tungsten processing conditions. The article also represents how the minimum width of the current pulses, at which platinum starts melting and, accordingly, the EDM process becomes possible, depends on the heat flow density. It is shown that the processing of platinum is expedient at a pulse width corresponding to the values, called the effective pulse width. Exceeding these values does not lead to a substantial increase in removal of material per pulse, but considerably reduces the maximum repetition rate and therefore, the EDM capacity. The paper shows the effective pulse width versus the heat flow density. It also presents the dependences of the maximum platinum surface melt penetration and the corresponding pulse width on the heat flow density. Results obtained using solutions of the Stephen heat problem can be used to optimize EDM operating conditions of platinum machining.
Centralized Stochastic Optimal Control of Complex Systems
Energy Technology Data Exchange (ETDEWEB)
Malikopoulos, Andreas [ORNL
2015-01-01
In this paper we address the problem of online optimization of the supervisory power management control in parallel hybrid electric vehicles (HEVs). We model HEV operation as a controlled Markov chain using the long-run expected average cost per unit time criterion, and we show that the control policy yielding the Pareto optimal solution minimizes the average cost criterion online. The effectiveness of the proposed solution is validated through simulation and compared to the solution derived with dynamic programming using the average cost criterion.
International Nuclear Information System (INIS)
Li Zhaojun; Liao Haitao; Coit, David W.
2009-01-01
This paper proposes a two-stage approach for solving multi-objective system reliability optimization problems. In this approach, a Pareto optimal solution set is initially identified at the first stage by applying a multiple objective evolutionary algorithm (MOEA). Quite often there are a large number of Pareto optimal solutions, and it is difficult, if not impossible, to effectively choose the representative solutions for the overall problem. To overcome this challenge, an integrated multiple objective selection optimization (MOSO) method is utilized at the second stage. Specifically, a self-organizing map (SOM), with the capability of preserving the topology of the data, is applied first to classify those Pareto optimal solutions into several clusters with similar properties. Then, within each cluster, the data envelopment analysis (DEA) is performed, by comparing the relative efficiency of those solutions, to determine the final representative solutions for the overall problem. Through this sequential solution identification and pruning process, the final recommended solutions to the multi-objective system reliability optimization problem can be easily determined in a more systematic and meaningful way.
Particle Swarm Optimization with Various Inertia Weight Variants for Optimal Power Flow Solution
Directory of Open Access Journals (Sweden)
Prabha Umapathy
2010-01-01
Full Text Available This paper proposes an efficient method to solve the optimal power flow problem in power systems using Particle Swarm Optimization (PSO. The objective of the proposed method is to find the steady-state operating point which minimizes the fuel cost, while maintaining an acceptable system performance in terms of limits on generator power, line flow, and voltage. Three different inertia weights, a constant inertia weight (CIW, a time-varying inertia weight (TVIW, and global-local best inertia weight (GLbestIW, are considered with the particle swarm optimization algorithm to analyze the impact of inertia weight on the performance of PSO algorithm. The PSO algorithm is simulated for each of the method individually. It is observed that the PSO algorithm with the proposed inertia weight yields better results, both in terms of optimal solution and faster convergence. The proposed method has been tested on the standard IEEE 30 bus test system to prove its efficacy. The algorithm is computationally faster, in terms of the number of load flows executed, and provides better results than other heuristic techniques.
Keulen, van T.A.C.; Gillot, J.; Jager, de A.G.; Steinbuch, M.
2014-01-01
This paper presents a numerical solution for scalar state constrained optimal control problems. The algorithm rewrites the constrained optimal control problem as a sequence of unconstrained optimal control problems which can be solved recursively as a two point boundary value problem. The solution
Software Support for Optimizing Layout Solution in Lean Production
Directory of Open Access Journals (Sweden)
Naqib Daneshjo
2018-02-01
Full Text Available As progressive managerial styles, the techniques based on "lean thinking" are being increasingly promoted. They are focused on applying lean production concepts to all phases of product lifecycle and also to business environment. This innovative approach strives to eliminate any wasting of resources and shortens the time to respond to customer requirements, including redesigning the structure of the organization’ supply chain. A lean organization is created mainly by employees, their creative potential, knowledge, self-realization and motivation for continuous improvement of the processes and the production systems. A set of tools, techniques and methods of lean production is basically always very similar. Only a form of their presentation or classification into individual phases of the product lifecycle may differ. The authors present the results of their research from the designing phases of production systems to optimize their dispositional solution with software support and 3D simulation and visualization. Modelling is based on use of Tecnomatix's and Photomodeler's progressive software tools and a dynamic model for capacitive dimensioning of more intelligent production system
Directory of Open Access Journals (Sweden)
Ajibade Oluwaseyi Ayodele
2016-01-01
Full Text Available In this attempt, which is a second part of discussions on tapped density optimisation for four agricultural wastes (particles of coconut, periwinkle, palm kernel and egg shells, performance analysis for comparative basis is made. This paper pioneers a study direction in which optimisation of process variables are pursued using Taguchi method integrated with the Pareto 80-20 rule. Negative percentage improvements resulted when the optimal tapped density was compared with the average tapped density. However, the performance analysis between optimal tapped density and the peak tapped density values yielded positive percentage improvements for the four filler particles. The performance analysis results validate the effectiveness of using the Taguchi method in improving the tapped density properties of the filler particles. The application of the Pareto 80-20 rule to the table of parameters and levels produced revised tables of parameters and levels which helped to identify the factor-levels position of each parameter that is economical to optimality. The Pareto 80-20 rule also produced revised S/N response tables which were used to know the relevant S/N ratios that are relevant to optimality.
Directory of Open Access Journals (Sweden)
Kristoffer Petersson
2017-07-01
Full Text Available We present a clinical distance measure for Pareto front evaluation studies in radiotherapy, which we show strongly correlates (r = 0.74 and 0.90 with clinical plan quality evaluation. For five prostate cases, sub-optimal treatment plans located at a clinical distance value of >0.32 (0.28–0.35 from fronts of Pareto optimal plans, were assessed to be of lower plan quality by our (12 observers (p < .05. In conclusion, the clinical distance measure can be used to determine if the difference between a front and a given plan (or between different fronts corresponds to a clinically significant plan quality difference.
International Nuclear Information System (INIS)
Elsays, Mostafa A.; Naguib Aly, M; Badawi, Alya A.
2010-01-01
The Particle Swarm Optimization (PSO) algorithm is used to optimize the design of shell-and-tube heat exchangers and determine the optimal feasible solutions so as to eliminate trial-and-error during the design process. The design formulation takes into account the area and the total annual cost of heat exchangers as two objective functions together with operating as well as geometrical constraints. The Nonlinear Constrained Single Objective Particle Swarm Optimization (NCSOPSO) algorithm is used to minimize and find the optimal feasible solution for each of the nonlinear constrained objective functions alone, respectively. Then, a novel Nonlinear Constrained Mult-objective Particle Swarm Optimization (NCMOPSO) algorithm is used to minimize and find the Pareto optimal solutions for both of the nonlinear constrained objective functions together. The experimental results show that the two algorithms are very efficient, fast and can find the accurate optimal feasible solutions of the shell and tube heat exchangers design optimization problem. (orig.)
Pareto frontier analyses based decision making tool for transportation of hazardous waste
International Nuclear Information System (INIS)
Das, Arup; Mazumder, T.N.; Gupta, A.K.
2012-01-01
Highlights: ► Posteriori method using multi-objective approach to solve bi-objective routing problem. ► System optimization (with multiple source–destination pairs) in a capacity constrained network using non-dominated sorting. ► Tools like cost elasticity and angle based focus used to analyze Pareto frontier to aid stakeholders make informed decisions. ► A real life case study of Kolkata Metropolitan Area to explain the workability of the model. - Abstract: Transportation of hazardous wastes through a region poses immense threat on the development along its road network. The risk to the population, exposed to such activities, has been documented in the past. However, a comprehensive framework for routing hazardous wastes has often been overlooked. A regional Hazardous Waste Management scheme should incorporate a comprehensive framework for hazardous waste transportation. This framework would incorporate the various stakeholders involved in decision making. Hence, a multi-objective approach is required to safeguard the interest of all the concerned stakeholders. The objective of this study is to design a methodology for routing of hazardous wastes between the generating units and the disposal facilities through a capacity constrained network. The proposed methodology uses posteriori method with multi-objective approach to find non-dominated solutions for the system consisting of multiple origins and destinations. A case study of transportation of hazardous wastes in Kolkata Metropolitan Area has also been provided to elucidate the methodology.
Tuning rules for robust FOPID controllers based on multi-objective optimization with FOPDT models.
Sánchez, Helem Sabina; Padula, Fabrizio; Visioli, Antonio; Vilanova, Ramon
2017-01-01
In this paper a set of optimally balanced tuning rules for fractional-order proportional-integral-derivative controllers is proposed. The control problem of minimizing at once the integrated absolute error for both the set-point and the load disturbance responses is addressed. The control problem is stated as a multi-objective optimization problem where a first-order-plus-dead-time process model subject to a robustness, maximum sensitivity based, constraint has been considered. A set of Pareto optimal solutions is obtained for different normalized dead times and then the optimal balance between the competing objectives is obtained by choosing the Nash solution among the Pareto-optimal ones. A curve fitting procedure has then been applied in order to generate suitable tuning rules. Several simulation results show the effectiveness of the proposed approach. Copyright © 2016. Published by Elsevier Ltd.
Asymptotic Normality of the Optimal Solution in Multiresponse Surface Mathematical Programming
Díaz-García, José A.; Caro-Lopera, Francisco J.
2015-01-01
An explicit form for the perturbation effect on the matrix of regression coeffi- cients on the optimal solution in multiresponse surface methodology is obtained in this paper. Then, the sensitivity analysis of the optimal solution is studied and the critical point characterisation of the convex program, associated with the optimum of a multiresponse surface, is also analysed. Finally, the asymptotic normality of the optimal solution is derived by the standard methods.
Xu, Gongxian; Liu, Ying; Gao, Qunwang
2016-02-10
This paper deals with multi-objective optimization of continuous bio-dissimilation process of glycerol to 1, 3-propanediol. In order to maximize the production rate of 1, 3-propanediol, maximize the conversion rate of glycerol to 1, 3-propanediol, maximize the conversion rate of glycerol, and minimize the concentration of by-product ethanol, we first propose six new multi-objective optimization models that can simultaneously optimize any two of the four objectives above. Then these multi-objective optimization problems are solved by using the weighted-sum and normal-boundary intersection methods respectively. Both the Pareto filter algorithm and removal criteria are used to remove those non-Pareto optimal points obtained by the normal-boundary intersection method. The results show that the normal-boundary intersection method can successfully obtain the approximate Pareto optimal sets of all the proposed multi-objective optimization problems, while the weighted-sum approach cannot achieve the overall Pareto optimal solutions of some multi-objective problems. Copyright © 2015 Elsevier B.V. All rights reserved.
Mahmoodabadi, M J; Taherkhorsandi, M; Bagheri, A
2014-01-01
An optimal robust state feedback tracking controller is introduced to control a biped robot. In the literature, the parameters of the controller are usually determined by a tedious trial and error process. To eliminate this process and design the parameters of the proposed controller, the multiobjective evolutionary algorithms, that is, the proposed method, modified NSGAII, Sigma method, and MATLAB's Toolbox MOGA, are employed in this study. Among the used evolutionary optimization algorithms to design the controller for biped robots, the proposed method operates better in the aspect of designing the controller since it provides ample opportunities for designers to choose the most appropriate point based upon the design criteria. Three points are chosen from the nondominated solutions of the obtained Pareto front based on two conflicting objective functions, that is, the normalized summation of angle errors and normalized summation of control effort. Obtained results elucidate the efficiency of the proposed controller in order to control a biped robot.
An Investigation of the Pareto Distribution as a Model for High Grazing Angle Clutter
2011-03-01
radar detection schemes under controlled conditions. Complicated clutter models result in mathematical difficulties in the determination of optimal and...a population [7]. It has been used in the modelling of actuarial data; an example is in excess of loss quotations in insurance [8]. Its usefulness as...UNCLASSIFIED modified Bessel functions, making it difficult to employ in radar detection schemes. The Pareto Distribution is amenable to mathematical
Solution of optimal power flow using evolutionary-based algorithms
African Journals Online (AJOL)
It aims to estimate the optimal settings of real generator output power, bus voltage, ...... Lansey, K. E., 2003, Optimization of water distribution network design using ... Pandit, M., 2016, Economic load dispatch of wind-solar-thermal system using ...
Multi-objective Optimization of Pulsed Gas Metal Arc Welding Process Using Neuro NSGA-II
Pal, Kamal; Pal, Surjya K.
2018-05-01
Weld quality is a critical issue in fabrication industries where products are custom-designed. Multi-objective optimization results number of solutions in the pareto-optimal front. Mathematical regression model based optimization methods are often found to be inadequate for highly non-linear arc welding processes. Thus, various global evolutionary approaches like artificial neural network, genetic algorithm (GA) have been developed. The present work attempts with elitist non-dominated sorting GA (NSGA-II) for optimization of pulsed gas metal arc welding process using back propagation neural network (BPNN) based weld quality feature models. The primary objective to maintain butt joint weld quality is the maximization of tensile strength with minimum plate distortion. BPNN has been used to compute the fitness of each solution after adequate training, whereas NSGA-II algorithm generates the optimum solutions for two conflicting objectives. Welding experiments have been conducted on low carbon steel using response surface methodology. The pareto-optimal front with three ranked solutions after 20th generations was considered as the best without further improvement. The joint strength as well as transverse shrinkage was found to be drastically improved over the design of experimental results as per validated pareto-optimal solutions obtained.
Pareto Improving Price Regulation when the Asset Market is Incomplete
Herings, P.J.J.; Polemarchakis, H.M.
1999-01-01
When the asset market is incomplete, competitive equilibria are constrained suboptimal, which provides a scope for pareto improving interventions. Price regulation can be such a pareto improving policy, even when the welfare effects of rationing are taken into account. An appealing aspect of price
Pareto 80/20 Law: Derivation via Random Partitioning
Lipovetsky, Stan
2009-01-01
The Pareto 80/20 Rule, also known as the Pareto principle or law, states that a small number of causes (20%) is responsible for a large percentage (80%) of the effect. Although widely recognized as a heuristic rule, this proportion has not been theoretically based. The article considers derivation of this 80/20 rule and some other standard…
The exponential age distribution and the Pareto firm size distribution
Coad, Alex
2008-01-01
Recent work drawing on data for large and small firms has shown a Pareto distribution of firm size. We mix a Gibrat-type growth process among incumbents with an exponential distribution of firm’s age, to obtain the empirical Pareto distribution.
Families of optimal thermodynamic solutions for combined cycle gas turbine (CCGT) power plants
International Nuclear Information System (INIS)
Godoy, E.; Scenna, N.J.; Benz, S.J.
2010-01-01
Optimal designs of a CCGT power plant characterized by maximum second law efficiency values are determined for a wide range of power demands and different values of the available heat transfer area. These thermodynamic optimal solutions are found within a feasible operation region by means of a non-linear mathematical programming (NLP) model, where decision variables (i.e. transfer areas, power production, mass flow rates, temperatures and pressures) can vary freely. Technical relationships among them are used to systematize optimal values of design and operative variables of a CCGT power plant into optimal solution sets, named here as optimal solution families. From an operative and design point of view, the families of optimal solutions let knowing in advance optimal values of the CCGT variables when facing changes of power demand or adjusting the design to an available heat transfer area.
van de Schoot, A J A J; Visser, J; van Kesteren, Z; Janssen, T M; Rasch, C R N; Bel, A
2016-02-21
The Pareto front reflects the optimal trade-offs between conflicting objectives and can be used to quantify the effect of different beam configurations on plan robustness and dose-volume histogram parameters. Therefore, our aim was to develop and implement a method to automatically approach the Pareto front in robust intensity-modulated proton therapy (IMPT) planning. Additionally, clinically relevant Pareto fronts based on different beam configurations will be derived and compared to enable beam configuration selection in cervical cancer proton therapy. A method to iteratively approach the Pareto front by automatically generating robustly optimized IMPT plans was developed. To verify plan quality, IMPT plans were evaluated on robustness by simulating range and position errors and recalculating the dose. For five retrospectively selected cervical cancer patients, this method was applied for IMPT plans with three different beam configurations using two, three and four beams. 3D Pareto fronts were optimized on target coverage (CTV D(99%)) and OAR doses (rectum V30Gy; bladder V40Gy). Per patient, proportions of non-approved IMPT plans were determined and differences between patient-specific Pareto fronts were quantified in terms of CTV D(99%), rectum V(30Gy) and bladder V(40Gy) to perform beam configuration selection. Per patient and beam configuration, Pareto fronts were successfully sampled based on 200 IMPT plans of which on average 29% were non-approved plans. In all patients, IMPT plans based on the 2-beam set-up were completely dominated by plans with the 3-beam and 4-beam configuration. Compared to the 3-beam set-up, the 4-beam set-up increased the median CTV D(99%) on average by 0.2 Gy and decreased the median rectum V(30Gy) and median bladder V(40Gy) on average by 3.6% and 1.3%, respectively. This study demonstrates a method to automatically derive Pareto fronts in robust IMPT planning. For all patients, the defined four-beam configuration was found optimal
International Nuclear Information System (INIS)
Van de Schoot, A J A J; Visser, J; Van Kesteren, Z; Rasch, C R N; Bel, A; Janssen, T M
2016-01-01
The Pareto front reflects the optimal trade-offs between conflicting objectives and can be used to quantify the effect of different beam configurations on plan robustness and dose-volume histogram parameters. Therefore, our aim was to develop and implement a method to automatically approach the Pareto front in robust intensity-modulated proton therapy (IMPT) planning. Additionally, clinically relevant Pareto fronts based on different beam configurations will be derived and compared to enable beam configuration selection in cervical cancer proton therapy. A method to iteratively approach the Pareto front by automatically generating robustly optimized IMPT plans was developed. To verify plan quality, IMPT plans were evaluated on robustness by simulating range and position errors and recalculating the dose. For five retrospectively selected cervical cancer patients, this method was applied for IMPT plans with three different beam configurations using two, three and four beams. 3D Pareto fronts were optimized on target coverage (CTV D 99% ) and OAR doses (rectum V 30Gy ; bladder V 40Gy ). Per patient, proportions of non-approved IMPT plans were determined and differences between patient-specific Pareto fronts were quantified in terms of CTV D 99% , rectum V 30Gy and bladder V 40Gy to perform beam configuration selection. Per patient and beam configuration, Pareto fronts were successfully sampled based on 200 IMPT plans of which on average 29% were non-approved plans. In all patients, IMPT plans based on the 2-beam set-up were completely dominated by plans with the 3-beam and 4-beam configuration. Compared to the 3-beam set-up, the 4-beam set-up increased the median CTV D 99% on average by 0.2 Gy and decreased the median rectum V 30Gy and median bladder V 40Gy on average by 3.6% and 1.3%, respectively. This study demonstrates a method to automatically derive Pareto fronts in robust IMPT planning. For all patients, the defined four-beam configuration was found optimal in
Craft, David; Monz, Michael
2010-02-01
To introduce a method to simultaneously explore a collection of Pareto surfaces. The method will allow radiotherapy treatment planners to interactively explore treatment plans for different beam angle configurations as well as different treatment modalities. The authors assume a convex optimization setting and represent the Pareto surface for each modality or given beam set by a set of discrete points on the surface. Weighted averages of these discrete points produce a continuous representation of each Pareto surface. The authors calculate a set of Pareto surfaces and use linear programming to navigate across the individual surfaces, allowing switches between surfaces. The switches are organized such that the plan profits in the requested way, while trying to keep the change in dose as small as possible. The system is demonstrated on a phantom pancreas IMRT case using 100 different five beam configurations and a multicriteria formulation with six objectives. The system has intuitive behavior and is easy to control. Also, because the underlying linear programs are small, the system is fast enough to offer real-time exploration for the Pareto surfaces of the given beam configurations. The system presented offers a sound starting point for building clinical systems for multicriteria exploration of different modalities and offers a controllable way to explore hundreds of beam angle configurations in IMRT planning, allowing the users to focus their attention on the dose distribution and treatment planning objectives instead of spending excessive time on the technicalities of delivery.
Coupled Low-thrust Trajectory and System Optimization via Multi-Objective Hybrid Optimal Control
Vavrina, Matthew A.; Englander, Jacob Aldo; Ghosh, Alexander R.
2015-01-01
The optimization of low-thrust trajectories is tightly coupled with the spacecraft hardware. Trading trajectory characteristics with system parameters ton identify viable solutions and determine mission sensitivities across discrete hardware configurations is labor intensive. Local independent optimization runs can sample the design space, but a global exploration that resolves the relationships between the system variables across multiple objectives enables a full mapping of the optimal solution space. A multi-objective, hybrid optimal control algorithm is formulated using a multi-objective genetic algorithm as an outer loop systems optimizer around a global trajectory optimizer. The coupled problem is solved simultaneously to generate Pareto-optimal solutions in a single execution. The automated approach is demonstrated on two boulder return missions.
An Improved Particle Swarm Optimization for Solving Bilevel Multiobjective Programming Problem
Directory of Open Access Journals (Sweden)
Tao Zhang
2012-01-01
Full Text Available An improved particle swarm optimization (PSO algorithm is proposed for solving bilevel multiobjective programming problem (BLMPP. For such problems, the proposed algorithm directly simulates the decision process of bilevel programming, which is different from most traditional algorithms designed for specific versions or based on specific assumptions. The BLMPP is transformed to solve multiobjective optimization problems in the upper level and the lower level interactively by an improved PSO. And a set of approximate Pareto optimal solutions for BLMPP is obtained using the elite strategy. This interactive procedure is repeated until the accurate Pareto optimal solutions of the original problem are found. Finally, some numerical examples are given to illustrate the feasibility of the proposed algorithm.
Multi-Objective Optimization for Smart House Applied Real Time Pricing Systems
Directory of Open Access Journals (Sweden)
Yasuaki Miyazato
2016-12-01
Full Text Available A smart house generally has a Photovoltaic panel (PV, a Heat Pump (HP, a Solar Collector (SC and a fixed battery. Since the fixed battery can buy and store inexpensive electricity during the night, the electricity bill can be reduced. However, a large capacity fixed battery is very expensive. Therefore, there is a need to determine the economic capacity of fixed battery. Furthermore, surplus electric power can be sold using a buyback program. By this program, PV can be effectively utilized and contribute to the reduction of the electricity bill. With this in mind, this research proposes a multi-objective optimization, the purpose of which is electric demand control and reduction of the electricity bill in the smart house. In this optimal problem, the Pareto optimal solutions are searched depending on the fixed battery capacity. Additionally, it is shown that consumers can choose what suits them by comparing the Pareto optimal solutions.
A Pareto Algorithm for Efficient De Novo Design of Multi-functional Molecules.
Daeyaert, Frits; Deem, Micheal W
2017-01-01
We have introduced a Pareto sorting algorithm into Synopsis, a de novo design program that generates synthesizable molecules with desirable properties. We give a detailed description of the algorithm and illustrate its working in 2 different de novo design settings: the design of putative dual and selective FGFR and VEGFR inhibitors, and the successful design of organic structure determining agents (OSDAs) for the synthesis of zeolites. We show that the introduction of Pareto sorting not only enables the simultaneous optimization of multiple properties but also greatly improves the performance of the algorithm to generate molecules with hard-to-meet constraints. This in turn allows us to suggest approaches to address the problem of false positive hits in de novo structure based drug design by introducing structural and physicochemical constraints in the designed molecules, and by forcing essential interactions between these molecules and their target receptor. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Directory of Open Access Journals (Sweden)
Sie Long Kek
2015-01-01
Full Text Available A computational approach is proposed for solving the discrete time nonlinear stochastic optimal control problem. Our aim is to obtain the optimal output solution of the original optimal control problem through solving the simplified model-based optimal control problem iteratively. In our approach, the adjusted parameters are introduced into the model used such that the differences between the real system and the model used can be computed. Particularly, system optimization and parameter estimation are integrated interactively. On the other hand, the output is measured from the real plant and is fed back into the parameter estimation problem to establish a matching scheme. During the calculation procedure, the iterative solution is updated in order to approximate the true optimal solution of the original optimal control problem despite model-reality differences. For illustration, a wastewater treatment problem is studied and the results show the efficiency of the approach proposed.
International Nuclear Information System (INIS)
Jiang, R.
2009-01-01
It is difficult to find the optimal solution of the sequential age replacement policy for a finite-time horizon. This paper presents an accurate approximation to find an approximate optimal solution of the sequential replacement policy. The proposed approximation is computationally simple and suitable for any failure distribution. Their accuracy is illustrated by two examples. Based on the approximate solution, an approximate estimate for the total cost is derived.
Chen, Zhihuan; Yuan, Yanbin; Yuan, Xiaohui; Huang, Yuehua; Li, Xianshan; Li, Wenwu
2015-05-01
A hydraulic turbine regulating system (HTRS) is one of the most important components of hydropower plant, which plays a key role in maintaining safety, stability and economical operation of hydro-electrical installations. At present, the conventional PID controller is widely applied in the HTRS system for its practicability and robustness, and the primary problem with respect to this control law is how to optimally tune the parameters, i.e. the determination of PID controller gains for satisfactory performance. In this paper, a kind of multi-objective evolutionary algorithms, named adaptive grid particle swarm optimization (AGPSO) is applied to solve the PID gains tuning problem of the HTRS system. This newly AGPSO optimized method, which differs from a traditional one-single objective optimization method, is designed to take care of settling time and overshoot level simultaneously, in which a set of non-inferior alternatives solutions (i.e. Pareto solution) is generated. Furthermore, a fuzzy-based membership value assignment method is employed to choose the best compromise solution from the obtained Pareto set. An illustrative example associated with the best compromise solution for parameter tuning of the nonlinear HTRS system is introduced to verify the feasibility and the effectiveness of the proposed AGPSO-based optimization approach, as compared with two another prominent multi-objective algorithms, i.e. Non-dominated Sorting Genetic Algorithm II (NSGAII) and Strength Pareto Evolutionary Algorithm II (SPEAII), for the quality and diversity of obtained Pareto solutions set. Consequently, simulation results show that this AGPSO optimized approach outperforms than compared methods with higher efficiency and better quality no matter whether the HTRS system works under unload or load conditions. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
Particle Swarm Optimization Toolbox
Grant, Michael J.
2010-01-01
The Particle Swarm Optimization Toolbox is a library of evolutionary optimization tools developed in the MATLAB environment. The algorithms contained in the library include a genetic algorithm (GA), a single-objective particle swarm optimizer (SOPSO), and a multi-objective particle swarm optimizer (MOPSO). Development focused on both the SOPSO and MOPSO. A GA was included mainly for comparison purposes, and the particle swarm optimizers appeared to perform better for a wide variety of optimization problems. All algorithms are capable of performing unconstrained and constrained optimization. The particle swarm optimizers are capable of performing single and multi-objective optimization. The SOPSO and MOPSO algorithms are based on swarming theory and bird-flocking patterns to search the trade space for the optimal solution or optimal trade in competing objectives. The MOPSO generates Pareto fronts for objectives that are in competition. A GA, based on Darwin evolutionary theory, is also included in the library. The GA consists of individuals that form a population in the design space. The population mates to form offspring at new locations in the design space. These offspring contain traits from both of the parents. The algorithm is based on this combination of traits from parents to hopefully provide an improved solution than either of the original parents. As the algorithm progresses, individuals that hold these optimal traits will emerge as the optimal solutions. Due to the generic design of all optimization algorithms, each algorithm interfaces with a user-supplied objective function. This function serves as a "black-box" to the optimizers in which the only purpose of this function is to evaluate solutions provided by the optimizers. Hence, the user-supplied function can be numerical simulations, analytical functions, etc., since the specific detail of this function is of no concern to the optimizer. These algorithms were originally developed to support entry
System Approach of Logistic Costs Optimization Solution in Supply Chain
Majerčák, Peter; Masárová, Gabriela; Buc, Daniel; Majerčáková, Eva
2013-01-01
This paper is focused on the possibility of using the costs simulation in supply chain, which are on relative high level. Our goal is to determine the costs using logistic costs optimization which must necessarily be used in business activities in the supply chain management. The paper emphasizes the need to perform not isolated optimization in the whole supply chain. Our goal is to compare classic approach, when every part tracks its costs isolated, a try to minimize them, with the system (l...
Efficient solution method for optimal control of nuclear systems
International Nuclear Information System (INIS)
Naser, J.A.; Chambre, P.L.
1981-01-01
To improve the utilization of existing fuel sources, the use of optimization techniques is becoming more important. A technique for solving systems of coupled ordinary differential equations with initial, boundary, and/or intermediate conditions is given. This method has a number of inherent advantages over existing techniques as well as being efficient in terms of computer time and space requirements. An example of computing the optimal control for a spatially dependent reactor model with and without temperature feedback is given. 10 refs
Analysis of a Pareto Mixture Distribution for Maritime Surveillance Radar
Directory of Open Access Journals (Sweden)
Graham V. Weinberg
2012-01-01
Full Text Available The Pareto distribution has been shown to be an excellent model for X-band high-resolution maritime surveillance radar clutter returns. Given the success of mixture distributions in radar, it is thus of interest to consider the effect of Pareto mixture models. This paper introduces a formulation of a Pareto intensity mixture distribution and investigates coherent multilook radar detector performance using this new clutter model. Clutter parameter estimates are derived from data sets produced by the Defence Science and Technology Organisation's Ingara maritime surveillance radar.
Energy Technology Data Exchange (ETDEWEB)
Dall' Anese, Emiliano; Dhople, Sairaj V.; Giannakis, Georgios B.
2015-07-01
This paper considers a collection of networked nonlinear dynamical systems, and addresses the synthesis of feedback controllers that seek optimal operating points corresponding to the solution of pertinent network-wide optimization problems. Particular emphasis is placed on the solution of semidefinite programs (SDPs). The design of the feedback controller is grounded on a dual e-subgradient approach, with the dual iterates utilized to dynamically update the dynamical-system reference signals. Global convergence is guaranteed for diminishing stepsize rules, even when the reference inputs are updated at a faster rate than the dynamical-system settling time. The application of the proposed framework to the control of power-electronic inverters in AC distribution systems is discussed. The objective is to bridge the time-scale separation between real-time inverter control and network-wide optimization. Optimization objectives assume the form of SDP relaxations of prototypical AC optimal power flow problems.
Wang, Hongrui; Liu, Hongwei; Cai, Leixin; Wang, Caixia; Lv, Qiang
2017-07-10
In this study, we extended the replica exchange Monte Carlo (REMC) sampling method to protein-small molecule docking conformational prediction using RosettaLigand. In contrast to the traditional Monte Carlo (MC) and REMC sampling methods, these methods use multi-objective optimization Pareto front information to facilitate the selection of replicas for exchange. The Pareto front information generated to select lower energy conformations as representative conformation structure replicas can facilitate the convergence of the available conformational space, including available near-native structures. Furthermore, our approach directly provides min-min scenario Pareto optimal solutions, as well as a hybrid of the min-min and max-min scenario Pareto optimal solutions with lower energy conformations for use as structure templates in the REMC sampling method. These methods were validated based on a thorough analysis of a benchmark data set containing 16 benchmark test cases. An in-depth comparison between MC, REMC, multi-objective optimization-REMC (MO-REMC), and hybrid MO-REMC (HMO-REMC) sampling methods was performed to illustrate the differences between the four conformational search strategies. Our findings demonstrate that the MO-REMC and HMO-REMC conformational sampling methods are powerful approaches for obtaining protein-small molecule docking conformational predictions based on the binding energy of complexes in RosettaLigand.
Multiobjective constraints for climate model parameter choices: Pragmatic Pareto fronts in CESM1
Langenbrunner, B.; Neelin, J. D.
2017-09-01
Global climate models (GCMs) are examples of high-dimensional input-output systems, where model output is a function of many variables, and an update in model physics commonly improves performance in one objective function (i.e., measure of model performance) at the expense of degrading another. Here concepts from multiobjective optimization in the engineering literature are used to investigate parameter sensitivity and optimization in the face of such trade-offs. A metamodeling technique called cut high-dimensional model representation (cut-HDMR) is leveraged in the context of multiobjective optimization to improve GCM simulation of the tropical Pacific climate, focusing on seasonal precipitation, column water vapor, and skin temperature. An evolutionary algorithm is used to solve for Pareto fronts, which are surfaces in objective function space along which trade-offs in GCM performance occur. This approach allows the modeler to visualize trade-offs quickly and identify the physics at play. In some cases, Pareto fronts are small, implying that trade-offs are minimal, optimal parameter value choices are more straightforward, and the GCM is well-functioning. In all cases considered here, the control run was found not to be Pareto-optimal (i.e., not on the front), highlighting an opportunity for model improvement through objectively informed parameter selection. Taylor diagrams illustrate that these improvements occur primarily in field magnitude, not spatial correlation, and they show that specific parameter updates can improve fields fundamental to tropical moist processes—namely precipitation and skin temperature—without significantly impacting others. These results provide an example of how basic elements of multiobjective optimization can facilitate pragmatic GCM tuning processes.
Research on vehicle routing optimization for the terminal distribution of B2C E-commerce firms
Zhang, Shiyun; Lu, Yapei; Li, Shasha
2018-05-01
In this paper, we established a half open multi-objective optimization model for the vehicle routing problem of B2C (business-to-customer) E-Commerce firms. To minimize the current transport distance as well as the disparity between the excepted shipments and the transport capacity in the next distribution, we applied the concept of dominated solution and Pareto solutions to the standard particle swarm optimization and proposed a MOPSO (multi-objective particle swarm optimization) algorithm to support the model. Besides, we also obtained the optimization solution of MOPSO algorithm based on data randomly generated through the system, which verified the validity of the model.
Optimal allocation and adaptive VAR control of PV-DG in distribution networks
International Nuclear Information System (INIS)
Fu, Xueqian; Chen, Haoyong; Cai, Runqing; Yang, Ping
2015-01-01
Highlights: • A methodology for optimal PV-DG allocation based on a combination of algorithms. • Dealing with the randomicity of solar power energy using CCSP. • Presenting a VAR control strategy to balance the technical demands. • Finding the Pareto solutions using MOPSO and SVM. • Evaluating the Pareto solutions using WRSR. - Abstract: The development of distributed generation (DG) has brought new challenges to power networks. One of them that catches extensive attention is the voltage regulation problem of distribution networks caused by DG. Optimal allocation of DG in distribution networks is another well-known problem being widely investigated. This paper proposes a new method for the optimal allocation of photovoltaic distributed generation (PV-DG) considering the non-dispatchable characteristics of PV units. An adaptive reactive power control model is introduced in PV-DG allocation as to balance the trade-off between the improvement of voltage quality and the minimization of power loss in a distribution network integrated with PV-DG units. The optimal allocation problem is formulated as a chance-constrained stochastic programming (CCSP) model for dealing with the randomness of solar power energy. A novel algorithm combining the multi-objective particle swarm optimization (MOPSO) with support vector machines (SVM) is proposed to find the Pareto front consisting of a set of possible solutions. The Pareto solutions are further evaluated using the weighted rank sum ratio (WRSR) method to help the decision-maker obtain the desired solution. Simulation results on a 33-bus radial distribution system show that the optimal allocation method can fully take into account the time-variant characteristics and probability distribution of PV-DG, and obtain the best allocation scheme
Optimization problems with equilibrium constraints and their numerical solution
Czech Academy of Sciences Publication Activity Database
Kočvara, Michal; Outrata, Jiří
Roč. 101 , č. 1 (2004), s. 119-149 ISSN 0025-5610 R&D Projects: GA AV ČR IAA1075005 Grant - others:BMBF(DE) 03ZOM3ER Institutional research plan: CEZ:AV0Z1075907 Keywords : optimization problems * MPEC * MPCC Subject RIV: BA - General Mathematics Impact factor: 1.016, year: 2004
Optimization of chromium biosorption in aqueous solution by marine ...
African Journals Online (AJOL)
Optimization of a chromium biosorption process was performed by varying three independent variables pH (0.5 to 3.5), initial chromium ion concentration (10 to 30 mg/L), and Yarrowia lipolytica dosage (2 to 4 g/L) using a Doehlert experimental design (DD) involving response surface methodology (RSM). For the maximum ...
Optimal Component Lumping: problem formulation and solution techniques
DEFF Research Database (Denmark)
Lin, Bao; Leibovici, Claude F.; Jørgensen, Sten Bay
2008-01-01
This paper presents a systematic method for optimal lumping of a large number of components in order to minimize the loss of information. In principle, a rigorous composition-based model is preferable to describe a system accurately. However, computational intensity and numerical issues restrict ...
Optimized Baxter model of protein solutions : Electrostatics versus adhesion
Prinsen, P.; Odijk, T.
2004-01-01
A theory is set up of spherical proteins interacting by screened electrostatics and constant adhesion, in which the effective adhesion parameter is optimized by a variational principle for the free energy. An analytical approach to the second virial coefficient is first outlined by balancing the
Directory of Open Access Journals (Sweden)
Jorge Caldera-Serrano
2015-09-01
Full Text Available Se analiza la reutilización de las colecciones audiovisuales de las cadenas de televisión con el fin de detectar si se cumple el Índice de Pareto, facilitando mecanismos para su control y explotación de la parte de la colección audiovisual menos utilizada. Se detecta que la correlación de Pareto se establece no sólo en el uso sino también en la presencia de elementos temáticos y elementos onomásticos en el archivo y en la difusión de contenidos, por lo que se plantea formas de control en la integración de información en la colección y de recursos en la difusión. Igualmente se describe el Índice de Pareto, los Media Asset Management y el cambio de paradigma al digital, elementos fundamentales para entender los problemas y las soluciones para la eliminación de problemas en la recuperación y en la conformación de la colección. Abstract: Reuse of audiovisual collections television networks in order to detect whether the Pareto index, providing mechanisms for control and exploitation of the least used part of the audiovisual collection holds analyzed. It is found that the correlation of Pareto is established not only in the use but also the presence of thematic elements and onomastic elements in the file and in the distribution of content, so forms of control arises in the integration of information collection and distributing resources. Likewise, the Pareto index, the Media Asset Management and the paradigm shift to digital, essential to understanding the problems and solutions to eliminate problems in recovery and in the establishment of collection elements described. Keywords: Information processing. Television. Electronic media. Information systems evaluation.
The linear ordering problem: an algorithm for the optimal solution ...
African Journals Online (AJOL)
In this paper we describe and implement an algorithm for the exact solution of the Linear Ordering problem. Linear Ordering is the problem of finding a linear order of the nodes of a graph such that the sum of the weights which are consistent with this order is as large as possible. It is an NP - Hard combinatorial optimisation ...
Yu, Xiang; Zhang, Xueqing
2017-01-01
Comprehensive learning particle swarm optimization (CLPSO) is a powerful state-of-the-art single-objective metaheuristic. Extending from CLPSO, this paper proposes multiswarm CLPSO (MSCLPSO) for multiobjective optimization. MSCLPSO involves multiple swarms, with each swarm associated with a separate original objective. Each particle's personal best position is determined just according to the corresponding single objective. Elitists are stored externally. MSCLPSO differs from existing multiobjective particle swarm optimizers in three aspects. First, each swarm focuses on optimizing the associated objective using CLPSO, without learning from the elitists or any other swarm. Second, mutation is applied to the elitists and the mutation strategy appropriately exploits the personal best positions and elitists. Third, a modified differential evolution (DE) strategy is applied to some extreme and least crowded elitists. The DE strategy updates an elitist based on the differences of the elitists. The personal best positions carry useful information about the Pareto set, and the mutation and DE strategies help MSCLPSO discover the true Pareto front. Experiments conducted on various benchmark problems demonstrate that MSCLPSO can find nondominated solutions distributed reasonably over the true Pareto front in a single run.
Using Pareto points for model identification in predictive toxicology
2013-01-01
Predictive toxicology is concerned with the development of models that are able to predict the toxicity of chemicals. A reliable prediction of toxic effects of chemicals in living systems is highly desirable in cosmetics, drug design or food protection to speed up the process of chemical compound discovery while reducing the need for lab tests. There is an extensive literature associated with the best practice of model generation and data integration but management and automated identification of relevant models from available collections of models is still an open problem. Currently, the decision on which model should be used for a new chemical compound is left to users. This paper intends to initiate the discussion on automated model identification. We present an algorithm, based on Pareto optimality, which mines model collections and identifies a model that offers a reliable prediction for a new chemical compound. The performance of this new approach is verified for two endpoints: IGC50 and LogP. The results show a great potential for automated model identification methods in predictive toxicology. PMID:23517649
Paasche, H.; Tronicke, J.
2012-04-01
In many near surface geophysical applications multiple tomographic data sets are routinely acquired to explore subsurface structures and parameters. Linking the model generation process of multi-method geophysical data sets can significantly reduce ambiguities in geophysical data analysis and model interpretation. Most geophysical inversion approaches rely on local search optimization methods used to find an optimal model in the vicinity of a user-given starting model. The final solution may critically depend on the initial model. Alternatively, global optimization (GO) methods have been used to invert geophysical data. They explore the solution space in more detail and determine the optimal model independently from the starting model. Additionally, they can be used to find sets of optimal models allowing a further analysis of model parameter uncertainties. Here we employ particle swarm optimization (PSO) to realize the global optimization of tomographic data. PSO is an emergent methods based on swarm intelligence characterized by fast and robust convergence towards optimal solutions. The fundamental principle of PSO is inspired by nature, since the algorithm mimics the behavior of a flock of birds searching food in a search space. In PSO, a number of particles cruise a multi-dimensional solution space striving to find optimal model solutions explaining the acquired data. The particles communicate their positions and success and direct their movement according to the position of the currently most successful particle of the swarm. The success of a particle, i.e. the quality of the currently found model by a particle, must be uniquely quantifiable to identify the swarm leader. When jointly inverting disparate data sets, the optimization solution has to satisfy multiple optimization objectives, at least one for each data set. Unique determination of the most successful particle currently leading the swarm is not possible. Instead, only statements about the Pareto
AN APPLICATION OF MULTICRITERIA OPTIMIZATION TO THE TWO-CARRIER TWO-SPEED PLANETARY GEAR TRAINS
Directory of Open Access Journals (Sweden)
Jelena Stefanović-Marinović
2017-04-01
Full Text Available The objective of this study is the application of multi-criteria optimization to the two-carrier two-speed planetary gear trains. In order to determine mathematical model of multi-criteria optimization, variables, objective functions and conditions should be determined. The subject of the paper is two-carrier two-speed planetary gears with brakes on single shafts. Apart from the determination of the set of the Pareto optimal solutions, the weighted coefficient method for choosing an optimal solution from this set is also included in the mathematical model.
The equivalence of multi-criteria methods for radiotherapy plan optimization
International Nuclear Information System (INIS)
Breedveld, Sebastiaan; Storchi, Pascal R M; Heijmen, Ben J M
2009-01-01
Several methods can be used to achieve multi-criteria optimization of radiation therapy treatment planning, which strive for Pareto-optimality. The property of the solution being Pareto optimal is desired, because it guarantees that no criteria can be improved without deteriorating another criteria. The most widely used methods are the weighted-sum method, in which the different treatment objectives are weighted, and constrained optimization methods, in which treatment goals are set and the algorithm has to find the best plan fulfilling these goals. The constrained method used in this paper, the 2pεc (2-phase ε-constraint) method is based on the ε-constraint method, which generates Pareto-optimal solutions. Both approaches are uniquely related to each other. In this paper, we will show that it is possible to switch from the constrained method to the weighted-sum method by using the Lagrange multipliers from the constrained optimization problem, and vice versa by setting the appropriate constraints. In general, the theory presented in this paper can be useful in cases where a new situation is slightly different from the original situation, e.g. in online treatment planning, with deformations of the volumes of interest, or in automated treatment planning, where changes to the automated plan have to be made. An example of the latter is given where the planner is not satisfied with the result from the constrained method and wishes to decrease the dose in a structure. By using the Lagrange multipliers, a weighted-sum optimization problem is constructed, which generates a Pareto-optimal solution in the neighbourhood of the original plan, but fulfills the new treatment objectives.
Optimal harvesting for a predator-prey agent-based model using difference equations.
Oremland, Matthew; Laubenbacher, Reinhard
2015-03-01
In this paper, a method known as Pareto optimization is applied in the solution of a multi-objective optimization problem. The system in question is an agent-based model (ABM) wherein global dynamics emerge from local interactions. A system of discrete mathematical equations is formulated in order to capture the dynamics of the ABM; while the original model is built up analytically from the rules of the model, the paper shows how minor changes to the ABM rule set can have a substantial effect on model dynamics. To address this issue, we introduce parameters into the equation model that track such changes. The equation model is amenable to mathematical theory—we show how stability analysis can be performed and validated using ABM data. We then reduce the equation model to a simpler version and implement changes to allow controls from the ABM to be tested using the equations. Cohen's weighted κ is proposed as a measure of similarity between the equation model and the ABM, particularly with respect to the optimization problem. The reduced equation model is used to solve a multi-objective optimization problem via a technique known as Pareto optimization, a heuristic evolutionary algorithm. Results show that the equation model is a good fit for ABM data; Pareto optimization provides a suite of solutions to the multi-objective optimization problem that can be implemented directly in the ABM.
Numerical solution of optimal departure frequency of Taipei TMS
Young, Lih-jier; Chiu, Chin-Hsin
2016-05-01
Route Number 5 (Bannan Line) of Taipei Mass Rapid Transit (MRT) is the most popular line in the Taipei Metro System especially during rush hours periods. It has been estimated there are more than 8,000 passengers on the ticket platform during 18:00∼19:00 at Taipei main station. The purpose of this research is to predict a specific departure frequency of passengers per train. Monte Carlo Simulation will be used to optimize departure frequency according to the passenger information provided by 22 stations, i.e., 22 random variables of route number 5. It is worth mentioning that we used 30,000 iterations to get the different samples of the optimization departure frequency, i.e., 10 trains/hr which matches the practical situation.
MINLP solution for an optimal isotope separation system
International Nuclear Information System (INIS)
Boisset-Baticle, L.; Latge, C.; Joulia, X.
1994-01-01
This paper deals with designing of cryogenic distillation systems for the separation of hydrogen isotopes in a thermonuclear fusion process. The design must minimize the tritium inventory in the distillation columns and satisfy the separation requirements. This induces the optimization of both the structure and the operating conditions of the columns. Such a problem is solved by use of a Mixed-Integer NonLinear Programming (MINLP) tool coupled to a process simulator. The MINLP procedure is based on the iterative and alternative treatment of two subproblems: a NLP problem which is solved by a reduced-gradient method, and a MILP problem, solved with a Branch and Bound method coupled to a simplexe. The formulation of the problem and the choice of an appropriate superstructure are here detailed, and results are finally presented, concerning the optimal design of a specific isotope separation system. (author)
Pareto Efficient Solution of Attack-Defence Trees
Aslanyan, Zaruhi; Nielson, Flemming
Attack-defence trees are a promising approach for representing threat scenarios and possible countermeasures in a concise and intuitive manner. An attack-defence tree describes the interaction between an attacker and a defender, and is evaluated by assigning parameters to the nodes, such as
A solution to the optimal power flow using multi-verse optimizer
Directory of Open Access Journals (Sweden)
Bachir Bentouati
2016-12-01
Full Text Available In this work, the most common problem of the modern power system named optimal power flow (OPF is optimized using the novel meta-heuristic optimization Multi-verse Optimizer(MVO algorithm. In order to solve the optimal power flow problem, the IEEE 30-bus and IEEE 57-bus systems are used. MVO is applied to solve the proposed problem. The problems considered in the OPF problem are fuel cost reduction, voltage profile improvement, voltage stability enhancement. The obtained results are compared with recently published meta-heuristics. Simulation results clearly reveal the effectiveness and the rapidity of the proposed algorithm for solving the OPF problem.
Simple Machines Forum, a Solution for Dialogue Optimization between Physicians
Directory of Open Access Journals (Sweden)
Laura SÎNGIORZAN
2013-02-01
Full Text Available We developed an instrument which can ensure a quick and easy dialogue between the physicians of the Oncology Institute and family physicians. The platform we chose was Simple Machines Forum (abbreviated as SMF, a free Internet forum (BBS - Bulletin Board System application. The purpose of this article is not to detail the software platform, but to emphasize the facilities and advantages of using this solution in the medical community.
The Fundamental Solution and Its Role in the Optimal Control of Infinite Dimensional Neutral Systems
International Nuclear Information System (INIS)
Liu Kai
2009-01-01
In this work, we shall consider standard optimal control problems for a class of neutral functional differential equations in Banach spaces. As the basis of a systematic theory of neutral models, the fundamental solution is constructed and a variation of constants formula of mild solutions is established. We introduce a class of neutral resolvents and show that the Laplace transform of the fundamental solution is its neutral resolvent operator. Necessary conditions in terms of the solutions of neutral adjoint systems are established to deal with the fixed time integral convex cost problem of optimality. Based on optimality conditions, the maximum principle for time varying control domain is presented. Finally, the time optimal control problem to a target set is investigated
Iterative solution to the optimal poison management problem in pressurized water reactors
International Nuclear Information System (INIS)
Colletti, J.P.; Levine, S.H.; Lewis, J.B.
1983-01-01
A new method for solving the optimal poison management problem for a multiregion pressurized water reactor has been developed. The optimization objective is to maximize the end-of-cycle core excess reactivity for any given beginning-of-cycle fuel loading. The problem is treated as an optimal control problem with the region burnup and control absorber concentrations acting as the state and control variables, respectively. Constraints are placed on the power peaking, soluble boron concentration, and control absorber concentrations. The solution method consists of successive relinearizations of the system equations resulting in a sequence of nonlinear programming problems whose solutions converge to the desired optimal control solution. Application of the method to several test problems based on a simplified three-region reactor suggests a bang-bang optimal control strategy with the peak power location switching between the inner and outer regions of the core and the critical soluble boron concentration as low as possible throughout the cycle
Optimized positioning of autonomous surgical lamps
Teuber, Jörn; Weller, Rene; Kikinis, Ron; Oldhafer, Karl-Jürgen; Lipp, Michael J.; Zachmann, Gabriel
2017-03-01
We consider the problem of finding automatically optimal positions of surgical lamps throughout the whole surgical procedure, where we assume that future lamps could be robotized. We propose a two-tiered optimization technique for the real-time autonomous positioning of those robotized surgical lamps. Typically, finding optimal positions for surgical lamps is a multi-dimensional problem with several, in part conflicting, objectives, such as optimal lighting conditions at every point in time while minimizing the movement of the lamps in order to avoid distractions of the surgeon. Consequently, we use multi-objective optimization (MOO) to find optimal positions in real-time during the entire surgery. Due to the conflicting objectives, there is usually not a single optimal solution for such kinds of problems, but a set of solutions that realizes a Pareto-front. When our algorithm selects a solution from this set it additionally has to consider the individual preferences of the surgeon. This is a highly non-trivial task because the relationship between the solution and the parameters is not obvious. We have developed a novel meta-optimization that considers exactly this challenge. It delivers an easy to understand set of presets for the parameters and allows a balance between the lamp movement and lamp obstruction. This metaoptimization can be pre-computed for different kinds of operations and it then used by our online optimization for the selection of the appropriate Pareto solution. Both optimization approaches use data obtained by a depth camera that captures the surgical site but also the environment around the operating table. We have evaluated our algorithms with data recorded during a real open abdominal surgery. It is available for use for scientific purposes. The results show that our meta-optimization produces viable parameter sets for different parts of an intervention even when trained on a small portion of it.
International Nuclear Information System (INIS)
Cao, Dingzhou; Murat, Alper; Chinnam, Ratna Babu
2013-01-01
This paper proposes a decomposition-based approach to exactly solve the multi-objective Redundancy Allocation Problem for series-parallel systems. Redundancy allocation problem is a form of reliability optimization and has been the subject of many prior studies. The majority of these earlier studies treat redundancy allocation problem as a single objective problem maximizing the system reliability or minimizing the cost given certain constraints. The few studies that treated redundancy allocation problem as a multi-objective optimization problem relied on meta-heuristic solution approaches. However, meta-heuristic approaches have significant limitations: they do not guarantee that Pareto points are optimal and, more importantly, they may not identify all the Pareto-optimal points. In this paper, we treat redundancy allocation problem as a multi-objective problem, as is typical in practice. We decompose the original problem into several multi-objective sub-problems, efficiently and exactly solve sub-problems, and then systematically combine the solutions. The decomposition-based approach can efficiently generate all the Pareto-optimal solutions for redundancy allocation problems. Experimental results demonstrate the effectiveness and efficiency of the proposed method over meta-heuristic methods on a numerical example taken from the literature.
Particle swarm optimization: an alternative in marine propeller optimization?
Vesting, F.; Bensow, R. E.
2018-01-01
This article deals with improving and evaluating the performance of two evolutionary algorithm approaches for automated engineering design optimization. Here a marine propeller design with constraints on cavitation nuisance is the intended application. For this purpose, the particle swarm optimization (PSO) algorithm is adapted for multi-objective optimization and constraint handling for use in propeller design. Three PSO algorithms are developed and tested for the optimization of four commercial propeller designs for different ship types. The results are evaluated by interrogating the generation medians and the Pareto front development. The same propellers are also optimized utilizing the well established NSGA-II genetic algorithm to provide benchmark results. The authors' PSO algorithms deliver comparable results to NSGA-II, but converge earlier and enhance the solution in terms of constraints violation.
A fast method for optimal reactive power flow solution
Energy Technology Data Exchange (ETDEWEB)
Sadasivam, G; Khan, M A [Anna Univ., Madras (IN). Coll. of Engineering
1990-01-01
A fast successive linear programming (SLP) method for minimizing transmission losses and improving the voltage profile is proposed. The method uses the same compactly stored, factorized constant matrices in all the LP steps, both for power flow solution and for constructing the LP model. The inherent oscillatory convergence of SLP methods is overcome by proper selection of initial step sizes and their gradual reduction. Detailed studies on three systems, including a 109-bus system, reveal the fast and reliable convergence property of the method. (author).
International Nuclear Information System (INIS)
Sreepathi, Bhargava Krishna; Rangaiah, G.P.
2015-01-01
Heat exchanger network (HEN) retrofitting improves the energy efficiency of the current process by reducing external utilities. In this work, HEN retrofitting involving streams having variable heat capacity is studied. For this, enthalpy values of a stream are fitted to a continuous cubic polynomial instead of a stepwise approach employed in the previous studies [1,2]. The former methodology is closer to reality as enthalpy or heat capacity changes gradually instead of step changes. Using the polynomial fitting formulation, single objective optimization (SOO) and multi-objective optimization (MOO) of a HEN retrofit problem are investigated. The results obtained show an improvement in the utility savings, and MOO provides many Pareto-optimal solutions to choose from. Also, Pareto-optimal solutions involving area addition in existing heat exchangers only (but no new exchangers and no structural modifications) are found and provided for comparison with those involving new exchangers and structural modifications as well. - Highlights: • HEN retrofitting involving streams with variable heat capacities is studied. • A continuous approach to handle variable heat capacity is proposed and tested. • Better and practical solutions are obtained for HEN retrofitting in process plants. • Pareto-optimal solutions provide many alternate choices for HEN retrofitting
Synthetic optimization of air turbine for dental handpieces.
Shi, Z Y; Dong, T
2014-01-01
A synthetic optimization of Pelton air turbine in dental handpieces concerning the power output, compressed air consumption and rotation speed in the mean time is implemented by employing a standard design procedure and variable limitation from practical dentistry. The Pareto optimal solution sets acquired by using the Normalized Normal Constraint method are mainly comprised of two piecewise continuous parts. On the Pareto frontier, the supply air stagnation pressure stalls at the lower boundary of the design space, the rotation speed is a constant value within the recommended range from literature, the blade tip clearance insensitive to while the nozzle radius increases with power output and mass flow rate of compressed air to which the residual geometric dimensions are showing an opposite trend within their respective "pieces" compared to the nozzle radius.
Optimal management of night eating syndrome: challenges and solutions
Directory of Open Access Journals (Sweden)
Kucukgoncu S
2015-03-01
Full Text Available Suat Kucukgoncu, Margaretta Midura, Cenk Tek Department of Psychiatry, Yale University, New Haven, CT, USA Abstract: Night Eating Syndrome (NES is a unique disorder characterized by a delayed pattern of food intake in which recurrent episodes of nocturnal eating and/or excessive food consumption occur after the evening meal. NES is a clinically important disorder due to its relationship to obesity, its association with other psychiatric disorders, and problems concerning sleep. However, NES often goes unrecognized by both health professionals and patients. The lack of knowledge regarding NES in clinical settings may lead to inadequate diagnoses and inappropriate treatment approaches. Therefore, the proper diagnosis of NES is the most important issue when identifying NES and providing treatment for this disorder. Clinical assessment tools such as the Night Eating Questionnaire may help health professionals working with populations vulnerable to NES. Although NES treatment studies are still in their infancy, antidepressant treatments and psychological therapies can be used for optimal management of patients with NES. Other treatment options such as melatonergic medications, light therapy, and the anticonvulsant topiramate also hold promise as future treatment options. The purpose of this review is to provide a summary of NES, including its diagnosis, comorbidities, and treatment approaches. Possible challenges addressing patients with NES and management options are also discussed. Keywords: night eating, obesity, psychiatric disorders, weight, depression
On Attainability of Optimal Solutions for Linear Elliptic Equations with Unbounded Coefficients
Directory of Open Access Journals (Sweden)
P. I. Kogut
2011-12-01
Full Text Available We study an optimal boundary control problem (OCP associated to a linear elliptic equation —div (Vj/ + A(xVy = f describing diffusion in a turbulent flow. The characteristic feature of this equation is the fact that, in applications, the stream matrix A(x = [a,ij(x]i,j=i,...,N is skew-symmetric, ац(х = —a,ji(x, measurable, and belongs to L -space (rather than L°°. An optimal solution to such problem can inherit a singular character of the original stream matrix A. We show that optimal solutions can be attainable by solutions of special optimal boundary control problems.
Smooth Solutions to Optimal Investment Models with Stochastic Volatilities and Portfolio Constraints
International Nuclear Information System (INIS)
Pham, H.
2002-01-01
This paper deals with an extension of Merton's optimal investment problem to a multidimensional model with stochastic volatility and portfolio constraints. The classical dynamic programming approach leads to a characterization of the value function as a viscosity solution of the highly nonlinear associated Bellman equation. A logarithmic transformation expresses the value function in terms of the solution to a semilinear parabolic equation with quadratic growth on the derivative term. Using a stochastic control representation and some approximations, we prove the existence of a smooth solution to this semilinear equation. An optimal portfolio is shown to exist, and is expressed in terms of the classical solution to this semilinear equation. This reduction is useful for studying numerical schemes for both the value function and the optimal portfolio. We illustrate our results with several examples of stochastic volatility models popular in the financial literature
OPTIMAL SOLUTIONS FOR IMPLEMENTING THE SUPPLY-SALES CHAIN MANAGEMENT
Directory of Open Access Journals (Sweden)
Elena COFAS
2014-04-01
Full Text Available The supply chain represents all physical flows , information and financial flows linking suppliers and customers. It leads on the one hand, the idea of the chain in which the various elements of an industrial production system are interrelated and secondly to a broad definition of supply (flow between plants, flow between a supplier and a customer, flow between two workstations etc.. For a number of enterprise managers, supply chain is a topic of major interest. In contrast, non-chain coordination, losses may result for the enterprise: obsolete inventory devaluation, impairment etc. Since the 1980’s, several companies came together in the same service all functions dealing logistic flow from supply to distribution, through production management and resource planning. At the same time it was developed the notion of “time" to expand these flows and to increase quality and reduce inventory. 1990’s promotes the trend of broadening the concept of integrated logistics to a more open organization, "supply chain" in which is contained the whole organization of the enterprise, designed around streams: sales, distribution, manufacturing, purchasing, and supply. This is the area where, through this work, I try to make a contribution towards finding practical solutions to implement an efficient supply chain that contribute to increased economic performance of companies.
Optimizing the pathology workstation "cockpit": Challenges and solutions
Directory of Open Access Journals (Sweden)
Elizabeth A Krupinski
2010-01-01
Full Text Available The 21 st century has brought numerous changes to the clinical reading (i.e., image or virtual pathology slide interpretation environment of pathologists and it will continue to change even more dramatically as information and communication technologies (ICTs become more widespread in the integrated healthcare enterprise. The extent to which these changes impact the practicing pathologist differ as a function of the technology under consideration, but digital "virtual slides" and the viewing of images on computer monitors instead of glass slides through a microscope clearly represents a significant change in the way that pathologists extract information from these images and render diagnostic decisions. One of the major challenges facing pathologists in this new era is how to best optimize the pathology workstation, the reading environment and the new and varied types of information available in order to ensure efficient and accurate processing of this information. Although workstations can be stand-alone units with images imported via external storage devices, this scenario is becoming less common as pathology departments connect to information highways within their hospitals and to external sites. Picture Archiving and Communications systems are no longer confined to radiology departments but are serving the entire integrated healthcare enterprise, including pathology. In radiology, the workstation is often referred to as the "cockpit" with a "digital dashboard" and the reading room as the "control room." Although pathology has yet to "go digital" to the extent that radiology has, lessons derived from radiology reading "cockpits" can be quite valuable in setting up the digital pathology reading room. In this article, we describe the concept of the digital dashboard and provide some recent examples of informatics-based applications that have been shown to improve the workflow and quality in digital reading environments.
A Note on Parameter Estimation in the Composite Weibull–Pareto Distribution
Directory of Open Access Journals (Sweden)
Enrique Calderín-Ojeda
2018-02-01
Full Text Available Composite models have received much attention in the recent actuarial literature to describe heavy-tailed insurance loss data. One of the models that presents a good performance to describe this kind of data is the composite Weibull–Pareto (CWL distribution. On this note, this distribution is revisited to carry out estimation of parameters via mle and mle2 optimization functions in R. The results are compared with those obtained in a previous paper by using the nlm function, in terms of analytical and graphical methods of model selection. In addition, the consistency of the parameter estimation is examined via a simulation study.
Adaptive surrogate model based multiobjective optimization for coastal aquifer management
Song, Jian; Yang, Yun; Wu, Jianfeng; Wu, Jichun; Sun, Xiaomin; Lin, Jin
2018-06-01
In this study, a novel surrogate model assisted multiobjective memetic algorithm (SMOMA) is developed for optimal pumping strategies of large-scale coastal groundwater problems. The proposed SMOMA integrates an efficient data-driven surrogate model with an improved non-dominated sorted genetic algorithm-II (NSGAII) that employs a local search operator to accelerate its convergence in optimization. The surrogate model based on Kernel Extreme Learning Machine (KELM) is developed and evaluated as an approximate simulator to generate the patterns of regional groundwater flow and salinity levels in coastal aquifers for reducing huge computational burden. The KELM model is adaptively trained during evolutionary search to satisfy desired fidelity level of surrogate so that it inhibits error accumulation of forecasting and results in correctly converging to true Pareto-optimal front. The proposed methodology is then applied to a large-scale coastal aquifer management in Baldwin County, Alabama. Objectives of minimizing the saltwater mass increase and maximizing the total pumping rate in the coastal aquifers are considered. The optimal solutions achieved by the proposed adaptive surrogate model are compared against those solutions obtained from one-shot surrogate model and original simulation model. The adaptive surrogate model does not only improve the prediction accuracy of Pareto-optimal solutions compared with those by the one-shot surrogate model, but also maintains the equivalent quality of Pareto-optimal solutions compared with those by NSGAII coupled with original simulation model, while retaining the advantage of surrogate models in reducing computational burden up to 94% of time-saving. This study shows that the proposed methodology is a computationally efficient and promising tool for multiobjective optimizations of coastal aquifer managements.
New numerical methods for open-loop and feedback solutions to dynamic optimization problems
Ghosh, Pradipto
The topic of the first part of this research is trajectory optimization of dynamical systems via computational swarm intelligence. Particle swarm optimization is a nature-inspired heuristic search method that relies on a group of potential solutions to explore the fitness landscape. Conceptually, each particle in the swarm uses its own memory as well as the knowledge accumulated by the entire swarm to iteratively converge on an optimal or near-optimal solution. It is relatively straightforward to implement and unlike gradient-based solvers, does not require an initial guess or continuity in the problem definition. Although particle swarm optimization has been successfully employed in solving static optimization problems, its application in dynamic optimization, as posed in optimal control theory, is still relatively new. In the first half of this thesis particle swarm optimization is used to generate near-optimal solutions to several nontrivial trajectory optimization problems including thrust programming for minimum fuel, multi-burn spacecraft orbit transfer, and computing minimum-time rest-to-rest trajectories for a robotic manipulator. A distinct feature of the particle swarm optimization implementation in this work is the runtime selection of the optimal solution structure. Optimal trajectories are generated by solving instances of constrained nonlinear mixed-integer programming problems with the swarming technique. For each solved optimal programming problem, the particle swarm optimization result is compared with a nearly exact solution found via a direct method using nonlinear programming. Numerical experiments indicate that swarm search can locate solutions to very great accuracy. The second half of this research develops a new extremal-field approach for synthesizing nearly optimal feedback controllers for optimal control and two-player pursuit-evasion games described by general nonlinear differential equations. A notable revelation from this development
Multi-objective optimization of the management of a waterworks using an integrated well field model
DEFF Research Database (Denmark)
Hansen, Annette Kirstine; Bauer-Gottwein, Peter; Rosbjerg, Dan
2012-01-01
of predicting the water level and the energy consumption of the individual production wells. The model has been applied to Søndersø waterworks in Denmark, where it predicts the energy consumption within 1.8% of the observed. The objectives of the optimization problem are to minimize the specific energy...... provides the decision-makers with compromise solutions between the two competing objectives. In the test case the Pareto optimal solutions are compared with an exhaustive benchmark solution. It is shown that the energy consumption can be reduced by 4% by changing the pumping configuration without violating...
Energy Technology Data Exchange (ETDEWEB)
Giantsoudi, Drosoula, E-mail: dgiantsoudi@partners.org [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts (United States); Grassberger, Clemens; Craft, David; Niemierko, Andrzej; Trofimov, Alexei; Paganetti, Harald [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, Massachusetts (United States)
2013-09-01
Purpose: To investigate the feasibility and potential clinical benefit of linear energy transfer (LET) guided plan optimization in intensity modulated proton therapy (IMPT). Methods and Materials: A multicriteria optimization (MCO) module was used to generate a series of Pareto-optimal IMPT base plans (BPs), corresponding to defined objectives, for 5 patients with head-and-neck cancer and 2 with pancreatic cancer. A Monte Carlo platform was used to calculate dose and LET distributions for each BP. A custom-designed MCO navigation module allowed the user to interpolate between BPs to produce deliverable Pareto-optimal solutions. Differences among the BPs were evaluated for each patient, based on dose–volume and LET–volume histograms and 3-dimensional distributions. An LET-based relative biological effectiveness (RBE) model was used to evaluate the potential clinical benefit when navigating the space of Pareto-optimal BPs. Results: The mean LET values for the target varied up to 30% among the BPs for the head-and-neck patients and up to 14% for the pancreatic cancer patients. Variations were more prominent in organs at risk (OARs), where mean LET values differed by a factor of up to 2 among the BPs for the same patient. An inverse relation between dose and LET distributions for the OARs was typically observed. Accounting for LET-dependent variable RBE values, a potential improvement on RBE-weighted dose of up to 40%, averaged over several structures under study, was noticed during MCO navigation. Conclusions: We present a novel strategy for optimizing proton therapy to maximize dose-averaged LET in tumor targets while simultaneously minimizing dose-averaged LET in normal tissue structures. MCO BPs show substantial LET variations, leading to potentially significant differences in RBE-weighted doses. Pareto-surface navigation, using both dose and LET distributions for guidance, provides the means for evaluating a large variety of deliverable plans and aids in
International Nuclear Information System (INIS)
Giantsoudi, Drosoula; Grassberger, Clemens; Craft, David; Niemierko, Andrzej; Trofimov, Alexei; Paganetti, Harald
2013-01-01
Purpose: To investigate the feasibility and potential clinical benefit of linear energy transfer (LET) guided plan optimization in intensity modulated proton therapy (IMPT). Methods and Materials: A multicriteria optimization (MCO) module was used to generate a series of Pareto-optimal IMPT base plans (BPs), corresponding to defined objectives, for 5 patients with head-and-neck cancer and 2 with pancreatic cancer. A Monte Carlo platform was used to calculate dose and LET distributions for each BP. A custom-designed MCO navigation module allowed the user to interpolate between BPs to produce deliverable Pareto-optimal solutions. Differences among the BPs were evaluated for each patient, based on dose–volume and LET–volume histograms and 3-dimensional distributions. An LET-based relative biological effectiveness (RBE) model was used to evaluate the potential clinical benefit when navigating the space of Pareto-optimal BPs. Results: The mean LET values for the target varied up to 30% among the BPs for the head-and-neck patients and up to 14% for the pancreatic cancer patients. Variations were more prominent in organs at risk (OARs), where mean LET values differed by a factor of up to 2 among the BPs for the same patient. An inverse relation between dose and LET distributions for the OARs was typically observed. Accounting for LET-dependent variable RBE values, a potential improvement on RBE-weighted dose of up to 40%, averaged over several structures under study, was noticed during MCO navigation. Conclusions: We present a novel strategy for optimizing proton therapy to maximize dose-averaged LET in tumor targets while simultaneously minimizing dose-averaged LET in normal tissue structures. MCO BPs show substantial LET variations, leading to potentially significant differences in RBE-weighted doses. Pareto-surface navigation, using both dose and LET distributions for guidance, provides the means for evaluating a large variety of deliverable plans and aids in
A Pareto upper tail for capital income distribution
Oancea, Bogdan; Pirjol, Dan; Andrei, Tudorel
2018-02-01
We present a study of the capital income distribution and of its contribution to the total income (capital income share) using individual tax income data in Romania, for 2013 and 2014. Using a parametric representation we show that the capital income is Pareto distributed in the upper tail, with a Pareto coefficient α ∼ 1 . 44 which is much smaller than the corresponding coefficient for wage- and non-wage-income (excluding capital income), of α ∼ 2 . 53. Including the capital income contribution has the effect of increasing the overall inequality measures.
Designing Pareto-superior demand-response rate options
International Nuclear Information System (INIS)
Horowitz, I.; Woo, C.K.
2006-01-01
We explore three voluntary service options-real-time pricing, time-of-use pricing, and curtailable/interruptible service-that a local distribution company might offer its customers in order to encourage them to alter their electricity usage in response to changes in the electricity-spot-market price. These options are simple and practical, and make minimal information demands. We show that each of the options is Pareto-superior ex ante, in that it benefits both the participants and the company offering it, while not affecting the non-participants. The options are shown to be Pareto-superior ex post as well, except under certain exceptional circumstances. (author)
Pareto-Zipf law in growing systems with multiplicative interactions
Ohtsuki, Toshiya; Tanimoto, Satoshi; Sekiyama, Makoto; Fujihara, Akihiro; Yamamoto, Hiroshi
2018-06-01
Numerical simulations of multiplicatively interacting stochastic processes with weighted selections were conducted. A feedback mechanism to control the weight w of selections was proposed. It becomes evident that when w is moderately controlled around 0, such systems spontaneously exhibit the Pareto-Zipf distribution. The simulation results are universal in the sense that microscopic details, such as parameter values and the type of control and weight, are irrelevant. The central ingredient of the Pareto-Zipf law is argued to be the mild control of interactions.
Multi-objective optimization of glycopeptide antibiotic production in batch and fed batch processes
DEFF Research Database (Denmark)
Maiti, Soumen K.; Eliasson Lantz, Anna; Bhushan, Mani
2011-01-01
batch operations using process model for Amycolatopsis balhimycina, a glycopeptide antibiotic producer. This resulted in a set of several pareto optimal solutions with the two objectives ranging from (0.75gl−1, 3.97g$-1) to (0.44gl−1, 5.19g$-1) for batch and from (1.5gl−1, 5.46g$-1) to (1.1gl−1, 6.34g...
Frenklach, Michael; Wang, Hai; Rabinowitz, Martin J.
1992-01-01
A method of systematic optimization, solution mapping, as applied to a large-scale dynamic model is presented. The basis of the technique is parameterization of model responses in terms of model parameters by simple algebraic expressions. These expressions are obtained by computer experiments arranged in a factorial design. The developed parameterized responses are then used in a joint multiparameter multidata-set optimization. A brief review of the mathematical background of the technique is given. The concept of active parameters is discussed. The technique is applied to determine an optimum set of parameters for a methane combustion mechanism. Five independent responses - comprising ignition delay times, pre-ignition methyl radical concentration profiles, and laminar premixed flame velocities - were optimized with respect to thirteen reaction rate parameters. The numerical predictions of the optimized model are compared to those computed with several recent literature mechanisms. The utility of the solution mapping technique in situations where the optimum is not unique is also demonstrated.
International Nuclear Information System (INIS)
Zhang, Enze; Wu, Yifei; Chen, Qingwei
2014-01-01
This paper proposes a practical approach, combining bare-bones particle swarm optimization and sensitivity-based clustering for solving multi-objective reliability redundancy allocation problems (RAPs). A two-stage process is performed to identify promising solutions. Specifically, a new bare-bones multi-objective particle swarm optimization algorithm (BBMOPSO) is developed and applied in the first stage to identify a Pareto-optimal set. This algorithm mainly differs from other multi-objective particle swarm optimization algorithms in the parameter-free particle updating strategy, which is especially suitable for handling the complexity and nonlinearity of RAPs. Moreover, by utilizing an approach based on the adaptive grid to update the global particle leaders, a mutation operator to improve the exploration ability and an effective constraint handling strategy, the integrated BBMOPSO algorithm can generate excellent approximation of the true Pareto-optimal front for RAPs. This is followed by a data clustering technique based on difference sensitivity in the second stage to prune the obtained Pareto-optimal set and obtain a small, workable sized set of promising solutions for system implementation. Two illustrative examples are presented to show the feasibility and effectiveness of the proposed approach
Pareto Distribution of Firm Size and Knowledge Spillover Process as a Network
Tomohiko Konno
2013-01-01
The firm size distribution is considered as Pareto distribution. In the present paper, we show that the Pareto distribution of firm size results from the spillover network model which was introduced in Konno (2010).
Optimization of the recycling process of precipitation barren solution in a uranium mine
International Nuclear Information System (INIS)
Long Qing; Yu Suqin; Zhao Wucheng; Han Wei; Zhang Hui; Chen Shuangxi
2014-01-01
Alkaline leaching process was adopted to recover uranium from ores in a uranium mine, and high concentration uranium solution, which would be later used in precipitation, was obtained after ion-exchange and elution steps. The eluting agent consisted of NaCl and NaHCO 3 . Though precipitation barren solution contained as high as 80 g/L Na 2 CO 3 , it still can not be recycled due to presence of high Cl - concentration So, both elution and precipitation processes were optimized in order to control the Cl - concentration in the precipitation barren solution to the recyclable concentration range. Because the precipitation barren solution can be recycled by optimization, the agent consumption was lowered and the discharge of waste water was reduced. (authors)
International Nuclear Information System (INIS)
Sayyaadi, Hoseyn; Babaie, Meisam; Farmani, Mohammad Reza
2011-01-01
Multi-objective optimization for design of a benchmark cogeneration system namely as the CGAM cogeneration system is performed. In optimization approach, Exergetic, Exergoeconomic and Environmental objectives are considered, simultaneously. In this regard, the set of Pareto optimal solutions known as the Pareto frontier is obtained using the MOPSO (multi-objective particle swarm optimizer). The exergetic efficiency as an exergetic objective is maximized while the unit cost of the system product and the cost of the environmental impact respectively as exergoeconomic and environmental objectives are minimized. Economic model which is utilized in the exergoeconomic analysis is built based on both simple model (used in original researches of the CGAM system) and the comprehensive modeling namely as TTR (total revenue requirement) method (used in sophisticated exergoeconomic analysis). Finally, a final optimal solution from optimal set of the Pareto frontier is selected using a fuzzy decision-making process based on the Bellman-Zadeh approach and results are compared with corresponding results obtained in a traditional decision-making process. Further, results are compared with the corresponding performance of the base case CGAM system and optimal designs of previous works and discussed. -- Highlights: → A multi-objective optimization approach has been implemented in optimization of a benchmark cogeneration system. → Objective functions based on the environmental impact evaluation, thermodynamic and economic analysis are obtained and optimized. → Particle swarm optimizer implemented and its robustness is compared with NSGA-II. → A final optimal configuration is found using various decision-making approaches. → Results compared with previous works in the field.
International Nuclear Information System (INIS)
Feng, Yongqiang; Zhang, Yaning; Li, Bingxi; Yang, Jinfu; Shi, Yang
2015-01-01
Highlights: • The thermoeconomic comparison of regenerative RORC and BORC is investigated. • The Pareto frontier solution with bi-objective compares with the corresponding single-objective solutions. • The three-objective optimization of the RORC and BORC is studied. • The RORC owns 8.1% higher exergy efficiency and 21.1% more LEC than the BORC under the Pareto-optimal solution. - Abstract: Based on the thermoeconomic multi-objective optimization by using non-dominated sorting genetic algorithm (NSGA-II), considering both thermodynamic performance and economic factors, the thermoeconomic comparison of regenerative organic Rankine cycles (RORC) and basic organic Rankine cycles (BORC) are investigated. The effects of five key parameters including evaporator outlet temperature, condenser temperature, degree of superheat, pinch point temperature difference and degree of supercooling on the exergy efficiency and levelized energy cost (LEC) are examined. Meanwhile, the Pareto frontier solution with bi-objective for maximizing exergy efficiency and minimizing LEC is obtained and compared with the corresponding single-objective solutions. Research demonstrates that there is a significant negative correlation between thermodynamic performance and economic factors. And the optimum exergy efficiency and LEC for the Pareto-optimal solution of the RORC are 55.97% and 0.142 $/kW h, respectively, which are 8.1% higher exergy efficiency and 21.1% more LEC than that of the BORC under considered condition. Highest exergy and thermal efficiencies are accompanied with lowest net power output and worst economic performance. Furthermore, taking the net power output into account, detailed investigation on the three-objective optimization for maximizing exergy efficiency, maximizing net power output and minimizing LEC is discussed
Doerr, Timothy; Alves, Gelio; Yu, Yi-Kuo
2006-03-01
Typical combinatorial optimizations are NP-hard; however, for a particular class of cost functions the corresponding combinatorial optimizations can be solved in polynomial time. This suggests a way to efficiently find approximate solutions - - find a transformation that makes the cost function as similar as possible to that of the solvable class. After keeping many high-ranking solutions using the approximate cost function, one may then re-assess these solutions with the full cost function to find the best approximate solution. Under this approach, it is important to be able to assess the quality of the solutions obtained, e.g., by finding the true ranking of kth best approximate solution when all possible solutions are considered exhaustively. To tackle this statistical issue, we provide a systematic method starting with a scaling function generated from the fininte number of high- ranking solutions followed by a convergent iterative mapping. This method, useful in a variant of the directed paths in random media problem proposed here, can also provide a statistical significance assessment for one of the most important proteomic tasks - - peptide sequencing using tandem mass spectrometry data.
Multi-objective optimization of inverse planning for accurate radiotherapy
International Nuclear Information System (INIS)
Cao Ruifen; Pei Xi; Cheng Mengyun; Li Gui; Hu Liqin; Wu Yican; Jing Jia; Li Guoli
2011-01-01
The multi-objective optimization of inverse planning based on the Pareto solution set, according to the multi-objective character of inverse planning in accurate radiotherapy, was studied in this paper. Firstly, the clinical requirements of a treatment plan were transformed into a multi-objective optimization problem with multiple constraints. Then, the fast and elitist multi-objective Non-dominated Sorting Genetic Algorithm (NSGA-II) was introduced to optimize the problem. A clinical example was tested using this method. The results show that an obtained set of non-dominated solutions were uniformly distributed and the corresponding dose distribution of each solution not only approached the expected dose distribution, but also met the dose-volume constraints. It was indicated that the clinical requirements were better satisfied using the method and the planner could select the optimal treatment plan from the non-dominated solution set. (authors)
A Pareto scale-inflated outlier model and its Bayesian analysis
Scollnik, David P. M.
2016-01-01
This paper develops a Pareto scale-inflated outlier model. This model is intended for use when data from some standard Pareto distribution of interest is suspected to have been contaminated with a relatively small number of outliers from a Pareto distribution with the same shape parameter but with an inflated scale parameter. The Bayesian analysis of this Pareto scale-inflated outlier model is considered and its implementation using the Gibbs sampler is discussed. The paper contains three wor...
Asymptotic Method of Solution for a Problem of Construction of Optimal Gas-Lift Process Modes
Directory of Open Access Journals (Sweden)
Fikrat A. Aliev
2010-01-01
Full Text Available Mathematical model in oil extraction by gas-lift method for the case when the reciprocal value of well's depth represents a small parameter is considered. Problem of optimal mode construction (i.e., construction of optimal program trajectories and controls is reduced to the linear-quadratic optimal control problem with a small parameter. Analytic formulae for determining the solutions at the first-order approximation with respect to the small parameter are obtained. Comparison of the obtained results with known ones on a specific example is provided, which makes it, in particular, possible to use obtained results in realizations of oil extraction problems by gas-lift method.
Multi-agent Pareto appointment exchanging in hospital patient scheduling
I.B. Vermeulen (Ivan); S.M. Bohte (Sander); D.J.A. Somefun (Koye); J.A. La Poutré (Han)
2007-01-01
htmlabstractWe present a dynamic and distributed approach to the hospital patient scheduling problem, in which patients can have multiple appointments that have to be scheduled to different resources. To efficiently solve this problem we develop a multi-agent Pareto-improvement appointment
Multi-agent Pareto appointment exchanging in hospital patient scheduling
Vermeulen, I.B.; Bohté, S.M.; Somefun, D.J.A.; Poutré, La J.A.
2007-01-01
We present a dynamic and distributed approach to the hospital patient scheduling problem, in which patients can have multiple appointments that have to be scheduled to different resources. To efficiently solve this problem we develop a multi-agent Pareto-improvement appointment exchanging algorithm:
Word frequencies: A comparison of Pareto type distributions
Wiegand, Martin; Nadarajah, Saralees; Si, Yuancheng
2018-03-01
Mehri and Jamaati (2017) [18] used Zipf's law to model word frequencies in Holy Bible translations for one hundred live languages. We compare the fit of Zipf's law to a number of Pareto type distributions. The latter distributions are shown to provide the best fit, as judged by a number of comparative plots and error measures. The fit of Zipf's law appears generally poor.
Robustness analysis of bogie suspension components Pareto optimised values
Mousavi Bideleh, Seyed Milad
2017-08-01
Bogie suspension system of high speed trains can significantly affect vehicle performance. Multiobjective optimisation problems are often formulated and solved to find the Pareto optimised values of the suspension components and improve cost efficiency in railway operations from different perspectives. Uncertainties in the design parameters of suspension system can negatively influence the dynamics behaviour of railway vehicles. In this regard, robustness analysis of a bogie dynamics response with respect to uncertainties in the suspension design parameters is considered. A one-car railway vehicle model with 50 degrees of freedom and wear/comfort Pareto optimised values of bogie suspension components is chosen for the analysis. Longitudinal and lateral primary stiffnesses, longitudinal and vertical secondary stiffnesses, as well as yaw damping are considered as five design parameters. The effects of parameter uncertainties on wear, ride comfort, track shift force, stability, and risk of derailment are studied by varying the design parameters around their respective Pareto optimised values according to a lognormal distribution with different coefficient of variations (COVs). The robustness analysis is carried out based on the maximum entropy concept. The multiplicative dimensional reduction method is utilised to simplify the calculation of fractional moments and improve the computational efficiency. The results showed that the dynamics response of the vehicle with wear/comfort Pareto optimised values of bogie suspension is robust against uncertainties in the design parameters and the probability of failure is small for parameter uncertainties with COV up to 0.1.
An Evolutionary Efficiency Alternative to the Notion of Pareto Efficiency
I.P. van Staveren (Irene)
2012-01-01
textabstractThe paper argues that the notion of Pareto efficiency builds on two normative assumptions: the more general consequentialist norm of any efficiency criterion, and the strong no-harm principle of the prohibition of any redistribution during the economic process that hurts at least one
Meta-Modeling by Symbolic Regression and Pareto Simulated Annealing
Stinstra, E.; Rennen, G.; Teeuwen, G.J.A.
2006-01-01
The subject of this paper is a new approach to Symbolic Regression.Other publications on Symbolic Regression use Genetic Programming.This paper describes an alternative method based on Pareto Simulated Annealing.Our method is based on linear regression for the estimation of constants.Interval
Tsallis-Pareto like distributions in hadron-hadron collisions
International Nuclear Information System (INIS)
Barnafoeldi, G G; Uermoessy, K; Biro, T S
2011-01-01
Non-extensive thermodynamics is a novel approach in high energy physics. In high-energy heavy-ion, and especially in proton-proton collisions we are far from a canonical thermal state, described by the Boltzmann-Gibbs statistic. In these reactions low and intermediate transverse momentum spectra are extremely well reproduced by the Tsallis-Pareto distribution, but the physical origin of Tsallis parameters is still an unsettled question. Here, we analyze whether Tsallis-Pareto energy distribution do overlap with hadron spectra at high-pT. We fitted data, measured in proton-proton (proton-antiproton) collisions in wide center of mass energy range from 200 GeV RHIC up to 7 TeV LHC energies. Furthermore, our test is extended to an investigation of a possible √s-dependence of the power in the Tsallis-Pareto distribution, motivated by QCD evolution equations. We found that Tsallis-Pareto distributions fit well high-pT data, in the wide center of mass energy range. Deviance from the fits appears at p T > 20-30 GeV/c, especially on CDF data. Introducing a pT-scaling ansatz, the fits at low and intermediate transverse momenta still remain good, and the deviations tend to disappear at the highest-pT data.
Energy Technology Data Exchange (ETDEWEB)
Anon.
2011-09-15
The l'Oreal CAP site has inaugurated its energetic eco-efficiency installations realized by EDF Optimal Solutions. This solution combines several techniques and makes possible to halve its yearly CO{sub 2} releases. (O.M.)
Directory of Open Access Journals (Sweden)
S. Das
2013-12-01
Full Text Available In this article, optimal homotopy-analysis method is used to obtain approximate analytic solution of the time-fractional diffusion equation with a given initial condition. The fractional derivatives are considered in the Caputo sense. Unlike usual Homotopy analysis method, this method contains at the most three convergence control parameters which describe the faster convergence of the solution. Effects of parameters on the convergence of the approximate series solution by minimizing the averaged residual error with the proper choices of parameters are calculated numerically and presented through graphs and tables for different particular cases.
Directory of Open Access Journals (Sweden)
Jinmyoung Seok
2015-07-01
Full Text Available In this article, we are interested in singularly perturbed nonlinear elliptic problems involving a fractional Laplacian. Under a class of nonlinearity which is believed to be almost optimal, we construct a positive solution which exhibits multiple spikes near any given local minimum components of an exterior potential of the problem.
CSIR Research Space (South Africa)
Greeff, M
2010-09-01
Full Text Available Decision making - with the goal of finding the optimal solution - is an important part of modern life. For example: In the control room of an airport, the goals or objectives are to minimise the risk of airplanes colliding, minimise the time that a...
K-maps: a vehicle to an optimal solution in combinational logic ...
African Journals Online (AJOL)
K-maps: a vehicle to an optimal solution in combinational logic design problems using digital multiplexers. ... Abstract. Application of Karnaugh maps (K-Maps) for the design of combinational logic circuits and sequential logic circuits is a subject that has been widely discussed. However, the use of K-Maps in the design of ...
Tax solutions for optimal reduction of tobacco use in West Africa ...
International Development Research Centre (IDRC) Digital Library (Canada)
Tax solutions for optimal reduction of tobacco use in West Africa. During the first phase of this project, numerous decision-makers were engaged and involved in discussions with the goal of establishing a new taxation system to reduce tobacco use in West Africa. Although regional economic authorities (ECOWAS and ...
Efficient Solutions and Cost-Optimal Analysis for Existing School Buildings
Directory of Open Access Journals (Sweden)
Paolo Maria Congedo
2016-10-01
Full Text Available The recast of the energy performance of buildings directive (EPBD describes a comparative methodological framework to promote energy efficiency and establish minimum energy performance requirements in buildings at the lowest costs. The aim of the cost-optimal methodology is to foster the achievement of nearly zero energy buildings (nZEBs, the new target for all new buildings by 2020, characterized by a high performance with a low energy requirement almost covered by renewable sources. The paper presents the results of the application of the cost-optimal methodology in two existing buildings located in the Mediterranean area. These buildings are a kindergarten and a nursery school that differ in construction period, materials and systems. Several combinations of measures have been applied to derive cost-effective efficient solutions for retrofitting. The cost-optimal level has been identified for each building and the best performing solutions have been selected considering both a financial and a macroeconomic analysis. The results illustrate the suitability of the methodology to assess cost-optimality and energy efficiency in school building refurbishment. The research shows the variants providing the most cost-effective balance between costs and energy saving. The cost-optimal solution reduces primary energy consumption by 85% and gas emissions by 82%–83% in each reference building.
Directory of Open Access Journals (Sweden)
Ben Minnaert
2017-09-01
Full Text Available Wireless power transfer from one transmitter to multiple receivers through inductive coupling is slowly entering the market. However, for certain applications, capacitive wireless power transfer (CWPT using electric coupling might be preferable. In this work, we determine closed-form expressions for a CWPT system with one transmitter and two receivers. We determine the optimal solution for two design requirements: (i maximum power transfer, and (ii maximum system efficiency. We derive the optimal loads and provide the analytical expressions for the efficiency and power. We show that the optimal load conductances for the maximum power configuration are always larger than for the maximum efficiency configuration. Furthermore, it is demonstrated that if the receivers are coupled, this can be compensated for by introducing susceptances that have the same value for both configurations. Finally, we numerically verify our results. We illustrate the similarities to the inductive wireless power transfer (IWPT solution and find that the same, but dual, expressions apply.
Regulation of Renewable Energy Sources to Optimal Power Flow Solutions Using ADMM: Preprint
Energy Technology Data Exchange (ETDEWEB)
Zhang, Yijian; Hong, Mingyi; Dall' Anese, Emiliano; Dhople, Sairaj; Xu, Zi
2017-03-03
This paper considers power distribution systems featuring renewable energy sources (RESs), and develops a distributed optimization method to steer the RES output powers to solutions of AC optimal power flow (OPF) problems. The design of the proposed method leverages suitable linear approximations of the AC-power flow equations, and is based on the Alternating Direction Method of Multipliers (ADMM). Convergence of the RES-inverter output powers to solutions of the OPF problem is established under suitable conditions on the stepsize as well as mismatches between the commanded setpoints and actual RES output powers. In a broad sense, the methods and results proposed here are also applicable to other distributed optimization problem setups with ADMM and inexact dual updates.
International Nuclear Information System (INIS)
Chen, Gonggui; Liu, Lilan; Song, Peizhu; Du, Yangwei
2014-01-01
Highlights: • New method for MOORPD problem using MOCIPSO and MOIPSO approaches. • Constrain-prior Pareto-dominance method is proposed to meet the constraints. • The limits of the apparent power flow of transmission line are considered. • MOORPD model is built up for MOORPD problem. • The achieved results by MOCIPSO and MOIPSO approaches are better than MOPSO method. - Abstract: Multi-objective optimal reactive power dispatch (MOORPD) seeks to not only minimize power losses, but also improve the stability of power system simultaneously. In this paper, the static voltage stability enhancement is achieved through incorporating L index in MOORPD problem. Chaotic improved PSO-based multi-objective optimization (MOCIPSO) and improved PSO-based multi-objective optimization (MOIPSO) approaches are proposed for solving complex multi-objective, mixed integer nonlinear problems such as minimization of power losses and L index in power systems simultaneously. In MOCIPSO and MOIPSO based optimization approaches, crossover operator is proposed to enhance PSO diversity and improve their global searching capability, and for MOCIPSO based optimization approach, chaotic sequences based on logistic map instead of random sequences is introduced to PSO for enhancing exploitation capability. In the two approaches, constrain-prior Pareto-dominance method (CPM) is proposed to meet the inequality constraints on state variables, the sorting and crowding distance methods are considered to maintain a well distributed Pareto optimal solutions, and moreover, fuzzy set theory is employed to extract the best compromise solution over the Pareto optimal curve. The proposed approaches have been examined and tested in the IEEE 30 bus and the IEEE 57 bus power systems. The performances of MOCIPSO, MOIPSO, and multi-objective PSO (MOPSO) approaches are compared with respect to multi-objective performance measures. The simulation results are promising and confirm the ability of MOCIPSO and
Directory of Open Access Journals (Sweden)
Ion LUNGU
2012-01-01
Full Text Available In this paper, we research, analyze and develop optimization solutions for the parallel reduction function using graphics processing units (GPUs that implement the Compute Unified Device Architecture (CUDA, a modern and novel approach for improving the software performance of data processing applications and algorithms. Many of these applications and algorithms make use of the reduction function in their computational steps. After having designed the function and its algorithmic steps in CUDA, we have progressively developed and implemented optimization solutions for the reduction function. In order to confirm, test and evaluate the solutions' efficiency, we have developed a custom tailored benchmark suite. We have analyzed the obtained experimental results regarding: the comparison of the execution time and bandwidth when using graphic processing units covering the main CUDA architectures (Tesla GT200, Fermi GF100, Kepler GK104 and a central processing unit; the data type influence; the binary operator's influence.
Monopoly, Pareto and Ramsey mark-ups
Ten Raa, T.
2009-01-01
Monopoly prices are too high. It is a price level problem, in the sense that the relative mark-ups have Ramsey optimal proportions, at least for independent constant elasticity demands. I show that this feature of monopoly prices breaks down the moment one demand is replaced by the textbook linear
Optimal power system generation scheduling by multi-objective genetic algorithms with preferences
International Nuclear Information System (INIS)
Zio, E.; Baraldi, P.; Pedroni, N.
2009-01-01
Power system generation scheduling is an important issue both from the economical and environmental safety viewpoints. The scheduling involves decisions with regards to the units start-up and shut-down times and to the assignment of the load demands to the committed generating units for minimizing the system operation costs and the emission of atmospheric pollutants. As many other real-world engineering problems, power system generation scheduling involves multiple, conflicting optimization criteria for which there exists no single best solution with respect to all criteria considered. Multi-objective optimization algorithms, based on the principle of Pareto optimality, can then be designed to search for the set of nondominated scheduling solutions from which the decision-maker (DM) must a posteriori choose the preferred alternative. On the other hand, often, information is available a priori regarding the preference values of the DM with respect to the objectives. When possible, it is important to exploit this information during the search so as to focus it on the region of preference of the Pareto-optimal set. In this paper, ways are explored to use this preference information for driving a multi-objective genetic algorithm towards the preferential region of the Pareto-optimal front. Two methods are considered: the first one extends the concept of Pareto dominance by biasing the chromosome replacement step of the algorithm by means of numerical weights that express the DM' s preferences; the second one drives the search algorithm by changing the shape of the dominance region according to linear trade-off functions specified by the DM. The effectiveness of the proposed approaches is first compared on a case study of literature. Then, a nonlinear, constrained, two-objective power generation scheduling problem is effectively tackled
Directory of Open Access Journals (Sweden)
Martin M Gossner
Full Text Available There is a great demand for standardising biodiversity assessments in order to allow optimal comparison across research groups. For invertebrates, pitfall or flight-interception traps are commonly used, but sampling solution differs widely between studies, which could influence the communities collected and affect sample processing (morphological or genetic. We assessed arthropod communities with flight-interception traps using three commonly used sampling solutions across two forest types and two vertical strata. We first considered the effect of sampling solution and its interaction with forest type, vertical stratum, and position of sampling jar at the trap on sample condition and community composition. We found that samples collected in copper sulphate were more mouldy and fragmented relative to other solutions which might impair morphological identification, but condition depended on forest type, trap type and the position of the jar. Community composition, based on order-level identification, did not differ across sampling solutions and only varied with forest type and vertical stratum. Species richness and species-level community composition, however, differed greatly among sampling solutions. Renner solution was highly attractant for beetles and repellent for true bugs. Secondly, we tested whether sampling solution affects subsequent molecular analyses and found that DNA barcoding success was species-specific. Samples from copper sulphate produced the fewest successful DNA sequences for genetic identification, and since DNA yield or quality was not particularly reduced in these samples additional interactions between the solution and DNA must also be occurring. Our results show that the choice of sampling solution should be an important consideration in biodiversity studies. Due to the potential bias towards or against certain species by Ethanol-containing sampling solution we suggest ethylene glycol as a suitable sampling solution when
Horst, Reto; Wüthrich, Kurt
2015-07-20
Reconstitution of integral membrane proteins (IMP) in aqueous solutions of detergent micelles has been extensively used in structural biology, using either X-ray crystallography or NMR in solution. Further progress could be achieved by establishing a rational basis for the selection of detergent and buffer conditions, since the stringent bottleneck that slows down the structural biology of IMPs is the preparation of diffracting crystals or concentrated solutions of stable isotope labeled IMPs. Here, we describe procedures to monitor the quality of aqueous solutions of [ 2 H, 15 N]-labeled IMPs reconstituted in detergent micelles. This approach has been developed for studies of β-barrel IMPs, where it was successfully applied for numerous NMR structure determinations, and it has also been adapted for use with α-helical IMPs, in particular GPCRs, in guiding crystallization trials and optimizing samples for NMR studies (Horst et al ., 2013). 2D [ 15 N, 1 H]-correlation maps are used as "fingerprints" to assess the foldedness of the IMP in solution. For promising samples, these "inexpensive" data are then supplemented with measurements of the translational and rotational diffusion coefficients, which give information on the shape and size of the IMP/detergent mixed micelles. Using microcoil equipment for these NMR experiments enables data collection with only micrograms of protein and detergent. This makes serial screens of variable solution conditions viable, enabling the optimization of parameters such as the detergent concentration, sample temperature, pH and the composition of the buffer.
The optimal design of UAV wing structure
Długosz, Adam; Klimek, Wiktor
2018-01-01
The paper presents an optimal design of UAV wing, made of composite materials. The aim of the optimization is to improve strength and stiffness together with reduction of the weight of the structure. Three different types of functionals, which depend on stress, stiffness and the total mass are defined. The paper presents an application of the in-house implementation of the evolutionary multi-objective algorithm in optimization of the UAV wing structure. Values of the functionals are calculated on the basis of results obtained from numerical simulations. Numerical FEM model, consisting of different composite materials is created. Adequacy of the numerical model is verified by results obtained from the experiment, performed on a tensile testing machine. Examples of multi-objective optimization by means of Pareto-optimal set of solutions are presented.
Genetic search for an optimal power flow solution from a high density cluster
Energy Technology Data Exchange (ETDEWEB)
Amarnath, R.V. [Hi-Tech College of Engineering and Technology, Hyderabad (India); Ramana, N.V. [JNTU College of Engineering, Jagityala (India)
2008-07-01
This paper proposed a novel method to solve optimal power flow (OPF) problems. The method is based on a genetic algorithm (GA) search from a High Density Cluster (GAHDC). The algorithm of the proposed method includes 3 stages, notably (1) a suboptimal solution is obtained via a conventional analytical method, (2) a high density cluster, which consists of other suboptimal data points from the first stage, is formed using a density-based cluster algorithm, and (3) a genetic algorithm based search is carried out for the exact optimal solution from a low population sized, high density cluster. The final optimal solution thoroughly satisfies the well defined fitness function. A standard IEEE 30-bus test system was considered for the simulation study. Numerical results were presented and compared with the results of other approaches. It was concluded that although there is not much difference in numerical values, the proposed method has the advantage of minimal computational effort and reduced CPU time. As such, the method would be suitable for online applications such as the present Optimal Power Flow problem. 24 refs., 2 tabs., 4 figs.
Global shape optimization of airfoil using multi-objective genetic algorithm
International Nuclear Information System (INIS)
Lee, Ju Hee; Lee, Sang Hwan; Park, Kyoung Woo
2005-01-01
The shape optimization of an airfoil has been performed for an incompressible viscous flow. In this study, Pareto frontier sets, which are global and non-dominated solutions, can be obtained without various weighting factors by using the multi-objective genetic algorithm. An NACA0012 airfoil is considered as a baseline model, and the profile of the airfoil is parameterized and rebuilt with four Bezier curves. Two curves, from leading to maximum thickness, are composed of five control points and the rest, from maximum thickness to tailing edge, are composed of four control points. There are eighteen design variables and two objective functions such as the lift and drag coefficients. A generation is made up of forty-five individuals. After fifteenth evolutions, the Pareto individuals of twenty can be achieved. One Pareto, which is the best of the reduction of the drag force, improves its drag to 13% and lift-drag ratio to 2%. Another Pareto, however, which is focused on increasing the lift force, can improve its lift force to 61%, while sustaining its drag force, compared to those of the baseline model
Global shape optimization of airfoil using multi-objective genetic algorithm
Energy Technology Data Exchange (ETDEWEB)
Lee, Ju Hee; Lee, Sang Hwan [Hanyang Univ., Seoul (Korea, Republic of); Park, Kyoung Woo [Hoseo Univ., Asan (Korea, Republic of)
2005-10-01
The shape optimization of an airfoil has been performed for an incompressible viscous flow. In this study, Pareto frontier sets, which are global and non-dominated solutions, can be obtained without various weighting factors by using the multi-objective genetic algorithm. An NACA0012 airfoil is considered as a baseline model, and the profile of the airfoil is parameterized and rebuilt with four Bezier curves. Two curves, from leading to maximum thickness, are composed of five control points and the rest, from maximum thickness to tailing edge, are composed of four control points. There are eighteen design variables and two objective functions such as the lift and drag coefficients. A generation is made up of forty-five individuals. After fifteenth evolutions, the Pareto individuals of twenty can be achieved. One Pareto, which is the best of the reduction of the drag force, improves its drag to 13% and lift-drag ratio to 2%. Another Pareto, however, which is focused on increasing the lift force, can improve its lift force to 61%, while sustaining its drag force, compared to those of the baseline model.
The optimal solution of a non-convex state-dependent LQR problem and its applications.
Directory of Open Access Journals (Sweden)
Xudan Xu
Full Text Available This paper studies a Non-convex State-dependent Linear Quadratic Regulator (NSLQR problem, in which the control penalty weighting matrix [Formula: see text] in the performance index is state-dependent. A necessary and sufficient condition for the optimal solution is established with a rigorous proof by Euler-Lagrange Equation. It is found that the optimal solution of the NSLQR problem can be obtained by solving a Pseudo-Differential-Riccati-Equation (PDRE simultaneously with the closed-loop system equation. A Comparison Theorem for the PDRE is given to facilitate solution methods for the PDRE. A linear time-variant system is employed as an example in simulation to verify the proposed optimal solution. As a non-trivial application, a goal pursuit process in psychology is modeled as a NSLQR problem and two typical goal pursuit behaviors found in human and animals are reproduced using different control weighting [Formula: see text]. It is found that these two behaviors save control energy and cause less stress over Conventional Control Behavior typified by the LQR control with a constant control weighting [Formula: see text], in situations where only the goal discrepancy at the terminal time is of concern, such as in Marathon races and target hitting missions.
DEFF Research Database (Denmark)
Wu, Guanglei; Bai, Shaoping; Hjørnet, Preben
2015-01-01
This paper deals with the parametric optimum design of a parallel Schoenflies-motion robot, named "Ragnar", designed for fast and flexible pick-and-place applications. The robot architecture admits a rectangular workspace, which can utilize the shop-floor space efficiently. In this work......, the parametric models of the transmission quality, elasto-statics and dynamics are established. By taking into consideration of design requirements and pick-and-place trajectory, a comprehensive multi-objective optimization problem is formulated to optimize both kinematic and dynamic performances. The Pareto......-front is obtained, which provides optimal solutions to the robot design. Robot prototyping work based on the optimal results is described....
International Nuclear Information System (INIS)
Cheung, Brian C.; Carriveau, Rupp; Ting, David S.K.
2014-01-01
This paper presents the findings from a multi-objective genetic algorithm optimization study on the design parameters of an underwater compressed air energy storage system (UWCAES). A 4 MWh UWCAES system was numerically simulated and its energy, exergy, and exergoeconomics were analysed. Optimal system configurations were determined that maximized the UWCAES system round-trip efficiency and operating profit, and minimized the cost rate of exergy destruction and capital expenditures. The optimal solutions obtained from the multi-objective optimization model formed a Pareto-optimal front, and a single preferred solution was selected using the pseudo-weight vector multi-criteria decision making approach. A sensitivity analysis was performed on interest rates to gauge its impact on preferred system designs. Results showed similar preferred system designs for all interest rates in the studied range. The round-trip efficiency and operating profit of the preferred system designs were approximately 68.5% and $53.5/cycle, respectively. The cost rate of the system increased with interest rates. - Highlights: • UWCAES system configurations were developed using multi-objective optimization. • System was optimized for energy efficiency, exergy, and exergoeconomics • Pareto-optimal solution surfaces were developed at different interest rates. • Similar preferred system configurations were found at all interest rates studied
Shape optimization of draft tubes for Agnew microhydro turbines
International Nuclear Information System (INIS)
Shojaeefard, Mohammad Hasan; Mirzaei, Ammar; Babaei, Ali
2014-01-01
Highlights: • The draft tube of Agnew microhydro turbine was optimized. • Pareto optimal solutions were determined by neural networks and NSGA-II algorithm. • The pressure recovery factor increases with height and angle over design ranges. • The loss coefficient reaches the minimum values at angles about 2 o . • Swirl of the incoming flow has great influence on the optimization results. - Abstract: In this study, the shape optimization of draft tubes utilized in Agnew type microhydro turbines has been discussed. The design parameters of the draft tube such as the cone angle and the height above the tailrace are considered in defining an optimization problem whose goal is to maximize the pressure recovery factor and minimize the energy loss coefficient of flow. The design space is determined by considering the experimental constraints and parameterized by the method of face-centered uniform ascertain distribution. The numerical simulations are performed using the boundary conditions found from laboratory tests and the obtained results are analyzed to create and validate a feed-forward neural network model, which is implemented as a surrogate model. The optimal Pareto solutions are finally determined using the NSGA-II evolutionary algorithm and compared for different inlet conditions. The results predict that the high swirl of the incoming flow drastically reduces the performance of the draft tube
Directory of Open Access Journals (Sweden)
Alexandr Victorovich Budylskiy
2014-06-01
Full Text Available This article considers the multicriteria optimization approach using the modified genetic algorithm to solve the project-scheduling problem under duration and cost constraints. The work contains the list of choices for solving this problem. The multicriteria optimization approach is justified here. The study describes the Pareto principles, which are used in the modified genetic algorithm. We identify the mathematical model of the project-scheduling problem. We introduced the modified genetic algorithm, the ranking strategies, the elitism approaches. The article includes the example.
Optimal solutions for a bio mathematical model for the evolution of smoking habit
Sikander, Waseem; Khan, Umar; Ahmed, Naveed; Mohyud-Din, Syed Tauseef
In this study, we apply Variation of Parameter Method (VPM) coupled with an auxiliary parameter to obtain the approximate solutions for the epidemic model for the evolution of smoking habit in a constant population. Convergence of the developed algorithm, namely VPM with an auxiliary parameter is studied. Furthermore, a simple way is considered for obtaining an optimal value of auxiliary parameter via minimizing the total residual error over the domain of problem. Comparison of the obtained results with standard VPM shows that an auxiliary parameter is very feasible and reliable in controlling the convergence of approximate solutions.
Bartosz, Krzysztof; Denkowski, Zdzisław; Kalita, Piotr
In this paper the sensitivity of optimal solutions to control problems described by second order evolution subdifferential inclusions under perturbations of state relations and of cost functionals is investigated. First we establish a new existence result for a class of such inclusions. Then, based on the theory of sequential [Formula: see text]-convergence we recall the abstract scheme concerning convergence of minimal values and minimizers. The abstract scheme works provided we can establish two properties: the Kuratowski convergence of solution sets for the state relations and some complementary [Formula: see text]-convergence of the cost functionals. Then these two properties are implemented in the considered case.
A parallel optimization method for product configuration and supplier selection based on interval
Zheng, Jian; Zhang, Meng; Li, Guoxi
2017-06-01
In the process of design and manufacturing, product configuration is an important way of product development, and supplier selection is an essential component of supply chain management. To reduce the risk of procurement and maximize the profits of enterprises, this study proposes to combine the product configuration and supplier selection, and express the multiple uncertainties as interval numbers. An integrated optimization model of interval product configuration and supplier selection was established, and NSGA-II was put forward to locate the Pareto-optimal solutions to the interval multiobjective optimization model.
Income inequality in Romania: The exponential-Pareto distribution
Oancea, Bogdan; Andrei, Tudorel; Pirjol, Dan
2017-03-01
We present a study of the distribution of the gross personal income and income inequality in Romania, using individual tax income data, and both non-parametric and parametric methods. Comparing with official results based on household budget surveys (the Family Budgets Survey and the EU-SILC data), we find that the latter underestimate the income share of the high income region, and the overall income inequality. A parametric study shows that the income distribution is well described by an exponential distribution in the low and middle incomes region, and by a Pareto distribution in the high income region with Pareto coefficient α = 2.53. We note an anomaly in the distribution in the low incomes region (∼9,250 RON), and present a model which explains it in terms of partial income reporting.
Pareto-depth for multiple-query image retrieval.
Hsiao, Ko-Jen; Calder, Jeff; Hero, Alfred O
2015-02-01
Most content-based image retrieval systems consider either one single query, or multiple queries that include the same object or represent the same semantic information. In this paper, we consider the content-based image retrieval problem for multiple query images corresponding to different image semantics. We propose a novel multiple-query information retrieval algorithm that combines the Pareto front method with efficient manifold ranking. We show that our proposed algorithm outperforms state of the art multiple-query retrieval algorithms on real-world image databases. We attribute this performance improvement to concavity properties of the Pareto fronts, and prove a theoretical result that characterizes the asymptotic concavity of the fronts.
Decomposition and Simplification of Multivariate Data using Pareto Sets.
Huettenberger, Lars; Heine, Christian; Garth, Christoph
2014-12-01
Topological and structural analysis of multivariate data is aimed at improving the understanding and usage of such data through identification of intrinsic features and structural relationships among multiple variables. We present two novel methods for simplifying so-called Pareto sets that describe such structural relationships. Such simplification is a precondition for meaningful visualization of structurally rich or noisy data. As a framework for simplification operations, we introduce a decomposition of the data domain into regions of equivalent structural behavior and the reachability graph that describes global connectivity of Pareto extrema. Simplification is then performed as a sequence of edge collapses in this graph; to determine a suitable sequence of such operations, we describe and utilize a comparison measure that reflects the changes to the data that each operation represents. We demonstrate and evaluate our methods on synthetic and real-world examples.
[Origination of Pareto distribution in complex dynamic systems].
Chernavskiĭ, D S; Nikitin, A P; Chernavskaia, O D
2008-01-01
The Pareto distribution, whose probability density function can be approximated at sufficiently great chi as rho(chi) - chi(-alpha), where alpha > or = 2, is of crucial importance from both the theoretical and practical point of view. The main reason is its qualitative distinction from the normal (Gaussian) distribution. Namely, the probability of high deviations appears to be significantly higher. The conception of the universal applicability of the Gauss law remains to be widely distributed despite the lack of objective confirmation of this notion in a variety of application areas. The origin of the Pareto distribution in dynamic systems located in the gaussian noise field is considered. A simple one-dimensional model is discussed where the system response in a rather wide interval of the variable can be quite precisely approximated by this distribution.
Optimal solution of full fuzzy transportation problems using total integral ranking
Sam’an, M.; Farikhin; Hariyanto, S.; Surarso, B.
2018-03-01
Full fuzzy transportation problem (FFTP) is a transportation problem where transport costs, demand, supply and decision variables are expressed in form of fuzzy numbers. To solve fuzzy transportation problem, fuzzy number parameter must be converted to a crisp number called defuzzyfication method. In this new total integral ranking method with fuzzy numbers from conversion of trapezoidal fuzzy numbers to hexagonal fuzzy numbers obtained result of consistency defuzzyfication on symmetrical fuzzy hexagonal and non symmetrical type 2 numbers with fuzzy triangular numbers. To calculate of optimum solution FTP used fuzzy transportation algorithm with least cost method. From this optimum solution, it is found that use of fuzzy number form total integral ranking with index of optimism gives different optimum value. In addition, total integral ranking value using hexagonal fuzzy numbers has an optimal value better than the total integral ranking value using trapezoidal fuzzy numbers.
Reinforcement learning solution for HJB equation arising in constrained optimal control problem.
Luo, Biao; Wu, Huai-Ning; Huang, Tingwen; Liu, Derong
2015-11-01
The constrained optimal control problem depends on the solution of the complicated Hamilton-Jacobi-Bellman equation (HJBE). In this paper, a data-based off-policy reinforcement learning (RL) method is proposed, which learns the solution of the HJBE and the optimal control policy from real system data. One important feature of the off-policy RL is that its policy evaluation can be realized with data generated by other behavior policies, not necessarily the target policy, which solves the insufficient exploration problem. The convergence of the off-policy RL is proved by demonstrating its equivalence to the successive approximation approach. Its implementation procedure is based on the actor-critic neural networks structure, where the function approximation is conducted with linearly independent basis functions. Subsequently, the convergence of the implementation procedure with function approximation is also proved. Finally, its effectiveness is verified through computer simulations. Copyright © 2015 Elsevier Ltd. All rights reserved.
Using the Pareto Distribution to Improve Estimates of Topcoded Earnings
Philip Armour; Richard V. Burkhauser; Jeff Larrimore
2014-01-01
Inconsistent censoring in the public-use March Current Population Survey (CPS) limits its usefulness in measuring labor earnings trends. Using Pareto estimation methods with less-censored internal CPS data, we create an enhanced cell-mean series to capture top earnings in the public-use CPS. We find that previous approaches for imputing topcoded earnings systematically understate top earnings. Annual earnings inequality trends since 1963 using our series closely approximate those found by Kop...
Accelerated life testing design using geometric process for pareto distribution
Mustafa Kamal; Shazia Zarrin; Arif Ul Islam
2013-01-01
In this paper the geometric process is used for the analysis of accelerated life testing under constant stress for Pareto Distribution. Assuming that the lifetimes under increasing stress levels form a geometric process, estimates of the parameters are obtained by using the maximum likelihood method for complete data. In addition, asymptotic interval estimates of the parameters of the distribution using Fisher information matrix are also obtained. The statistical properties of the parameters ...
Small Sample Robust Testing for Normality against Pareto Tails
Czech Academy of Sciences Publication Activity Database
Stehlík, M.; Fabián, Zdeněk; Střelec, L.
2012-01-01
Roč. 41, č. 7 (2012), s. 1167-1194 ISSN 0361-0918 Grant - others:Aktion(CZ-AT) 51p7, 54p21, 50p14, 54p13 Institutional research plan: CEZ:AV0Z10300504 Keywords : consistency * Hill estimator * t-Hill estimator * location functional * Pareto tail * power comparison * returns * robust tests for normality Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.295, year: 2012
International Nuclear Information System (INIS)
Milickovic, N.; Lahanas, M.; Papagiannopoulou, M.; Zamboglou, N.; Baltas, D.
2002-01-01
In high dose rate (HDR) brachytherapy, conventional dose optimization algorithms consider multiple objectives in the form of an aggregate function that transforms the multiobjective problem into a single-objective problem. As a result, there is a loss of information on the available alternative possible solutions. This method assumes that the treatment planner exactly understands the correlation between competing objectives and knows the physical constraints. This knowledge is provided by the Pareto trade-off set obtained by single-objective optimization algorithms with a repeated optimization with different importance vectors. A mapping technique avoids non-feasible solutions with negative dwell weights and allows the use of constraint free gradient-based deterministic algorithms. We compare various such algorithms and methods which could improve their performance. This finally allows us to generate a large number of solutions in a few minutes. We use objectives expressed in terms of dose variances obtained from a few hundred sampling points in the planning target volume (PTV) and in organs at risk (OAR). We compare two- to four-dimensional Pareto fronts obtained with the deterministic algorithms and with a fast-simulated annealing algorithm. For PTV-based objectives, due to the convex objective functions, the obtained solutions are global optimal. If OARs are included, then the solutions found are also global optimal, although local minima may be present as suggested. (author)
Searching for optimal integer solutions to set partitioning problems using column generation
Bredström, David; Jörnsten, Kurt; Rönnqvist, Mikael
2007-01-01
We describe a new approach to produce integer feasible columns to a set partitioning problem directly in solving the linear programming (LP) relaxation using column generation. Traditionally, column generation is aimed to solve the LP relaxation as quick as possible without any concern of the integer properties of the columns formed. In our approach we aim to generate the columns forming the optimal integer solution while simultaneously solving the LP relaxation. By this we can re...
International Nuclear Information System (INIS)
Pombo, A. Vieira; Murta-Pina, João; Pires, V. Fernão
2015-01-01
A multi-objective planning approach for the reliability of electric distribution networks using a memetic optimization is presented. In this reliability optimization, the type of the equipment (switches or reclosers) and their location are optimized. The multiple objectives considered to find the optimal values for these planning variables are the minimization of the total equipment cost and at the same time the minimization of two distribution network reliability indexes. The reliability indexes are the system average interruption frequency index (SAIFI) and system average interruption duration index (SAIDI). To solve this problem a memetic evolutionary algorithm is proposed, which combines the Non-Dominated Sorting Genetic Algorithm II (NSGA-II) with a local search algorithm. The obtained Pareto-optimal front contains solutions of different trade-offs with respect to the three objectives. A real distribution network is used to test the proposed algorithm. The obtained results show that this approach allows the utility to obtain the optimal type and location of the equipments to achieve the best reliability with the lower cost. - Highlights: • Reliability indexes SAIFI and SAIDI and Equipment Cost are optimized. • Optimization of equipment type, number and location on a MV network. • Memetic evolutionary algorithm with a local search algorithm is proposed. • Pareto optimal front solutions with respect to the three objective functions
Peralta, Richard C.; Forghani, Ali; Fayad, Hala
2014-04-01
Many real water resources optimization problems involve conflicting objectives for which the main goal is to find a set of optimal solutions on, or near to the Pareto front. E-constraint and weighting multiobjective optimization techniques have shortcomings, especially as the number of objectives increases. Multiobjective Genetic Algorithms (MGA) have been previously proposed to overcome these difficulties. Here, an MGA derives a set of optimal solutions for multiobjective multiuser conjunctive use of reservoir, stream, and (un)confined groundwater resources. The proposed methodology is applied to a hydraulically and economically nonlinear system in which all significant flows, including stream-aquifer-reservoir-diversion-return flow interactions, are simulated and optimized simultaneously for multiple periods. Neural networks represent constrained state variables. The addressed objectives that can be optimized simultaneously in the coupled simulation-optimization model are: (1) maximizing water provided from sources, (2) maximizing hydropower production, and (3) minimizing operation costs of transporting water from sources to destinations. Results show the efficiency of multiobjective genetic algorithms for generating Pareto optimal sets for complex nonlinear multiobjective optimization problems.
Generalized Pareto optimum and semi-classical spinors
Rouleux, M.
2018-02-01
In 1971, S. Smale presented a generalization of Pareto optimum he called the critical Pareto set. The underlying motivation was to extend Morse theory to several functions, i.e. to find a Morse theory for m differentiable functions defined on a manifold M of dimension ℓ. We use this framework to take a 2 × 2 Hamiltonian ℋ = ℋ(p) ∈ 2 C ∞(T * R 2) to its normal form near a singular point of the Fresnel surface. Namely we say that ℋ has the Pareto property if it decomposes, locally, up to a conjugation with regular matrices, as ℋ(p) = u ‧(p)C(p)(u ‧(p))*, where u : R 2 → R 2 has singularities of codimension 1 or 2, and C(p) is a regular Hermitian matrix (“integrating factor”). In particular this applies in certain cases to the matrix Hamiltonian of Elasticity theory and its (relative) perturbations of order 3 in momentum at the origin.
Getu, Rahel; Tola, Yetenayet B; Neela, Satheesh
2017-01-01
Soy milk-based beverages play an important role as a healthy food alternative for human consumption. However, the ‘beany’ flavor and chalky mouth feel of soy milk often makes it unpalatable to consumers. The objective of the present study is to optimize a blend of soy milk, mango nectar and sucrose solution for the best quality soy milk-based beverage. This study was designed to develop a soy milk blended beverage, with mango nectar and sucrose solutions, with the best physicochemical and sensory properties. Fourteen combinations of formulations were determined by D-optimal mixture simplex lattice design, by using Design expert. The blended beverages were prepared by mixing the three basic ingredients with the range of 60−100% soy milk, 0–25% mango nectar and 0–15% sucrose solution. The prepared blended beverage was analyzed for selected physicochemical and sensory properties. The statistical significance of the terms in the regression equations were examined by Analysis of Variance (ANOVA) for each response and the significance test level was set at 5% (p nectar and sucrose solution increased, total color change, total soluble solid, gross energy, titratable acidity, and beta-carotene contents increased but with a decrease in moisture , ash, protein, ether extract, minerals and phytic acid contents was observed. Fi- nally, numerical optimization determined that 81% soy milk, 16% Mango nectar and 3% sugar solution will give by a soy milk blended beverage with the best physicochemical and sensory properties, with a desirability of 0.564. Blending soy milk with fruit juice such as mango is beneficial, as it improves sensory as well as selected nutritional parameters.
Directory of Open Access Journals (Sweden)
S. N. Syed Nasir
2018-01-01
Full Text Available This research is focusing on optimal placement and sizing of multiple variable passive filter (VPF to mitigate harmonic distortion due to charging station (CS at 449 bus distribution network. There are 132 units of CS which are scheduled based on user behaviour within 24 hours, with the interval of 15 minutes. By considering the varying of CS patterns and harmonic impact, Modified Lightning Search Algorithm (MLSA is used to find 22 units of VPF coordination, so that less harmonics will be injected from 415 V bus to the medium voltage network and power loss is also reduced. Power system harmonic flow, VPF, CS, battery, and the analysis will be modelled in MATLAB/m-file platform. High Performance Computing (HPC is used to make simulation faster. Pareto-Fuzzy technique is used to obtain sizing of VPF from all nondominated solutions. From the result, the optimal placements and sizes of VPF are able to reduce the maximum THD for voltage and current and also the total apparent losses up to 39.14%, 52.5%, and 2.96%, respectively. Therefore, it can be concluded that the MLSA is suitable method to mitigate harmonic and it is beneficial in minimizing the impact of aggressive CS installation at distribution network.
Improved Solutions for the Optimal Coordination of DOCRs Using Firefly Algorithm
Directory of Open Access Journals (Sweden)
Muhammad Sulaiman
2018-01-01
Full Text Available Nature-inspired optimization techniques are useful tools in electrical engineering problems to minimize or maximize an objective function. In this paper, we use the firefly algorithm to improve the optimal solution for the problem of directional overcurrent relays (DOCRs. It is a complex and highly nonlinear constrained optimization problem. In this problem, we have two types of design variables, which are variables for plug settings (PSs and the time dial settings (TDSs for each relay in the circuit. The objective function is to minimize the total operating time of all the basic relays to avoid unnecessary delays. We have considered four models in this paper which are IEEE (3-bus, 4-bus, 6-bus, and 8-bus models. From the numerical results, it is obvious that the firefly algorithm with certain parameter settings performs better than the other state-of-the-art algorithms.
A Decomposition Model for HPLC-DAD Data Set and Its Solution by Particle Swarm Optimization
Directory of Open Access Journals (Sweden)
Lizhi Cui
2014-01-01
Full Text Available This paper proposes a separation method, based on the model of Generalized Reference Curve Measurement and the algorithm of Particle Swarm Optimization (GRCM-PSO, for the High Performance Liquid Chromatography with Diode Array Detection (HPLC-DAD data set. Firstly, initial parameters are generated to construct reference curves for the chromatogram peaks of the compounds based on its physical principle. Then, a General Reference Curve Measurement (GRCM model is designed to transform these parameters to scalar values, which indicate the fitness for all parameters. Thirdly, rough solutions are found by searching individual target for every parameter, and reinitialization only around these rough solutions is executed. Then, the Particle Swarm Optimization (PSO algorithm is adopted to obtain the optimal parameters by minimizing the fitness of these new parameters given by the GRCM model. Finally, spectra for the compounds are estimated based on the optimal parameters and the HPLC-DAD data set. Through simulations and experiments, following conclusions are drawn: (1 the GRCM-PSO method can separate the chromatogram peaks and spectra from the HPLC-DAD data set without knowing the number of the compounds in advance even when severe overlap and white noise exist; (2 the GRCM-PSO method is able to handle the real HPLC-DAD data set.
Directory of Open Access Journals (Sweden)
R. Manam
2017-12-01
Full Text Available In this paper, a sensitive constrained integer linear programming approach is formulated for the optimal allocation of Phasor Measurement Units (PMUs in a power system network to obtain state estimation. In this approach, sensitive buses along with zero injection buses (ZIB are considered for optimal allocation of PMUs in the network to generate state estimation solutions. Sensitive buses are evolved from the mean of bus voltages subjected to increase of load consistently up to 50%. Sensitive buses are ranked in order to place PMUs. Sensitive constrained optimal PMU allocation in case of single line and no line contingency are considered in observability analysis to ensure protection and control of power system from abnormal conditions. Modeling of ZIB constraints is included to minimize the number of PMU network allocations. This paper presents optimal allocation of PMU at sensitive buses with zero injection modeling, considering cost criteria and redundancy to increase the accuracy of state estimation solution without losing observability of the whole system. Simulations are carried out on IEEE 14, 30 and 57 bus systems and results obtained are compared with traditional and other state estimation methods available in the literature, to demonstrate the effectiveness of the proposed method.
Optimal Solution for VLSI Physical Design Automation Using Hybrid Genetic Algorithm
Directory of Open Access Journals (Sweden)
I. Hameem Shanavas
2014-01-01
Full Text Available In Optimization of VLSI Physical Design, area minimization and interconnect length minimization is an important objective in physical design automation of very large scale integration chips. The objective of minimizing the area and interconnect length would scale down the size of integrated chips. To meet the above objective, it is necessary to find an optimal solution for physical design components like partitioning, floorplanning, placement, and routing. This work helps to perform the optimization of the benchmark circuits with the above said components of physical design using hierarchical approach of evolutionary algorithms. The goal of minimizing the delay in partitioning, minimizing the silicon area in floorplanning, minimizing the layout area in placement, minimizing the wirelength in routing has indefinite influence on other criteria like power, clock, speed, cost, and so forth. Hybrid evolutionary algorithm is applied on each of its phases to achieve the objective. Because evolutionary algorithm that includes one or many local search steps within its evolutionary cycles to obtain the minimization of area and interconnect length. This approach combines a hierarchical design like genetic algorithm and simulated annealing to attain the objective. This hybrid approach can quickly produce optimal solutions for the popular benchmarks.
Hanks, Brantley R.; Skelton, Robert E.
1991-01-01
Vibration in modern structural and mechanical systems can be reduced in amplitude by increasing stiffness, redistributing stiffness and mass, and/or adding damping if design techniques are available to do so. Linear Quadratic Regulator (LQR) theory in modern multivariable control design, attacks the general dissipative elastic system design problem in a global formulation. The optimal design, however, allows electronic connections and phase relations which are not physically practical or possible in passive structural-mechanical devices. The restriction of LQR solutions (to the Algebraic Riccati Equation) to design spaces which can be implemented as passive structural members and/or dampers is addressed. A general closed-form solution to the optimal free-decay control problem is presented which is tailored for structural-mechanical system. The solution includes, as subsets, special cases such as the Rayleigh Dissipation Function and total energy. Weighting matrix selection is a constrained choice among several parameters to obtain desired physical relationships. The closed-form solution is also applicable to active control design for systems where perfect, collocated actuator-sensor pairs exist.
Space-planning and structural solutions of low-rise buildings: Optimal selection methods
Gusakova, Natalya; Minaev, Nikolay; Filushina, Kristina; Dobrynina, Olga; Gusakov, Alexander
2017-11-01
The present study is devoted to elaboration of methodology used to select appropriately the space-planning and structural solutions in low-rise buildings. Objective of the study is working out the system of criteria influencing the selection of space-planning and structural solutions which are most suitable for low-rise buildings and structures. Application of the defined criteria in practice aim to enhance the efficiency of capital investments, energy and resource saving, create comfortable conditions for the population considering climatic zoning of the construction site. Developments of the project can be applied while implementing investment-construction projects of low-rise housing at different kinds of territories based on the local building materials. The system of criteria influencing the optimal selection of space-planning and structural solutions of low-rise buildings has been developed. Methodological basis has been also elaborated to assess optimal selection of space-planning and structural solutions of low-rise buildings satisfying the requirements of energy-efficiency, comfort and safety, and economical efficiency. Elaborated methodology enables to intensify the processes of low-rise construction development for different types of territories taking into account climatic zoning of the construction site. Stimulation of low-rise construction processes should be based on the system of approaches which are scientifically justified; thus it allows enhancing energy efficiency, comfort, safety and economical effectiveness of low-rise buildings.
Inovenkov, I. N.; Echkina, E. Yu.; Nefedov, V. V.; Ponomarenko, L. S.
2017-12-01
In this paper we discuss how a particles-in-cell computation code can be combined with methods of multicriterion optimization (in particular the Pareto optimal solutions of the multicriterion optimization problem) and a hierarchy of computational models approach to create an efficient tool for solving a wide array of problems related to the laser-plasma interaction. In case of the computational experiment the multicriterion optimization can be applied as follows: the researcher defines the objectives of the experiment - some computable scalar values (i.e. high kinetic energy of the ions leaving the domain, least possible number of electrons leaving domain in the given direction, etc). After that the parameters of the experiment which can be varied to achieve these objectives and the constrains on these parameters are chosen (e.g. amplitude and wave-length of the laser radiation, dimensions of the plasma slab(s)). The Pareto optimality of the vector of the parameters can be seen as this: x 0 is Pareto optimal if there exists no vector which would improve some criterion without causing a simultaneous degradation in at least one other criterion. These efficient set of parameter and constrains can be selected based on the preliminary calculations in the simplified models (one or two-dimensional) either analytical or numerical. The multistage computation of the Pareto set radically reduces the number of variants which are to be evaluated to achieve the given accuracy. During the final stage we further improve the results by recomputing some of the optimal variants on the finer grids, with more particles and/or in the frame of a more detailed model. As an example we have considered the ion acceleration caused by interaction of very intense and ultra-short laser pulses with plasmas and have calculated the optimal set of experiment parameters for optimizing number and average energy of high energy ions leaving the domain in the given direction and minimizing the expulsion
Many-Objective Optimization Using Adaptive Differential Evolution with a New Ranking Method
Directory of Open Access Journals (Sweden)
Xiaoguang He
2014-01-01
Full Text Available Pareto dominance is an important concept and is usually used in multiobjective evolutionary algorithms (MOEAs to determine the nondominated solutions. However, for many-objective problems, using Pareto dominance to rank the solutions even in the early generation, most obtained solutions are often the nondominated solutions, which results in a little selection pressure of MOEAs toward the optimal solutions. In this paper, a new ranking method is proposed for many-objective optimization problems to verify a relatively smaller number of representative nondominated solutions with a uniform and wide distribution and improve the selection pressure of MOEAs. After that, a many-objective differential evolution with the new ranking method (MODER for handling many-objective optimization problems is designed. At last, the experiments are conducted and the proposed algorithm is compared with several well-known algorithms. The experimental results show that the proposed algorithm can guide the search to converge to the true PF and maintain the diversity of solutions for many-objective problems.
Multiobjective hyper heuristic scheme for system design and optimization
Rafique, Amer Farhan
2012-11-01
As system design is becoming more and more multifaceted, integrated, and complex, the traditional single objective optimization trends of optimal design are becoming less and less efficient and effective. Single objective optimization methods present a unique optimal solution whereas multiobjective methods present pareto front. The foremost intent is to predict a reasonable distributed pareto-optimal solution set independent of the problem instance through multiobjective scheme. Other objective of application of intended approach is to improve the worthiness of outputs of the complex engineering system design process at the conceptual design phase. The process is automated in order to provide the system designer with the leverage of the possibility of studying and analyzing a large multiple of possible solutions in a short time. This article presents Multiobjective Hyper Heuristic Optimization Scheme based on low level meta-heuristics developed for the application in engineering system design. Herein, we present a stochastic function to manage meta-heuristics (low-level) to augment surety of global optimum solution. Generic Algorithm, Simulated Annealing and Swarm Intelligence are used as low-level meta-heuristics in this study. Performance of the proposed scheme is investigated through a comprehensive empirical analysis yielding acceptable results. One of the primary motives for performing multiobjective optimization is that the current engineering systems require simultaneous optimization of conflicting and multiple. Random decision making makes the implementation of this scheme attractive and easy. Injecting feasible solutions significantly alters the search direction and also adds diversity of population resulting in accomplishment of pre-defined goals set in the proposed scheme.
Multi-Objective Optimization of Start-up Strategy for Pumped Storage Units
Directory of Open Access Journals (Sweden)
Jinjiao Hou
2018-05-01
Full Text Available This paper proposes a multi-objective optimization method for the start-up strategy of pumped storage units (PSU for the first time. In the multi-objective optimization method, the speed rise time and the overshoot during the process of the start-up are taken as the objectives. A precise simulation platform is built for simulating the transient process of start-up, and for calculating the objectives based on the process. The Multi-objective Particle Swarm Optimization algorithm (MOPSO is adopted to optimize the widely applied start-up strategies based on one-stage direct guide vane control (DGVC, and two-stage DGVC. Based on the Pareto Front obtained, a multi-objective decision-making method based on the relative objective proximity is used to sort the solutions in the Pareto Front. Start-up strategy optimization for a PSU of a pumped storage power station in Jiangxi Province in China is conducted in experiments. The results show that: (1 compared with the single objective optimization, the proposed multi-objective optimization of start-up strategy not only greatly shortens the speed rise time and the speed overshoot, but also makes the speed curve quickly stabilize; (2 multi-objective optimization of strategy based on two-stage DGVC achieves better solution for a quick and smooth start-up of PSU than that of the strategy based on one-stage DGVC.
Williams, Perry J.; Kendall, William L.
2017-01-01
Choices in ecological research and management are the result of balancing multiple, often competing, objectives. Multi-objective optimization (MOO) is a formal decision-theoretic framework for solving multiple objective problems. MOO is used extensively in other fields including engineering, economics, and operations research. However, its application for solving ecological problems has been sparse, perhaps due to a lack of widespread understanding. Thus, our objective was to provide an accessible primer on MOO, including a review of methods common in other fields, a review of their application in ecology, and a demonstration to an applied resource management problem.A large class of methods for solving MOO problems can be separated into two strategies: modelling preferences pre-optimization (the a priori strategy), or modelling preferences post-optimization (the a posteriori strategy). The a priori strategy requires describing preferences among objectives without knowledge of how preferences affect the resulting decision. In the a posteriori strategy, the decision maker simultaneously considers a set of solutions (the Pareto optimal set) and makes a choice based on the trade-offs observed in the set. We describe several methods for modelling preferences pre-optimization, including: the bounded objective function method, the lexicographic method, and the weighted-sum method. We discuss modelling preferences post-optimization through examination of the Pareto optimal set. We applied each MOO strategy to the natural resource management problem of selecting a population target for cackling goose (Branta hutchinsii minima) abundance. Cackling geese provide food security to Native Alaskan subsistence hunters in the goose's nesting area, but depredate crops on private agricultural fields in wintering areas. We developed objective functions to represent the competing objectives related to the cackling goose population target and identified an optimal solution
Optimal design of cluster-based ad-hoc networks using probabilistic solution discovery
International Nuclear Information System (INIS)
Cook, Jason L.; Ramirez-Marquez, Jose Emmanuel
2009-01-01
The reliability of ad-hoc networks is gaining popularity in two areas: as a topic of academic interest and as a key performance parameter for defense systems employing this type of network. The ad-hoc network is dynamic and scalable and these descriptions are what attract its users. However, these descriptions are also synonymous for undefined and unpredictable when considering the impacts to the reliability of the system. The configuration of an ad-hoc network changes continuously and this fact implies that no single mathematical expression or graphical depiction can describe the system reliability-wise. Previous research has used mobility and stochastic models to address this challenge successfully. In this paper, the authors leverage the stochastic approach and build upon it a probabilistic solution discovery (PSD) algorithm to optimize the topology for a cluster-based mobile ad-hoc wireless network (MAWN). Specifically, the membership of nodes within the back-bone network or networks will be assigned in such as way as to maximize reliability subject to a constraint on cost. The constraint may also be considered as a non-monetary cost, such as weight, volume, power, or the like. When a cost is assigned to each component, a maximum cost threshold is assigned to the network, and the method is run; the result is an optimized allocation of the radios enabling back-bone network(s) to provide the most reliable network possible without exceeding the allowable cost. The method is intended for use directly as part of the architectural design process of a cluster-based MAWN to efficiently determine an optimal or near-optimal design solution. It is capable of optimizing the topology based upon all-terminal reliability (ATR), all-operating terminal reliability (AoTR), or two-terminal reliability (2TR)
Risk finance for catastrophe losses with Pareto-calibrated Lévy-stable severities.
Powers, Michael R; Powers, Thomas Y; Gao, Siwei
2012-11-01
For catastrophe losses, the conventional risk finance paradigm of enterprise risk management identifies transfer, as opposed to pooling or avoidance, as the preferred solution. However, this analysis does not necessarily account for differences between light- and heavy-tailed characteristics of loss portfolios. Of particular concern are the decreasing benefits of diversification (through pooling) as the tails of severity distributions become heavier. In the present article, we study a loss portfolio characterized by nonstochastic frequency and a class of Lévy-stable severity distributions calibrated to match the parameters of the Pareto II distribution. We then propose a conservative risk finance paradigm that can be used to prepare the firm for worst-case scenarios with regard to both (1) the firm's intrinsic sensitivity to risk and (2) the heaviness of the severity's tail. © 2012 Society for Risk Analysis.
Optimizing an Investment Solution in Conditions of Uncertainty and Risk as a Multicriterial Task
Directory of Open Access Journals (Sweden)
Kotsyuba Oleksiy S.
2017-10-01
Full Text Available The article is concerned with the methodology for optimizing investment decisions in conditions of uncertainty and risk. The subject area of the study relates, first of all, to real investment. The problem of modeling an optimal investment solution is considered to be a multicriterial task. Also, the constructive part of the publication is based on the position that the multicriteriality of objectives of investment projecting is the result, first, of the complex nature of the category of economic attractiveness (efficiency of real investment, and secondly, of the need to take into account the risk factor, which is a vector measure, in the preparation of an investment solution. An attempt has been made to develop an instrumentarium to optimize investment decisions in a situation of uncertainty and the risk it engenders, based on the use of roll-up of the local criteria. As a result of its implementation, a model has been proposed, which has the advantage that it takes into account, to a greater extent than is the case for standardized roll-up options, the contensive and formal features of the local (detailed criteria.
Nakata, Toshihiko; Ninomiya, Takanori
2006-10-10
A general solution of undersampling frequency conversion and its optimization for parallel photodisplacement imaging is presented. Phase-modulated heterodyne interference light generated by a linear region of periodic displacement is captured by a charge-coupled device image sensor, in which the interference light is sampled at a sampling rate lower than the Nyquist frequency. The frequencies of the components of the light, such as the sideband and carrier (which include photodisplacement and topography information, respectively), are downconverted and sampled simultaneously based on the integration and sampling effects of the sensor. A general solution of frequency and amplitude in this downconversion is derived by Fourier analysis of the sampling procedure. The optimal frequency condition for the heterodyne beat signal, modulation signal, and sensor gate pulse is derived such that undesirable components are eliminated and each information component is converted into an orthogonal function, allowing each to be discretely reproduced from the Fourier coefficients. The optimal frequency parameters that maximize the sideband-to-carrier amplitude ratio are determined, theoretically demonstrating its high selectivity over 80 dB. Preliminary experiments demonstrate that this technique is capable of simultaneous imaging of reflectivity, topography, and photodisplacement for the detection of subsurface lattice defects at a speed corresponding to an acquisition time of only 0.26 s per 256 x 256 pixel area.
Directory of Open Access Journals (Sweden)
Ricardo Faia
2017-06-01
Full Text Available The deregulation of the electricity sector has culminated in the introduction of competitive markets. In addition, the emergence of new forms of electric energy production, namely the production of renewable energy, has brought additional changes in electricity market operation. Renewable energy has significant advantages, but at the cost of an intermittent character. The generation variability adds new challenges for negotiating players, as they have to deal with a new level of uncertainty. In order to assist players in their decisions, decision support tools enabling assisting players in their negotiations are crucial. Artificial intelligence techniques play an important role in this decision support, as they can provide valuable results in rather small execution times, namely regarding the problem of optimizing the electricity markets participation portfolio. This paper proposes a heuristic method that provides an initial solution that allows metaheuristic techniques to improve their results through a good initialization of the optimization process. Results show that by using the proposed heuristic, multiple metaheuristic optimization methods are able to improve their solutions in a faster execution time, thus providing a valuable contribution for players support in energy markets negotiations.
Study of Research and Development Processes through Fuzzy Super FRM Model and Optimization Solutions
Directory of Open Access Journals (Sweden)
Flavius Aurelian Sârbu
2015-01-01
Full Text Available The aim of this study is to measure resources for R&D (research and development at the regional level in Romania and also obtain primary data that will be important in making the right decisions to increase competitiveness and development based on an economic knowledge. As our motivation, we would like to emphasize that by the use of Super Fuzzy FRM model we want to determine the state of R&D processes at regional level using a mean different from the statistical survey, while by the two optimization methods we mean to provide optimization solutions for the R&D actions of the enterprises. Therefore to fulfill the above mentioned aim in this application-oriented paper we decided to use a questionnaire and for the interpretation of the results the Super Fuzzy FRM model, representing the main novelty of our paper, as this theory provides a formalism based on matrix calculus, which allows processing of large volumes of information and also delivers results difficult or impossible to see, through statistical processing. Furthermore another novelty of the paper represents the optimization solutions submitted in this work, given for the situation when the sales price is variable, and the quantity sold is constant in time and for the reverse situation.
Directory of Open Access Journals (Sweden)
Nicolae Petrescu
2010-01-01
Full Text Available This paper is intended to achieve a mathematical model for the optimal dimensioning of number and heights of somedams/thresholds during a downpour, a decrease of water flow rate being obtained and by the solid material depositionsbehind the constructions a new smaller slope of the valley that changes the torrential nature that had before theconstruction is obtained.The choice of the dam and its characteristic dimensions may be an optimization issue and the location of dams on thetorrential (rainfall aspect is dictated by the capabilities of the foundation and restraint so that the chosen solutions willhave to comply with these sites. Finally, the choice of optimal solution to limit torrential (rainfall aspect will be basedon a calculation where the number of thresholds / dams can be a variable related to this, their height properly varying.The calculation method presented is an attempt to demonstrate the multiple opportunities available to implement atechnical issue solving conditions offered by the mathematics against soil erosion, which now is currently very topicalon the environmental protection.
International Nuclear Information System (INIS)
Hedayat, Afshin; Davilu, Hadi; Barfrosh, Ahmad Abdollahzadeh; Sepanloo, Kamran
2009-01-01
To successfully carry out material irradiation experiments and radioisotope productions, a high thermal neutron flux at irradiation box over a desired life time of a core configuration is needed. On the other hand, reactor safety and operational constraints must be preserved during core configuration selection. Two main objectives and two safety and operational constraints are suggested to optimize reactor core configuration design. Suggested parameters and conditions are considered as two separate fitness functions composed of two main objectives and two penalty functions. This is a constrained and combinatorial type of a multi-objective optimization problem. In this paper, a fast and effective hybrid artificial intelligence algorithm is introduced and developed to reach a Pareto optimal set. The hybrid algorithm is composed of a fast and elitist multi-objective genetic algorithm (GA) and a fast fitness function evaluating system based on the cascade feed forward artificial neural networks (ANNs). A specific GA representation of core configuration and also special GA operators are introduced and used to overcome the combinatorial constraints of this optimization problem. A software package (Core Pattern Calculator 1) is developed to prepare and reform required data for ANNs training and also to revise the optimization results. Some practical test parameters and conditions are suggested to adjust main parameters of the hybrid algorithm. Results show that introduced ANNs can be trained and estimate selected core parameters of a research reactor very quickly. It improves effectively optimization process. Final optimization results show that a uniform and dense diversity of Pareto fronts are gained over a wide range of fitness function values. To take a more careful selection of Pareto optimal solutions, a revision system is introduced and used. The revision of gained Pareto optimal set is performed by using developed software package. Also some secondary operational
Energy Technology Data Exchange (ETDEWEB)
Hedayat, Afshin, E-mail: ahedayat@aut.ac.i [Department of Nuclear Engineering and Physics, Amirkabir University of Technology (Tehran Polytechnic), 424 Hafez Avenue, P.O. Box 15875-4413, Tehran (Iran, Islamic Republic of); Reactor Research and Development School, Nuclear Science and Technology Research Institute (NSTRI), End of North Karegar Street, P.O. Box 14395-836, Tehran (Iran, Islamic Republic of); Davilu, Hadi [Department of Nuclear Engineering and Physics, Amirkabir University of Technology (Tehran Polytechnic), 424 Hafez Avenue, P.O. Box 15875-4413, Tehran (Iran, Islamic Republic of); Barfrosh, Ahmad Abdollahzadeh [Department of Computer Engineering, Amirkabir University of Technology (Tehran Polytechnic), 424 Hafez Avenue, P.O. Box 15875-4413, Tehran (Iran, Islamic Republic of); Sepanloo, Kamran [Reactor Research and Development School, Nuclear Science and Technology Research Institute (NSTRI), End of North Karegar Street, P.O. Box 14395-836, Tehran (Iran, Islamic Republic of)
2009-12-15
To successfully carry out material irradiation experiments and radioisotope productions, a high thermal neutron flux at irradiation box over a desired life time of a core configuration is needed. On the other hand, reactor safety and operational constraints must be preserved during core configuration selection. Two main objectives and two safety and operational constraints are suggested to optimize reactor core configuration design. Suggested parameters and conditions are considered as two separate fitness functions composed of two main objectives and two penalty functions. This is a constrained and combinatorial type of a multi-objective optimization problem. In this paper, a fast and effective hybrid artificial intelligence algorithm is introduced and developed to reach a Pareto optimal set. The hybrid algorithm is composed of a fast and elitist multi-objective genetic algorithm (GA) and a fast fitness function evaluating system based on the cascade feed forward artificial neural networks (ANNs). A specific GA representation of core configuration and also special GA operators are introduced and used to overcome the combinatorial constraints of this optimization problem. A software package (Core Pattern Calculator 1) is developed to prepare and reform required data for ANNs training and also to revise the optimization results. Some practical test parameters and conditions are suggested to adjust main parameters of the hybrid algorithm. Results show that introduced ANNs can be trained and estimate selected core parameters of a research reactor very quickly. It improves effectively optimization process. Final optimization results show that a uniform and dense diversity of Pareto fronts are gained over a wide range of fitness function values. To take a more careful selection of Pareto optimal solutions, a revision system is introduced and used. The revision of gained Pareto optimal set is performed by using developed software package. Also some secondary operational
Directory of Open Access Journals (Sweden)
Farzad Amirkhani
2017-03-01
The proposed method is implemented on classical job-shop problems with objective of makespan and results are compared with mixed integer programming model. Moreover, the appropriate dispatching priorities are achieved for dynamic job-shop problem minimizing a multi-objective criteria. The results show that simulation-based optimization are highly capable to capture the main characteristics of the shop and produce optimal/near-optimal solutions with highly credibility degree.
Long, Kim Chenming
Real-world engineering optimization problems often require the consideration of multiple conflicting and noncommensurate objectives, subject to nonconvex constraint regions in a high-dimensional decision space. Further challenges occur for combinatorial multiobjective problems in which the decision variables are not continuous. Traditional multiobjective optimization methods of operations research, such as weighting and epsilon constraint methods, are ill-suited to solving these complex, multiobjective problems. This has given rise to the application of a wide range of metaheuristic optimization algorithms, such as evolutionary, particle swarm, simulated annealing, and ant colony methods, to multiobjective optimization. Several multiobjective evolutionary algorithms have been developed, including the strength Pareto evolutionary algorithm (SPEA) and the non-dominated sorting genetic algorithm (NSGA), for determining the Pareto-optimal set of non-dominated solutions. Although numerous researchers have developed a wide range of multiobjective optimization algorithms, there is a continuing need to construct computationally efficient algorithms with an improved ability to converge to globally non-dominated solutions along the Pareto-optimal front for complex, large-scale, multiobjective engineering optimization problems. This is particularly important when the multiple objective functions and constraints of the real-world system cannot be expressed in explicit mathematical representations. This research presents a novel metaheuristic evolutionary algorithm for complex multiobjective optimization problems, which combines the metaheuristic tabu search algorithm with the evolutionary algorithm (TSEA), as embodied in genetic algorithms. TSEA is successfully applied to bicriteria (i.e., structural reliability and retrofit cost) optimization of the aircraft tail structure fatigue life, which increases its reliability by prolonging fatigue life. A comparison for this
On the size distribution of cities: an economic interpretation of the Pareto coefficient.
Suh, S H
1987-01-01
"Both the hierarchy and the stochastic models of size distribution of cities are analyzed in order to explain the Pareto coefficient by economic variables. In hierarchy models, it is found that the rate of variation in the productivity of cities and that in the probability of emergence of cities can explain the Pareto coefficient. In stochastic models, the productivity of cities is found to explain the Pareto coefficient. New city-size distribution functions, in which the Pareto coefficient is decomposed by economic variables, are estimated." excerpt
Multiobjective Optimal Algorithm for Automatic Calibration of Daily Streamflow Forecasting Model
Directory of Open Access Journals (Sweden)
Yi Liu
2016-01-01
Full Text Available Single-objection function cannot describe the characteristics of the complicated hydrologic system. Consequently, it stands to reason that multiobjective functions are needed for calibration of hydrologic model. The multiobjective algorithms based on the theory of nondominate are employed to solve this multiobjective optimal problem. In this paper, a novel multiobjective optimization method based on differential evolution with adaptive Cauchy mutation and Chaos searching (MODE-CMCS is proposed to optimize the daily streamflow forecasting model. Besides, to enhance the diversity performance of Pareto solutions, a more precise crowd distance assigner is presented in this paper. Furthermore, the traditional generalized spread metric (SP is sensitive with the size of Pareto set. A novel diversity performance metric, which is independent of Pareto set size, is put forward in this research. The efficacy of the new algorithm MODE-CMCS is compared with the nondominated sorting genetic algorithm II (NSGA-II on a daily streamflow forecasting model based on support vector machine (SVM. The results verify that the performance of MODE-CMCS is superior to the NSGA-II for automatic calibration of hydrologic model.
Vector optimization theory, applications, and extensions
Jahn, Johannes
2011-01-01
This new edition of a key monograph has fresh sections on the work of Edgeworth and Pareto in its presentation in a general setting of the fundamentals and important results of vector optimization. It examines background material, applications and theories.
International Nuclear Information System (INIS)
Shao, Wei; Cui, Zheng; Cheng, Lin
2016-01-01
Highlights: • A multi-objective optimization model of air distribution of grate cooler by genetic algorithm is proposed. • Pareto Front is obtained and validated by comparing with operating data. • Optimal schemes are compared and selected by engineering background. • Total power consumption after optimization decreases 61.10%. • Thickness of clinker on three grate plates is thinner. - Abstract: The cooling air distributions of grate cooler exercise a great influence on the clinker cooling efficiency and power consumption of cooling fans. A multi-objective optimization model of air distributions of grate cooler with cross-flow heat exchanger analogy is proposed in this paper. Firstly, thermodynamic and flow models of clinker cooling process is carried out. Then based on entropy generation minimization analysis, modified entropy generation numbers caused by heat transfer and pressure drop are chosen as objective functions respectively which optimized by genetic algorithm. The design variables are superficial velocities of air chambers and thicknesses of clinker layers on different grate plates. A set of Pareto optimal solutions which two objectives are optimized simultaneously is achieved. Scattered distributions of design variables resulting in the conflict between two objectives are brought out. The final optimal air distribution and thicknesses of clinker layers are selected from the Pareto optimal solutions based on power consumption of cooling fans minimization and validated by measurements. Compared with actual operating scheme, the total air volumes of optimized schemes decrease 2.4%, total power consumption of cooling fans decreases 61.1% and the outlet temperature of clinker decreases 122.9 °C which shows a remarkable energy-saving effect on energy consumption.
International Nuclear Information System (INIS)
Safari, Jalal
2012-01-01
This paper proposes a variant of the Non-dominated Sorting Genetic Algorithm (NSGA-II) to solve a novel mathematical model for multi-objective redundancy allocation problems (MORAP). Most researchers about redundancy allocation problem (RAP) have focused on single objective optimization, while there has been some limited research which addresses multi-objective optimization. Also all mathematical multi-objective models of general RAP assume that the type of redundancy strategy for each subsystem is predetermined and known a priori. In general, active redundancy has traditionally received greater attention; however, in practice both active and cold-standby redundancies may be used within a particular system design. The choice of redundancy strategy then becomes an additional decision variable. Thus, the proposed model and solution method are to select the best redundancy strategy, type of components, and levels of redundancy for each subsystem that maximizes the system reliability and minimize total system cost under system-level constraints. This problem belongs to the NP-hard class. This paper presents a second-generation Multiple-Objective Evolutionary Algorithm (MOEA), named NSGA-II to find the best solution for the given problem. The proposed algorithm demonstrates the ability to identify a set of optimal solutions (Pareto front), which provides the Decision Maker (DM) with a complete picture of the optimal solution space. After finding the Pareto front, a procedure is used to select the best solution from the Pareto front. Finally, the advantages of the presented multi-objective model and of the proposed algorithm are illustrated by solving test problems taken from the literature and the robustness of the proposed NSGA-II is discussed.
Anti-predatory particle swarm optimization: Solution to nonconvex economic dispatch problems
Energy Technology Data Exchange (ETDEWEB)
Selvakumar, A. Immanuel [Department of Electrical and Electronics Engineering, Karunya Institute of Technology and Sciences, Coimbatore 641114, Tamilnadu (India); Thanushkodi, K. [Department of Electronics and Instrumentation Engineering, Government College of Technology, Coimbatore 641013, Tamilnadu (India)
2008-01-15
This paper proposes a new particle swarm optimization (PSO) strategy namely, anti-predatory particle swarm optimization (APSO) to solve nonconvex economic dispatch problems. In the classical PSO, the movement of a particle (bird) is governed by three behaviors: inertial, cognitive and social. The cognitive and social behaviors are the components of the foraging activity, which help the swarm of birds to locate food. Another activity that is observed in birds is the anti-predatory nature, which helps the swarm to escape from the predators. In this work, the anti-predatory activity is modeled and embedded in the classical PSO to form APSO. This inclusion enhances the exploration capability of the swarm. To validate the proposed APSO model, it is applied to two test systems having nonconvex solution spaces. Satisfactory results are obtained when compared with previous approaches. (author)
A Generalized Measure for the Optimal Portfolio Selection Problem and its Explicit Solution
Directory of Open Access Journals (Sweden)
Zinoviy Landsman
2018-03-01
Full Text Available In this paper, we offer a novel class of utility functions applied to optimal portfolio selection. This class incorporates as special cases important measures such as the mean-variance, Sharpe ratio, mean-standard deviation and others. We provide an explicit solution to the problem of optimal portfolio selection based on this class. Furthermore, we show that each measure in this class generally reduces to the efficient frontier that coincides or belongs to the classical mean-variance efficient frontier. In addition, a condition is provided for the existence of the a one-to-one correspondence between the parameter of this class of utility functions and the trade-off parameter λ in the mean-variance utility function. This correspondence essentially provides insight into the choice of this parameter. We illustrate our results by taking a portfolio of stocks from National Association of Securities Dealers Automated Quotation (NASDAQ.
The Pareto Analysis for Establishing Content Criteria in Surgical Training.
Kramp, Kelvin H; van Det, Marc J; Veeger, Nic J G M; Pierie, Jean-Pierre E N
2016-01-01
Current surgical training is still highly dependent on expensive operating room (OR) experience. Although there have been many attempts to transfer more training to the skills laboratory, little research is focused on which technical behaviors can lead to the highest profit when they are trained outside the OR. The Pareto principle states that in any population that contributes to a common effect, a few account for the bulk of the effect. This principle has been widely used in business management to increase company profits. This study uses the Pareto principle for establishing content criteria for more efficient surgical training. A retrospective study was conducted to assess verbal guidance provided by 9 supervising surgeons to 12 trainees performing 64 laparoscopic cholecystectomies in the OR. The verbal corrections were documented, tallied, and clustered according to the aimed change in novice behavior. The corrections were rank ordered, and a cumulative distribution curve was used to calculate which corrections accounted for 80% of the total number of verbal corrections. In total, 253 different verbal corrections were uttered 1587 times and were categorized into 40 different clusters of aimed changes in novice behaviors. The 35 highest-ranking verbal corrections (14%) and the 11 highest-ranking clusters (28%) accounted for 80% of the total number of given verbal corrections. Following the Pareto principle, we were able to identify the aspects of trainee behavior that account for most corrections given by supervisors during a laparoscopic cholecystectomy on humans. This strategy can be used for the development of new training programs to prepare the trainee in advance for the challenges encountered in the clinical setting in an OR. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
International Nuclear Information System (INIS)
Onishi, Viviani C.; Ravagnani, Mauro A.S.S.; Jiménez, Laureano; Caballero, José A.
2017-01-01
Highlights: • New multi-objective optimization model for the simultaneous WHEN synthesis. • A multistage superstructure allows power and thermal integration of process streams. • Simultaneous minimization of environmental impacts and total annualized cost. • Alternative set of Pareto solutions is presented to support decision-makers. - Abstract: Sustainable and efficient energy use is crucial for lessening carbon dioxide emissions in industrial plants. This paper introduces a new multi-objective optimization model for the synthesis of work and heat exchange networks (WHENs), aiming to obtain the optimal balance between economic and environmental performance. The proposed multistage superstructure allows power and thermal integration of process gaseous streams, through the simultaneous minimization of total annualized cost (TAC) and environmental impacts (EI). The latter objective is determined by environmental indicators that follow the life cycle assessment (LCA) principles. The WHEN superstructure is optimized as a multi-objective mixed-integer nonlinear programming (moMINLP) model and solved with the GAMS software. Results show a decrease of ∼79% in the heat transfer area and ∼32% in the capital cost between the solutions found for single problem optimizations. These results represent a diminution of ∼23.5% in the TAC, while EI is increased in ∼99.2%. As these solutions can be impractical for economic or environmental reasons, we present a set of alternative Pareto-optimal solutions to support decision-makers towards the implementation of more environment-friendly and cost-effective WHENs.
Czech Academy of Sciences Publication Activity Database
Beremlijski, P.; Outrata, Jiří; Haslinger, Jaroslav; Pathó, R.
2014-01-01
Roč. 52, č. 5 (2014), s. 3371-3400 ISSN 0363-0129 R&D Projects: GA ČR(CZ) GAP201/12/0671 Grant - others:GA MŠK(CZ) CZ.1.05/1.1.00/02.0070; GA MŠK(CZ) CZ.1.07/2.3.00/20.0070 Institutional support: RVO:67985556 ; RVO:68145535 Keywords : shape optimization * contact problems * Coulomb friction * solution-dependent coefficient of friction * mathematical programs with equilibrium constraints Subject RIV: BA - General Mathematics Impact factor: 1.463, year: 2014 http://library.utia.cas.cz/separaty/2014/MTR/outrata-0434234.pdf
Global stability, periodic solutions, and optimal control in a nonlinear differential delay model
Directory of Open Access Journals (Sweden)
Anatoli F. Ivanov
2010-09-01
Full Text Available A nonlinear differential equation with delay serving as a mathematical model of several applied problems is considered. Sufficient conditions for the global asymptotic stability and for the existence of periodic solutions are given. Two particular applications are treated in detail. The first one is a blood cell production model by Mackey, for which new periodicity criteria are derived. The second application is a modified economic model with delay due to Ramsey. An optimization problem for a maximal consumption is stated and solved for the latter.
Dobrinskaya, Tatiana
2015-01-01
This paper suggests a new method for optimizing yaw maneuvers on the International Space Station (ISS). Yaw rotations are the most common large maneuvers on the ISS often used for docking and undocking operations, as well as for other activities. When maneuver optimization is used, large maneuvers, which were performed on thrusters, could be performed either using control moment gyroscopes (CMG), or with significantly reduced thruster firings. Maneuver optimization helps to save expensive propellant and reduce structural loads - an important factor for the ISS service life. In addition, optimized maneuvers reduce contamination of the critical elements of the vehicle structure, such as solar arrays. This paper presents an analytical solution for optimizing yaw attitude maneuvers. Equations describing pitch and roll motion needed to counteract the major torques during a yaw maneuver are obtained. A yaw rate profile is proposed. Also the paper describes the physical basis of the suggested optimization approach. In the obtained optimized case, the torques are significantly reduced. This torque reduction was compared to the existing optimization method which utilizes the computational solution. It was shown that the attitude profiles and the torque reduction have a good match for these two methods of optimization. The simulations using the ISS flight software showed similar propellant consumption for both methods. The analytical solution proposed in this paper has major benefits with respect to computational approach. In contrast to the current computational solution, which only can be calculated on the ground, the analytical solution does not require extensive computational resources, and can be implemented in the onboard software, thus, making the maneuver execution automatic. The automatic maneuver significantly simplifies the operations and, if necessary, allows to perform a maneuver without communication with the ground. It also reduces the probability of command
Income dynamics with a stationary double Pareto distribution.
Toda, Alexis Akira
2011-04-01
Once controlled for the trend, the distribution of personal income appears to be double Pareto, a distribution that obeys the power law exactly in both the upper and the lower tails. I propose a model of income dynamics with a stationary distribution that is consistent with this fact. Using US male wage data for 1970-1993, I estimate the power law exponent in two ways--(i) from each cross section, assuming that the distribution has converged to the stationary distribution, and (ii) from a panel directly estimating the parameters of the income dynamics model--and obtain the same value of 8.4.
Bayesian modeling to paired comparison data via the Pareto distribution
Directory of Open Access Journals (Sweden)
Nasir Abbas
2017-12-01
Full Text Available A probabilistic approach to build models for paired comparison experiments based on the comparison of two Pareto variables is considered. Analysis of the proposed model is carried out in classical as well as Bayesian frameworks. Informative and uninformative priors are employed to accommodate the prior information. Simulation study is conducted to assess the suitablily and performance of the model under theoretical conditions. Appropriateness of fit of the is also carried out. Entire inferential procedure is illustrated by comparing certain cricket teams using real dataset.
International Nuclear Information System (INIS)
Bouaziz, M.N.; Aziz, Abdul
2010-01-01
A novel concept of double optimal linearization is introduced and used to obtain a simple and accurate solution for the temperature distribution in a straight rectangular convective-radiative fin with temperature dependent thermal conductivity. The solution is built from the classical solution for a pure convection fin of constant thermal conductivity which appears in terms of hyperbolic functions. When compared with the direct numerical solution, the double optimally linearized solution is found to be accurate within 4% for a range of radiation-conduction and thermal conductivity parameters that are likely to be encountered in practice. The present solution is simple and offers superior accuracy compared with the fairly complex approximate solutions based on the homotopy perturbation method, variational iteration method, and the double series regular perturbation method. The fin efficiency expression resembles the classical result for the constant thermal conductivity convecting fin. The present results are easily usable by the practicing engineers in their thermal design and analysis work involving fins.
Optimization of well field management
DEFF Research Database (Denmark)
Hansen, Annette Kirstine
Groundwater is a limited but important resource for fresh water supply. Differ- ent conflicting objectives are important when operating a well field. This study investigates how the management of a well field can be improved with respect to different objectives simultaneously. A framework...... for optimizing well field man- agement using multi-objective optimization is developed. The optimization uses the Strength Pareto Evolutionary Algorithm 2 (SPEA2) to find the Pareto front be- tween the conflicting objectives. The Pareto front is a set of non-inferior optimal points and provides an important tool...... for the decision-makers. The optimization framework is tested on two case studies. Both abstract around 20,000 cubic meter of water per day, but are otherwise rather different. The first case study concerns the management of Hardhof waterworks, Switzer- land, where artificial infiltration of river water...
International Nuclear Information System (INIS)
Sklarz, Shlomo E.; Tannor, David J.; Khaneja, Navin
2004-01-01
We study the problem of optimal control of dissipative quantum dynamics. Although under most circumstances dissipation leads to an increase in entropy (or a decrease in purity) of the system, there is an important class of problems for which dissipation with external control can decrease the entropy (or increase the purity) of the system. An important example is laser cooling. In such systems, there is an interplay of the Hamiltonian part of the dynamics, which is controllable, and the dissipative part of the dynamics, which is uncontrollable. The strategy is to control the Hamiltonian portion of the evolution in such a way that the dissipation causes the purity of the system to increase rather than decrease. The goal of this paper is to find the strategy that leads to maximal purity at the final time. Under the assumption that Hamiltonian control is complete and arbitrarily fast, we provide a general framework by which to calculate optimal cooling strategies. These assumptions lead to a great simplification, in which the control problem can be reformulated in terms of the spectrum of eigenvalues of ρ, rather than ρ itself. By combining this formulation with the Hamilton-Jacobi-Bellman theorem we are able to obtain an equation for the globally optimal cooling strategy in terms of the spectrum of the density matrix. For the three-level Λ system, we provide a complete analytic solution for the optimal cooling strategy. For this system it is found that the optimal strategy does not exploit system coherences and is a 'greedy' strategy, in which the purity is increased maximally at each instant
Directory of Open Access Journals (Sweden)
M. Mourabet
2017-05-01
Full Text Available In the present study, Response surface methodology (RSM was employed for the removal of fluoride on Brushite and the process parameters were optimized. Four important process parameters including initial fluoride concentration (40–50 mg/L, pH (4–11, temperature (10–40 °C and B dose (0.05–0.15 g were optimized to obtain the best response of fluoride removal using the statistical Box–Behnken design. The experimental data obtained were analyzed by analysis of variance (ANOVA and fitted to a second-order polynomial equation using multiple regression analysis. Numerical optimization applying desirability function was used to identify the optimum conditions for maximum removal of fluoride. The optimum conditions were found to be initial concentration = 49.06 mg/L, initial solution pH = 5.36, adsorbent dose = 0.15 g and temperature = 31.96 °C. A confirmatory experiment was performed to evaluate the accuracy of the optimization procedure and maximum fluoride removal of 88.78% was achieved under the optimized conditions. Several error analysis equations were used to measure the goodness-of-fit. Kinetic studies showed that the adsorption followed a pseudo-second order reaction. The equilibrium data were analyzed using Langmuir, Freundlich, and Sips isotherm models at different temperatures. The Langmuir model was found to be describing the data. The adsorption capacity from the Langmuir isotherm (QL was found to be 29.212, 35.952 and 36.260 mg/g at 298, 303, and 313 K respectively.
International Nuclear Information System (INIS)
Dong, Feifei; Liu, Yong; Su, Han; Zou, Rui; Guo, Huaicheng
2015-01-01
Water quality management and load reduction are subject to inherent uncertainties in watershed systems and competing decision objectives. Therefore, optimal decision-making modeling in watershed load reduction is suffering due to the following challenges: (a) it is difficult to obtain absolutely “optimal” solutions, and (b) decision schemes may be vulnerable to failure. The probability that solutions are feasible under uncertainties is defined as reliability. A reliability-oriented multi-objective (ROMO) decision-making approach was proposed in this study for optimal decision making with stochastic parameters and multiple decision reliability objectives. Lake Dianchi, one of the three most eutrophic lakes in China, was examined as a case study for optimal watershed nutrient load reduction to restore lake water quality. This study aimed to maximize reliability levels from considerations of cost and load reductions. The Pareto solutions of the ROMO optimization model were generated with the multi-objective evolutionary algorithm, demonstrating schemes representing different biases towards reliability. The Pareto fronts of six maximum allowable emission (MAE) scenarios were obtained, which indicated that decisions may be unreliable under unpractical load reduction requirements. A decision scheme identification process was conducted using the back propagation neural network (BPNN) method to provide a shortcut for identifying schemes at specific reliability levels for decision makers. The model results indicated that the ROMO approach can offer decision makers great insights into reliability tradeoffs and can thus help them to avoid ineffective decisions. - Highlights: • Reliability-oriented multi-objective (ROMO) optimal decision approach was proposed. • The approach can avoid specifying reliability levels prior to optimization modeling. • Multiple reliability objectives can be systematically balanced using Pareto fronts. • Neural network model was used to
Energy Technology Data Exchange (ETDEWEB)
Dong, Feifei [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Liu, Yong, E-mail: yongliu@pku.edu.cn [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Institute of Water Sciences, Peking University, Beijing 100871 (China); Su, Han [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Zou, Rui [Tetra Tech, Inc., 10306 Eaton Place, Ste 340, Fairfax, VA 22030 (United States); Yunnan Key Laboratory of Pollution Process and Management of Plateau Lake-Watershed, Kunming 650034 (China); Guo, Huaicheng [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China)
2015-05-15
Water quality management and load reduction are subject to inherent uncertainties in watershed systems and competing decision objectives. Therefore, optimal decision-making modeling in watershed load reduction is suffering due to the following challenges: (a) it is difficult to obtain absolutely “optimal” solutions, and (b) decision schemes may be vulnerable to failure. The probability that solutions are feasible under uncertainties is defined as reliability. A reliability-oriented multi-objective (ROMO) decision-making approach was proposed in this study for optimal decision making with stochastic parameters and multiple decision reliability objectives. Lake Dianchi, one of the three most eutrophic lakes in China, was examined as a case study for optimal watershed nutrient load reduction to restore lake water quality. This study aimed to maximize reliability levels from considerations of cost and load reductions. The Pareto solutions of the ROMO optimization model were generated with the multi-objective evolutionary algorithm, demonstrating schemes representing different biases towards reliability. The Pareto fronts of six maximum allowable emission (MAE) scenarios were obtained, which indicated that decisions may be unreliable under unpractical load reduction requirements. A decision scheme identification process was conducted using the back propagation neural network (BPNN) method to provide a shortcut for identifying schemes at specific reliability levels for decision makers. The model results indicated that the ROMO approach can offer decision makers great insights into reliability tradeoffs and can thus help them to avoid ineffective decisions. - Highlights: • Reliability-oriented multi-objective (ROMO) optimal decision approach was proposed. • The approach can avoid specifying reliability levels prior to optimization modeling. • Multiple reliability objectives can be systematically balanced using Pareto fronts. • Neural network model was used to
Directory of Open Access Journals (Sweden)
A. P. Karpenko
2015-01-01
Full Text Available We consider the relatively new and rapidly developing class of methods to solve a problem of multi-objective optimization, based on the preliminary built finite-dimensional approximation of the set, and thereby, the Pareto front of this problem as well. The work investigates the efficiency of several modifications of the method of adaptive weighted sum (AWS. This method proposed in the paper of Ryu and Kim Van (JH. Ryu, S. Kim, H. Wan is intended to build Pareto approximation of the multi-objective optimization problem.The AWS method uses quadratic approximation of the objective functions in the current sub-domain of the search space (the area of trust based on the gradient and Hessian matrix of the objective functions. To build the (quadratic meta objective functions this work uses methods of the experimental design theory, which involves calculating the values of these functions in the grid nodes covering the area of trust (a sensing method of the search domain. There are two groups of the sensing methods under consideration: hypercube- and hyper-sphere-based methods. For each of these groups, a number of test multi-objective optimization tasks has been used to study the efficiency of the following grids: "Latin Hypercube"; grid, which is uniformly random for each measurement; grid, based on the LP sequences.
International Nuclear Information System (INIS)
Fenwick, John D.; Pardo-Montero, Juan
2010-01-01
Purpose: Homogenized blocked arcs are intuitively appealing as basis functions for multicriteria optimization of rotational radiotherapy. Such arcs avoid an organ-at-risk (OAR), spread dose out well over the rest-of-body (ROB), and deliver homogeneous doses to a planning target volume (PTV) using intensity modulated fluence profiles, obtainable either from closed-form solutions or iterative numerical calculations. Here, the analytic and iterative arcs are compared. Methods: Dose-distributions have been calculated for nondivergent beams, both including and excluding scatter, beam penumbra, and attenuation effects, which are left out of the derivation of the analytic arcs. The most straightforward analytic arc is created by truncating the well-known Brahme, Roos, and Lax (BRL) solution, cutting its uniform dose region down from an annulus to a smaller nonconcave region lying beyond the OAR. However, the truncation leaves behind high dose hot-spots immediately on either side of the OAR, generated by very high BRL fluence levels just beyond the OAR. These hot-spots can be eliminated using alternative analytical solutions ''C'' and ''L,'' which, respectively, deliver constant and linearly rising fluences in the gap region between the OAR and PTV (before truncation). Results: Measured in terms of PTV dose homogeneity, ROB dose-spread, and OAR avoidance, C solutions generate better arc dose-distributions than L when scatter, penumbra, and attenuation are left out of the dose modeling. Including these factors, L becomes the best analytical solution. However, the iterative approach generates better dose-distributions than any of the analytical solutions because it can account and compensate for penumbra and scatter effects. Using the analytical solutions as starting points for the iterative methodology, dose-distributions almost as good as those obtained using the conventional iterative approach can be calculated very rapidly. Conclusions: The iterative methodology is
Methanol Synthesis: Optimal Solution for a Better Efficiency of the Process
Directory of Open Access Journals (Sweden)
Grazia Leonzio
2018-02-01
Full Text Available In this research, an ANOVA analysis and a response surface methodology are applied to analyze the equilibrium of methanol reaction from pure carbon dioxide and hydrogen. In the ANOVA analysis, carbon monoxide composition in the feed, reaction temperature, recycle and water removal through a zeolite membrane are the analyzed factors. Carbon conversion, methanol yield, methanol productivity and methanol selectivity are the analyzed responses. Results show that main factors have the same effect on responses and a common significant interaction is not present. Carbon monoxide composition and water removal have a positive effect, while temperature and recycle have a negative effect on the system. From central composite design, an optimal solution is found in order to overcome thermodynamic limit: the reactor works with a membrane at lower temperature with carbon monoxide composition in the feed equal to 10 mol % and without recycle. In these conditions, carbon conversion, methanol yield, methanol selectivity, and methanol production are, respectively, higher than 60%, higher than 60%, between 90% and 95% and higher than 0.15 mol/h when considering a feed flow rate of 1 mol/h. A comparison with a traditional reactor is also developed: the membrane reactor ensures to have a carbon conversion higher of the 29% and a methanol yield higher of the 34%. Future researches should evaluate an economic analysis about the optimal solution.
International Nuclear Information System (INIS)
Piserchio, Andrea; Ghose, Ranajeet; Cowburn, David
2009-01-01
Progression of a host of human cancers is associated with elevated levels of expression and catalytic activity of the Src family of tyrosine kinases (SFKs), making them key therapeutic targets. Even with the availability of multiple crystal structures of active and inactive forms of the SFK catalytic domain (CD), a complete understanding of its catalytic regulation is unavailable. Also unavailable are atomic or near-atomic resolution information about their interactions, often weak or transient, with regulating phosphatases and downstream targets. Solution NMR, the biophysical method best suited to tackle this problem, was previously hindered by difficulties in bacterial expression and purification of sufficient quantities of soluble, properly folded protein for economically viable labeling with NMR-active isotopes. Through a choice of optimal constructs, co-expression with chaperones and optimization of the purification protocol, we have achieved the ability to bacterially produce large quantities of the isotopically-labeled CD of c-Src, the prototypical SFK, and of its activating Tyr-phosphorylated form. All constructs produce excellent spectra allowing solution NMR studies of this family in an efficient manner
International Nuclear Information System (INIS)
Ramirez-Marquez, Jose Emmanuel; Rocco S, Claudio M.
2009-01-01
This paper introduces an evolutionary optimization approach that can be readily applied to solve stochastic network interdiction problems (SNIP). The network interdiction problem solved considers the minimization of the cost associated with an interdiction strategy such that the maximum flow that can be transmitted between a source node and a sink node for a fixed network design is greater than or equal to a given reliability requirement. Furthermore, the model assumes that the nominal capacity of each network link and the cost associated with their interdiction can change from link to link and that such interdiction has a probability of being successful. This version of the SNIP is for the first time modeled as a capacitated network reliability problem allowing for the implementation of computation and solution techniques previously unavailable. The solution process is based on an evolutionary algorithm that implements: (1) Monte-Carlo simulation, to generate potential network interdiction strategies, (2) capacitated network reliability techniques to analyze strategies' source-sink flow reliability and, (3) an evolutionary optimization technique to define, in probabilistic terms, how likely a link is to appear in the final interdiction strategy. Examples for different sizes of networks are used throughout the paper to illustrate the approach
International Nuclear Information System (INIS)
Luz, L.C.Q.P. da.
1984-01-01
The purpose of this work was the development of an instrumental method for the optimization of the indirect neutron activation analysis of boron in aqueous solutions. The optimization took into account the analytical parameters under laboratory conditions: activation carried out with a 241 Am/Be neutron source and detection of the activity induced in vanadium with two NaI(Tl) gamma spectrometers. A calibration curve was thus obtained for a concentration range of 0 to 5000 ppm B. Later on, experimental models were built in order to study the feasibility of automation. The analysis of boron was finally performed, under the previously established conditions, with an automated system comprising the operations of transport, irradiation and counting. An improvement in the quality of the analysis was observed, with boron concentrations as low as 5 ppm being determined with a precision level better than 0.4%. The experimental model features all basic design elements for an automated device for the analysis of boron in agueous solutions wherever this is required, as in the operation of nuclear reactors. (Author) [pt
Effectiveness of meta-models for multi-objective optimization of centrifugal impeller
International Nuclear Information System (INIS)
Bellary, Sayed Ahmed Imran; Samad, Abdus; Husain, Afzal
2014-01-01
The major issue of multiple fidelity based analysis and optimization of fluid machinery system depends upon the proper construction of low fidelity model or meta-model. A low fidelity model uses responses obtained from a high fidelity model, and the meta-model is then used to produce population of solutions required for evolutionary algorithm for multi-objective optimization. The Pareto-optimal front which shows functional relationships among the multiple objectives can produce erroneous results if the low fidelity models are not well-constructed. In the present research, response surface approximation and Kriging meta-models were evaluated for their effectiveness for the application in the turbomachinery design and optimization. A high fidelity model such as CFD technique along with the metamodels was used to obtain Pareto-optimal front via multi-objective genetic algorithm. A centrifugal impeller has been considered as case study to find relationship between two conflicting objectives, viz., hydraulic efficiency and head. Design variables from the impeller geometry have been chosen and the responses of the objective functions were evaluated through CFD analysis. The fidelity of each metamodel has been discussed in context of their predictions in entire design space in general and near optimal region in particular. Exploitation of the multiple meta-models enhances the quality of multi-objective optimization and provides the information pertaining to fidelity of optimization model. It was observed that the Kriging meta-model was better suited for this type of problem as it involved less approximation error in the Pareto-optimal front.
Effectiveness of meta-models for multi-objective optimization of centrifugal impeller
Energy Technology Data Exchange (ETDEWEB)
Bellary, Sayed Ahmed Imran; Samad, Abdus [Indian Institute of Technology Madras, Chennai (India); Husain, Afzal [Sultan Qaboos University, Al-Khoudh (Oman)
2014-12-15
The major issue of multiple fidelity based analysis and optimization of fluid machinery system depends upon the proper construction of low fidelity model or meta-model. A low fidelity model uses responses obtained from a high fidelity model, and the meta-model is then used to produce population of solutions required for evolutionary algorithm for multi-objective optimization. The Pareto-optimal front which shows functional relationships among the multiple objectives can produce erroneous results if the low fidelity models are not well-constructed. In the present research, response surface approximation and Kriging meta-models were evaluated for their effectiveness for the application in the turbomachinery design and optimization. A high fidelity model such as CFD technique along with the metamodels was used to obtain Pareto-optimal front via multi-objective genetic algorithm. A centrifugal impeller has been considered as case study to find relationship between two conflicting objectives, viz., hydraulic efficiency and head. Design variables from the impeller geometry have been chosen and the responses of the objective functions were evaluated through CFD analysis. The fidelity of each metamodel has been discussed in context of their predictions in entire design space in general and near optimal region in particular. Exploitation of the multiple meta-models enhances the quality of multi-objective optimization and provides the information pertaining to fidelity of optimization model. It was observed that the Kriging meta-model was better suited for this type of problem as it involved less approximation error in the Pareto-optimal front.
DEFF Research Database (Denmark)
Ren, Jingzheng; Liang, Hanwei; Dong, Liang
2016-01-01
approach for supporting decision-making in the design for the sustainability with the implementation of industrial symbiosis in chemical complex. Through incorporating the emergy theory, the model is formulated as a multi-objective approach that can optimize both the economic benefit and sustainable...... performance of the integrated industrial system. A set of emergy based evaluation index are designed. Multi-objective Particle Swarm Algorithm is proposed to solve the model, and the decision-makers are allowed to choose the suitable solutions form the Pareto solutions. An illustrative case has been studied...
Towards a seascape typology. I. Zipf versus Pareto laws
Seuront, Laurent; Mitchell, James G.
Two data analysis methods, referred to as the Zipf and Pareto methods, initially introduced in economics and linguistics two centuries ago and subsequently used in a wide range of fields (word frequency in languages and literature, human demographics, finance, city formation, genomics and physics), are described and proposed here as a potential tool to classify space-time patterns in marine ecology. The aim of this paper is, first, to present the theoretical bases of Zipf and Pareto laws, and to demonstrate that they are strictly equivalent. In that way, we provide a one-to-one correspondence between their characteristic exponents and argue that the choice of technique is a matter of convenience. Second, we argue that the appeal of this technique is that it is assumption-free for the distribution of the data and regularity of sampling interval, as well as being extremely easy to implement. Finally, in order to allow marine ecologists to identify and classify any structure in their data sets, we provide a step by step overview of the characteristic shapes expected for Zipf's law for the cases of randomness, power law behavior, power law behavior contaminated by internal and external noise, and competing power laws illustrated on the basis of typical ecological situations such as mixing processes involving non-interacting and interacting species, phytoplankton growth processes and differential grazing by zooplankton.
Qi, Xin; Ju, Guohao; Xu, Shuyan
2018-04-10
The phase diversity (PD) technique needs optimization algorithms to minimize the error metric and find the global minimum. Particle swarm optimization (PSO) is very suitable for PD due to its simple structure, fast convergence, and global searching ability. However, the traditional PSO algorithm for PD still suffers from the stagnation problem (premature convergence), which can result in a wrong solution. In this paper, the stagnation problem of the traditional PSO algorithm for PD is illustrated first. Then, an explicit strategy is proposed to solve this problem, based on an in-depth understanding of the inherent optimization mechanism of the PSO algorithm. Specifically, a criterion is proposed to detect premature convergence; then a redistributing mechanism is proposed to prevent premature convergence. To improve the efficiency of this redistributing mechanism, randomized Halton sequences are further introduced to ensure the uniform distribution and randomness of the redistributed particles in the search space. Simulation results show that this strategy can effectively solve the stagnation problem of the PSO algorithm for PD, especially for large-scale and high-dimension wavefront sensing and noisy conditions. This work is further verified by an experiment. This work can improve the robustness and performance of PD wavefront sensing.
An n -material thresholding method for improving integerness of solutions in topology optimization
International Nuclear Information System (INIS)
Watts, Seth; Engineering); Tortorelli, Daniel A.; Engineering)
2016-01-01
It is common in solving topology optimization problems to replace an integer-valued characteristic function design field with the material volume fraction field, a real-valued approximation of the design field that permits "fictitious" mixtures of materials during intermediate iterations in the optimization process. This is reasonable so long as one can interpolate properties for such materials and so long as the final design is integer valued. For this purpose, we present a method for smoothly thresholding the volume fractions of an arbitrary number of material phases which specify the design. This method is trivial for two-material design problems, for example, the canonical topology design problem of specifying the presence or absence of a single material within a domain, but it becomes more complex when three or more materials are used, as often occurs in material design problems. We take advantage of the similarity in properties between the volume fractions and the barycentric coordinates on a simplex to derive a thresholding, method which is applicable to an arbitrary number of materials. As we show in a sensitivity analysis, this method has smooth derivatives, allowing it to be used in gradient-based optimization algorithms. Finally, we present results, which show synergistic effects when used with Solid Isotropic Material with Penalty and Rational Approximation of Material Properties material interpolation functions, popular methods of ensuring integerness of solutions.
International Nuclear Information System (INIS)
Shi, Zhongyuan; Dong, Tao
2015-01-01
Highlights: • A constructal thermohydraulic optimization was carried out. • The effect of manufacturing limit on the Pareto solution set was discussed. • The suitable constraints may differ from those on a quasi-continuous basis. - Abstract: A synthetic optimization is presented for the Pareto layouts of discrete heat sources (with uniform heat flux) flush mounted on a flat plate over which laminar flow serves for cooling purpose. The peak temperatures and the flow drag loss are minimizing simultaneously provided that the total heat dissipation rate and the plate length are held constant. The impact of the manufacturing limit, i.e. the minimum length of the heated or the adiabatic patch, on the optimum layout is discussed. The results in general comply with analytical deduction based on the constructal theory. However in a finite length scenario, geometric constraints on the adiabatic spacing differ from that fits the situation in which maximum heat transfer performance alone is to be achieved.
Setiawan, R.
2018-05-01
In this paper, Economic Order Quantity (EOQ) of the vendor-buyer supply-chain model under a probabilistic condition with imperfect quality items has been analysed. The analysis is delivered using two concepts in game theory approach, which is Stackelberg equilibrium and Pareto Optimal, under non-cooperative and cooperative games, respectively. Another result is getting acomparison of theoptimal result between integrated scheme and game theory approach based on analytical and numerical result using appropriate simulation data.
Optimal wind-hydro solution for the Marmara region of Turkey to meet electricity demand
International Nuclear Information System (INIS)
Dursun, Bahtiyar; Alboyaci, Bora; Gokcol, Cihan
2011-01-01
Wind power technology is now a reliable electricity production system. It presents an economically attractive solution for the continuously increasing energy demand of the Marmara region located in Turkey. However, the stochastic behavior of wind speed in the Marmara region can lead to significant disharmony between wind energy production and electricity demand. Therefore, to overcome wind's variable nature, a more reliable solution would be to integrate hydropower with wind energy. In this study, a methodology to estimate an optimal wind-hydro solution is developed and it is subsequently applied to six typical different site cases in the Marmara region in order to define the most beneficial configuration of the wind-hydro system. All numerical calculations are based on the long-term wind speed measurements, electrical load demand and operational characteristics of the system components. -- Research highlights: → This study is the first application of a wind-hydro pumped storage system in Turkey. → The methodology developed in this study is applied to the six sites in the Marmara region of Turkey. A wind - hydro pumped storage system is proposed to meet the electric energy demand of the Marmara region.
Energy Technology Data Exchange (ETDEWEB)
Dall' Anese, Emiliano; Simonetto, Andrea; Dhople, Sairaj
2016-12-29
This paper focuses on power distribution networks featuring inverter-interfaced distributed energy resources (DERs), and develops feedback controllers that drive the DER output powers to solutions of time-varying AC optimal power flow (OPF) problems. Control synthesis is grounded on primal-dual-type methods for regularized Lagrangian functions, as well as linear approximations of the AC power-flow equations. Convergence and OPF-solution-tracking capabilities are established while acknowledging: i) communication-packet losses, and ii) partial updates of control signals. The latter case is particularly relevant since it enables asynchronous operation of the controllers where DER setpoints are updated at a fast time scale based on local voltage measurements, and information on the network state is utilized if and when available, based on communication constraints. As an application, the paper considers distribution systems with high photovoltaic integration, and demonstrates that the proposed framework provides fast voltage-regulation capabilities, while enabling the near real-time pursuit of solutions of AC OPF problems.
Optimal Thermal Unit Commitment Solution integrating Renewable Energy with Generator Outage
Directory of Open Access Journals (Sweden)
S. Sivasakthi
2017-06-01
Full Text Available The increasing concern of global climate changes, the promotion of renewable energy sources, primarily wind generation, is a welcome move to reduce the pollutant emissions from conventional power plants. Integration of wind power generation with the existing power network is an emerging research field. This paper presents a meta-heuristic algorithm based approach to determine the feasible dispatch solution for wind integrated thermal power system. The Unit Commitment (UC process aims to identify the best feasible generation scheme of the committed units such that the overall generation cost is reduced, when subjected to a variety of constraints at each time interval. As the UC formulation involves many variables and system and operational constraints, identifying the best solution is still a research task. Nowadays, it is inevitable to include power system reliability issues in operation strategy. The generator failure and malfunction are the prime influencing factor for reliability issues hence they have considered in UC formulation of wind integrated thermal power system. The modern evolutionary algorithm known as Grey Wolf Optimization (GWO algorithm is applied to solve the intended UC problem. The potential of the GWO algorithm is validated by the standard test systems. Besides, the ramp rate limits are also incorporated in the UC formulation. The simulation results reveal that the GWO algorithm has the capability of obtaining economical resolutions with good solution quality.
Energy Technology Data Exchange (ETDEWEB)
Dall' Anese, Emiliano; Simonetto, Andrea; Dhople, Sairaj
2016-12-01
This paper focuses on power distribution networks featuring inverter-interfaced distributed energy resources (DERs), and develops feedback controllers that drive the DER output powers to solutions of time-varying AC optimal power flow (OPF) problems. Control synthesis is grounded on primal-dual-type methods for regularized Lagrangian functions, as well as linear approximations of the AC power-flow equations. Convergence and OPF-solution-tracking capabilities are established while acknowledging: i) communication-packet losses, and ii) partial updates of control signals. The latter case is particularly relevant since it enables asynchronous operation of the controllers where DER setpoints are updated at a fast time scale based on local voltage measurements, and information on the network state is utilized if and when available, based on communication constraints. As an application, the paper considers distribution systems with high photovoltaic integration, and demonstrates that the proposed framework provides fast voltage-regulation capabilities, while enabling the near real-time pursuit of solutions of AC OPF problems.
An asymptotically unbiased minimum density power divergence estimator for the Pareto-tail index
DEFF Research Database (Denmark)
Dierckx, Goedele; Goegebeur, Yuri; Guillou, Armelle
2013-01-01
We introduce a robust and asymptotically unbiased estimator for the tail index of Pareto-type distributions. The estimator is obtained by fitting the extended Pareto distribution to the relative excesses over a high threshold with the minimum density power divergence criterion. Consistency...
Strong Convergence Bound of the Pareto Index Estimator under Right Censoring
Directory of Open Access Journals (Sweden)
Peng Zuoxiang
2010-01-01
Full Text Available Let be a sequence of positive independent and identically distributed random variables with common Pareto-type distribution function as , where represents a slowly varying function at infinity. In this note we study the strong convergence bound of a kind of right censored Pareto index estimator under second-order regularly varying conditions.
DEFF Research Database (Denmark)
Ottosson, Rickard O; Engstrom, Per E; Sjöström, David
2008-01-01
constitute the Pareto front. The Pareto concept applies well to the inverse planning process, which involves inherently contradictory objectives, high and uniform target dose on one hand, and sparing of surrounding tissue and nearby organs at risk (OAR) on the other. Due to the specific characteristics...
Directory of Open Access Journals (Sweden)
E. SCHNEIDER
2014-07-01
Full Text Available The article is part of a special issue on occasion of the publication of the entire scientific correspondence of Vilfredo Pareto with Maffeo Pantaleoni. The author reconstructs the beginning of their correspondence, the debate in pure mathematical economics and draws main conclusions on the different views of Pareto with respect to Marshal, Edgeworth and Fisher.JEL: B16, B31, C02, C60
Optimal Locations of Bus Stops Connecting Subways near Urban Intersections
Directory of Open Access Journals (Sweden)
Yuan Cui
2015-01-01
Full Text Available Unsuitable locations of bus stops which provide feeder transportation connecting subways near urban intersections usually lead to the low efficiency of public transport and level of passenger service. A multiobjective optimization model to distribute such stop locations is proposed to attain the shortest total walk distance of passengers and minimum delay time of cars through intersections and travel time of buses. The Pareto frontier and optimal solutions for the proposed model are given by the distance-based and enumerative methods. The Xizhimen bus stop is selected to implement case studies for verifying the validity of the proposed model. The analysis of sensitivity on possible solutions is also carried out in the case studies. The results show that the proposed model is capable of optimizing the locations of bus stops connecting subways near intersections and helpful to improve the level of passengers service and operational efficiency of public transportation.
Moment-tensor solutions estimated using optimal filter theory: Global seismicity, 2001
Sipkin, S.A.; Bufe, C.G.; Zirbes, M.D.
2003-01-01
This paper is the 12th in a series published yearly containing moment-tensor solutions computed at the US Geological Survey using an algorithm based on the theory of optimal filter design (Sipkin, 1982 and Sipkin, 1986b). An inversion has been attempted for all earthquakes with a magnitude, mb or MS, of 5.5 or greater. Previous listings include solutions for earthquakes that occurred from 1981 to 2000 (Sipkin, 1986b; Sipkin and Needham, 1989, Sipkin and Needham, 1991, Sipkin and Needham, 1992, Sipkin and Needham, 1993, Sipkin and Needham, 1994a and Sipkin and Needham, 1994b; Sipkin and Zirbes, 1996 and Sipkin and Zirbes, 1997; Sipkin et al., 1998, Sipkin et al., 1999, Sipkin et al., 2000a, Sipkin et al., 2000b and Sipkin et al., 2002).The entire USGS moment-tensor catalog can be obtained via anonymous FTP at ftp://ghtftp.cr.usgs.gov. After logging on, change directory to “momten”. This directory contains two compressed ASCII files that contain the finalized solutions, “mt.lis.Z” and “fmech.lis.Z”. “mt.lis.Z” contains the elements of the moment tensors along with detailed event information; “fmech.lis.Z” contains the decompositions into the principal axes and best double-couples. The fast moment-tensor solutions for more recent events that have not yet been finalized and added to the catalog, are gathered by month in the files “jan01.lis.Z”, etc. “fmech.doc.Z” describes the various fields.
Optimizing nanodiscs and bicelles for solution NMR studies of two β-barrel membrane proteins
International Nuclear Information System (INIS)
Kucharska, Iga; Edrington, Thomas C.; Liang, Binyong; Tamm, Lukas K.
2015-01-01
Solution NMR spectroscopy has become a robust method to determine structures and explore the dynamics of integral membrane proteins. The vast majority of previous studies on membrane proteins by solution NMR have been conducted in lipid micelles. Contrary to the lipids that form a lipid bilayer in biological membranes, micellar lipids typically contain only a single hydrocarbon chain or two chains that are too short to form a bilayer. Therefore, there is a need to explore alternative more bilayer-like media to mimic the natural environment of membrane proteins. Lipid bicelles and lipid nanodiscs have emerged as two alternative membrane mimetics that are compatible with solution NMR spectroscopy. Here, we have conducted a comprehensive comparison of the physical and spectroscopic behavior of two outer membrane proteins from Pseudomonas aeruginosa, OprG and OprH, in lipid micelles, bicelles, and nanodiscs of five different sizes. Bicelles stabilized with a fraction of negatively charged lipids yielded spectra of almost comparable quality as in the best micellar solutions and the secondary structures were found to be almost indistinguishable in the two environments. Of the five nanodiscs tested, nanodiscs assembled from MSP1D1ΔH5 performed the best with both proteins in terms of sample stability and spectral resolution. Even in these optimal nanodiscs some broad signals from the membrane embedded barrel were severely overlapped with sharp signals from the flexible loops making their assignments difficult. A mutant OprH that had two of the flexible loops truncated yielded very promising spectra for further structural and dynamical analysis in MSP1D1ΔH5 nanodiscs
Optimization of ultrasonic arrays design and setting using a differential evolution
International Nuclear Information System (INIS)
Puel, B.; Chatillon, S.; Calmon, P.; Lesselier, D.
2011-01-01
Optimization of both design and setting of phased arrays could be not so easy when they are performed manually via parametric studies. An optimization method based on an Evolutionary Algorithm and numerical simulation is proposed and evaluated. The Randomized Adaptive Differential Evolution has been adapted to meet the specificities of the non-destructive testing applications. In particular, the solution of multi-objective problems is aimed at with the implementation of the concept of pareto-optimal sets of solutions. The algorithm has been implemented and connected to the ultrasonic simulation modules of the CIVA software used as forward model. The efficiency of the method is illustrated on two realistic cases of application: optimization of the position and delay laws of a flexible array inspecting a nozzle, considered as a mono-objective problem; and optimization of the design of a surrounded array and its delay laws, considered as a constrained bi-objective problem. (authors)
Optimization of constrained multiple-objective reliability problems using evolutionary algorithms
International Nuclear Information System (INIS)
Salazar, Daniel; Rocco, Claudio M.; Galvan, Blas J.
2006-01-01
This paper illustrates the use of multi-objective optimization to solve three types of reliability optimization problems: to find the optimal number of redundant components, find the reliability of components, and determine both their redundancy and reliability. In general, these problems have been formulated as single objective mixed-integer non-linear programming problems with one or several constraints and solved by using mathematical programming techniques or special heuristics. In this work, these problems are reformulated as multiple-objective problems (MOP) and then solved by using a second-generation Multiple-Objective Evolutionary Algorithm (MOEA) that allows handling constraints. The MOEA used in this paper (NSGA-II) demonstrates the ability to identify a set of optimal solutions (Pareto front), which provides the Decision Maker with a complete picture of the optimal solution space. Finally, the advantages of both MOP and MOEA approaches are illustrated by solving four redundancy problems taken from the literature
Optimization of constrained multiple-objective reliability problems using evolutionary algorithms
Energy Technology Data Exchange (ETDEWEB)
Salazar, Daniel [Instituto de Sistemas Inteligentes y Aplicaciones Numericas en Ingenieria (IUSIANI), Division de Computacion Evolutiva y Aplicaciones (CEANI), Universidad de Las Palmas de Gran Canaria, Islas Canarias (Spain) and Facultad de Ingenieria, Universidad Central Venezuela, Caracas (Venezuela)]. E-mail: danielsalazaraponte@gmail.com; Rocco, Claudio M. [Facultad de Ingenieria, Universidad Central Venezuela, Caracas (Venezuela)]. E-mail: crocco@reacciun.ve; Galvan, Blas J. [Instituto de Sistemas Inteligentes y Aplicaciones Numericas en Ingenieria (IUSIANI), Division de Computacion Evolutiva y Aplicaciones (CEANI), Universidad de Las Palmas de Gran Canaria, Islas Canarias (Spain)]. E-mail: bgalvan@step.es
2006-09-15
This paper illustrates the use of multi-objective optimization to solve three types of reliability optimization problems: to find the optimal number of redundant components, find the reliability of components, and determine both their redundancy and reliability. In general, these problems have been formulated as single objective mixed-integer non-linear programming problems with one or several constraints and solved by using mathematical programming techniques or special heuristics. In this work, these problems are reformulated as multiple-objective problems (MOP) and then solved by using a second-generation Multiple-Objective Evolutionary Algorithm (MOEA) that allows handling constraints. The MOEA used in this paper (NSGA-II) demonstrates the ability to identify a set of optimal solutions (Pareto front), which provides the Decision Maker with a complete picture of the optimal solution space. Finally, the advantages of both MOP and MOEA approaches are illustrated by solving four redundancy problems taken from the literature.
GAO Hongying; WU Kangping
2007-01-01
This paper estimates the Pareto exponent of the city size (population size and economy size) distribution, all provinces, and three regions in China in 1997, 2000 and 2003 by OLS, comparatively analyzes the Pareto exponent cross section and times, and empirically analyzes the factors which impacts on the Pareto exponents of provinces. Our analyses show that the size distributions of cities in China follow the Pareto distribution and are of structural features. Variations in the value of the P...
Directory of Open Access Journals (Sweden)
José Raúl Castro
2016-02-01
Full Text Available This paper presents an efficient algorithm to solve the multi-objective (MO voltage control problem in distribution networks. The proposed algorithm minimizes the following three objectives: voltage variation on pilot buses, reactive power production ratio deviation, and generator voltage deviation. This work leverages two optimization techniques: fuzzy logic to find the optimum value of the reactive power of the distributed generation (DG and Pareto optimization to find the optimal value of the pilot bus voltage so that this produces lower losses under the constraints that the voltage remains within established limits. Variable loads and DGs are taken into account in this paper. The algorithm is tested on an IEEE 13-node test feeder and the results show the effectiveness of the proposed model.
Nazemizadeh, M.; Rahimi, H. N.; Amini Khoiy, K.
2012-03-01
This paper presents an optimal control strategy for optimal trajectory planning of mobile robots by considering nonlinear dynamic model and nonholonomic constraints of the system. The nonholonomic constraints of the system are introduced by a nonintegrable set of differential equations which represent kinematic restriction on the motion. The Lagrange's principle is employed to derive the nonlinear equations of the system. Then, the optimal path planning of the mobile robot is formulated as an optimal control problem. To set up the problem, the nonlinear equations of the system are assumed as constraints, and a minimum energy objective function is defined. To solve the problem, an indirect solution of the optimal control method is employed, and conditions of the optimality derived as a set of coupled nonlinear differential equations. The optimality equations are solved numerically, and various simulations are performed for a nonholonomic mobile robot to illustrate effectiveness of the proposed method.
Directory of Open Access Journals (Sweden)
Xiang Yu
2016-06-01
Full Text Available Optimal operation of hydropower reservoir systems often needs to optimize multiple conflicting objectives simultaneously. The conflicting objectives result in a Pareto front, which is a set of non-dominated solutions. Non-dominated solutions cannot outperform each other on all the objectives. An optimization framework based on the multi-swarm comprehensive learning particle swarm optimization algorithm is proposed to solve the multi-objective operation of hydropower reservoir systems. Through adopting search techniques such as decomposition, mutation and differential evolution, the algorithm tries to derive multiple non-dominated solutions reasonably distributed over the true Pareto front in one single run, thereby facilitating determining the final tradeoff. The long-term sustainable planning of the Three Gorges cascaded hydropower system consisting of the Three Gorges Dam and Gezhouba Dam located on the Yangtze River in China is studied. Two conflicting objectives, i.e., maximizing hydropower generation and minimizing deviation from the outflow lower target to realize the system’s economic, environmental and social benefits during the drought season, are optimized simultaneously. Experimental results demonstrate that the optimization framework helps to robustly derive multiple feasible non-dominated solutions with satisfactory convergence, diversity and extremity in one single run for the case studied.
Optimal solutions for the evolution of a social obesity epidemic model
Sikander, Waseem; Khan, Umar; Mohyud-Din, Syed Tauseef
2017-06-01
In this work, a novel modification in the traditional homotopy perturbation method (HPM) is proposed by embedding an auxiliary parameter in the boundary condition. The scheme is used to carry out a mathematical evaluation of the social obesity epidemic model. The incidence of excess weight and obesity in adulthood population and prediction of its behavior in the coming years is analyzed by using a modified algorithm. The proposed method increases the convergence of the approximate analytical solution over the domain of the problem. Furthermore, a convenient way is considered for choosing an optimal value of auxiliary parameters via minimizing the total residual error. The graphical comparison of the obtained results with the standard HPM explicitly reveals the accuracy and efficiency of the developed scheme.
Directory of Open Access Journals (Sweden)
Stefanos Georganos
2018-02-01
Full Text Available In object-based image analysis (OBIA, the appropriate parametrization of segmentation algorithms is crucial for obtaining satisfactory image classification results. One of the ways this can be done is by unsupervised segmentation parameter optimization (USPO. A popular USPO method does this through the optimization of a “global score” (GS, which minimizes intrasegment heterogeneity and maximizes intersegment heterogeneity. However, the calculated GS values are sensitive to the minimum and maximum ranges of the candidate segmentations. Previous research proposed the use of fixed minimum/maximum threshold values for the intrasegment/intersegment heterogeneity measures to deal with the sensitivity of user-defined ranges, but the performance of this approach has not been investigated in detail. In the context of a remote sensing very-high-resolution urban application, we show the limitations of the fixed threshold approach, both in a theoretical and applied manner, and instead propose a novel solution to identify the range of candidate segmentations using local regression trend analysis. We found that the proposed approach showed significant improvements over the use of fixed minimum/maximum values, is less subjective than user-defined threshold values and, thus, can be of merit for a fully automated procedure and big data applications.
Hybrid solution and pump-storage optimization in water supply system efficiency: A case study
International Nuclear Information System (INIS)
Vieira, F.; Ramos, H.M.
2008-01-01
Environmental targets and saving energy have become ones of the world main concerns over the last years and it will increase and become more important in a near future. The world population growth rate is the major factor contributing for the increase in global pollution and energy and water consumption. In 2005, the world population was approximately 6.5 billion and this number is expected to reach 9 billion by 2050 [United Nations, 2008. (www.un.org), accessed on July]. Water supply systems use energy for pumping water, so new strategies must be developed and implemented in order to reduce this consumption. In addition, if there is excess of hydraulic energy in a water system, some type of water power generation can be implemented. This paper presents an optimization model that determines the best hourly operation for 1 day, according to the electricity tariff, for a pumped storage system with water consumption and inlet discharge. Wind turbines are introduced in the system. The rules obtained as output of the optimization process are subsequently introduced in a hydraulic simulator, in order to verify the system behaviour. A comparison with the normal water supply operating mode is done and the energy cost savings with this hybrid solution are calculated
Solving multiobjective optimal reactive power dispatch using modified NSGA-II
Energy Technology Data Exchange (ETDEWEB)
Jeyadevi, S.; Baskar, S.; Babulal, C.K.; Willjuice Iruthayarajan, M. [Department of Electrical and Electronics Engineering, Thiagarajar College of Engineering, Madurai, Tamilnadu 625 015 (India)
2011-02-15
This paper addresses an application of modified NSGA-II (MNSGA-II) by incorporating controlled elitism and dynamic crowding distance (DCD) strategies in NSGA-II to multiobjective optimal reactive power dispatch (ORPD) problem by minimizing real power loss and maximizing the system voltage stability. To validate the Pareto-front obtained using MNSGA-II, reference Pareto-front is generated using multiple runs of single objective optimization with weighted sum of objectives. For simulation purposes, IEEE 30 and IEEE 118 bus test systems are considered. The performance of MNSGA-II, NSGA-II and multiobjective particle swarm optimization (MOPSO) approaches are compared with respect to multiobjective performance measures. TOPSIS technique is applied on obtained non-dominated solutions to determine best compromise solution (BCS). Karush-Kuhn-Tucker (KKT) conditions are also applied on the obtained non-dominated solutions to substantiate a claim on optimality. Simulation results are quite promising and the MNSGA-II performs better than NSGA-II in maintaining diversity and authenticates its potential to solve multiobjective ORPD effectively. (author)
Dictatorship, liberalism and the Pareto rule: Possible and impossible
Directory of Open Access Journals (Sweden)
Boričić Branislav
2009-01-01
Full Text Available The current economic crisis has shaken belief in the capacity of neoliberal 'free market' policies. Numerous supports of state intervention have arisen, and the interest for social choice theory has revived. In this paper we consider three standard properties for aggregating individual into social preferences: dictatorship, liberalism and the Pareto rule, and their formal negations. The context of the pure first-order classical logic makes it possible to show how some combinations of the above mentioned conditions, under the hypothesis of unrestricted domain, form simple and reasonable examples of possible or impossible social choice systems. Due to their simplicity, these examples, including the famous 'liberal paradox', could have a particular didactic value.
Pareto analysis of critical factors affecting technical institution evaluation
Directory of Open Access Journals (Sweden)
Victor Gambhir
2012-08-01
Full Text Available With the change of education policy in 1991, more and more technical institutions are being set up in India. Some of these institutions provide quality education, but others are merely concentrating on quantity. These stakeholders are in a state of confusion about decision to select the best institute for their higher educational studies. Although various agencies including print media provide ranking of these institutions every year, but their results are controversial and biased. In this paper, the authors have made an endeavor to find the critical factors for technical institution evaluation from literature survey. A Pareto analysis has also been performed to find the intensity of these critical factors in evaluation. This will not only help the stake holders in taking right decisions but will also help the management of institutions in benchmarking for identifying the most important critical areas to improve the existing system. This will in turn help Indian economy.
Origin of Pareto-like spatial distributions in ecosystems.
Manor, Alon; Shnerb, Nadav M
2008-12-31
Recent studies of cluster distribution in various ecosystems revealed Pareto statistics for the size of spatial colonies. These results were supported by cellular automata simulations that yield robust criticality for endogenous pattern formation based on positive feedback. We show that this patch statistics is a manifestation of the law of proportionate effect. Mapping the stochastic model to a Markov birth-death process, the transition rates are shown to scale linearly with cluster size. This mapping provides a connection between patch statistics and the dynamics of the ecosystem; the "first passage time" for different colonies emerges as a powerful tool that discriminates between endogenous and exogenous clustering mechanisms. Imminent catastrophic shifts (such as desertification) manifest themselves in a drastic change of the stability properties of spatial colonies.
Optimizing the recovery of copper from electroplating rinse bath solution by hollow fiber membrane.
Oskay, Kürşad Oğuz; Kul, Mehmet
2015-01-01
This study aimed to recover and remove copper from industrial model wastewater solution by non-dispersive solvent extraction (NDSX). Two mathematical models were developed to simulate the performance of an integrated extraction-stripping process, based on the use of hollow fiber contactors using the response surface method. The models allow one to predict the time dependent efficiencies of the two phases involved in individual extraction or stripping processes. The optimal recovery efficiency parameters were determined as 227 g/L of H2SO4 concentration, 1.22 feed/strip ratio, 450 mL/min flow rate (115.9 cm/min. flow velocity) and 15 volume % LIX 84-I concentration in 270 min by central composite design (CCD). At these optimum conditions, the experimental value of recovery efficiency was 95.88%, which was in close agreement with the 97.75% efficiency value predicted by the model. At the end of the process, almost all the copper in the model wastewater solution was removed and recovered as CuSO4.5H2O salt, which can be reused in the copper electroplating industry.
Optimization of strontium adsorption from aqueous solution using (mn-Zr) oxide-pan composite spheres
International Nuclear Information System (INIS)
Inan, S.; Altas, Y.
2009-01-01
The processes based on adsorption and ion exchange have a great role for the pre-concentration and separation of toxic, long lived radionuclides from liquid waste. In Nuclear waste management, the removal of long lived, radiotoxic isotopes from radioactive waste such as strontium reduces the storage problems and facilitates the disposal of the waste. Depending on the waste type, a variety of adsorbents and/or ion exchangers are used. Due to the amorphous structure of hydrous oxides and their mixtures, they don't have reproducible properties. Besides, obtained powders are very fine particles and they can cause operational problems such as pressure drop and filtration. Therefore they are not suitable for column applications. These reasons have recently expedited the study on the preparation of organic-inorganic composite adsorbent beads for industrial applications. PAN, as a stable and porous support for fine particles, provides the utilization of ion exchangers in large scale column applications. The utilization of PAN as a support material with many inorganic ion exchangers was firstly achieved by Sebesta in the beginning of 1990's. Later on, PAN based composite ion exchangers were prepared and used for the removal of radionuclides and heavy metal ions from aqueous solution and waste waters. In this study, spherical (Mn-Zr)oxide-PAN composite were prepared for separation of strontium from aqueous solution in a wide pH range. Sr 2 + adsorption of composite adsorbent was optimized by using experimental design 'Central Composite Design' model.
Solving Multiobjective Optimization Problems Using Artificial Bee Colony Algorithm
Directory of Open Access Journals (Sweden)
Wenping Zou
2011-01-01
Full Text Available Multiobjective optimization has been a difficult problem and focus for research in fields of science and engineering. This paper presents a novel algorithm based on artificial bee colony (ABC to deal with multi-objective optimization problems. ABC is one of the most recently introduced algorithms based on the intelligent foraging behavior of a honey bee swarm. It uses less control parameters, and it can be efficiently used for solving multimodal and multidimensional optimization problems. Our algorithm uses the concept of Pareto dominance to determine the flight direction of a bee, and it maintains nondominated solution vectors which have been found in an external archive. The proposed algorithm is validated using the standard test problems, and simulation results show that the proposed approach is highly competitive and can be considered a viable alternative to solve multi-objective optimization problems.
Efficiency enhancement of a gas turbine cycle using an optimized tubular recuperative heat exchanger
International Nuclear Information System (INIS)
Sayyaadi, Hoseyn; Mehrabipour, Reza
2012-01-01
A simple gas turbine cycle namely as the Kraftwerk Union AG unit including a Siemens gas turbine model V93.1 with 60 MW nominal power and 26.0% thermal efficiency utilized in the Fars power plant located is considered for the efficiency enhancement. A typical tubular vertical recuperative heat exchanger is designed in order to integrate into the cycle as an air pre-heater for thermal efficiency improvement. Thermal and geometric specifications of the recuperative heat exchanger are obtained in a multi-objective optimization process. The exergetic efficiency of the gas cycle is maximized while the payback time for the capital investment of the recuperator is minimized. Combination of these objectives and decision variables with suitable engineering and physical constraints makes a set of the MINLP optimization problem. Optimization programming is performed using the NSGA-II algorithm and Pareto optimal frontiers are obtained in three cases including the minimum, average and maximum ambient air temperatures. In each case, the final optimal solution has been selected using three decision-making approaches including the fuzzy Bellman-Zadeh, LINMAP and TOPSIS methods. It has been shown that the TOPSIS and LINMAP decision-makers when applied on the Pareto frontier which is obtained at average ambient air temperature yields best results in comparison to other cases. -- Highlights: ► A simple Brayton gas cycle is considered for the efficiency improvement by integrating of a recuperator. ► Objective functions based on thermodynamic and economic analysis are obtained. ► The payback time for the capital investment is minimized and the exergetic efficiency of the system is maximized. ► Pareto optimal frontiers at various site conditions are obtained. ► A final optimal configuration is found using various decision-making approaches.
Multiobjective Optimization of a Counterrotating Type Pump-Turbine Unit Operated at Turbine Mode
Directory of Open Access Journals (Sweden)
Jin-Hyuk Kim
2014-05-01
Full Text Available A multiobjective optimization for improving the turbine output and efficiency of a counterrotating type pump-turbine unit operated at turbine mode was carried out in this work. The blade geometry of both the runners was optimized using a hybrid multiobjective evolutionary algorithm coupled with a surrogate model. Three-dimensional Reynolds-averaged Navier-Stokes equations with the shear stress transport turbulence model were discretized by finite volume approximations and solved on hexahedral grids to analyze the flow in the pump-turbine unit. As major hydrodynamic performance parameters, the turbine output and efficiency were selected as objective functions with two design variables related to the hub profiles of both the runner blades. These objectives were numerically assessed at twelve design points selected by Latin hypercube sampling in the design space. Response surface approximation models for the objectives were constructed based on the objective function values at the design points. A fast nondominated sorting genetic algorithm for the local search coupled with the response surface approximation models was applied to determine the global Pareto-optimal solutions. The trade-off between the two objectives was determined and described with respect to the Pareto-optimal solutions. The results of this work showed that the turbine outputs and efficiencies of optimized pump-turbine units were simultaneously improved in comparison to the reference unit.
Improved multi-objective clustering algorithm using particle swarm optimization.
Directory of Open Access Journals (Sweden)
Congcong Gong
Full Text Available Multi-objective clustering has received widespread attention recently, as it can obtain more accurate and reasonable solution. In this paper, an improved multi-objective clustering framework using particle swarm optimization (IMCPSO is proposed. Firstly, a novel particle representation for clustering problem is designed to help PSO search clustering solutions in continuous space. Secondly, the distribution of Pareto set is analyzed. The analysis results are applied to the leader selection strategy, and make algorithm avoid trapping in local optimum. Moreover, a clustering solution-improved method is proposed, which can increase the efficiency in searching clustering solution greatly. In the experiments, 28 datasets are used and nine state-of-the-art clustering algorithms are compared, the proposed method is superior to other approaches in the evaluation index ARI.
Improved multi-objective clustering algorithm using particle swarm optimization.
Gong, Congcong; Chen, Haisong; He, Weixiong; Zhang, Zhanliang
2017-01-01
Multi-objective clustering has received widespread attention recently, as it can obtain more accurate and reasonable solution. In this paper, an improved multi-objective clustering framework using particle swarm optimization (IMCPSO) is proposed. Firstly, a novel particle representation for clustering problem is designed to help PSO search clustering solutions in continuous space. Secondly, the distribution of Pareto set is analyzed. The analysis results are applied to the leader selection strategy, and make algorithm avoid trapping in local optimum. Moreover, a clustering solution-improved method is proposed, which can increase the efficiency in searching clustering solution greatly. In the experiments, 28 datasets are used and nine state-of-the-art clustering algorithms are compared, the proposed method is superior to other approaches in the evaluation index ARI.
International Nuclear Information System (INIS)
Salari, Ehsan; Craft, David; Wala, Jeremiah
2012-01-01
To formulate and solve the fluence-map merging procedure of the recently-published VMAT treatment-plan optimization method, called vmerge, as a bi-criteria optimization problem. Using an exact merging method rather than the previously-used heuristic, we are able to better characterize the trade-off between the delivery efficiency and dose quality. vmerge begins with a solution of the fluence-map optimization problem with 180 equi-spaced beams that yields the ‘ideal’ dose distribution. Neighboring fluence maps are then successively merged, meaning that they are added together and delivered as a single map. The merging process improves the delivery efficiency at the expense of deviating from the initial high-quality dose distribution. We replace the original merging heuristic by considering the merging problem as a discrete bi-criteria optimization problem with the objectives of maximizing the treatment efficiency and minimizing the deviation from the ideal dose. We formulate this using a network-flow model that represents the merging problem. Since the problem is discrete and thus non-convex, we employ a customized box algorithm to characterize the Pareto frontier. The Pareto frontier is then used as a benchmark to evaluate the performance of the standard vmerge algorithm as well as two other similar heuristics. We test the exact and heuristic merging approaches on a pancreas and a prostate cancer case. For both cases, the shape of the Pareto frontier suggests that starting from a high-quality plan, we can obtain efficient VMAT plans through merging neighboring fluence maps without substantially deviating from the initial dose distribution. The trade-off curves obtained by the various heuristics are contrasted and shown to all be equally capable of initial plan simplifications, but to deviate in quality for more drastic efficiency improvements. This work presents a network optimization approach to the merging problem. Contrasting the trade-off curves of the
Salari, Ehsan; Wala, Jeremiah; Craft, David
2012-09-07
To formulate and solve the fluence-map merging procedure of the recently-published VMAT treatment-plan optimization method, called VMERGE, as a bi-criteria optimization problem. Using an exact merging method rather than the previously-used heuristic, we are able to better characterize the trade-off between the delivery efficiency and dose quality. VMERGE begins with a solution of the fluence-map optimization problem with 180 equi-spaced beams that yields the 'ideal' dose distribution. Neighboring fluence maps are then successively merged, meaning that they are added together and delivered as a single map. The merging process improves the delivery efficiency at the expense of deviating from the initial high-quality dose distribution. We replace the original merging heuristic by considering the merging problem as a discrete bi-criteria optimization problem with the objectives of maximizing the treatment efficiency and minimizing the deviation from the ideal dose. We formulate this using a network-flow model that represents the merging problem. Since the problem is discrete and thus non-convex, we employ a customized box algorithm to characterize the Pareto frontier. The Pareto frontier is then used as a benchmark to evaluate the performance of the standard VMERGE algorithm as well as two other similar heuristics. We test the exact and heuristic merging approaches on a pancreas and a prostate cancer case. For both cases, the shape of the Pareto frontier suggests that starting from a high-quality plan, we can obtain efficient VMAT plans through merging neighboring fluence maps without substantially deviating from the initial dose distribution. The trade-off curves obtained by the various heuristics are contrasted and shown to all be equally capable of initial plan simplifications, but to deviate in quality for more drastic efficiency improvements. This work presents a network optimization approach to the merging problem. Contrasting the trade-off curves of the merging
Design of a centrifugal compressor impeller using multi-objective optimization algorithm
International Nuclear Information System (INIS)
Kim, Jin Hyuk; Husain, Afzal; Kim, Kwang Yong; Choi, Jae Ho
2009-01-01
This paper presents a design optimization of a centrifugal compressor impeller with hybrid multi-objective evolutionary algorithm (hybrid MOEA). Reynolds-averaged Navier-Stokes equations with shear stress transport turbulence model are discretized by finite volume approximations and solved on hexahedral grids for flow analyses. Two objectives, i.e., isentropic efficiency and total pressure ratio are selected with four design variables defining impeller hub and shroud contours in meridional contours to optimize the system. Non-dominated Sorting of Genetic Algorithm (NSGA-II) with ε-constraint strategy for local search coupled with Radial Basis Neural Network model is used for multi-objective optimization. The optimization results show that isentropic efficiencies and total pressure ratios of the five cluster points at the Pareto-optimal solutions are enhanced by multi-objective optimization.
Design of a centrifugal compressor impeller using multi-objective optimization algorithm
Energy Technology Data Exchange (ETDEWEB)
Kim, Jin Hyuk; Husain, Afzal; Kim, Kwang Yong [Inha University, Incheon (Korea, Republic of); Choi, Jae Ho [Samsung Techwin Co., Ltd., Changwon (Korea, Republic of)
2009-07-01
This paper presents a design optimization of a centrifugal compressor impeller with hybrid multi-objective evolutionary algorithm (hybrid MOEA). Reynolds-averaged Navier-Stokes equations with shear stress transport turbulence model are discretized by finite volume approximations and solved on hexahedral grids for flow analyses. Two objectives, i.e., isentropic efficiency and total pressure ratio are selected with four design variables defining impeller hub and shroud contours in meridional contours to optimize the system. Non-dominated Sorting of Genetic Algorithm (NSGA-II) with {epsilon}-constraint strategy for local search coupled with Radial Basis Neural Network model is used for multi-objective optimization. The optimization results show that isentropic efficiencies and total pressure ratios of the five cluster points at the Pareto-optimal solutions are enhanced by multi-objective optimization.
International Nuclear Information System (INIS)
Shao, Wei; Cui, Zheng; Cheng, Lin
2017-01-01
Highlights: • A multi-objective optimization model of air distributions of grate cooler by genetic algorithm is proposed. • Optimal air distributions of different conditions are obtained and validated by measurements. • The most economic average diameters of clinker particles is 0.02 m. • The most economic amount of air chambers is 9. - Abstract: The paper proposes a multi-objective optimization model of cooling air distributions of grate cooler in cement plant based on convective heat transfer principle and entropy generation minimization analysis. The heat transfer and flow models of clinker cooling process are brought out at first. Then the modified entropy generation numbers caused by heat transfer and viscous dissipation are considered as objective functions respectively which are optimized by genetic algorithm simultaneously. The design variables are superficial velocities of air chambers and thicknesses of clinker layer on different grate plates. The model is verified by a set of Pareto optimal solutions and scattered distributions of design variables. Sensitive analysis of average diameters of clinker particles and amount of air chambers are carried out based on the optimization model. The optimal cooling air distributions are compared by heat recovered, energy consumption of cooling fans and heat efficiency of grate cooler. And all of them are selected from the Pareto optimal solutions based on energy consumption of cooling fans minimization. The results show that the most effective and economic average diameter of clinker particles is 0.02 m and the amount of air chambers is 9.
Walkowiak-Tomczak, Dorota; Czapski, Janusz; Młynarczyk, Karolina
2016-01-01
Elderberries are a source of dietary supplements and bioactive compounds, such as anthocyanins. These dyes are used in food technology. The aim of the study was to assess the changes in colour parameters, anthocyanin contents and sensory attributes in solutions of elderberry juice concentrates during storage in a model system and to determine predictability of sensory attributes of colour in solutions based on regression equations using the response surface methodology. The experiment was carried out according to the 3-level factorial design for three factors. Independent variables included pH, storage time and temperature. Dependent variables were assumed to be the components and colour parameters in the CIE L*a*b* system, pigment contents and sensory attributes. Changes in colour components X, Y, Z and colour parameters L*, a*, b*, C* and h* were most dependent on pH values. Colour lightness L* and tone h* increased with an increase in experimental factors, while the share of the red colour a* and colour saturation C* decreased. The greatest effect on the anthocyanin concentration was recorded for storage time. Sensory attributes deteriorated during storage. The highest correlation coefficients were found between the value of colour tone h* and anthocyanin contents in relation to the assessment of the naturalness and desirability of colour. A high goodness-of-fit of the model to data and high values of R2 for regression equations were obtained for all responses. The response surface method facilitates optimization of experimental factor values in order to obtain a specific attribute of the product, but not in all cases of the experiment. Within the tested range of factors, it is possible to predict changes in anthocyanin content and the sensory attributes of elderberry juice concentrate solutions as food dye, on the basis of the lack of a fit test. The highest stability of dyes and colour of elderberry solutions was found in the samples at pH 3.0, which confirms
International Nuclear Information System (INIS)
Toffolo, A.; Lazzaretto, A.
2002-01-01
Thermoeconomic analyses in thermal system design are always focused on the economic objective. However, knowledge of only the economic minimum may not be sufficient in the decision making process, since solutions with a higher thermodynamic efficiency, in spite of small increases in total costs, may result in much more interesting designs due to changes in energy market prices or in energy policies. This paper suggests how to perform a multi-objective optimization in order to find solutions that simultaneously satisfy exergetic and economic objectives. This corresponds to a search for the set of Pareto optimal solutions with respect to the two competing objectives. The optimization process is carried out by an evolutionary algorithm, that features a new diversity preserving mechanism using as a test case the well-known CGAM problem. (author)
Czech Academy of Sciences Publication Activity Database
Haslinger, J.; Outrata, Jiří; Pathó, R.
2012-01-01
Roč. 20, č. 1 (2012), s. 31-59 ISSN 1877-0533 R&D Projects: GA AV ČR IAA100750802 Institutional research plan: CEZ:AV0Z10750506 Institutional support: RVO:67985556 Keywords : shape optimization * Signorini problem * model with given frinction * solution-dependent coefficient of friction * mathematical probrams with equilibrium constraints Subject RIV: BA - General Mathematics Impact factor: 1.036, year: 2012 http://library.utia.cas.cz/separaty/2012/MTR/outrata-shape optimization in 2d contact problems with given friction and a solution-dependent coefficient of friction .pdf
Directory of Open Access Journals (Sweden)
Nadjla Hariri
2013-03-01
Full Text Available This study aimed to determine the status of Persian professional web social networks' features and provide a suitable solution for optimization of these networks in Iran. The research methods were library research and evaluative method, and study population consisted of 10 Persian professional web social networks. In this study, for data collection, a check list of social networks important tools and features was used. According to the results, “Cloob”, “IR Experts” and “Doreh” were the most compatible networks with the criteria of social networks. Finally, some solutions were presented for optimization of capabilities of Persian professional web social networks.
Zhou, Bao-Rong; Liu, Si-Liang; Zhang, Yong-Jun; Yi, Ying-Qi; Lin, Xiao-Ming
2017-05-01
To mitigate the impact on the distribution networks caused by the stochastic characteristic and high penetration of photovoltaic, a multi-objective optimal power flow model is proposed in this paper. The regulation capability of capacitor, inverter of photovoltaic and energy storage system embedded in active distribution network are considered to minimize the expected value of active power the T loss and probability of voltage violation in this model. Firstly, a probabilistic power flow based on cumulant method is introduced to calculate the value of the objectives. Secondly, NSGA-II algorithm is adopted for optimization to obtain the Pareto optimal solutions. Finally, the best compromise solution can be achieved through fuzzy membership degree method. By the multi-objective optimization calculation of IEEE34-node distribution network, the results show that the model can effectively improve the voltage security and economy of the distribution network on different levels of photovoltaic penetration.
Optimal design and planning of glycerol-based biorefinery supply chains under uncertainty
DEFF Research Database (Denmark)
Loureiro da Costa Lira Gargalo, Carina; Carvalho, Ana; Gernaey, Krist V.
2017-01-01
-echelon mixed integer linear programming problem is proposed based upon a previous model, GlyThink. In the new formulation, market uncertainties are taken into account at the strategic planning level. The robustness of the supply chain structures is analyzed based on statistical data provided...... by the implementation of the Monte Carlo method, where a deterministic optimization problem is solved for each scenario. Furthermore, the solution of the stochastic multi-objective optimization model, points to the Pareto set of trade-off solutions obtained when maximizing the NPV and minimizing environmental......The optimal design and planning of glycerol-based biorefinery supply chains is critical for the development and implementation of this concept in a sustainable manner. To achieve this, a decision-making framework is proposed in this work, to holistically optimize the design and planning...
Multi-objective optimization of a plate and frame heat exchanger via genetic algorithm
Energy Technology Data Exchange (ETDEWEB)
Najafi, Hamidreza; Najafi, Behzad [K. N. Toosi University of Technology, Department of Mechanical Engineering, Tehran (Iran)
2010-06-15
In the present paper, a plate and frame heat exchanger is considered. Multi-objective optimization using genetic algorithm is developed in order to obtain a set of geometric design parameters, which lead to minimum pressure drop and the maximum overall heat transfer coefficient. Vividly, considered objective functions are conflicting and no single solution can satisfy both objectives simultaneously. Multi-objective optimization procedure yields a set of optimal solutions, called Pareto front, each of which is a trade-off between objectives and can be selected by the user, regarding the application and the project's limits. The presented work takes care of numerous geometric parameters in the presence of logical constraints. A sensitivity analysis is also carried out to study the effects of different geometric parameters on the considered objective functions. Modeling the system and implementing the multi-objective optimization via genetic algorithm has been performed by MATLAB. (orig.)