WorldWideScience

Sample records for pareto evolutionary algorithm

  1. Sensitivity versus accuracy in multiclass problems using memetic Pareto evolutionary neural networks.

    Science.gov (United States)

    Fernández Caballero, Juan Carlos; Martínez, Francisco José; Hervás, César; Gutiérrez, Pedro Antonio

    2010-05-01

    This paper proposes a multiclassification algorithm using multilayer perceptron neural network models. It tries to boost two conflicting main objectives of multiclassifiers: a high correct classification rate level and a high classification rate for each class. This last objective is not usually optimized in classification, but is considered here given the need to obtain high precision in each class in real problems. To solve this machine learning problem, we use a Pareto-based multiobjective optimization methodology based on a memetic evolutionary algorithm. We consider a memetic Pareto evolutionary approach based on the NSGA2 evolutionary algorithm (MPENSGA2). Once the Pareto front is built, two strategies or automatic individual selection are used: the best model in accuracy and the best model in sensitivity (extremes in the Pareto front). These methodologies are applied to solve 17 classification benchmark problems obtained from the University of California at Irvine (UCI) repository and one complex real classification problem. The models obtained show high accuracy and a high classification rate for each class.

  2. Improving Polyp Detection Algorithms for CT Colonography: Pareto Front Approach.

    Science.gov (United States)

    Huang, Adam; Li, Jiang; Summers, Ronald M; Petrick, Nicholas; Hara, Amy K

    2010-03-21

    We investigated a Pareto front approach to improving polyp detection algorithms for CT colonography (CTC). A dataset of 56 CTC colon surfaces with 87 proven positive detections of 53 polyps sized 4 to 60 mm was used to evaluate the performance of a one-step and a two-step curvature-based region growing algorithm. The algorithmic performance was statistically evaluated and compared based on the Pareto optimal solutions from 20 experiments by evolutionary algorithms. The false positive rate was lower (pPareto optimization process can effectively help in fine-tuning and redesigning polyp detection algorithms.

  3. Optimization of externalities using DTM measures: a Pareto optimal multi objective optimization using the evolutionary algorithm SPEA2+

    NARCIS (Netherlands)

    Wismans, Luc Johannes Josephus; van Berkum, Eric C.; Bliemer, Michiel; Allkim, T.P.; van Arem, Bart

    2010-01-01

    Multi objective optimization of externalities of traffic is performed solving a network design problem in which Dynamic Traffic Management measures are used. The resulting Pareto optimal set is determined by employing the SPEA2+ evolutionary algorithm.

  4. Global WASF-GA: An Evolutionary Algorithm in Multiobjective Optimization to Approximate the Whole Pareto Optimal Front.

    Science.gov (United States)

    Saborido, Rubén; Ruiz, Ana B; Luque, Mariano

    2017-01-01

    In this article, we propose a new evolutionary algorithm for multiobjective optimization called Global WASF-GA ( global weighting achievement scalarizing function genetic algorithm), which falls within the aggregation-based evolutionary algorithms. The main purpose of Global WASF-GA is to approximate the whole Pareto optimal front. Its fitness function is defined by an achievement scalarizing function (ASF) based on the Tchebychev distance, in which two reference points are considered (both utopian and nadir objective vectors) and the weight vector used is taken from a set of weight vectors whose inverses are well-distributed. At each iteration, all individuals are classified into different fronts. Each front is formed by the solutions with the lowest values of the ASF for the different weight vectors in the set, using the utopian vector and the nadir vector as reference points simultaneously. Varying the weight vector in the ASF while considering the utopian and the nadir vectors at the same time enables the algorithm to obtain a final set of nondominated solutions that approximate the whole Pareto optimal front. We compared Global WASF-GA to MOEA/D (different versions) and NSGA-II in two-, three-, and five-objective problems. The computational results obtained permit us to conclude that Global WASF-GA gets better performance, regarding the hypervolume metric and the epsilon indicator, than the other two algorithms in many cases, especially in three- and five-objective problems.

  5. Test scheduling optimization for 3D network-on-chip based on cloud evolutionary algorithm of Pareto multi-objective

    Science.gov (United States)

    Xu, Chuanpei; Niu, Junhao; Ling, Jing; Wang, Suyan

    2018-03-01

    In this paper, we present a parallel test strategy for bandwidth division multiplexing under the test access mechanism bandwidth constraint. The Pareto solution set is combined with a cloud evolutionary algorithm to optimize the test time and power consumption of a three-dimensional network-on-chip (3D NoC). In the proposed method, all individuals in the population are sorted in non-dominated order and allocated to the corresponding level. Individuals with extreme and similar characteristics are then removed. To increase the diversity of the population and prevent the algorithm from becoming stuck around local optima, a competition strategy is designed for the individuals. Finally, we adopt an elite reservation strategy and update the individuals according to the cloud model. Experimental results show that the proposed algorithm converges to the optimal Pareto solution set rapidly and accurately. This not only obtains the shortest test time, but also optimizes the power consumption of the 3D NoC.

  6. A New DG Multiobjective Optimization Method Based on an Improved Evolutionary Algorithm

    Directory of Open Access Journals (Sweden)

    Wanxing Sheng

    2013-01-01

    Full Text Available A distribution generation (DG multiobjective optimization method based on an improved Pareto evolutionary algorithm is investigated in this paper. The improved Pareto evolutionary algorithm, which introduces a penalty factor in the objective function constraints, uses an adaptive crossover and a mutation operator in the evolutionary process and combines a simulated annealing iterative process. The proposed algorithm is utilized to the optimize DG injection models to maximize DG utilization while minimizing system loss and environmental pollution. A revised IEEE 33-bus system with multiple DG units was used to test the multiobjective optimization algorithm in a distribution power system. The proposed algorithm was implemented and compared with the strength Pareto evolutionary algorithm 2 (SPEA2, a particle swarm optimization (PSO algorithm, and nondominated sorting genetic algorithm II (NGSA-II. The comparison of the results demonstrates the validity and practicality of utilizing DG units in terms of economic dispatch and optimal operation in a distribution power system.

  7. An Improved Multiobjective Optimization Evolutionary Algorithm Based on Decomposition for Complex Pareto Fronts.

    Science.gov (United States)

    Jiang, Shouyong; Yang, Shengxiang

    2016-02-01

    The multiobjective evolutionary algorithm based on decomposition (MOEA/D) has been shown to be very efficient in solving multiobjective optimization problems (MOPs). In practice, the Pareto-optimal front (POF) of many MOPs has complex characteristics. For example, the POF may have a long tail and sharp peak and disconnected regions, which significantly degrades the performance of MOEA/D. This paper proposes an improved MOEA/D for handling such kind of complex problems. In the proposed algorithm, a two-phase strategy (TP) is employed to divide the whole optimization procedure into two phases. Based on the crowdedness of solutions found in the first phase, the algorithm decides whether or not to delicate computational resources to handle unsolved subproblems in the second phase. Besides, a new niche scheme is introduced into the improved MOEA/D to guide the selection of mating parents to avoid producing duplicate solutions, which is very helpful for maintaining the population diversity when the POF of the MOP being optimized is discontinuous. The performance of the proposed algorithm is investigated on some existing benchmark and newly designed MOPs with complex POF shapes in comparison with several MOEA/D variants and other approaches. The experimental results show that the proposed algorithm produces promising performance on these complex problems.

  8. Hybridization of Strength Pareto Multiobjective Optimization with Modified Cuckoo Search Algorithm for Rectangular Array.

    Science.gov (United States)

    Abdul Rani, Khairul Najmy; Abdulmalek, Mohamedfareq; A Rahim, Hasliza; Siew Chin, Neoh; Abd Wahab, Alawiyah

    2017-04-20

    This research proposes the various versions of modified cuckoo search (MCS) metaheuristic algorithm deploying the strength Pareto evolutionary algorithm (SPEA) multiobjective (MO) optimization technique in rectangular array geometry synthesis. Precisely, the MCS algorithm is proposed by incorporating the Roulette wheel selection operator to choose the initial host nests (individuals) that give better results, adaptive inertia weight to control the positions exploration of the potential best host nests (solutions), and dynamic discovery rate to manage the fraction probability of finding the best host nests in 3-dimensional search space. In addition, the MCS algorithm is hybridized with the particle swarm optimization (PSO) and hill climbing (HC) stochastic techniques along with the standard strength Pareto evolutionary algorithm (SPEA) forming the MCSPSOSPEA and MCSHCSPEA, respectively. All the proposed MCS-based algorithms are examined to perform MO optimization on Zitzler-Deb-Thiele's (ZDT's) test functions. Pareto optimum trade-offs are done to generate a set of three non-dominated solutions, which are locations, excitation amplitudes, and excitation phases of array elements, respectively. Overall, simulations demonstrates that the proposed MCSPSOSPEA outperforms other compatible competitors, in gaining a high antenna directivity, small half-power beamwidth (HPBW), low average side lobe level (SLL) suppression, and/or significant predefined nulls mitigation, simultaneously.

  9. An encoding technique for multiobjective evolutionary algorithms applied to power distribution system reconfiguration.

    Science.gov (United States)

    Guardado, J L; Rivas-Davalos, F; Torres, J; Maximov, S; Melgoza, E

    2014-01-01

    Network reconfiguration is an alternative to reduce power losses and optimize the operation of power distribution systems. In this paper, an encoding scheme for evolutionary algorithms is proposed in order to search efficiently for the Pareto-optimal solutions during the reconfiguration of power distribution systems considering multiobjective optimization. The encoding scheme is based on the edge window decoder (EWD) technique, which was embedded in the Strength Pareto Evolutionary Algorithm 2 (SPEA2) and the Nondominated Sorting Genetic Algorithm II (NSGA-II). The effectiveness of the encoding scheme was proved by solving a test problem for which the true Pareto-optimal solutions are known in advance. In order to prove the practicability of the encoding scheme, a real distribution system was used to find the near Pareto-optimal solutions for different objective functions to optimize.

  10. An Encoding Technique for Multiobjective Evolutionary Algorithms Applied to Power Distribution System Reconfiguration

    Directory of Open Access Journals (Sweden)

    J. L. Guardado

    2014-01-01

    Full Text Available Network reconfiguration is an alternative to reduce power losses and optimize the operation of power distribution systems. In this paper, an encoding scheme for evolutionary algorithms is proposed in order to search efficiently for the Pareto-optimal solutions during the reconfiguration of power distribution systems considering multiobjective optimization. The encoding scheme is based on the edge window decoder (EWD technique, which was embedded in the Strength Pareto Evolutionary Algorithm 2 (SPEA2 and the Nondominated Sorting Genetic Algorithm II (NSGA-II. The effectiveness of the encoding scheme was proved by solving a test problem for which the true Pareto-optimal solutions are known in advance. In order to prove the practicability of the encoding scheme, a real distribution system was used to find the near Pareto-optimal solutions for different objective functions to optimize.

  11. PARETO: A novel evolutionary optimization approach to multiobjective IMRT planning.

    Science.gov (United States)

    Fiege, Jason; McCurdy, Boyd; Potrebko, Peter; Champion, Heather; Cull, Andrew

    2011-09-01

    In radiation therapy treatment planning, the clinical objectives of uniform high dose to the planning target volume (PTV) and low dose to the organs-at-risk (OARs) are invariably in conflict, often requiring compromises to be made between them when selecting the best treatment plan for a particular patient. In this work, the authors introduce Pareto-Aware Radiotherapy Evolutionary Treatment Optimization (pareto), a multiobjective optimization tool to solve for beam angles and fluence patterns in intensity-modulated radiation therapy (IMRT) treatment planning. pareto is built around a powerful multiobjective genetic algorithm (GA), which allows us to treat the problem of IMRT treatment plan optimization as a combined monolithic problem, where all beam fluence and angle parameters are treated equally during the optimization. We have employed a simple parameterized beam fluence representation with a realistic dose calculation approach, incorporating patient scatter effects, to demonstrate feasibility of the proposed approach on two phantoms. The first phantom is a simple cylindrical phantom containing a target surrounded by three OARs, while the second phantom is more complex and represents a paraspinal patient. pareto results in a large database of Pareto nondominated solutions that represent the necessary trade-offs between objectives. The solution quality was examined for several PTV and OAR fitness functions. The combination of a conformity-based PTV fitness function and a dose-volume histogram (DVH) or equivalent uniform dose (EUD) -based fitness function for the OAR produced relatively uniform and conformal PTV doses, with well-spaced beams. A penalty function added to the fitness functions eliminates hotspots. Comparison of resulting DVHs to those from treatment plans developed with a single-objective fluence optimizer (from a commercial treatment planning system) showed good correlation. Results also indicated that pareto shows promise in optimizing the number

  12. Comprehensive preference optimization of an irreversible thermal engine using pareto based mutable smart bee algorithm and generalized regression neural network

    DEFF Research Database (Denmark)

    Mozaffari, Ahmad; Gorji-Bandpy, Mofid; Samadian, Pendar

    2013-01-01

    Optimizing and controlling of complex engineering systems is a phenomenon that has attracted an incremental interest of numerous scientists. Until now, a variety of intelligent optimizing and controlling techniques such as neural networks, fuzzy logic, game theory, support vector machines...... and stochastic algorithms were proposed to facilitate controlling of the engineering systems. In this study, an extended version of mutable smart bee algorithm (MSBA) called Pareto based mutable smart bee (PBMSB) is inspired to cope with multi-objective problems. Besides, a set of benchmark problems and four...... well-known Pareto based optimizing algorithms i.e. multi-objective bee algorithm (MOBA), multi-objective particle swarm optimization (MOPSO) algorithm, non-dominated sorting genetic algorithm (NSGA-II), and strength Pareto evolutionary algorithm (SPEA 2) are utilized to confirm the acceptable...

  13. PARETO: A novel evolutionary optimization approach to multiobjective IMRT planning

    International Nuclear Information System (INIS)

    Fiege, Jason; McCurdy, Boyd; Potrebko, Peter; Champion, Heather; Cull, Andrew

    2011-01-01

    Purpose: In radiation therapy treatment planning, the clinical objectives of uniform high dose to the planning target volume (PTV) and low dose to the organs-at-risk (OARs) are invariably in conflict, often requiring compromises to be made between them when selecting the best treatment plan for a particular patient. In this work, the authors introduce Pareto-Aware Radiotherapy Evolutionary Treatment Optimization (pareto), a multiobjective optimization tool to solve for beam angles and fluence patterns in intensity-modulated radiation therapy (IMRT) treatment planning. Methods: pareto is built around a powerful multiobjective genetic algorithm (GA), which allows us to treat the problem of IMRT treatment plan optimization as a combined monolithic problem, where all beam fluence and angle parameters are treated equally during the optimization. We have employed a simple parameterized beam fluence representation with a realistic dose calculation approach, incorporating patient scatter effects, to demonstrate feasibility of the proposed approach on two phantoms. The first phantom is a simple cylindrical phantom containing a target surrounded by three OARs, while the second phantom is more complex and represents a paraspinal patient. Results: pareto results in a large database of Pareto nondominated solutions that represent the necessary trade-offs between objectives. The solution quality was examined for several PTV and OAR fitness functions. The combination of a conformity-based PTV fitness function and a dose-volume histogram (DVH) or equivalent uniform dose (EUD) -based fitness function for the OAR produced relatively uniform and conformal PTV doses, with well-spaced beams. A penalty function added to the fitness functions eliminates hotspots. Comparison of resulting DVHs to those from treatment plans developed with a single-objective fluence optimizer (from a commercial treatment planning system) showed good correlation. Results also indicated that pareto shows

  14. Strength Pareto particle swarm optimization and hybrid EA-PSO for multi-objective optimization.

    Science.gov (United States)

    Elhossini, Ahmed; Areibi, Shawki; Dony, Robert

    2010-01-01

    This paper proposes an efficient particle swarm optimization (PSO) technique that can handle multi-objective optimization problems. It is based on the strength Pareto approach originally used in evolutionary algorithms (EA). The proposed modified particle swarm algorithm is used to build three hybrid EA-PSO algorithms to solve different multi-objective optimization problems. This algorithm and its hybrid forms are tested using seven benchmarks from the literature and the results are compared to the strength Pareto evolutionary algorithm (SPEA2) and a competitive multi-objective PSO using several metrics. The proposed algorithm shows a slower convergence, compared to the other algorithms, but requires less CPU time. Combining PSO and evolutionary algorithms leads to superior hybrid algorithms that outperform SPEA2, the competitive multi-objective PSO (MO-PSO), and the proposed strength Pareto PSO based on different metrics.

  15. Multi-objective evolutionary algorithms for fuzzy classification in survival prediction.

    Science.gov (United States)

    Jiménez, Fernando; Sánchez, Gracia; Juárez, José M

    2014-03-01

    This paper presents a novel rule-based fuzzy classification methodology for survival/mortality prediction in severe burnt patients. Due to the ethical aspects involved in this medical scenario, physicians tend not to accept a computer-based evaluation unless they understand why and how such a recommendation is given. Therefore, any fuzzy classifier model must be both accurate and interpretable. The proposed methodology is a three-step process: (1) multi-objective constrained optimization of a patient's data set, using Pareto-based elitist multi-objective evolutionary algorithms to maximize accuracy and minimize the complexity (number of rules) of classifiers, subject to interpretability constraints; this step produces a set of alternative (Pareto) classifiers; (2) linguistic labeling, which assigns a linguistic label to each fuzzy set of the classifiers; this step is essential to the interpretability of the classifiers; (3) decision making, whereby a classifier is chosen, if it is satisfactory, according to the preferences of the decision maker. If no classifier is satisfactory for the decision maker, the process starts again in step (1) with a different input parameter set. The performance of three multi-objective evolutionary algorithms, niched pre-selection multi-objective algorithm, elitist Pareto-based multi-objective evolutionary algorithm for diversity reinforcement (ENORA) and the non-dominated sorting genetic algorithm (NSGA-II), was tested using a patient's data set from an intensive care burn unit and a standard machine learning data set from an standard machine learning repository. The results are compared using the hypervolume multi-objective metric. Besides, the results have been compared with other non-evolutionary techniques and validated with a multi-objective cross-validation technique. Our proposal improves the classification rate obtained by other non-evolutionary techniques (decision trees, artificial neural networks, Naive Bayes, and case

  16. Strength Pareto Evolutionary Algorithm using Self-Organizing Data Analysis Techniques

    Directory of Open Access Journals (Sweden)

    Ionut Balan

    2015-03-01

    Full Text Available Multiobjective optimization is widely used in problems solving from a variety of areas. To solve such problems there was developed a set of algorithms, most of them based on evolutionary techniques. One of the algorithms from this class, which gives quite good results is SPEA2, method which is the basis of the proposed algorithm in this paper. Results from this paper are obtained by running these two algorithms on a flow-shop problem.

  17. Implementation of an evolutionary algorithm in planning investment in a power distribution system

    Directory of Open Access Journals (Sweden)

    Carlos Andrés García Montoya

    2011-06-01

    Full Text Available The definition of an investment plan to implement in a distribution power system, is a task that constantly faced by utilities. This work presents a methodology for determining the investment plan for a distribution power system under a shortterm, using as a criterion for evaluating investment projects, associated costs and customers benefit from its implementation. Given the number of projects carried out annually on the system, the definition of an investment plan requires the use of computational tools to evaluate, a set of possibilities, the one that best suits the needs of the present system and better results. That is why in the job, implementing a multi objective evolutionary algorithm SPEA (Strength Pareto Evolutionary Algorithm, which, based on the principles of Pareto optimality, it deliver to the planning expert, the best solutions found in the optimization process. The performance of the algorithm is tested using a set of projects to determine the best among the possible plans. We analyze also the effect of operators on the performance of evolutionary algorithm and results.

  18. Development of antibiotic regimens using graph based evolutionary algorithms.

    Science.gov (United States)

    Corns, Steven M; Ashlock, Daniel A; Bryden, Kenneth M

    2013-12-01

    This paper examines the use of evolutionary algorithms in the development of antibiotic regimens given to production animals. A model is constructed that combines the lifespan of the animal and the bacteria living in the animal's gastro-intestinal tract from the early finishing stage until the animal reaches market weight. This model is used as the fitness evaluation for a set of graph based evolutionary algorithms to assess the impact of diversity control on the evolving antibiotic regimens. The graph based evolutionary algorithms have two objectives: to find an antibiotic treatment regimen that maintains the weight gain and health benefits of antibiotic use and to reduce the risk of spreading antibiotic resistant bacteria. This study examines different regimens of tylosin phosphate use on bacteria populations divided into Gram positive and Gram negative types, with a focus on Campylobacter spp. Treatment regimens were found that provided decreased antibiotic resistance relative to conventional methods while providing nearly the same benefits as conventional antibiotic regimes. By using a graph to control the information flow in the evolutionary algorithm, a variety of solutions along the Pareto front can be found automatically for this and other multi-objective problems. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Hybrid Pareto artificial bee colony algorithm for multi-objective single machine group scheduling problem with sequence-dependent setup times and learning effects.

    Science.gov (United States)

    Yue, Lei; Guan, Zailin; Saif, Ullah; Zhang, Fei; Wang, Hao

    2016-01-01

    Group scheduling is significant for efficient and cost effective production system. However, there exist setup times between the groups, which require to decrease it by sequencing groups in an efficient way. Current research is focused on a sequence dependent group scheduling problem with an aim to minimize the makespan in addition to minimize the total weighted tardiness simultaneously. In most of the production scheduling problems, the processing time of jobs is assumed as fixed. However, the actual processing time of jobs may be reduced due to "learning effect". The integration of sequence dependent group scheduling problem with learning effects has been rarely considered in literature. Therefore, current research considers a single machine group scheduling problem with sequence dependent setup times and learning effects simultaneously. A novel hybrid Pareto artificial bee colony algorithm (HPABC) with some steps of genetic algorithm is proposed for current problem to get Pareto solutions. Furthermore, five different sizes of test problems (small, small medium, medium, large medium, large) are tested using proposed HPABC. Taguchi method is used to tune the effective parameters of the proposed HPABC for each problem category. The performance of HPABC is compared with three famous multi objective optimization algorithms, improved strength Pareto evolutionary algorithm (SPEA2), non-dominated sorting genetic algorithm II (NSGAII) and particle swarm optimization algorithm (PSO). Results indicate that HPABC outperforms SPEA2, NSGAII and PSO and gives better Pareto optimal solutions in terms of diversity and quality for almost all the instances of the different sizes of problems.

  20. Implementation of strength pareto evolutionary algorithm II in the multiobjective burnable poison placement optimization of KWU pressurized water reactor

    International Nuclear Information System (INIS)

    Gharari, Rahman; Poursalehi, Navid; Abbasi, Mohmmadreza; Aghale, Mahdi

    2016-01-01

    In this research, for the first time, a new optimization method, i.e., strength Pareto evolutionary algorithm II (SPEA-II), is developed for the burnable poison placement (BPP) optimization of a nuclear reactor core. In the BPP problem, an optimized placement map of fuel assemblies with burnable poison is searched for a given core loading pattern according to defined objectives. In this work, SPEA-II coupled with a nodal expansion code is used for solving the BPP problem of Kraftwerk Union AG (KWU) pressurized water reactor. Our optimization goal for the BPP is to achieve a greater multiplication factor (K-e-f-f) for gaining possible longer operation cycles along with more flattening of fuel assembly relative power distribution, considering a safety constraint on the radial power peaking factor. For appraising the proposed methodology, the basic approach, i.e., SPEA, is also developed in order to compare obtained results. In general, results reveal the acceptance performance and high strength of SPEA, particularly its new version, i.e., SPEA-II, in achieving a semioptimized loading pattern for the BPP optimization of KWU pressurized water reactor

  1. Implementation of strength pareto evolutionary algorithm II in the multiobjective burnable poison placement optimization of KWU pressurized water reactor

    Energy Technology Data Exchange (ETDEWEB)

    Gharari, Rahman [Nuclear Science and Technology Research Institute (NSTRI), Tehran (Iran, Islamic Republic of); Poursalehi, Navid; Abbasi, Mohmmadreza; Aghale, Mahdi [Nuclear Engineering Dept, Shahid Beheshti University, Tehran (Iran, Islamic Republic of)

    2016-10-15

    In this research, for the first time, a new optimization method, i.e., strength Pareto evolutionary algorithm II (SPEA-II), is developed for the burnable poison placement (BPP) optimization of a nuclear reactor core. In the BPP problem, an optimized placement map of fuel assemblies with burnable poison is searched for a given core loading pattern according to defined objectives. In this work, SPEA-II coupled with a nodal expansion code is used for solving the BPP problem of Kraftwerk Union AG (KWU) pressurized water reactor. Our optimization goal for the BPP is to achieve a greater multiplication factor (K-e-f-f) for gaining possible longer operation cycles along with more flattening of fuel assembly relative power distribution, considering a safety constraint on the radial power peaking factor. For appraising the proposed methodology, the basic approach, i.e., SPEA, is also developed in order to compare obtained results. In general, results reveal the acceptance performance and high strength of SPEA, particularly its new version, i.e., SPEA-II, in achieving a semioptimized loading pattern for the BPP optimization of KWU pressurized water reactor.

  2. δ-Similar Elimination to Enhance Search Performance of Multiobjective Evolutionary Algorithms

    Science.gov (United States)

    Aguirre, Hernán; Sato, Masahiko; Tanaka, Kiyoshi

    In this paper, we propose δ-similar elimination to improve the search performance of multiobjective evolutionary algorithms in combinatorial optimization problems. This method eliminates similar individuals in objective space to fairly distribute selection among the different regions of the instantaneous Pareto front. We investigate four eliminating methods analyzing their effects using NSGA-II. In addition, we compare the search performance of NSGA-II enhanced by our method and NSGA-II enhanced by controlled elitism.

  3. An Evolutionary Algorithm for Multiobjective Fuzzy Portfolio Selection Models with Transaction Cost and Liquidity

    Directory of Open Access Journals (Sweden)

    Wei Yue

    2015-01-01

    Full Text Available The major issues for mean-variance-skewness models are the errors in estimations that cause corner solutions and low diversity in the portfolio. In this paper, a multiobjective fuzzy portfolio selection model with transaction cost and liquidity is proposed to maintain the diversity of portfolio. In addition, we have designed a multiobjective evolutionary algorithm based on decomposition of the objective space to maintain the diversity of obtained solutions. The algorithm is used to obtain a set of Pareto-optimal portfolios with good diversity and convergence. To demonstrate the effectiveness of the proposed model and algorithm, the performance of the proposed algorithm is compared with the classic MOEA/D and NSGA-II through some numerical examples based on the data of the Shanghai Stock Exchange Market. Simulation results show that our proposed algorithm is able to obtain better diversity and more evenly distributed Pareto front than the other two algorithms and the proposed model can maintain quite well the diversity of portfolio. The purpose of this paper is to deal with portfolio problems in the weighted possibilistic mean-variance-skewness (MVS and possibilistic mean-variance-skewness-entropy (MVS-E frameworks with transaction cost and liquidity and to provide different Pareto-optimal investment strategies as diversified as possible for investors at a time, rather than one strategy for investors at a time.

  4. A Knowledge-Informed and Pareto-Based Artificial Bee Colony Optimization Algorithm for Multi-Objective Land-Use Allocation

    Directory of Open Access Journals (Sweden)

    Lina Yang

    2018-02-01

    Full Text Available Land-use allocation is of great significance in urban development. This type of allocation is usually considered to be a complex multi-objective spatial optimization problem, whose optimized result is a set of Pareto-optimal solutions (Pareto front reflecting different tradeoffs in several objectives. However, obtaining a Pareto front is a challenging task, and the Pareto front obtained by state-of-the-art algorithms is still not sufficient. To achieve better Pareto solutions, taking the grid-representative land-use allocation problem with two objectives as an example, an artificial bee colony optimization algorithm for multi-objective land-use allocation (ABC-MOLA is proposed. In this algorithm, the traditional ABC’s search direction guiding scheme and solution maintaining process are modified. In addition, a knowledge-informed neighborhood search strategy, which utilizes the auxiliary knowledge of natural geography and spatial structures to facilitate the neighborhood spatial search around each solution, is developed to further improve the Pareto front’s quality. A series of comparison experiments (a simulated experiment with small data volume and a real-world data experiment for a large area shows that all the Pareto fronts obtained by ABC-MOLA totally dominate the Pareto fronts by other algorithms, which demonstrates ABC-MOLA’s effectiveness in achieving Pareto fronts of high quality.

  5. Use of multiple objective evolutionary algorithms in optimizing surveillance requirements

    International Nuclear Information System (INIS)

    Martorell, S.; Carlos, S.; Villanueva, J.F.; Sanchez, A.I; Galvan, B.; Salazar, D.; Cepin, M.

    2006-01-01

    This paper presents the development and application of a double-loop Multiple Objective Evolutionary Algorithm that uses a Multiple Objective Genetic Algorithm to perform the simultaneous optimization of periodic Test Intervals (TI) and Test Planning (TP). It takes into account the time-dependent effect of TP performed on stand-by safety-related equipment. TI and TP are part of the Surveillance Requirements within Technical Specifications at Nuclear Power Plants. It addresses the problem of multi-objective optimization in the space of dependable variables, i.e. TI and TP, using a novel flexible structure of the optimization algorithm. Lessons learnt from the cases of application of the methodology to optimize TI and TP for the High-Pressure Injection System are given. The results show that the double-loop Multiple Objective Evolutionary Algorithm is able to find the Pareto set of solutions that represents a surface of non-dominated solutions that satisfy all the constraints imposed on the objective functions and decision variables. Decision makers can adopt then the best solution found depending on their particular preference, e.g. minimum cost, minimum unavailability

  6. Pareto optimal pairwise sequence alignment.

    Science.gov (United States)

    DeRonne, Kevin W; Karypis, George

    2013-01-01

    Sequence alignment using evolutionary profiles is a commonly employed tool when investigating a protein. Many profile-profile scoring functions have been developed for use in such alignments, but there has not yet been a comprehensive study of Pareto optimal pairwise alignments for combining multiple such functions. We show that the problem of generating Pareto optimal pairwise alignments has an optimal substructure property, and develop an efficient algorithm for generating Pareto optimal frontiers of pairwise alignments. All possible sets of two, three, and four profile scoring functions are used from a pool of 11 functions and applied to 588 pairs of proteins in the ce_ref data set. The performance of the best objective combinations on ce_ref is also evaluated on an independent set of 913 protein pairs extracted from the BAliBASE RV11 data set. Our dynamic-programming-based heuristic approach produces approximated Pareto optimal frontiers of pairwise alignments that contain comparable alignments to those on the exact frontier, but on average in less than 1/58th the time in the case of four objectives. Our results show that the Pareto frontiers contain alignments whose quality is better than the alignments obtained by single objectives. However, the task of identifying a single high-quality alignment among those in the Pareto frontier remains challenging.

  7. Evolutionary tradeoffs, Pareto optimality and the morphology of ammonite shells.

    Science.gov (United States)

    Tendler, Avichai; Mayo, Avraham; Alon, Uri

    2015-03-07

    Organisms that need to perform multiple tasks face a fundamental tradeoff: no design can be optimal at all tasks at once. Recent theory based on Pareto optimality showed that such tradeoffs lead to a highly defined range of phenotypes, which lie in low-dimensional polyhedra in the space of traits. The vertices of these polyhedra are called archetypes- the phenotypes that are optimal at a single task. To rigorously test this theory requires measurements of thousands of species over hundreds of millions of years of evolution. Ammonoid fossil shells provide an excellent model system for this purpose. Ammonoids have a well-defined geometry that can be parameterized using three dimensionless features of their logarithmic-spiral-shaped shells. Their evolutionary history includes repeated mass extinctions. We find that ammonoids fill out a pyramid in morphospace, suggesting five specific tasks - one for each vertex of the pyramid. After mass extinctions, surviving species evolve to refill essentially the same pyramid, suggesting that the tasks are unchanging. We infer putative tasks for each archetype, related to economy of shell material, rapid shell growth, hydrodynamics and compactness. These results support Pareto optimality theory as an approach to study evolutionary tradeoffs, and demonstrate how this approach can be used to infer the putative tasks that may shape the natural selection of phenotypes.

  8. Energy-Efficient Scheduling Problem Using an Effective Hybrid Multi-Objective Evolutionary Algorithm

    Directory of Open Access Journals (Sweden)

    Lvjiang Yin

    2016-12-01

    Full Text Available Nowadays, manufacturing enterprises face the challenge of just-in-time (JIT production and energy saving. Therefore, study of JIT production and energy consumption is necessary and important in manufacturing sectors. Moreover, energy saving can be attained by the operational method and turn off/on idle machine method, which also increases the complexity of problem solving. Thus, most researchers still focus on small scale problems with one objective: a single machine environment. However, the scheduling problem is a multi-objective optimization problem in real applications. In this paper, a single machine scheduling model with controllable processing and sequence dependence setup times is developed for minimizing the total earliness/tardiness (E/T, cost, and energy consumption simultaneously. An effective multi-objective evolutionary algorithm called local multi-objective evolutionary algorithm (LMOEA is presented to tackle this multi-objective scheduling problem. To accommodate the characteristic of the problem, a new solution representation is proposed, which can convert discrete combinational problems into continuous problems. Additionally, a multiple local search strategy with self-adaptive mechanism is introduced into the proposed algorithm to enhance the exploitation ability. The performance of the proposed algorithm is evaluated by instances with comparison to other multi-objective meta-heuristics such as Nondominated Sorting Genetic Algorithm II (NSGA-II, Strength Pareto Evolutionary Algorithm 2 (SPEA2, Multiobjective Particle Swarm Optimization (OMOPSO, and Multiobjective Evolutionary Algorithm Based on Decomposition (MOEA/D. Experimental results demonstrate that the proposed LMOEA algorithm outperforms its counterparts for this kind of scheduling problems.

  9. Evaluating and Improving Automatic Sleep Spindle Detection by Using Multi-Objective Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Min-Yin Liu

    2017-05-01

    Full Text Available Sleep spindles are brief bursts of brain activity in the sigma frequency range (11–16 Hz measured by electroencephalography (EEG mostly during non-rapid eye movement (NREM stage 2 sleep. These oscillations are of great biological and clinical interests because they potentially play an important role in identifying and characterizing the processes of various neurological disorders. Conventionally, sleep spindles are identified by expert sleep clinicians via visual inspection of EEG signals. The process is laborious and the results are inconsistent among different experts. To resolve the problem, numerous computerized methods have been developed to automate the process of sleep spindle identification. Still, the performance of these automated sleep spindle detection methods varies inconsistently from study to study. There are two reasons: (1 the lack of common benchmark databases, and (2 the lack of commonly accepted evaluation metrics. In this study, we focus on tackling the second problem by proposing to evaluate the performance of a spindle detector in a multi-objective optimization context and hypothesize that using the resultant Pareto fronts for deriving evaluation metrics will improve automatic sleep spindle detection. We use a popular multi-objective evolutionary algorithm (MOEA, the Strength Pareto Evolutionary Algorithm (SPEA2, to optimize six existing frequency-based sleep spindle detection algorithms. They include three Fourier, one continuous wavelet transform (CWT, and two Hilbert-Huang transform (HHT based algorithms. We also explore three hybrid approaches. Trained and tested on open-access DREAMS and MASS databases, two new hybrid methods of combining Fourier with HHT algorithms show significant performance improvement with F1-scores of 0.726–0.737.

  10. Computing gap free Pareto front approximations with stochastic search algorithms.

    Science.gov (United States)

    Schütze, Oliver; Laumanns, Marco; Tantar, Emilia; Coello, Carlos A Coello; Talbi, El-Ghazali

    2010-01-01

    Recently, a convergence proof of stochastic search algorithms toward finite size Pareto set approximations of continuous multi-objective optimization problems has been given. The focus was on obtaining a finite approximation that captures the entire solution set in some suitable sense, which was defined by the concept of epsilon-dominance. Though bounds on the quality of the limit approximation-which are entirely determined by the archiving strategy and the value of epsilon-have been obtained, the strategies do not guarantee to obtain a gap free approximation of the Pareto front. That is, such approximations A can reveal gaps in the sense that points f in the Pareto front can exist such that the distance of f to any image point F(a), a epsilon A, is "large." Since such gap free approximations are desirable in certain applications, and the related archiving strategies can be advantageous when memetic strategies are included in the search process, we are aiming in this work for such methods. We present two novel strategies that accomplish this task in the probabilistic sense and under mild assumptions on the stochastic search algorithm. In addition to the convergence proofs, we give some numerical results to visualize the behavior of the different archiving strategies. Finally, we demonstrate the potential for a possible hybridization of a given stochastic search algorithm with a particular local search strategy-multi-objective continuation methods-by showing that the concept of epsilon-dominance can be integrated into this approach in a suitable way.

  11. Multi-objective thermodynamic optimization of an irreversible regenerative Brayton cycle using evolutionary algorithm and decision making

    OpenAIRE

    Rajesh Kumar; S.C. Kaushik; Raj Kumar; Ranjana Hans

    2016-01-01

    Brayton heat engine model is developed in MATLAB simulink environment and thermodynamic optimization based on finite time thermodynamic analysis along with multiple criteria is implemented. The proposed work investigates optimal values of various decision variables that simultaneously optimize power output, thermal efficiency and ecological function using evolutionary algorithm based on NSGA-II. Pareto optimal frontier between triple and dual objectives is obtained and best optimal value is s...

  12. Pareto Optimization of a Half Car Passive Suspension Model Using a Novel Multiobjective Heat Transfer Search Algorithm

    Directory of Open Access Journals (Sweden)

    Vimal Savsani

    2017-01-01

    Full Text Available Most of the modern multiobjective optimization algorithms are based on the search technique of genetic algorithms; however the search techniques of other recently developed metaheuristics are emerging topics among researchers. This paper proposes a novel multiobjective optimization algorithm named multiobjective heat transfer search (MOHTS algorithm, which is based on the search technique of heat transfer search (HTS algorithm. MOHTS employs the elitist nondominated sorting and crowding distance approach of an elitist based nondominated sorting genetic algorithm-II (NSGA-II for obtaining different nondomination levels and to preserve the diversity among the optimal set of solutions, respectively. The capability in yielding a Pareto front as close as possible to the true Pareto front of MOHTS has been tested on the multiobjective optimization problem of the vehicle suspension design, which has a set of five second-order linear ordinary differential equations. Half car passive ride model with two different sets of five objectives is employed for optimizing the suspension parameters using MOHTS and NSGA-II. The optimization studies demonstrate that MOHTS achieves the better nondominated Pareto front with the widespread (diveresed set of optimal solutions as compared to NSGA-II, and further the comparison of the extreme points of the obtained Pareto front reveals the dominance of MOHTS over NSGA-II, multiobjective uniform diversity genetic algorithm (MUGA, and combined PSO-GA based MOEA.

  13. Active learning of Pareto fronts.

    Science.gov (United States)

    Campigotto, Paolo; Passerini, Andrea; Battiti, Roberto

    2014-03-01

    This paper introduces the active learning of Pareto fronts (ALP) algorithm, a novel approach to recover the Pareto front of a multiobjective optimization problem. ALP casts the identification of the Pareto front into a supervised machine learning task. This approach enables an analytical model of the Pareto front to be built. The computational effort in generating the supervised information is reduced by an active learning strategy. In particular, the model is learned from a set of informative training objective vectors. The training objective vectors are approximated Pareto-optimal vectors obtained by solving different scalarized problem instances. The experimental results show that ALP achieves an accurate Pareto front approximation with a lower computational effort than state-of-the-art estimation of distribution algorithms and widely known genetic techniques.

  14. Hybrid Microgrid Configuration Optimization with Evolutionary Algorithms

    Science.gov (United States)

    Lopez, Nicolas

    This dissertation explores the Renewable Energy Integration Problem, and proposes a Genetic Algorithm embedded with a Monte Carlo simulation to solve large instances of the problem that are impractical to solve via full enumeration. The Renewable Energy Integration Problem is defined as finding the optimum set of components to supply the electric demand to a hybrid microgrid. The components considered are solar panels, wind turbines, diesel generators, electric batteries, connections to the power grid and converters, which can be inverters and/or rectifiers. The methodology developed is explained as well as the combinatorial formulation. In addition, 2 case studies of a single objective optimization version of the problem are presented, in order to minimize cost and to minimize global warming potential (GWP) followed by a multi-objective implementation of the offered methodology, by utilizing a non-sorting Genetic Algorithm embedded with a monte Carlo Simulation. The method is validated by solving a small instance of the problem with known solution via a full enumeration algorithm developed by NREL in their software HOMER. The dissertation concludes that the evolutionary algorithms embedded with Monte Carlo simulation namely modified Genetic Algorithms are an efficient form of solving the problem, by finding approximate solutions in the case of single objective optimization, and by approximating the true Pareto front in the case of multiple objective optimization of the Renewable Energy Integration Problem.

  15. A Pareto Algorithm for Efficient De Novo Design of Multi-functional Molecules.

    Science.gov (United States)

    Daeyaert, Frits; Deem, Micheal W

    2017-01-01

    We have introduced a Pareto sorting algorithm into Synopsis, a de novo design program that generates synthesizable molecules with desirable properties. We give a detailed description of the algorithm and illustrate its working in 2 different de novo design settings: the design of putative dual and selective FGFR and VEGFR inhibitors, and the successful design of organic structure determining agents (OSDAs) for the synthesis of zeolites. We show that the introduction of Pareto sorting not only enables the simultaneous optimization of multiple properties but also greatly improves the performance of the algorithm to generate molecules with hard-to-meet constraints. This in turn allows us to suggest approaches to address the problem of false positive hits in de novo structure based drug design by introducing structural and physicochemical constraints in the designed molecules, and by forcing essential interactions between these molecules and their target receptor. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Comparison of Multiobjective Evolutionary Algorithms for Operations Scheduling under Machine Availability Constraints

    Directory of Open Access Journals (Sweden)

    M. Frutos

    2013-01-01

    Full Text Available Many of the problems that arise in production systems can be handled with multiobjective techniques. One of those problems is that of scheduling operations subject to constraints on the availability of machines and buffer capacity. In this paper we analyze different Evolutionary multiobjective Algorithms (MOEAs for this kind of problems. We consider an experimental framework in which we schedule production operations for four real world Job-Shop contexts using three algorithms, NSGAII, SPEA2, and IBEA. Using two performance indexes, Hypervolume and R2, we found that SPEA2 and IBEA are the most efficient for the tasks at hand. On the other hand IBEA seems to be a better choice of tool since it yields more solutions in the approximate Pareto frontier.

  17. Multi-objective flexible job shop scheduling problem using variable neighborhood evolutionary algorithm

    Science.gov (United States)

    Wang, Chun; Ji, Zhicheng; Wang, Yan

    2017-07-01

    In this paper, multi-objective flexible job shop scheduling problem (MOFJSP) was studied with the objects to minimize makespan, total workload and critical workload. A variable neighborhood evolutionary algorithm (VNEA) was proposed to obtain a set of Pareto optimal solutions. First, two novel crowded operators in terms of the decision space and object space were proposed, and they were respectively used in mating selection and environmental selection. Then, two well-designed neighborhood structures were used in local search, which consider the problem characteristics and can hold fast convergence. Finally, extensive comparison was carried out with the state-of-the-art methods specially presented for solving MOFJSP on well-known benchmark instances. The results show that the proposed VNEA is more effective than other algorithms in solving MOFJSP.

  18. Chaotic Multiobjective Evolutionary Algorithm Based on Decomposition for Test Task Scheduling Problem

    Directory of Open Access Journals (Sweden)

    Hui Lu

    2014-01-01

    Full Text Available Test task scheduling problem (TTSP is a complex optimization problem and has many local optima. In this paper, a hybrid chaotic multiobjective evolutionary algorithm based on decomposition (CMOEA/D is presented to avoid becoming trapped in local optima and to obtain high quality solutions. First, we propose an improving integrated encoding scheme (IES to increase the efficiency. Then ten chaotic maps are applied into the multiobjective evolutionary algorithm based on decomposition (MOEA/D in three phases, that is, initial population and crossover and mutation operators. To identify a good approach for hybrid MOEA/D and chaos and indicate the effectiveness of the improving IES several experiments are performed. The Pareto front and the statistical results demonstrate that different chaotic maps in different phases have different effects for solving the TTSP especially the circle map and ICMIC map. The similarity degree of distribution between chaotic maps and the problem is a very essential factor for the application of chaotic maps. In addition, the experiments of comparisons of CMOEA/D and variable neighborhood MOEA/D (VNM indicate that our algorithm has the best performance in solving the TTSP.

  19. Solving multi-objective job shop problem using nature-based algorithms: new Pareto approximation features

    Directory of Open Access Journals (Sweden)

    Jarosław Rudy

    2015-01-01

    Full Text Available In this paper the job shop scheduling problem (JSP with minimizing two criteria simultaneously is considered. JSP is frequently used model in real world applications of combinatorial optimization. Multi-objective job shop problems (MOJSP were rarely studied. We implement and compare two multi-agent nature-based methods, namely ant colony optimization (ACO and genetic algorithm (GA for MOJSP. Both of those methods employ certain technique, taken from the multi-criteria decision analysis in order to establish ranking of solutions. ACO and GA differ in a method of keeping information about previously found solutions and their quality, which affects the course of the search. In result, new features of Pareto approximations provided by said algorithms are observed: aside from the slight superiority of the ACO method the Pareto frontier approximations provided by both methods are disjoint sets. Thus, both methods can be used to search mutually exclusive areas of the Pareto frontier.

  20. AMOBH: Adaptive Multiobjective Black Hole Algorithm.

    Science.gov (United States)

    Wu, Chong; Wu, Tao; Fu, Kaiyuan; Zhu, Yuan; Li, Yongbo; He, Wangyong; Tang, Shengwen

    2017-01-01

    This paper proposes a new multiobjective evolutionary algorithm based on the black hole algorithm with a new individual density assessment (cell density), called "adaptive multiobjective black hole algorithm" (AMOBH). Cell density has the characteristics of low computational complexity and maintains a good balance of convergence and diversity of the Pareto front. The framework of AMOBH can be divided into three steps. Firstly, the Pareto front is mapped to a new objective space called parallel cell coordinate system. Then, to adjust the evolutionary strategies adaptively, Shannon entropy is employed to estimate the evolution status. At last, the cell density is combined with a dominance strength assessment called cell dominance to evaluate the fitness of solutions. Compared with the state-of-the-art methods SPEA-II, PESA-II, NSGA-II, and MOEA/D, experimental results show that AMOBH has a good performance in terms of convergence rate, population diversity, population convergence, subpopulation obtention of different Pareto regions, and time complexity to the latter in most cases.

  1. A Novel Multiobjective Evolutionary Algorithm Based on Regression Analysis

    Directory of Open Access Journals (Sweden)

    Zhiming Song

    2015-01-01

    Full Text Available As is known, the Pareto set of a continuous multiobjective optimization problem with m objective functions is a piecewise continuous (m-1-dimensional manifold in the decision space under some mild conditions. However, how to utilize the regularity to design multiobjective optimization algorithms has become the research focus. In this paper, based on this regularity, a model-based multiobjective evolutionary algorithm with regression analysis (MMEA-RA is put forward to solve continuous multiobjective optimization problems with variable linkages. In the algorithm, the optimization problem is modelled as a promising area in the decision space by a probability distribution, and the centroid of the probability distribution is (m-1-dimensional piecewise continuous manifold. The least squares method is used to construct such a model. A selection strategy based on the nondominated sorting is used to choose the individuals to the next generation. The new algorithm is tested and compared with NSGA-II and RM-MEDA. The result shows that MMEA-RA outperforms RM-MEDA and NSGA-II on the test instances with variable linkages. At the same time, MMEA-RA has higher efficiency than the other two algorithms. A few shortcomings of MMEA-RA have also been identified and discussed in this paper.

  2. Safety management in NPPs using evolutionary algorithm

    International Nuclear Information System (INIS)

    Mishra, A.; Patwardhan, A.; Chauhan, A.; Verma, A.K.

    2005-01-01

    Technical specification and maintenance (TS and M) activities in a plant are associated with controlling risk or with satisfying requirements, and are candidates to be evaluated for their resource effectiveness in risk-informed applications. The general goal of safety management in Nuclear Power Plants (NPPs) is to make requirements and activities more risk effective and less costly. Accordingly, the risk-based analysis of Technical Specification (RBTS) is being considered in evaluating current TS. The multi objective optimization of the TS and M requirements of a NPP based on risk and cost, gives the pareto-optimal solutions, from which the utility can pick its decision variables suiting its interest. In this paper a multi objective Evolutionary Algorithm technique has been used to make a trade-off between risk and cost both at the system level and at the plant level for Loss of coolant Accident (LOCA) and Main Steam Line Break (MSLB) as initiating events. (authors)

  3. Pareto-optimal phylogenetic tree reconciliation.

    Science.gov (United States)

    Libeskind-Hadas, Ran; Wu, Yi-Chieh; Bansal, Mukul S; Kellis, Manolis

    2014-06-15

    Phylogenetic tree reconciliation is a widely used method for reconstructing the evolutionary histories of gene families and species, hosts and parasites and other dependent pairs of entities. Reconciliation is typically performed using maximum parsimony, in which each evolutionary event type is assigned a cost and the objective is to find a reconciliation of minimum total cost. It is generally understood that reconciliations are sensitive to event costs, but little is understood about the relationship between event costs and solutions. Moreover, choosing appropriate event costs is a notoriously difficult problem. We address this problem by giving an efficient algorithm for computing Pareto-optimal sets of reconciliations, thus providing the first systematic method for understanding the relationship between event costs and reconciliations. This, in turn, results in new techniques for computing event support values and, for cophylogenetic analyses, performing robust statistical tests. We provide new software tools and demonstrate their use on a number of datasets from evolutionary genomic and cophylogenetic studies. Our Python tools are freely available at www.cs.hmc.edu/∼hadas/xscape. . © The Author 2014. Published by Oxford University Press.

  4. Optimization of Wind Turbine Airfoil Using Nondominated Sorting Genetic Algorithm and Pareto Optimal Front

    Directory of Open Access Journals (Sweden)

    Ziaul Huque

    2012-01-01

    Full Text Available A Computational Fluid Dynamics (CFD and response surface-based multiobjective design optimization were performed for six different 2D airfoil profiles, and the Pareto optimal front of each airfoil is presented. FLUENT, which is a commercial CFD simulation code, was used to determine the relevant aerodynamic loads. The Lift Coefficient (CL and Drag Coefficient (CD data at a range of 0° to 12° angles of attack (α and at three different Reynolds numbers (Re=68,459, 479, 210, and 958, 422 for all the six airfoils were obtained. Realizable k-ε turbulence model with a second-order upwind solution method was used in the simulations. The standard least square method was used to generate response surface by the statistical code JMP. Elitist Non-dominated Sorting Genetic Algorithm (NSGA-II was used to determine the Pareto optimal set based on the response surfaces. Each Pareto optimal solution represents a different compromise between design objectives. This gives the designer a choice to select a design compromise that best suits the requirements from a set of optimal solutions. The Pareto solution set is presented in the form of a Pareto optimal front.

  5. A Bee Evolutionary Guiding Nondominated Sorting Genetic Algorithm II for Multiobjective Flexible Job-Shop Scheduling

    Directory of Open Access Journals (Sweden)

    Qianwang Deng

    2017-01-01

    Full Text Available Flexible job-shop scheduling problem (FJSP is an NP-hard puzzle which inherits the job-shop scheduling problem (JSP characteristics. This paper presents a bee evolutionary guiding nondominated sorting genetic algorithm II (BEG-NSGA-II for multiobjective FJSP (MO-FJSP with the objectives to minimize the maximal completion time, the workload of the most loaded machine, and the total workload of all machines. It adopts a two-stage optimization mechanism during the optimizing process. In the first stage, the NSGA-II algorithm with T iteration times is first used to obtain the initial population N, in which a bee evolutionary guiding scheme is presented to exploit the solution space extensively. In the second stage, the NSGA-II algorithm with GEN iteration times is used again to obtain the Pareto-optimal solutions. In order to enhance the searching ability and avoid the premature convergence, an updating mechanism is employed in this stage. More specifically, its population consists of three parts, and each of them changes with the iteration times. What is more, numerical simulations are carried out which are based on some published benchmark instances. Finally, the effectiveness of the proposed BEG-NSGA-II algorithm is shown by comparing the experimental results and the results of some well-known algorithms already existed.

  6. A Bee Evolutionary Guiding Nondominated Sorting Genetic Algorithm II for Multiobjective Flexible Job-Shop Scheduling.

    Science.gov (United States)

    Deng, Qianwang; Gong, Guiliang; Gong, Xuran; Zhang, Like; Liu, Wei; Ren, Qinghua

    2017-01-01

    Flexible job-shop scheduling problem (FJSP) is an NP-hard puzzle which inherits the job-shop scheduling problem (JSP) characteristics. This paper presents a bee evolutionary guiding nondominated sorting genetic algorithm II (BEG-NSGA-II) for multiobjective FJSP (MO-FJSP) with the objectives to minimize the maximal completion time, the workload of the most loaded machine, and the total workload of all machines. It adopts a two-stage optimization mechanism during the optimizing process. In the first stage, the NSGA-II algorithm with T iteration times is first used to obtain the initial population N , in which a bee evolutionary guiding scheme is presented to exploit the solution space extensively. In the second stage, the NSGA-II algorithm with GEN iteration times is used again to obtain the Pareto-optimal solutions. In order to enhance the searching ability and avoid the premature convergence, an updating mechanism is employed in this stage. More specifically, its population consists of three parts, and each of them changes with the iteration times. What is more, numerical simulations are carried out which are based on some published benchmark instances. Finally, the effectiveness of the proposed BEG-NSGA-II algorithm is shown by comparing the experimental results and the results of some well-known algorithms already existed.

  7. Algorithmic Mechanism Design of Evolutionary Computation.

    Science.gov (United States)

    Pei, Yan

    2015-01-01

    We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm.

  8. Diversity-Guided Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Ursem, Rasmus Kjær

    2002-01-01

    Population diversity is undoubtably a key issue in the performance of evolutionary algorithms. A common hypothesis is that high diversity is important to avoid premature convergence and to escape local optima. Various diversity measures have been used to analyze algorithms, but so far few...... algorithms have used a measure to guide the search. The diversity-guided evolutionary algorithm (DGEA) uses the wellknown distance-to-average-point measure to alternate between phases of exploration (mutation) and phases of exploitation (recombination and selection). The DGEA showed remarkable results...

  9. Pareto optimization in algebraic dynamic programming.

    Science.gov (United States)

    Saule, Cédric; Giegerich, Robert

    2015-01-01

    Pareto optimization combines independent objectives by computing the Pareto front of its search space, defined as the set of all solutions for which no other candidate solution scores better under all objectives. This gives, in a precise sense, better information than an artificial amalgamation of different scores into a single objective, but is more costly to compute. Pareto optimization naturally occurs with genetic algorithms, albeit in a heuristic fashion. Non-heuristic Pareto optimization so far has been used only with a few applications in bioinformatics. We study exact Pareto optimization for two objectives in a dynamic programming framework. We define a binary Pareto product operator [Formula: see text] on arbitrary scoring schemes. Independent of a particular algorithm, we prove that for two scoring schemes A and B used in dynamic programming, the scoring scheme [Formula: see text] correctly performs Pareto optimization over the same search space. We study different implementations of the Pareto operator with respect to their asymptotic and empirical efficiency. Without artificial amalgamation of objectives, and with no heuristics involved, Pareto optimization is faster than computing the same number of answers separately for each objective. For RNA structure prediction under the minimum free energy versus the maximum expected accuracy model, we show that the empirical size of the Pareto front remains within reasonable bounds. Pareto optimization lends itself to the comparative investigation of the behavior of two alternative scoring schemes for the same purpose. For the above scoring schemes, we observe that the Pareto front can be seen as a composition of a few macrostates, each consisting of several microstates that differ in the same limited way. We also study the relationship between abstract shape analysis and the Pareto front, and find that they extract information of a different nature from the folding space and can be meaningfully combined.

  10. Convex hull ranking algorithm for multi-objective evolutionary algorithms

    NARCIS (Netherlands)

    Davoodi Monfrared, M.; Mohades, A.; Rezaei, J.

    2012-01-01

    Due to many applications of multi-objective evolutionary algorithms in real world optimization problems, several studies have been done to improve these algorithms in recent years. Since most multi-objective evolutionary algorithms are based on the non-dominated principle, and their complexity

  11. An EM Algorithm for Double-Pareto-Lognormal Generalized Linear Model Applied to Heavy-Tailed Insurance Claims

    Directory of Open Access Journals (Sweden)

    Enrique Calderín-Ojeda

    2017-11-01

    Full Text Available Generalized linear models might not be appropriate when the probability of extreme events is higher than that implied by the normal distribution. Extending the method for estimating the parameters of a double Pareto lognormal distribution (DPLN in Reed and Jorgensen (2004, we develop an EM algorithm for the heavy-tailed Double-Pareto-lognormal generalized linear model. The DPLN distribution is obtained as a mixture of a lognormal distribution with a double Pareto distribution. In this paper the associated generalized linear model has the location parameter equal to a linear predictor which is used to model insurance claim amounts for various data sets. The performance is compared with those of the generalized beta (of the second kind and lognorma distributions.

  12. A New Methodology to Select the Preferred Solutions from the Pareto-optimal Set: Application to Polymer Extrusion

    International Nuclear Information System (INIS)

    Ferreira, Jose C.; Gaspar-Cunha, Antonio; Fonseca, Carlos M.

    2007-01-01

    Most of the real world optimization problems involve multiple, usually conflicting, optimization criteria. Generating Pareto optimal solutions plays an important role in multi-objective optimization, and the problem is considered to be solved when the Pareto optimal set is found, i.e., the set of non-dominated solutions. Multi-Objective Evolutionary Algorithms based on the principle of Pareto optimality are designed to produce the complete set of non-dominated solutions. However, this is not allays enough since the aim is not only to know the Pareto set but, also, to obtain one solution from this Pareto set. Thus, the definition of a methodology able to select a single solution from the set of non-dominated solutions (or a region of the Pareto frontier), and taking into account the preferences of a Decision Maker (DM), is necessary. A different method, based on a weighted stress function, is proposed. It is able to integrate the user's preferences in order to find the best region of the Pareto frontier accordingly with these preferences. This method was tested on some benchmark test problems, with two and three criteria, and on a polymer extrusion problem. This methodology is able to select efficiently the best Pareto-frontier region for the specified relative importance of the criteria

  13. Safety management in NPPs using an evolutionary algorithm technique

    International Nuclear Information System (INIS)

    Mishra, Alok; Patwardhan, Anand; Verma, A.K.

    2007-01-01

    The general goal of safety management in Nuclear Power Plants (NPPs) is to make requirements and activities more risk effective and less costly. The technical specification and maintenance (TS and M) activities in a plant are associated with controlling risk or with satisfying requirements, and are candidates to be evaluated for their resource effectiveness in risk-informed applications. Accordingly, the risk-based analysis of technical specification (RBTS) is being considered in evaluating current TS. The multi-objective optimization of the TS and M requirements of a NPP based on risk and cost, gives the pareto-optimal solutions, from which the utility can pick its decision variables suiting its interest. In this paper, a multi-objective evolutionary algorithm technique has been used to make a trade-off between risk and cost both at the system level and at the plant level for loss of coolant accident (LOCA) and main steam line break (MSLB) as initiating events

  14. Pareto design of state feedback tracking control of a biped robot via multiobjective PSO in comparison with sigma method and genetic algorithms: modified NSGAII and MATLAB's toolbox.

    Science.gov (United States)

    Mahmoodabadi, M J; Taherkhorsandi, M; Bagheri, A

    2014-01-01

    An optimal robust state feedback tracking controller is introduced to control a biped robot. In the literature, the parameters of the controller are usually determined by a tedious trial and error process. To eliminate this process and design the parameters of the proposed controller, the multiobjective evolutionary algorithms, that is, the proposed method, modified NSGAII, Sigma method, and MATLAB's Toolbox MOGA, are employed in this study. Among the used evolutionary optimization algorithms to design the controller for biped robots, the proposed method operates better in the aspect of designing the controller since it provides ample opportunities for designers to choose the most appropriate point based upon the design criteria. Three points are chosen from the nondominated solutions of the obtained Pareto front based on two conflicting objective functions, that is, the normalized summation of angle errors and normalized summation of control effort. Obtained results elucidate the efficiency of the proposed controller in order to control a biped robot.

  15. MULTI-OBJECTIVE OPTIMAL DESIGN OF GROUNDWATER REMEDIATION SYSTEMS: APPLICATION OF THE NICHED PARETO GENETIC ALGORITHM (NPGA). (R826614)

    Science.gov (United States)

    A multiobjective optimization algorithm is applied to a groundwater quality management problem involving remediation by pump-and-treat (PAT). The multiobjective optimization framework uses the niched Pareto genetic algorithm (NPGA) and is applied to simultaneously minimize the...

  16. Multi-objective exergy-based optimization of a polygeneration energy system using an evolutionary algorithm

    International Nuclear Information System (INIS)

    Ahmadi, Pouria; Rosen, Marc A.; Dincer, Ibrahim

    2012-01-01

    A comprehensive thermodynamic modeling and optimization is reported of a polygeneration energy system for the simultaneous production of heating, cooling, electricity and hot water from a common energy source. This polygeneration system is composed of four major parts: gas turbine (GT) cycle, Rankine cycle, absorption cooling cycle and domestic hot water heater. A multi-objective optimization method based on an evolutionary algorithm is applied to determine the best design parameters for the system. The two objective functions utilized in the analysis are the total cost rate of the system, which is the cost associated with fuel, component purchasing and environmental impact, and the system exergy efficiency. The total cost rate of the system is minimized while the cycle exergy efficiency is maximized by using an evolutionary algorithm. To provide a deeper insight, the Pareto frontier is shown for multi-objective optimization. In addition, a closed form equation for the relationship between exergy efficiency and total cost rate is derived. Finally, a sensitivity analysis is performed to assess the effects of several design parameters on the system total exergy destruction rate, CO 2 emission and exergy efficiency.

  17. Optimization of constrained multiple-objective reliability problems using evolutionary algorithms

    International Nuclear Information System (INIS)

    Salazar, Daniel; Rocco, Claudio M.; Galvan, Blas J.

    2006-01-01

    This paper illustrates the use of multi-objective optimization to solve three types of reliability optimization problems: to find the optimal number of redundant components, find the reliability of components, and determine both their redundancy and reliability. In general, these problems have been formulated as single objective mixed-integer non-linear programming problems with one or several constraints and solved by using mathematical programming techniques or special heuristics. In this work, these problems are reformulated as multiple-objective problems (MOP) and then solved by using a second-generation Multiple-Objective Evolutionary Algorithm (MOEA) that allows handling constraints. The MOEA used in this paper (NSGA-II) demonstrates the ability to identify a set of optimal solutions (Pareto front), which provides the Decision Maker with a complete picture of the optimal solution space. Finally, the advantages of both MOP and MOEA approaches are illustrated by solving four redundancy problems taken from the literature

  18. Optimization of constrained multiple-objective reliability problems using evolutionary algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Salazar, Daniel [Instituto de Sistemas Inteligentes y Aplicaciones Numericas en Ingenieria (IUSIANI), Division de Computacion Evolutiva y Aplicaciones (CEANI), Universidad de Las Palmas de Gran Canaria, Islas Canarias (Spain) and Facultad de Ingenieria, Universidad Central Venezuela, Caracas (Venezuela)]. E-mail: danielsalazaraponte@gmail.com; Rocco, Claudio M. [Facultad de Ingenieria, Universidad Central Venezuela, Caracas (Venezuela)]. E-mail: crocco@reacciun.ve; Galvan, Blas J. [Instituto de Sistemas Inteligentes y Aplicaciones Numericas en Ingenieria (IUSIANI), Division de Computacion Evolutiva y Aplicaciones (CEANI), Universidad de Las Palmas de Gran Canaria, Islas Canarias (Spain)]. E-mail: bgalvan@step.es

    2006-09-15

    This paper illustrates the use of multi-objective optimization to solve three types of reliability optimization problems: to find the optimal number of redundant components, find the reliability of components, and determine both their redundancy and reliability. In general, these problems have been formulated as single objective mixed-integer non-linear programming problems with one or several constraints and solved by using mathematical programming techniques or special heuristics. In this work, these problems are reformulated as multiple-objective problems (MOP) and then solved by using a second-generation Multiple-Objective Evolutionary Algorithm (MOEA) that allows handling constraints. The MOEA used in this paper (NSGA-II) demonstrates the ability to identify a set of optimal solutions (Pareto front), which provides the Decision Maker with a complete picture of the optimal solution space. Finally, the advantages of both MOP and MOEA approaches are illustrated by solving four redundancy problems taken from the literature.

  19. A Hybrid Chaotic Quantum Evolutionary Algorithm

    DEFF Research Database (Denmark)

    Cai, Y.; Zhang, M.; Cai, H.

    2010-01-01

    A hybrid chaotic quantum evolutionary algorithm is proposed to reduce amount of computation, speed up convergence and restrain premature phenomena of quantum evolutionary algorithm. The proposed algorithm adopts the chaotic initialization method to generate initial population which will form a pe...... tests. The presented algorithm is applied to urban traffic signal timing optimization and the effect is satisfied....

  20. Species co-evolutionary algorithm: a novel evolutionary algorithm based on the ecology and environments for optimization

    DEFF Research Database (Denmark)

    Li, Wuzhao; Wang, Lei; Cai, Xingjuan

    2015-01-01

    and affect each other in many ways. The relationships include competition, predation, parasitism, mutualism and pythogenesis. In this paper, we consider the five relationships between solutions to propose a co-evolutionary algorithm termed species co-evolutionary algorithm (SCEA). In SCEA, five operators...

  1. Industrial Applications of Evolutionary Algorithms

    CERN Document Server

    Sanchez, Ernesto; Tonda, Alberto

    2012-01-01

    This book is intended as a reference both for experienced users of evolutionary algorithms and for researchers that are beginning to approach these fascinating optimization techniques. Experienced users will find interesting details of real-world problems, and advice on solving issues related to fitness computation, modeling and setting appropriate parameters to reach optimal solutions. Beginners will find a thorough introduction to evolutionary computation, and a complete presentation of all evolutionary algorithms exploited to solve different problems. The book could fill the gap between the

  2. The Algorithm for Algorithms: An Evolutionary Algorithm Based on Automatic Designing of Genetic Operators

    Directory of Open Access Journals (Sweden)

    Dazhi Jiang

    2015-01-01

    Full Text Available At present there is a wide range of evolutionary algorithms available to researchers and practitioners. Despite the great diversity of these algorithms, virtually all of the algorithms share one feature: they have been manually designed. A fundamental question is “are there any algorithms that can design evolutionary algorithms automatically?” A more complete definition of the question is “can computer construct an algorithm which will generate algorithms according to the requirement of a problem?” In this paper, a novel evolutionary algorithm based on automatic designing of genetic operators is presented to address these questions. The resulting algorithm not only explores solutions in the problem space like most traditional evolutionary algorithms do, but also automatically generates genetic operators in the operator space. In order to verify the performance of the proposed algorithm, comprehensive experiments on 23 well-known benchmark optimization problems are conducted. The results show that the proposed algorithm can outperform standard differential evolution algorithm in terms of convergence speed and solution accuracy which shows that the algorithm designed automatically by computers can compete with the algorithms designed by human beings.

  3. Comparison of evolutionary computation algorithms for solving bi ...

    Indian Academy of Sciences (India)

    failure probability. Multiobjective Evolutionary Computation algorithms (MOEAs) are well-suited for Multiobjective task scheduling on heterogeneous environment. The two Multi-Objective Evolutionary Algorithms such as Multiobjective Genetic. Algorithm (MOGA) and Multiobjective Evolutionary Programming (MOEP) with.

  4. A heuristic ranking approach on capacity benefit margin determination using Pareto-based evolutionary programming technique.

    Science.gov (United States)

    Othman, Muhammad Murtadha; Abd Rahman, Nurulazmi; Musirin, Ismail; Fotuhi-Firuzabad, Mahmud; Rajabi-Ghahnavieh, Abbas

    2015-01-01

    This paper introduces a novel multiobjective approach for capacity benefit margin (CBM) assessment taking into account tie-line reliability of interconnected systems. CBM is the imperative information utilized as a reference by the load-serving entities (LSE) to estimate a certain margin of transfer capability so that a reliable access to generation through interconnected system could be attained. A new Pareto-based evolutionary programming (EP) technique is used to perform a simultaneous determination of CBM for all areas of the interconnected system. The selection of CBM at the Pareto optimal front is proposed to be performed by referring to a heuristic ranking index that takes into account system loss of load expectation (LOLE) in various conditions. Eventually, the power transfer based available transfer capability (ATC) is determined by considering the firm and nonfirm transfers of CBM. A comprehensive set of numerical studies are conducted on the modified IEEE-RTS79 and the performance of the proposed method is numerically investigated in detail. The main advantage of the proposed technique is in terms of flexibility offered to an independent system operator in selecting an appropriate solution of CBM simultaneously for all areas.

  5. A Heuristic Ranking Approach on Capacity Benefit Margin Determination Using Pareto-Based Evolutionary Programming Technique

    Directory of Open Access Journals (Sweden)

    Muhammad Murtadha Othman

    2015-01-01

    Full Text Available This paper introduces a novel multiobjective approach for capacity benefit margin (CBM assessment taking into account tie-line reliability of interconnected systems. CBM is the imperative information utilized as a reference by the load-serving entities (LSE to estimate a certain margin of transfer capability so that a reliable access to generation through interconnected system could be attained. A new Pareto-based evolutionary programming (EP technique is used to perform a simultaneous determination of CBM for all areas of the interconnected system. The selection of CBM at the Pareto optimal front is proposed to be performed by referring to a heuristic ranking index that takes into account system loss of load expectation (LOLE in various conditions. Eventually, the power transfer based available transfer capability (ATC is determined by considering the firm and nonfirm transfers of CBM. A comprehensive set of numerical studies are conducted on the modified IEEE-RTS79 and the performance of the proposed method is numerically investigated in detail. The main advantage of the proposed technique is in terms of flexibility offered to an independent system operator in selecting an appropriate solution of CBM simultaneously for all areas.

  6. A Note on Evolutionary Algorithms and Its Applications

    Science.gov (United States)

    Bhargava, Shifali

    2013-01-01

    This paper introduces evolutionary algorithms with its applications in multi-objective optimization. Here elitist and non-elitist multiobjective evolutionary algorithms are discussed with their advantages and disadvantages. We also discuss constrained multiobjective evolutionary algorithms and their applications in various areas.

  7. Evolutionary algorithms for multi-objective energetic and economic optimization in thermal system design

    International Nuclear Information System (INIS)

    Toffolo, A.; Lazzaretto, A.

    2002-01-01

    Thermoeconomic analyses in thermal system design are always focused on the economic objective. However, knowledge of only the economic minimum may not be sufficient in the decision making process, since solutions with a higher thermodynamic efficiency, in spite of small increases in total costs, may result in much more interesting designs due to changes in energy market prices or in energy policies. This paper suggests how to perform a multi-objective optimization in order to find solutions that simultaneously satisfy exergetic and economic objectives. This corresponds to a search for the set of Pareto optimal solutions with respect to the two competing objectives. The optimization process is carried out by an evolutionary algorithm, that features a new diversity preserving mechanism using as a test case the well-known CGAM problem. (author)

  8. A tabu search evalutionary algorithm for multiobjective optimization: Application to a bi-criterion aircraft structural reliability problem

    Science.gov (United States)

    Long, Kim Chenming

    Real-world engineering optimization problems often require the consideration of multiple conflicting and noncommensurate objectives, subject to nonconvex constraint regions in a high-dimensional decision space. Further challenges occur for combinatorial multiobjective problems in which the decision variables are not continuous. Traditional multiobjective optimization methods of operations research, such as weighting and epsilon constraint methods, are ill-suited to solving these complex, multiobjective problems. This has given rise to the application of a wide range of metaheuristic optimization algorithms, such as evolutionary, particle swarm, simulated annealing, and ant colony methods, to multiobjective optimization. Several multiobjective evolutionary algorithms have been developed, including the strength Pareto evolutionary algorithm (SPEA) and the non-dominated sorting genetic algorithm (NSGA), for determining the Pareto-optimal set of non-dominated solutions. Although numerous researchers have developed a wide range of multiobjective optimization algorithms, there is a continuing need to construct computationally efficient algorithms with an improved ability to converge to globally non-dominated solutions along the Pareto-optimal front for complex, large-scale, multiobjective engineering optimization problems. This is particularly important when the multiple objective functions and constraints of the real-world system cannot be expressed in explicit mathematical representations. This research presents a novel metaheuristic evolutionary algorithm for complex multiobjective optimization problems, which combines the metaheuristic tabu search algorithm with the evolutionary algorithm (TSEA), as embodied in genetic algorithms. TSEA is successfully applied to bicriteria (i.e., structural reliability and retrofit cost) optimization of the aircraft tail structure fatigue life, which increases its reliability by prolonging fatigue life. A comparison for this

  9. Multi-objective thermodynamic optimization of an irreversible regenerative Brayton cycle using evolutionary algorithm and decision making

    Directory of Open Access Journals (Sweden)

    Rajesh Kumar

    2016-06-01

    Full Text Available Brayton heat engine model is developed in MATLAB simulink environment and thermodynamic optimization based on finite time thermodynamic analysis along with multiple criteria is implemented. The proposed work investigates optimal values of various decision variables that simultaneously optimize power output, thermal efficiency and ecological function using evolutionary algorithm based on NSGA-II. Pareto optimal frontier between triple and dual objectives is obtained and best optimal value is selected using Fuzzy, TOPSIS, LINMAP and Shannon’s entropy decision making methods. Triple objective evolutionary approach applied to the proposed model gives power output, thermal efficiency, ecological function as (53.89 kW, 0.1611, −142 kW which are 29.78%, 25.86% and 21.13% lower in comparison with reversible system. Furthermore, the present study reflects the effect of various heat capacitance rates and component efficiencies on triple objectives in graphical custom. Finally, with the aim of error investigation, average and maximum errors of obtained results are computed.

  10. Approximating convex Pareto surfaces in multiobjective radiotherapy planning

    International Nuclear Information System (INIS)

    Craft, David L.; Halabi, Tarek F.; Shih, Helen A.; Bortfeld, Thomas R.

    2006-01-01

    Radiotherapy planning involves inherent tradeoffs: the primary mission, to treat the tumor with a high, uniform dose, is in conflict with normal tissue sparing. We seek to understand these tradeoffs on a case-to-case basis, by computing for each patient a database of Pareto optimal plans. A treatment plan is Pareto optimal if there does not exist another plan which is better in every measurable dimension. The set of all such plans is called the Pareto optimal surface. This article presents an algorithm for computing well distributed points on the (convex) Pareto optimal surface of a multiobjective programming problem. The algorithm is applied to intensity-modulated radiation therapy inverse planning problems, and results of a prostate case and a skull base case are presented, in three and four dimensions, investigating tradeoffs between tumor coverage and critical organ sparing

  11. Evolutionary algorithms for mobile ad hoc networks

    CERN Document Server

    Dorronsoro, Bernabé; Danoy, Grégoire; Pigné, Yoann; Bouvry, Pascal

    2014-01-01

    Describes how evolutionary algorithms (EAs) can be used to identify, model, and minimize day-to-day problems that arise for researchers in optimization and mobile networking. Mobile ad hoc networks (MANETs), vehicular networks (VANETs), sensor networks (SNs), and hybrid networks—each of these require a designer’s keen sense and knowledge of evolutionary algorithms in order to help with the common issues that plague professionals involved in optimization and mobile networking. This book introduces readers to both mobile ad hoc networks and evolutionary algorithms, presenting basic concepts as well as detailed descriptions of each. It demonstrates how metaheuristics and evolutionary algorithms (EAs) can be used to help provide low-cost operations in the optimization process—allowing designers to put some “intelligence” or sophistication into the design. It also offers efficient and accurate information on dissemination algorithms topology management, and mobility models to address challenges in the ...

  12. Multi-objective genetic algorithm optimization of 2D- and 3D-Pareto fronts for vibrational quantum processes

    International Nuclear Information System (INIS)

    Gollub, C; De Vivie-Riedle, R

    2009-01-01

    A multi-objective genetic algorithm is applied to optimize picosecond laser fields, driving vibrational quantum processes. Our examples are state-to-state transitions and unitary transformations. The approach allows features of the shaped laser fields and of the excitation mechanisms to be controlled simultaneously with the quantum yield. Within the parameter range accessible to the experiment, we focus on short pulse durations and low pulse energies to optimize preferably robust laser fields. Multidimensional Pareto fronts for these conflicting objectives could be constructed. Comparison with previous work showed that the solutions from Pareto optimizations and from optimal control theory match very well.

  13. Comparing Evolutionary Strategies on a Biobjective Cultural Algorithm

    Directory of Open Access Journals (Sweden)

    Carolina Lagos

    2014-01-01

    Full Text Available Evolutionary algorithms have been widely used to solve large and complex optimisation problems. Cultural algorithms (CAs are evolutionary algorithms that have been used to solve both single and, to a less extent, multiobjective optimisation problems. In order to solve these optimisation problems, CAs make use of different strategies such as normative knowledge, historical knowledge, circumstantial knowledge, and among others. In this paper we present a comparison among CAs that make use of different evolutionary strategies; the first one implements a historical knowledge, the second one considers a circumstantial knowledge, and the third one implements a normative knowledge. These CAs are applied on a biobjective uncapacitated facility location problem (BOUFLP, the biobjective version of the well-known uncapacitated facility location problem. To the best of our knowledge, only few articles have applied evolutionary multiobjective algorithms on the BOUFLP and none of those has focused on the impact of the evolutionary strategy on the algorithm performance. Our biobjective cultural algorithm, called BOCA, obtains important improvements when compared to other well-known evolutionary biobjective optimisation algorithms such as PAES and NSGA-II. The conflicting objective functions considered in this study are cost minimisation and coverage maximisation. Solutions obtained by each algorithm are compared using a hypervolume S metric.

  14. Dynamic Uniform Scaling for Multiobjective Genetic Algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf; Goldberg, David E.

    2004-01-01

    Before Multiobjective Evolutionary Algorithms (MOEAs) can be used as a widespread tool for solving arbitrary real world problems there are some salient issues which require further investigation. One of these issues is how a uniform distribution of solutions along the Pareto non-dominated front c...

  15. Dynamic Uniform Scaling for Multiobjective Genetic Algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf; Goldberg, D.E.

    2004-01-01

    Before Multiobjective Evolutionary Algorithms (MOEAs) can be used as a widespread tool for solving arbitrary real world problems there are some salient issues which require further investigation. One of these issues is how a uniform distribution of solutions along the Pareto non-dominated front can...

  16. Introduction to Evolutionary Algorithms

    CERN Document Server

    Yu, Xinjie

    2010-01-01

    Evolutionary algorithms (EAs) are becoming increasingly attractive for researchers from various disciplines, such as operations research, computer science, industrial engineering, electrical engineering, social science, economics, etc. This book presents an insightful, comprehensive, and up-to-date treatment of EAs, such as genetic algorithms, differential evolution, evolution strategy, constraint optimization, multimodal optimization, multiobjective optimization, combinatorial optimization, evolvable hardware, estimation of distribution algorithms, ant colony optimization, particle swarm opti

  17. Pareto-depth for multiple-query image retrieval.

    Science.gov (United States)

    Hsiao, Ko-Jen; Calder, Jeff; Hero, Alfred O

    2015-02-01

    Most content-based image retrieval systems consider either one single query, or multiple queries that include the same object or represent the same semantic information. In this paper, we consider the content-based image retrieval problem for multiple query images corresponding to different image semantics. We propose a novel multiple-query information retrieval algorithm that combines the Pareto front method with efficient manifold ranking. We show that our proposed algorithm outperforms state of the art multiple-query retrieval algorithms on real-world image databases. We attribute this performance improvement to concavity properties of the Pareto fronts, and prove a theoretical result that characterizes the asymptotic concavity of the fronts.

  18. Using Coevolution Genetic Algorithm with Pareto Principles to Solve Project Scheduling Problem under Duration and Cost Constraints

    Directory of Open Access Journals (Sweden)

    Alexandr Victorovich Budylskiy

    2014-06-01

    Full Text Available This article considers the multicriteria optimization approach using the modified genetic algorithm to solve the project-scheduling problem under duration and cost constraints. The work contains the list of choices for solving this problem. The multicriteria optimization approach is justified here. The study describes the Pareto principles, which are used in the modified genetic algorithm. We identify the mathematical model of the project-scheduling problem. We introduced the modified genetic algorithm, the ranking strategies, the elitism approaches. The article includes the example.

  19. Attribute Index and Uniform Design Based Multiobjective Association Rule Mining with Evolutionary Algorithm

    Directory of Open Access Journals (Sweden)

    Jie Zhang

    2013-01-01

    Full Text Available In association rule mining, evaluating an association rule needs to repeatedly scan database to compare the whole database with the antecedent, consequent of a rule and the whole rule. In order to decrease the number of comparisons and time consuming, we present an attribute index strategy. It only needs to scan database once to create the attribute index of each attribute. Then all metrics values to evaluate an association rule do not need to scan database any further, but acquire data only by means of the attribute indices. The paper visualizes association rule mining as a multiobjective problem rather than a single objective one. In order to make the acquired solutions scatter uniformly toward the Pareto frontier in the objective space, elitism policy and uniform design are introduced. The paper presents the algorithm of attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm, abbreviated as IUARMMEA. It does not require the user-specified minimum support and minimum confidence anymore, but uses a simple attribute index. It uses a well-designed real encoding so as to extend its application scope. Experiments performed on several databases demonstrate that the proposed algorithm has excellent performance, and it can significantly reduce the number of comparisons and time consumption.

  20. Attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm.

    Science.gov (United States)

    Zhang, Jie; Wang, Yuping; Feng, Junhong

    2013-01-01

    In association rule mining, evaluating an association rule needs to repeatedly scan database to compare the whole database with the antecedent, consequent of a rule and the whole rule. In order to decrease the number of comparisons and time consuming, we present an attribute index strategy. It only needs to scan database once to create the attribute index of each attribute. Then all metrics values to evaluate an association rule do not need to scan database any further, but acquire data only by means of the attribute indices. The paper visualizes association rule mining as a multiobjective problem rather than a single objective one. In order to make the acquired solutions scatter uniformly toward the Pareto frontier in the objective space, elitism policy and uniform design are introduced. The paper presents the algorithm of attribute index and uniform design based multiobjective association rule mining with evolutionary algorithm, abbreviated as IUARMMEA. It does not require the user-specified minimum support and minimum confidence anymore, but uses a simple attribute index. It uses a well-designed real encoding so as to extend its application scope. Experiments performed on several databases demonstrate that the proposed algorithm has excellent performance, and it can significantly reduce the number of comparisons and time consumption.

  1. Pareto-Optimization of HTS CICC for High-Current Applications in Self-Field

    Directory of Open Access Journals (Sweden)

    Giordano Tomassetti

    2018-01-01

    Full Text Available The ENEA superconductivity laboratory developed a novel design for Cable-in-Conduit Conductors (CICCs comprised of stacks of 2nd-generation REBCO coated conductors. In its original version, the cable was made up of 150 HTS tapes distributed in five slots, twisted along an aluminum core. In this work, taking advantage of a 2D finite element model, able to estimate the cable’s current distribution in the cross-section, a multiobjective optimization procedure was implemented. The aim of optimization was to simultaneously maximize both engineering current density and total current flowing inside the tapes when operating in self-field, by varying the cross-section layout. Since the optimization process involved both integer and real geometrical variables, the choice of an evolutionary search algorithm was strictly necessary. The use of an evolutionary algorithm in the frame of a multiple objective optimization made it an obliged choice to numerically approach the problem using a nonstandard fast-converging optimization algorithm. By means of this algorithm, the Pareto frontiers for the different configurations were calculated, providing a powerful tool for the designer to achieve the desired preliminary operating conditions in terms of engineering current density and/or total current, depending on the specific application field, that is, power transmission cable and bus bar systems.

  2. Pareto front estimation for decision making.

    Science.gov (United States)

    Giagkiozis, Ioannis; Fleming, Peter J

    2014-01-01

    The set of available multi-objective optimisation algorithms continues to grow. This fact can be partially attributed to their widespread use and applicability. However, this increase also suggests several issues remain to be addressed satisfactorily. One such issue is the diversity and the number of solutions available to the decision maker (DM). Even for algorithms very well suited for a particular problem, it is difficult-mainly due to the computational cost-to use a population large enough to ensure the likelihood of obtaining a solution close to the DM's preferences. In this paper we present a novel methodology that produces additional Pareto optimal solutions from a Pareto optimal set obtained at the end run of any multi-objective optimisation algorithm for two-objective and three-objective problem instances.

  3. An Evolutionary Multi-objective Approach for Speed Tuning Optimization with Energy Saving in Railway Management

    OpenAIRE

    Chevrier , Rémy

    2010-01-01

    International audience; An approach for speed tuning in railway management is presented for optimizing both travel duration and energy saving. This approach is based on a state-of-the-art evolutionary algorithm with Pareto approach. This algorithm provides a set of diversified non-dominated solutions to the decision-maker. A case study on Gonesse connection (France) is also reported and analyzed.

  4. Modelling and Pareto optimization of heat transfer and flow coefficients in microchannels using GMDH type neural networks and genetic algorithms

    International Nuclear Information System (INIS)

    Amanifard, N.; Nariman-Zadeh, N.; Borji, M.; Khalkhali, A.; Habibdoust, A.

    2008-01-01

    Three-dimensional heat transfer characteristics and pressure drop of water flow in a set of rectangular microchannels are numerically investigated using Fluent and compared with those of experimental results. Two metamodels based on the evolved group method of data handling (GMDH) type neural networks are then obtained for modelling of both pressure drop (ΔP) and Nusselt number (Nu) with respect to design variables such as geometrical parameters of microchannels, the amount of heat flux and the Reynolds number. Using such obtained polynomial neural networks, multi-objective genetic algorithms (GAs) (non-dominated sorting genetic algorithm, NSGA-II) with a new diversity preserving mechanism is then used for Pareto based optimization of microchannels considering two conflicting objectives such as (ΔP) and (Nu). It is shown that some interesting and important relationships as useful optimal design principles involved in the performance of microchannels can be discovered by Pareto based multi-objective optimization of the obtained polynomial metamodels representing their heat transfer and flow characteristics. Such important optimal principles would not have been obtained without the use of both GMDH type neural network modelling and the Pareto optimization approach

  5. Infrastructure system restoration planning using evolutionary algorithms

    Science.gov (United States)

    Corns, Steven; Long, Suzanna K.; Shoberg, Thomas G.

    2016-01-01

    This paper presents an evolutionary algorithm to address restoration issues for supply chain interdependent critical infrastructure. Rapid restoration of infrastructure after a large-scale disaster is necessary to sustaining a nation's economy and security, but such long-term restoration has not been investigated as thoroughly as initial rescue and recovery efforts. A model of the Greater Saint Louis Missouri area was created and a disaster scenario simulated. An evolutionary algorithm is used to determine the order in which the bridges should be repaired based on indirect costs. Solutions were evaluated based on the reduction of indirect costs and the restoration of transportation capacity. When compared to a greedy algorithm, the evolutionary algorithm solution reduced indirect costs by approximately 12.4% by restoring automotive travel routes for workers and re-establishing the flow of commodities across the three rivers in the Saint Louis area.

  6. Multi-Objective Optimization of Hybrid Renewable Energy System Using an Enhanced Multi-Objective Evolutionary Algorithm

    Directory of Open Access Journals (Sweden)

    Mengjun Ming

    2017-05-01

    Full Text Available Due to the scarcity of conventional energy resources and the greenhouse effect, renewable energies have gained more attention. This paper proposes methods for multi-objective optimal design of hybrid renewable energy system (HRES in both isolated-island and grid-connected modes. In each mode, the optimal design aims to find suitable configurations of photovoltaic (PV panels, wind turbines, batteries and diesel generators in HRES such that the system cost and the fuel emission are minimized, and the system reliability/renewable ability (corresponding to different modes is maximized. To effectively solve this multi-objective problem (MOP, the multi-objective evolutionary algorithm based on decomposition (MOEA/D using localized penalty-based boundary intersection (LPBI method is proposed. The algorithm denoted as MOEA/D-LPBI is demonstrated to outperform its competitors on the HRES model as well as a set of benchmarks. Moreover, it effectively obtains a good approximation of Pareto optimal HRES configurations. By further considering a decision maker’s preference, the most satisfied configuration of the HRES can be identified.

  7. Multi-objective mixture-based iterated density estimation evolutionary algorithms

    NARCIS (Netherlands)

    Thierens, D.; Bosman, P.A.N.

    2001-01-01

    We propose an algorithm for multi-objective optimization using a mixture-based iterated density estimation evolutionary algorithm (MIDEA). The MIDEA algorithm is a prob- abilistic model building evolutionary algo- rithm that constructs at each generation a mixture of factorized probability

  8. Can we reach Pareto optimal outcomes using bottom-up approaches?

    NARCIS (Netherlands)

    V. Sanchez-Anguix (Victor); R. Aydoğan (Reyhan); T. Baarslag (Tim); C.M. Jonker (Catholijn)

    2016-01-01

    textabstractClassically, disciplines like negotiation and decision making have focused on reaching Pareto optimal solutions due to its stability and efficiency properties. Despite the fact that many practical and theoretical algorithms have successfully attempted to provide Pareto optimal solutions,

  9. Evaluation of models generated via hybrid evolutionary algorithms ...

    African Journals Online (AJOL)

    2016-04-02

    Apr 2, 2016 ... Evaluation of models generated via hybrid evolutionary algorithms for the prediction of Microcystis ... evolutionary algorithms (HEA) proved to be highly applica- ble to the hypertrophic reservoirs of South Africa. .... discovered and optimised using a large-scale parallel computational device and relevant soft-.

  10. Application of a rule extraction algorithm family based on the Re-RX algorithm to financial credit risk assessment from a Pareto optimal perspective

    Directory of Open Access Journals (Sweden)

    Yoichi Hayashi

    2016-01-01

    Full Text Available Historically, the assessment of credit risk has proved to be both highly important and extremely difficult. Currently, financial institutions rely on the use of computer-generated credit scores for risk assessment. However, automated risk evaluations are currently imperfect, and the loss of vast amounts of capital could be prevented by improving the performance of computerized credit assessments. A number of approaches have been developed for the computation of credit scores over the last several decades, but these methods have been considered too complex without good interpretability and have therefore not been widely adopted. Therefore, in this study, we provide the first comprehensive comparison of results regarding the assessment of credit risk obtained using 10 runs of 10-fold cross validation of the Re-RX algorithm family, including the Re-RX algorithm, the Re-RX algorithm with both discrete and continuous attributes (Continuous Re-RX, the Re-RX algorithm with J48graft, the Re-RX algorithm with a trained neural network (Sampling Re-RX, NeuroLinear, NeuroLinear+GRG, and three unique rule extraction techniques involving support vector machines and Minerva from four real-life, two-class mixed credit-risk datasets. We also discuss the roles of various newly-extended types of the Re-RX algorithm and high performance classifiers from a Pareto optimal perspective. Our findings suggest that Continuous Re-RX, Re-RX with J48graft, and Sampling Re-RX comprise a powerful management tool that allows the creation of advanced, accurate, concise and interpretable decision support systems for credit risk evaluation. In addition, from a Pareto optimal perspective, the Re-RX algorithm family has superior features in relation to the comprehensibility of extracted rules and the potential for credit scoring with Big Data.

  11. Pareto-optimal multi-objective dimensionality reduction deep auto-encoder for mammography classification.

    Science.gov (United States)

    Taghanaki, Saeid Asgari; Kawahara, Jeremy; Miles, Brandon; Hamarneh, Ghassan

    2017-07-01

    Feature reduction is an essential stage in computer aided breast cancer diagnosis systems. Multilayer neural networks can be trained to extract relevant features by encoding high-dimensional data into low-dimensional codes. Optimizing traditional auto-encoders works well only if the initial weights are close to a proper solution. They are also trained to only reduce the mean squared reconstruction error (MRE) between the encoder inputs and the decoder outputs, but do not address the classification error. The goal of the current work is to test the hypothesis that extending traditional auto-encoders (which only minimize reconstruction error) to multi-objective optimization for finding Pareto-optimal solutions provides more discriminative features that will improve classification performance when compared to single-objective and other multi-objective approaches (i.e. scalarized and sequential). In this paper, we introduce a novel multi-objective optimization of deep auto-encoder networks, in which the auto-encoder optimizes two objectives: MRE and mean classification error (MCE) for Pareto-optimal solutions, rather than just MRE. These two objectives are optimized simultaneously by a non-dominated sorting genetic algorithm. We tested our method on 949 X-ray mammograms categorized into 12 classes. The results show that the features identified by the proposed algorithm allow a classification accuracy of up to 98.45%, demonstrating favourable accuracy over the results of state-of-the-art methods reported in the literature. We conclude that adding the classification objective to the traditional auto-encoder objective and optimizing for finding Pareto-optimal solutions, using evolutionary multi-objective optimization, results in producing more discriminative features. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. A possibilistic approach to rotorcraft design through a multi-objective evolutionary algorithm

    Science.gov (United States)

    Chae, Han Gil

    Most of the engineering design processes in use today in the field may be considered as a series of successive decision making steps. The decision maker uses information at hand, determines the direction of the procedure, and generates information for the next step and/or other decision makers. However, the information is often incomplete, especially in the early stages of the design process of a complex system. As the complexity of the system increases, uncertainties eventually become unmanageable using traditional tools. In such a case, the tools and analysis values need to be "softened" to account for the designer's intuition. One of the methods that deals with issues of intuition and incompleteness is possibility theory. Through the use of possibility theory coupled with fuzzy inference, the uncertainties estimated by the intuition of the designer are quantified for design problems. By involving quantified uncertainties in the tools, the solutions can represent a possible set, instead of a crisp spot, for predefined levels of certainty. From a different point of view, it is a well known fact that engineering design is a multi-objective problem or a set of such problems. The decision maker aims to find satisfactory solutions, sometimes compromising the objectives that conflict with each other. Once the candidates of possible solutions are generated, a satisfactory solution can be found by various decision-making techniques. A number of multi-objective evolutionary algorithms (MOEAs) have been developed, and can be found in the literature, which are capable of generating alternative solutions and evaluating multiple sets of solutions in one single execution of an algorithm. One of the MOEA techniques that has been proven to be very successful for this class of problems is the strength Pareto evolutionary algorithm (SPEA) which falls under the dominance-based category of methods. The Pareto dominance that is used in SPEA, however, is not enough to account for the

  13. On the construction of experimental designs for a given task by jointly optimizing several quality criteria: Pareto-optimal experimental designs.

    Science.gov (United States)

    Sánchez, M S; Sarabia, L A; Ortiz, M C

    2012-11-19

    Experimental designs for a given task should be selected on the base of the problem being solved and of some criteria that measure their quality. There are several such criteria because there are several aspects to be taken into account when making a choice. The most used criteria are probably the so-called alphabetical optimality criteria (for example, the A-, E-, and D-criteria related to the joint estimation of the coefficients, or the I- and G-criteria related to the prediction variance). Selecting a proper design to solve a problem implies finding a balance among these several criteria that measure the performance of the design in different aspects. Technically this is a problem of multi-criteria optimization, which can be tackled from different views. The approach presented here addresses the problem in its real vector nature, so that ad hoc experimental designs are generated with an algorithm based on evolutionary algorithms to find the Pareto-optimal front. There is not theoretical limit to the number of criteria that can be studied and, contrary to other approaches, no just one experimental design is computed but a set of experimental designs all of them with the property of being Pareto-optimal in the criteria needed by the user. Besides, the use of an evolutionary algorithm makes it possible to search in both continuous and discrete domains and avoid the need of having a set of candidate points, usual in exchange algorithms. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Pareto navigation-algorithmic foundation of interactive multi-criteria IMRT planning

    International Nuclear Information System (INIS)

    Monz, M; Kuefer, K H; Bortfeld, T R; Thieke, C

    2008-01-01

    Inherently, IMRT treatment planning involves compromising between different planning goals. Multi-criteria IMRT planning directly addresses this compromising and thus makes it more systematic. Usually, several plans are computed from which the planner selects the most promising following a certain procedure. Applying Pareto navigation for this selection step simultaneously increases the variety of planning options and eases the identification of the most promising plan. Pareto navigation is an interactive multi-criteria optimization method that consists of the two navigation mechanisms 'selection' and 'restriction'. The former allows the formulation of wishes whereas the latter allows the exclusion of unwanted plans. They are realized as optimization problems on the so-called plan bundle-a set constructed from pre-computed plans. They can be approximately reformulated so that their solution time is a small fraction of a second. Thus, the user can be provided with immediate feedback regarding his or her decisions. Pareto navigation was implemented in the MIRA navigator software and allows real-time manipulation of the current plan and the set of considered plans. The changes are triggered by simple mouse operations on the so-called navigation star and lead to real-time updates of the navigation star and the dose visualizations. Since any Pareto-optimal plan in the plan bundle can be found with just a few navigation operations the MIRA navigator allows a fast and directed plan determination. Besides, the concept allows for a refinement of the plan bundle, thus offering a middle course between single plan computation and multi-criteria optimization. Pareto navigation offers so far unmatched real-time interactions, ease of use and plan variety, setting it apart from the multi-criteria IMRT planning methods proposed so far

  15. Pareto navigation: algorithmic foundation of interactive multi-criteria IMRT planning.

    Science.gov (United States)

    Monz, M; Küfer, K H; Bortfeld, T R; Thieke, C

    2008-02-21

    Inherently, IMRT treatment planning involves compromising between different planning goals. Multi-criteria IMRT planning directly addresses this compromising and thus makes it more systematic. Usually, several plans are computed from which the planner selects the most promising following a certain procedure. Applying Pareto navigation for this selection step simultaneously increases the variety of planning options and eases the identification of the most promising plan. Pareto navigation is an interactive multi-criteria optimization method that consists of the two navigation mechanisms 'selection' and 'restriction'. The former allows the formulation of wishes whereas the latter allows the exclusion of unwanted plans. They are realized as optimization problems on the so-called plan bundle -- a set constructed from pre-computed plans. They can be approximately reformulated so that their solution time is a small fraction of a second. Thus, the user can be provided with immediate feedback regarding his or her decisions. Pareto navigation was implemented in the MIRA navigator software and allows real-time manipulation of the current plan and the set of considered plans. The changes are triggered by simple mouse operations on the so-called navigation star and lead to real-time updates of the navigation star and the dose visualizations. Since any Pareto-optimal plan in the plan bundle can be found with just a few navigation operations the MIRA navigator allows a fast and directed plan determination. Besides, the concept allows for a refinement of the plan bundle, thus offering a middle course between single plan computation and multi-criteria optimization. Pareto navigation offers so far unmatched real-time interactions, ease of use and plan variety, setting it apart from the multi-criteria IMRT planning methods proposed so far.

  16. A Double Evolutionary Pool Memetic Algorithm for Examination Timetabling Problems

    Directory of Open Access Journals (Sweden)

    Yu Lei

    2014-01-01

    Full Text Available A double evolutionary pool memetic algorithm is proposed to solve the examination timetabling problem. To improve the performance of the proposed algorithm, two evolutionary pools, that is, the main evolutionary pool and the secondary evolutionary pool, are employed. The genetic operators have been specially designed to fit the examination timetabling problem. A simplified version of the simulated annealing strategy is designed to speed the convergence of the algorithm. A clonal mechanism is introduced to preserve population diversity. Extensive experiments carried out on 12 benchmark examination timetabling instances show that the proposed algorithm is able to produce promising results for the uncapacitated examination timetabling problem.

  17. Modelling and multi-objective optimization of a variable valve-timing spark-ignition engine using polynomial neural networks and evolutionary algorithms

    International Nuclear Information System (INIS)

    Atashkari, K.; Nariman-Zadeh, N.; Goelcue, M.; Khalkhali, A.; Jamali, A.

    2007-01-01

    The main reason for the efficiency decrease at part load conditions for four-stroke spark-ignition (SI) engines is the flow restriction at the cross-sectional area of the intake system. Traditionally, valve-timing has been designed to optimize operation at high engine-speed and wide open throttle conditions. Several investigations have demonstrated that improvements at part load conditions in engine performance can be accomplished if the valve-timing is variable. Controlling valve-timing can be used to improve the torque and power curve as well as to reduce fuel consumption and emissions. In this paper, a group method of data handling (GMDH) type neural network and evolutionary algorithms (EAs) are firstly used for modelling the effects of intake valve-timing (V t ) and engine speed (N) of a spark-ignition engine on both developed engine torque (T) and fuel consumption (Fc) using some experimentally obtained training and test data. Using such obtained polynomial neural network models, a multi-objective EA (non-dominated sorting genetic algorithm, NSGA-II) with a new diversity preserving mechanism are secondly used for Pareto based optimization of the variable valve-timing engine considering two conflicting objectives such as torque (T) and fuel consumption (Fc). The comparison results demonstrate the superiority of the GMDH type models over feedforward neural network models in terms of the statistical measures in the training data, testing data and the number of hidden neurons. Further, it is shown that some interesting and important relationships, as useful optimal design principles, involved in the performance of the variable valve-timing four-stroke spark-ignition engine can be discovered by the Pareto based multi-objective optimization of the polynomial models. Such important optimal principles would not have been obtained without the use of both the GMDH type neural network modelling and the multi-objective Pareto optimization approach

  18. Parallel Evolutionary Optimization Algorithms for Peptide-Protein Docking

    Science.gov (United States)

    Poluyan, Sergey; Ershov, Nikolay

    2018-02-01

    In this study we examine the possibility of using evolutionary optimization algorithms in protein-peptide docking. We present the main assumptions that reduce the docking problem to a continuous global optimization problem and provide a way of using evolutionary optimization algorithms. The Rosetta all-atom force field was used for structural representation and energy scoring. We describe the parallelization scheme and MPI/OpenMP realization of the considered algorithms. We demonstrate the efficiency and the performance for some algorithms which were applied to a set of benchmark tests.

  19. ADAPTIVE SELECTION OF AUXILIARY OBJECTIVES IN MULTIOBJECTIVE EVOLUTIONARY ALGORITHMS

    Directory of Open Access Journals (Sweden)

    I. A. Petrova

    2016-05-01

    Full Text Available Subject of Research.We propose to modify the EA+RL method, which increases efficiency of evolutionary algorithms by means of auxiliary objectives. The proposed modification is compared to the existing objective selection methods on the example of travelling salesman problem. Method. In the EA+RL method a reinforcement learning algorithm is used to select an objective – the target objective or one of the auxiliary objectives – at each iteration of the single-objective evolutionary algorithm.The proposed modification of the EA+RL method adopts this approach for the usage with a multiobjective evolutionary algorithm. As opposed to theEA+RL method, in this modification one of the auxiliary objectives is selected by reinforcement learning and optimized together with the target objective at each step of the multiobjective evolutionary algorithm. Main Results.The proposed modification of the EA+RL method was compared to the existing objective selection methods on the example of travelling salesman problem. In the EA+RL method and its proposed modification reinforcement learning algorithms for stationary and non-stationary environment were used. The proposed modification of the EA+RL method applied with reinforcement learning for non-stationary environment outperformed the considered objective selection algorithms on the most problem instances. Practical Significance. The proposed approach increases efficiency of evolutionary algorithms, which may be used for solving discrete NP-hard optimization problems. They are, in particular, combinatorial path search problems and scheduling problems.

  20. Multiobjective Multifactorial Optimization in Evolutionary Multitasking.

    Science.gov (United States)

    Gupta, Abhishek; Ong, Yew-Soon; Feng, Liang; Tan, Kay Chen

    2016-05-03

    In recent decades, the field of multiobjective optimization has attracted considerable interest among evolutionary computation researchers. One of the main features that makes evolutionary methods particularly appealing for multiobjective problems is the implicit parallelism offered by a population, which enables simultaneous convergence toward the entire Pareto front. While a plethora of related algorithms have been proposed till date, a common attribute among them is that they focus on efficiently solving only a single optimization problem at a time. Despite the known power of implicit parallelism, seldom has an attempt been made to multitask, i.e., to solve multiple optimization problems simultaneously. It is contended that the notion of evolutionary multitasking leads to the possibility of automated transfer of information across different optimization exercises that may share underlying similarities, thereby facilitating improved convergence characteristics. In particular, the potential for automated transfer is deemed invaluable from the standpoint of engineering design exercises where manual knowledge adaptation and reuse are routine. Accordingly, in this paper, we present a realization of the evolutionary multitasking paradigm within the domain of multiobjective optimization. The efficacy of the associated evolutionary algorithm is demonstrated on some benchmark test functions as well as on a real-world manufacturing process design problem from the composites industry.

  1. Evolutionary Algorithms for Boolean Queries Optimization

    Czech Academy of Sciences Publication Activity Database

    Húsek, Dušan; Snášel, Václav; Neruda, Roman; Owais, S.S.J.; Krömer, P.

    2006-01-01

    Roč. 3, č. 1 (2006), s. 15-20 ISSN 1790-0832 R&D Projects: GA AV ČR 1ET100300414 Institutional research plan: CEZ:AV0Z10300504 Keywords : evolutionary algorithms * genetic algorithms * information retrieval * Boolean query Subject RIV: BA - General Mathematics

  2. An Evolutionary Efficiency Alternative to the Notion of Pareto Efficiency

    NARCIS (Netherlands)

    I.P. van Staveren (Irene)

    2012-01-01

    textabstractThe paper argues that the notion of Pareto efficiency builds on two normative assumptions: the more general consequentialist norm of any efficiency criterion, and the strong no-harm principle of the prohibition of any redistribution during the economic process that hurts at least one

  3. Pareto-Optimal Estimates of California Precipitation Change

    Science.gov (United States)

    Langenbrunner, Baird; Neelin, J. David

    2017-12-01

    In seeking constraints on global climate model projections under global warming, one commonly finds that different subsets of models perform well under different objective functions, and these trade-offs are difficult to weigh. Here a multiobjective approach is applied to a large set of subensembles generated from the Climate Model Intercomparison Project phase 5 ensemble. We use observations and reanalyses to constrain tropical Pacific sea surface temperatures, upper level zonal winds in the midlatitude Pacific, and California precipitation. An evolutionary algorithm identifies the set of Pareto-optimal subensembles across these three measures, and these subensembles are used to constrain end-of-century California wet season precipitation change. This methodology narrows the range of projections throughout California, increasing confidence in estimates of positive mean precipitation change. Finally, we show how this technique complements and generalizes emergent constraint approaches for restricting uncertainty in end-of-century projections within multimodel ensembles using multiple criteria for observational constraints.

  4. Pareto-optimal multi-objective design of airplane control systems

    Science.gov (United States)

    Schy, A. A.; Johnson, K. G.; Giesy, D. P.

    1980-01-01

    A constrained minimization algorithm for the computer aided design of airplane control systems to meet many requirements over a set of flight conditions is generalized using the concept of Pareto-optimization. The new algorithm yields solutions on the boundary of the achievable domain in objective space in a single run, whereas the older method required a sequence of runs to approximate such a limiting solution. However, Pareto-optimality does not guarantee a satisfactory design, since such solutions may emphasize some objectives at the expense of others. The designer must still interact with the program to obtain a well-balanced set of objectives. Using the example of a fighter lateral stability augmentation system (SAS) design over five flight conditions, several effective techniques are developed for obtaining well-balanced Pareto-optimal solutions. For comparison, one of these techniques is also used in a recently developed algorithm of Kreisselmeier and Steinhauser, which replaces the hard constraints with soft constraints, using a special penalty function. It is shown that comparable results can be obtained.

  5. EvAg: A Scalable Peer-to-Peer Evolutionary Algorithm

    NARCIS (Netherlands)

    Laredo, J.L.J.; Eiben, A.E.; van Steen, M.R.; Merelo, J.J.

    2010-01-01

    This paper studies the scalability of an Evolutionary Algorithm (EA) whose population is structured by means of a gossiping protocol and where the evolutionary operators act exclusively within the local neighborhoods. This makes the algorithm inherently suited for parallel execution in a

  6. A Pareto-based multi-objective optimization algorithm to design energy-efficient shading devices

    International Nuclear Information System (INIS)

    Khoroshiltseva, Marina; Slanzi, Debora; Poli, Irene

    2016-01-01

    Highlights: • We present a multi-objective optimization algorithm for shading design. • We combine Harmony search and Pareto-based procedures. • Thermal and daylighting performances of external shading were considered. • We applied the optimization process to a residential social housing in Madrid. - Abstract: In this paper we address the problem of designing new energy-efficient static daylight devices that will surround the external windows of a residential building in Madrid. Shading devices can in fact largely influence solar gains in a building and improve thermal and lighting comforts by selectively intercepting the solar radiation and by reducing the undesirable glare. A proper shading device can therefore significantly increase the thermal performance of a building by reducing its energy demand in different climate conditions. In order to identify the set of optimal shading devices that allow a low energy consumption of the dwelling while maintaining high levels of thermal and lighting comfort for the inhabitants we derive a multi-objective optimization methodology based on Harmony Search and Pareto front approaches. The results show that the multi-objective approach here proposed is an effective procedure in designing energy efficient shading devices when a large set of conflicting objectives characterizes the performance of the proposed solutions.

  7. Calculating complete and exact Pareto front for multiobjective optimization: a new deterministic approach for discrete problems.

    Science.gov (United States)

    Hu, Xiao-Bing; Wang, Ming; Di Paolo, Ezequiel

    2013-06-01

    Searching the Pareto front for multiobjective optimization problems usually involves the use of a population-based search algorithm or of a deterministic method with a set of different single aggregate objective functions. The results are, in fact, only approximations of the real Pareto front. In this paper, we propose a new deterministic approach capable of fully determining the real Pareto front for those discrete problems for which it is possible to construct optimization algorithms to find the k best solutions to each of the single-objective problems. To this end, two theoretical conditions are given to guarantee the finding of the actual Pareto front rather than its approximation. Then, a general methodology for designing a deterministic search procedure is proposed. A case study is conducted, where by following the general methodology, a ripple-spreading algorithm is designed to calculate the complete exact Pareto front for multiobjective route optimization. When compared with traditional Pareto front search methods, the obvious advantage of the proposed approach is its unique capability of finding the complete Pareto front. This is illustrated by the simulation results in terms of both solution quality and computational efficiency.

  8. Spectral-Efficiency - Illumination Pareto Front for Energy Harvesting Enabled VLC System

    KAUST Repository

    Abdelhady, Amr Mohamed Abdelaziz

    2017-12-13

    The continuous improvement in optical energy harvesting devices motivates visible light communication (VLC) system developers to utilize such available free energy sources. An outdoor VLC system is considered where an optical base station sends data to multiple users that are capable of harvesting the optical energy. The proposed VLC system serves multiple users using time division multiple access (TDMA) with unequal time and power allocation, which are allocated to improve the system performance. The adopted optical system provides users with illumination and data communication services. The outdoor optical design objective is to maximize the illumination, while the communication design objective is to maximize the spectral efficiency (SE). The design objectives are shown to be conflicting, therefore, a multiobjective optimization problem is formulated to obtain the Pareto front performance curve for the proposed system. To this end, the marginal optimization problems are solved first using low complexity algorithms. Then, based on the proposed algorithms, a low complexity algorithm is developed to obtain an inner bound of the Pareto front for the illumination-SE tradeoff. The inner bound for the Pareto-front is shown to be close to the optimal Pareto-frontier via several simulation scenarios for different system parameters.

  9. Synthesis of logic circuits with evolutionary algorithms

    Energy Technology Data Exchange (ETDEWEB)

    JONES,JAKE S.; DAVIDSON,GEORGE S.

    2000-01-26

    In the last decade there has been interest and research in the area of designing circuits with genetic algorithms, evolutionary algorithms, and genetic programming. However, the ability to design circuits of the size and complexity required by modern engineering design problems, simply by specifying required outputs for given inputs has as yet eluded researchers. This paper describes current research in the area of designing logic circuits using an evolutionary algorithm. The goal of the research is to improve the effectiveness of this method and make it a practical aid for design engineers. A novel method of implementing the algorithm is introduced, and results are presented for various multiprocessing systems. In addition to evolving standard arithmetic circuits, work in the area of evolving circuits that perform digital signal processing tasks is described.

  10. Derivative-free generation and interpolation of convex Pareto optimal IMRT plans

    Science.gov (United States)

    Hoffmann, Aswin L.; Siem, Alex Y. D.; den Hertog, Dick; Kaanders, Johannes H. A. M.; Huizenga, Henk

    2006-12-01

    In inverse treatment planning for intensity-modulated radiation therapy (IMRT), beamlet intensity levels in fluence maps of high-energy photon beams are optimized. Treatment plan evaluation criteria are used as objective functions to steer the optimization process. Fluence map optimization can be considered a multi-objective optimization problem, for which a set of Pareto optimal solutions exists: the Pareto efficient frontier (PEF). In this paper, a constrained optimization method is pursued to iteratively estimate the PEF up to some predefined error. We use the property that the PEF is convex for a convex optimization problem to construct piecewise-linear upper and lower bounds to approximate the PEF from a small initial set of Pareto optimal plans. A derivative-free Sandwich algorithm is presented in which these bounds are used with three strategies to determine the location of the next Pareto optimal solution such that the uncertainty in the estimated PEF is maximally reduced. We show that an intelligent initial solution for a new Pareto optimal plan can be obtained by interpolation of fluence maps from neighbouring Pareto optimal plans. The method has been applied to a simplified clinical test case using two convex objective functions to map the trade-off between tumour dose heterogeneity and critical organ sparing. All three strategies produce representative estimates of the PEF. The new algorithm is particularly suitable for dynamic generation of Pareto optimal plans in interactive treatment planning.

  11. Derivative-free generation and interpolation of convex Pareto optimal IMRT plans

    International Nuclear Information System (INIS)

    Hoffmann, Aswin L; Siem, Alex Y D; Hertog, Dick den; Kaanders, Johannes H A M; Huizenga, Henk

    2006-01-01

    In inverse treatment planning for intensity-modulated radiation therapy (IMRT), beamlet intensity levels in fluence maps of high-energy photon beams are optimized. Treatment plan evaluation criteria are used as objective functions to steer the optimization process. Fluence map optimization can be considered a multi-objective optimization problem, for which a set of Pareto optimal solutions exists: the Pareto efficient frontier (PEF). In this paper, a constrained optimization method is pursued to iteratively estimate the PEF up to some predefined error. We use the property that the PEF is convex for a convex optimization problem to construct piecewise-linear upper and lower bounds to approximate the PEF from a small initial set of Pareto optimal plans. A derivative-free Sandwich algorithm is presented in which these bounds are used with three strategies to determine the location of the next Pareto optimal solution such that the uncertainty in the estimated PEF is maximally reduced. We show that an intelligent initial solution for a new Pareto optimal plan can be obtained by interpolation of fluence maps from neighbouring Pareto optimal plans. The method has been applied to a simplified clinical test case using two convex objective functions to map the trade-off between tumour dose heterogeneity and critical organ sparing. All three strategies produce representative estimates of the PEF. The new algorithm is particularly suitable for dynamic generation of Pareto optimal plans in interactive treatment planning

  12. Analog Circuit Design Optimization Based on Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Mansour Barari

    2014-01-01

    Full Text Available This paper investigates an evolutionary-based designing system for automated sizing of analog integrated circuits (ICs. Two evolutionary algorithms, genetic algorithm and PSO (Parswal particle swarm optimization algorithm, are proposed to design analog ICs with practical user-defined specifications. On the basis of the combination of HSPICE and MATLAB, the system links circuit performances, evaluated through specific electrical simulation, to the optimization system in the MATLAB environment, for the selected topology. The system has been tested by typical and hard-to-design cases, such as complex analog blocks with stringent design requirements. The results show that the design specifications are closely met. Comparisons with available methods like genetic algorithms show that the proposed algorithm offers important advantages in terms of optimization quality and robustness. Moreover, the algorithm is shown to be efficient.

  13. Evolutionary Algorithms Application Analysis in Biometric Systems

    Directory of Open Access Journals (Sweden)

    N. Goranin

    2010-01-01

    Full Text Available Wide usage of biometric information for person identity verification purposes, terrorist acts prevention measures and authenticationprocess simplification in computer systems has raised significant attention to reliability and efficiency of biometricsystems. Modern biometric systems still face many reliability and efficiency related issues such as reference databasesearch speed, errors while recognizing of biometric information or automating biometric feature extraction. Current scientificinvestigations show that application of evolutionary algorithms may significantly improve biometric systems. In thisarticle we provide a comprehensive review of main scientific research done in sphere of evolutionary algorithm applicationfor biometric system parameter improvement.

  14. Hybridizing Evolutionary Algorithms with Opportunistic Local Search

    DEFF Research Database (Denmark)

    Gießen, Christian

    2013-01-01

    There is empirical evidence that memetic algorithms (MAs) can outperform plain evolutionary algorithms (EAs). Recently the first runtime analyses have been presented proving the aforementioned conjecture rigorously by investigating Variable-Depth Search, VDS for short (Sudholt, 2008). Sudholt...

  15. Towards Automatic Controller Design using Multi-Objective Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf

    of evolutionary computation, a choice was made to use multi-objective algorithms for the purpose of aiding in automatic controller design. More specifically, the choice was made to use the Non-dominated Sorting Genetic Algorithm II (NSGAII), which is one of the most potent algorithms currently in use...... for automatic controller design. However, because the field of evolutionary computation is relatively unknown in the field of control engineering, this thesis also includes a comprehensive introduction to the basic field of evolutionary computation as well as a description of how the field has previously been......In order to design the controllers of tomorrow, a need has risen for tools that can aid in the design of these. A desire to use evolutionary computation as a tool to achieve that goal is what gave inspiration for the work contained in this thesis. After having studied the foundations...

  16. TU-C-17A-01: A Data-Based Development for Pratical Pareto Optimality Assessment and Identification

    International Nuclear Information System (INIS)

    Ruan, D; Qi, S; DeMarco, J; Kupelian, P; Low, D

    2014-01-01

    Purpose: To develop an efficient Pareto optimality assessment scheme to support plan comparison and practical determination of best-achievable practical treatment plan goals. Methods: Pareto efficiency reflects the tradeoffs among competing target coverage and normal tissue sparing in multi-criterion optimization (MCO) based treatment planning. Assessing and understanding Pareto optimality provides insightful guidance for future planning. However, current MCO-driven Pareto estimation makes relaxed assumptions about the Pareto structure and insufficiently account for practical limitations in beam complexity, leading to performance upper bounds that may be unachievable. This work proposed an alternative data-driven approach that implicitly incorporates the practical limitations, and identifies the Pareto frontier subset by eliminating dominated plans incrementally using the Edgeworth Pareto hull (EPH). The exactness of this elimination process also permits the development of a hierarchical procedure for speedup when the plan cohort size is large, by partitioning the cohort and performing elimination in each subset before a final aggregated elimination. The developed algorithm was first tested on 2D and 3D where accuracy can be reliably assessed. As a specific application, the algorithm was applied to compare systematic plan quality for lower head-and-neck, amongst 4 competing treatment modalities. Results: The algorithm agrees exactly with brute-force pairwise comparison and visual inspection in low dimensions. The hierarchical algorithm shows sqrt(k) folds speedup with k being the number of data points in the plan cohort, demonstrating good efficiency enhancement for heavy testing tasks. Application to plan performance comparison showed superiority of tomotherapy plans for the lower head-and-neck, and revealed a potential nonconvex Pareto frontier structure. Conclusion: An accurate and efficient scheme to identify Pareto frontier from a plan cohort has been

  17. TU-C-17A-01: A Data-Based Development for Pratical Pareto Optimality Assessment and Identification

    Energy Technology Data Exchange (ETDEWEB)

    Ruan, D; Qi, S; DeMarco, J; Kupelian, P; Low, D [UCLA Department of Radiation Oncology, Los Angeles, CA (United States)

    2014-06-15

    Purpose: To develop an efficient Pareto optimality assessment scheme to support plan comparison and practical determination of best-achievable practical treatment plan goals. Methods: Pareto efficiency reflects the tradeoffs among competing target coverage and normal tissue sparing in multi-criterion optimization (MCO) based treatment planning. Assessing and understanding Pareto optimality provides insightful guidance for future planning. However, current MCO-driven Pareto estimation makes relaxed assumptions about the Pareto structure and insufficiently account for practical limitations in beam complexity, leading to performance upper bounds that may be unachievable. This work proposed an alternative data-driven approach that implicitly incorporates the practical limitations, and identifies the Pareto frontier subset by eliminating dominated plans incrementally using the Edgeworth Pareto hull (EPH). The exactness of this elimination process also permits the development of a hierarchical procedure for speedup when the plan cohort size is large, by partitioning the cohort and performing elimination in each subset before a final aggregated elimination. The developed algorithm was first tested on 2D and 3D where accuracy can be reliably assessed. As a specific application, the algorithm was applied to compare systematic plan quality for lower head-and-neck, amongst 4 competing treatment modalities. Results: The algorithm agrees exactly with brute-force pairwise comparison and visual inspection in low dimensions. The hierarchical algorithm shows sqrt(k) folds speedup with k being the number of data points in the plan cohort, demonstrating good efficiency enhancement for heavy testing tasks. Application to plan performance comparison showed superiority of tomotherapy plans for the lower head-and-neck, and revealed a potential nonconvex Pareto frontier structure. Conclusion: An accurate and efficient scheme to identify Pareto frontier from a plan cohort has been

  18. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    Science.gov (United States)

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.

  19. Evolutionary Algorithm for Optimal Vaccination Scheme

    International Nuclear Information System (INIS)

    Parousis-Orthodoxou, K J; Vlachos, D S

    2014-01-01

    The following work uses the dynamic capabilities of an evolutionary algorithm in order to obtain an optimal immunization strategy in a user specified network. The produced algorithm uses a basic genetic algorithm with crossover and mutation techniques, in order to locate certain nodes in the inputted network. These nodes will be immunized in an SIR epidemic spreading process, and the performance of each immunization scheme, will be evaluated by the level of containment that provides for the spreading of the disease

  20. Economic modeling using evolutionary algorithms : the effect of binary encoding of strategies

    NARCIS (Netherlands)

    Waltman, L.R.; Eck, van N.J.; Dekker, Rommert; Kaymak, U.

    2011-01-01

    We are concerned with evolutionary algorithms that are employed for economic modeling purposes. We focus in particular on evolutionary algorithms that use a binary encoding of strategies. These algorithms, commonly referred to as genetic algorithms, are popular in agent-based computational economics

  1. A Clustal Alignment Improver Using Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Thomsen, Rene; Fogel, Gary B.; Krink, Thimo

    2002-01-01

    Multiple sequence alignment (MSA) is a crucial task in bioinformatics. In this paper we extended previous work with evolutionary algorithms (EA) by using MSA solutions obtained from the wellknown Clustal V algorithm as a candidate solution seed of the initial EA population. Our results clearly show...

  2. Pareto law and Pareto index in the income distribution of Japanese companies

    OpenAIRE

    Ishikawa, Atushi

    2004-01-01

    In order to study the phenomenon in detail that income distribution follows Pareto law, we analyze the database of high income companies in Japan. We find a quantitative relation between the average capital of the companies and the Pareto index. The larger the average capital becomes, the smaller the Pareto index becomes. From this relation, we can possibly explain that the Pareto index of company income distribution hardly changes, while the Pareto index of personal income distribution chang...

  3. Hybrid Projected Gradient-Evolutionary Search Algorithm for Mixed Integer Nonlinear Optimization Problems

    National Research Council Canada - National Science Library

    Homaifar, Abdollah; Esterline, Albert; Kimiaghalam, Bahram

    2005-01-01

    The Hybrid Projected Gradient-Evolutionary Search Algorithm (HPGES) algorithm uses a specially designed evolutionary-based global search strategy to efficiently create candidate solutions in the solution space...

  4. Multiobjective constraints for climate model parameter choices: Pragmatic Pareto fronts in CESM1

    Science.gov (United States)

    Langenbrunner, B.; Neelin, J. D.

    2017-09-01

    Global climate models (GCMs) are examples of high-dimensional input-output systems, where model output is a function of many variables, and an update in model physics commonly improves performance in one objective function (i.e., measure of model performance) at the expense of degrading another. Here concepts from multiobjective optimization in the engineering literature are used to investigate parameter sensitivity and optimization in the face of such trade-offs. A metamodeling technique called cut high-dimensional model representation (cut-HDMR) is leveraged in the context of multiobjective optimization to improve GCM simulation of the tropical Pacific climate, focusing on seasonal precipitation, column water vapor, and skin temperature. An evolutionary algorithm is used to solve for Pareto fronts, which are surfaces in objective function space along which trade-offs in GCM performance occur. This approach allows the modeler to visualize trade-offs quickly and identify the physics at play. In some cases, Pareto fronts are small, implying that trade-offs are minimal, optimal parameter value choices are more straightforward, and the GCM is well-functioning. In all cases considered here, the control run was found not to be Pareto-optimal (i.e., not on the front), highlighting an opportunity for model improvement through objectively informed parameter selection. Taylor diagrams illustrate that these improvements occur primarily in field magnitude, not spatial correlation, and they show that specific parameter updates can improve fields fundamental to tropical moist processes—namely precipitation and skin temperature—without significantly impacting others. These results provide an example of how basic elements of multiobjective optimization can facilitate pragmatic GCM tuning processes.

  5. The concept of ageing in evolutionary algorithms: Discussion and inspirations for human ageing.

    Science.gov (United States)

    Dimopoulos, Christos; Papageorgis, Panagiotis; Boustras, George; Efstathiades, Christodoulos

    2017-04-01

    This paper discusses the concept of ageing as this applies to the operation of Evolutionary Algorithms, and examines its relationship to the concept of ageing as this is understood for human beings. Evolutionary Algorithms constitute a family of search algorithms which base their operation on an analogy from the evolution of species in nature. The paper initially provides the necessary knowledge on the operation of Evolutionary Algorithms, focusing on the use of ageing strategies during the implementation of the evolutionary process. Background knowledge on the concept of ageing, as this is defined scientifically for biological systems, is subsequently presented. Based on this information, the paper provides a comparison between the two ageing concepts, and discusses the philosophical inspirations which can be drawn for human ageing based on the operation of Evolutionary Algorithms. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Evolutionary optimization and game strategies for advanced multi-disciplinary design applications to aeronautics and UAV design

    CERN Document Server

    Periaux, Jacques; Lee, Dong Seop Chris

    2015-01-01

    Many complex aeronautical design problems can be formulated with efficient multi-objective evolutionary optimization methods and game strategies. This book describes the role of advanced innovative evolution tools in the solution, or the set of solutions of single or multi disciplinary optimization. These tools use the concept of multi-population, asynchronous parallelization and hierarchical topology which allows different models including precise, intermediate and approximate models with each node belonging to the different hierarchical layer handled by a different Evolutionary Algorithm. The efficiency of evolutionary algorithms for both single and multi-objective optimization problems are significantly improved by the coupling of EAs with games and in particular by a new dynamic methodology named “Hybridized Nash-Pareto games”. Multi objective Optimization techniques and robust design problems taking into account uncertainties are introduced and explained in detail. Several applications dealing with c...

  7. An Agent-Based Co-Evolutionary Multi-Objective Algorithm for Portfolio Optimization

    Directory of Open Access Journals (Sweden)

    Rafał Dreżewski

    2017-08-01

    Full Text Available Algorithms based on the process of natural evolution are widely used to solve multi-objective optimization problems. In this paper we propose the agent-based co-evolutionary algorithm for multi-objective portfolio optimization. The proposed technique is compared experimentally to the genetic algorithm, co-evolutionary algorithm and a more classical approach—the trend-following algorithm. During the experiments historical data from the Warsaw Stock Exchange is used in order to assess the performance of the compared algorithms. Finally, we draw some conclusions from these experiments, showing the strong and weak points of all the techniques.

  8. Pareto printsiip

    Index Scriptorium Estoniae

    2011-01-01

    Itaalia majandusteadlase Vilfredo Pareto jõudmisest oma kuulsa printsiibini ja selle printsiibi mõjust tänapäevasele juhtimisele. Pareto printsiibi kohaselt ei aita suurem osa tegevusest meid tulemuseni jõuda, vaid on aja raiskamine. Diagramm

  9. Multiobjective Optimization of Linear Cooperative Spectrum Sensing: Pareto Solutions and Refinement.

    Science.gov (United States)

    Yuan, Wei; You, Xinge; Xu, Jing; Leung, Henry; Zhang, Tianhang; Chen, Chun Lung Philip

    2016-01-01

    In linear cooperative spectrum sensing, the weights of secondary users and detection threshold should be optimally chosen to minimize missed detection probability and to maximize secondary network throughput. Since these two objectives are not completely compatible, we study this problem from the viewpoint of multiple-objective optimization. We aim to obtain a set of evenly distributed Pareto solutions. To this end, here, we introduce the normal constraint (NC) method to transform the problem into a set of single-objective optimization (SOO) problems. Each SOO problem usually results in a Pareto solution. However, NC does not provide any solution method to these SOO problems, nor any indication on the optimal number of Pareto solutions. Furthermore, NC has no preference over all Pareto solutions, while a designer may be only interested in some of them. In this paper, we employ a stochastic global optimization algorithm to solve the SOO problems, and then propose a simple method to determine the optimal number of Pareto solutions under a computational complexity constraint. In addition, we extend NC to refine the Pareto solutions and select the ones of interest. Finally, we verify the effectiveness and efficiency of the proposed methods through computer simulations.

  10. Exploring the Environment/Energy Pareto Optimal Front of an Office Room Using Computational Fluid Dynamics-Based Interactive Optimization Method

    Directory of Open Access Journals (Sweden)

    Kangji Li

    2017-02-01

    Full Text Available This paper is concerned with the development of a high-resolution and control-friendly optimization framework in enclosed environments that helps improve thermal comfort, indoor air quality (IAQ, and energy costs of heating, ventilation and air conditioning (HVAC system simultaneously. A computational fluid dynamics (CFD-based optimization method which couples algorithms implemented in Matlab with CFD simulation is proposed. The key part of this method is a data interactive mechanism which efficiently passes parameters between CFD simulations and optimization functions. A two-person office room is modeled for the numerical optimization. The multi-objective evolutionary algorithm—non-dominated-and-crowding Sorting Genetic Algorithm II (NSGA-II—is realized to explore the environment/energy Pareto front of the enclosed space. Performance analysis will demonstrate the effectiveness of the presented optimization method.

  11. Exploitation of linkage learning in evolutionary algorithms

    CERN Document Server

    Chen, Ying-ping

    2010-01-01

    The exploitation of linkage learning is enhancing the performance of evolutionary algorithms. This monograph examines recent progress in linkage learning, with a series of focused technical chapters that cover developments and trends in the field.

  12. A FAST AND ELITIST BI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR SCHEDULING INDEPENDENT TASKS ON HETEROGENEOUS SYSTEMS

    Directory of Open Access Journals (Sweden)

    G.Subashini

    2010-07-01

    Full Text Available To meet the increasing computational demands, geographically distributed resources need to be logically coupled to make them work as a unified resource. In analyzing the performance of such distributed heterogeneous computing systems scheduling a set of tasks to the available set of resources for execution is highly important. Task scheduling being an NP-complete problem, use of metaheuristics is more appropriate in obtaining optimal solutions. Schedules thus obtained can be evaluated using several criteria that may conflict with one another which require multi objective problem formulation. This paper investigates the application of an elitist Nondominated Sorting Genetic Algorithm (NSGA-II, to efficiently schedule a set of independent tasks in a heterogeneous distributed computing system. The objectives considered in this paper include minimizing makespan and average flowtime simultaneously. The implementation of NSGA-II algorithm and Weighted-Sum Genetic Algorithm (WSGA has been tested on benchmark instances for distributed heterogeneous systems. As NSGA-II generates a set of Pareto optimal solutions, to verify the effectiveness of NSGA-II over WSGA a fuzzy based membership value assignment method is employed to choose the best compromise solution from the obtained Pareto solution set.

  13. Feasibility of identification of gamma knife planning strategies by identification of pareto optimal gamma knife plans.

    Science.gov (United States)

    Giller, C A

    2011-12-01

    The use of conformity indices to optimize Gamma Knife planning is common, but does not address important tradeoffs between dose to tumor and normal tissue. Pareto analysis has been used for this purpose in other applications, but not for Gamma Knife (GK) planning. The goal of this work is to use computer models to show that Pareto analysis may be feasible for GK planning to identify dosimetric tradeoffs. We define a GK plan A to be Pareto dominant to B if the prescription isodose volume of A covers more tumor but not more normal tissue than B, or if A covers less normal tissue but not less tumor than B. A plan is Pareto optimal if it is not dominated by any other plan. Two different Pareto optimal plans represent different tradeoffs between dose to tumor and normal tissue, because neither plan dominates the other. 'GK simulator' software calculated dose distributions for GK plans, and was called repetitively by a genetic algorithm to calculate Pareto dominant plans. Three irregular tumor shapes were tested in 17 trials using various combinations of shots. The mean number of Pareto dominant plans/trial was 59 ± 17 (sd). Different planning strategies were identified by large differences in shot positions, and 70 of the 153 coordinate plots (46%) showed differences of 5mm or more. The Pareto dominant plans dominated other nearby plans. Pareto dominant plans represent different dosimetric tradeoffs and can be systematically calculated using genetic algorithms. Automatic identification of non-intuitive planning strategies may be feasible with these methods.

  14. Pareto-Optimal Model Selection via SPRINT-Race.

    Science.gov (United States)

    Zhang, Tiantian; Georgiopoulos, Michael; Anagnostopoulos, Georgios C

    2018-02-01

    In machine learning, the notion of multi-objective model selection (MOMS) refers to the problem of identifying the set of Pareto-optimal models that optimize by compromising more than one predefined objectives simultaneously. This paper introduces SPRINT-Race, the first multi-objective racing algorithm in a fixed-confidence setting, which is based on the sequential probability ratio with indifference zone test. SPRINT-Race addresses the problem of MOMS with multiple stochastic optimization objectives in the proper Pareto-optimality sense. In SPRINT-Race, a pairwise dominance or non-dominance relationship is statistically inferred via a non-parametric, ternary-decision, dual-sequential probability ratio test. The overall probability of falsely eliminating any Pareto-optimal models or mistakenly returning any clearly dominated models is strictly controlled by a sequential Holm's step-down family-wise error rate control method. As a fixed-confidence model selection algorithm, the objective of SPRINT-Race is to minimize the computational effort required to achieve a prescribed confidence level about the quality of the returned models. The performance of SPRINT-Race is first examined via an artificially constructed MOMS problem with known ground truth. Subsequently, SPRINT-Race is applied on two real-world applications: 1) hybrid recommender system design and 2) multi-criteria stock selection. The experimental results verify that SPRINT-Race is an effective and efficient tool for such MOMS problems. code of SPRINT-Race is available at https://github.com/watera427/SPRINT-Race.

  15. Multi-objective optimization of a vertical ground source heat pump using evolutionary algorithm

    International Nuclear Information System (INIS)

    Sayyaadi, Hoseyn; Amlashi, Emad Hadaddi; Amidpour, Majid

    2009-01-01

    Thermodynamic and thermoeconomic optimization of a vertical ground source heat pump system has been studied. A model based on the energy and exergy analysis is presented here. An economic model of the system is developed according to the Total Revenue Requirement (TRR) method. The objective functions based on the thermodynamic and thermoeconomic analysis are developed. The proposed vertical ground source heat pump system including eight decision variables is considered for optimization. An artificial intelligence technique known as evolutionary algorithm (EA) has been utilized as an optimization method. This approach has been applied to minimize either the total levelized cost of the system product or the exergy destruction of the system. Three levels of optimization including thermodynamic single objective, thermoeconomic single objective and multi-objective optimizations are performed. In Multi-objective optimization, both thermodynamic and thermoeconomic objectives are considered, simultaneously. In the case of multi-objective optimization, an example of decision-making process for selection of the final solution from available optimal points on Pareto frontier is presented. The results obtained using the various optimization approaches are compared and discussed. Further, the sensitivity of optimized systems to the interest rate, to the annual number of operating hours and to the electricity cost are studied in detail.

  16. Evolutionary algorithms applied to Landau-gauge fixing

    International Nuclear Information System (INIS)

    Markham, J.F.

    1998-01-01

    Current algorithms used to put a lattice gauge configuration into Landau gauge either suffer from the problem of critical slowing-down or involve an additions computational expense to overcome it. Evolutionary Algorithms (EAs), which have been widely applied to other global optimisation problems, may be of use in gauge fixing. Also, being global, they should not suffer from critical slowing-down as do local gradient based algorithms. We apply EA'S and also a Steepest Descent (SD) based method to the problem of Landau Gauge Fixing and compare their performance. (authors)

  17. Pareto utility

    NARCIS (Netherlands)

    Ikefuji, M.; Laeven, R.J.A.; Magnus, J.R.; Muris, C.H.M.

    2013-01-01

    In searching for an appropriate utility function in the expected utility framework, we formulate four properties that we want the utility function to satisfy. We conduct a search for such a function, and we identify Pareto utility as a function satisfying all four desired properties. Pareto utility

  18. Variants of Evolutionary Algorithms for Real-World Applications

    CERN Document Server

    Weise, Thomas; Michalewicz, Zbigniew

    2012-01-01

    Evolutionary Algorithms (EAs) are population-based, stochastic search algorithms that mimic natural evolution. Due to their ability to find excellent solutions for conventionally hard and dynamic problems within acceptable time, EAs have attracted interest from many researchers and practitioners in recent years. This book “Variants of Evolutionary Algorithms for Real-World Applications” aims to promote the practitioner’s view on EAs by providing a comprehensive discussion of how EAs can be adapted to the requirements of various applications in the real-world domains. It comprises 14 chapters, including an introductory chapter re-visiting the fundamental question of what an EA is and other chapters addressing a range of real-world problems such as production process planning, inventory system and supply chain network optimisation, task-based jobs assignment, planning for CNC-based work piece construction, mechanical/ship design tasks that involve runtime-intense simulations, data mining for the predictio...

  19. Multi-Objective Optimization of the Hedging Model for reservoir Operation Using Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    sadegh sadeghitabas

    2015-12-01

    Full Text Available Multi-objective problems rarely ever provide a single optimal solution, rather they yield an optimal set of outputs (Pareto fronts. Solving these problems was previously accomplished by using some simplifier methods such as the weighting coefficient method used for converting a multi-objective problem to a single objective function. However, such robust tools as multi-objective meta-heuristic algorithms have been recently developed for solving these problems. The hedging model is one of the classic problems for reservoir operation that is generally employed for mitigating drought impacts in water resources management. According to this method, although it is possible to supply the total planned demands, only portions of the demands are met to save water by allowing small deficits in the current conditions in order to avoid or reduce severe deficits in future. The approach heavily depends on economic and social considerations. In the present study, the meta-heuristic algorithms of NSGA-II, MOPSO, SPEA-II, and AMALGAM are used toward the multi-objective optimization of the hedging model. For this purpose, the rationing factors involved in Taleghan dam operation are optimized over a 35-year statistical period of inflow. There are two objective functions: a minimizing the modified shortage index, and b maximizing the reliability index (i.e., two opposite objectives. The results show that the above algorithms are applicable to a wide range of optimal solutions. Among the algorithms, AMALGAM is found to produce a better Pareto front for the values of the objective function, indicating its more satisfactory performance.

  20. An Improved SPEA2 Algorithm with Adaptive Selection of Evolutionary Operators Scheme for Multiobjective Optimization Problems

    Directory of Open Access Journals (Sweden)

    Fuqing Zhao

    2016-01-01

    Full Text Available A fixed evolutionary mechanism is usually adopted in the multiobjective evolutionary algorithms and their operators are static during the evolutionary process, which causes the algorithm not to fully exploit the search space and is easy to trap in local optima. In this paper, a SPEA2 algorithm which is based on adaptive selection evolution operators (AOSPEA is proposed. The proposed algorithm can adaptively select simulated binary crossover, polynomial mutation, and differential evolution operator during the evolutionary process according to their contribution to the external archive. Meanwhile, the convergence performance of the proposed algorithm is analyzed with Markov chain. Simulation results on the standard benchmark functions reveal that the performance of the proposed algorithm outperforms the other classical multiobjective evolutionary algorithms.

  1. Computational Modeling of Teaching and Learning through Application of Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Richard Lamb

    2015-09-01

    Full Text Available Within the mind, there are a myriad of ideas that make sense within the bounds of everyday experience, but are not reflective of how the world actually exists; this is particularly true in the domain of science. Classroom learning with teacher explanation are a bridge through which these naive understandings can be brought in line with scientific reality. The purpose of this paper is to examine how the application of a Multiobjective Evolutionary Algorithm (MOEA can work in concert with an existing computational-model to effectively model critical-thinking in the science classroom. An evolutionary algorithm is an algorithm that iteratively optimizes machine learning based computational models. The research question is, does the application of an evolutionary algorithm provide a means to optimize the Student Task and Cognition Model (STAC-M and does the optimized model sufficiently represent and predict teaching and learning outcomes in the science classroom? Within this computational study, the authors outline and simulate the effect of teaching on the ability of a “virtual” student to solve a Piagetian task. Using the Student Task and Cognition Model (STAC-M a computational model of student cognitive processing in science class developed in 2013, the authors complete a computational experiment which examines the role of cognitive retraining on student learning. Comparison of the STAC-M and the STAC-M with inclusion of the Multiobjective Evolutionary Algorithm shows greater success in solving the Piagetian science-tasks post cognitive retraining with the Multiobjective Evolutionary Algorithm. This illustrates the potential uses of cognitive and neuropsychological computational modeling in educational research. The authors also outline the limitations and assumptions of computational modeling.

  2. When do evolutionary algorithms optimize separable functions in parallel?

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Sudholt, Dirk; Witt, Carsten

    2013-01-01

    is that evolutionary algorithms make progress on all subfunctions in parallel, so that optimizing a separable function does not take not much longer than optimizing the hardest subfunction-subfunctions are optimized "in parallel." We show that this is only partially true, already for the simple (1+1) evolutionary...... algorithm ((1+1) EA). For separable functions composed of k Boolean functions indeed the optimization time is the maximum optimization time of these functions times a small O(log k) overhead. More generally, for sums of weighted subfunctions that each attain non-negative integer values less than r = o(log1...

  3. An Evolutionary Approach for Bilevel Multi-objective Problems

    Science.gov (United States)

    Deb, Kalyanmoy; Sinha, Ankur

    Evolutionary multi-objective optimization (EMO) algorithms have been extensively applied to find multiple near Pareto-optimal solutions over the past 15 years or so. However, EMO algorithms for solving bilevel multi-objective optimization problems have not received adequate attention yet. These problems appear in many applications in practice and involve two levels, each comprising of multiple conflicting objectives. These problems require every feasible upper-level solution to satisfy optimality of a lower-level optimization problem, thereby making them difficult to solve. In this paper, we discuss a recently proposed bilevel EMO procedure and show its working principle on a couple of test problems and on a business decision-making problem. This paper should motivate other EMO researchers to engage more into this important optimization task of practical importance.

  4. Multi-agent Pareto appointment exchanging in hospital patient scheduling

    NARCIS (Netherlands)

    Vermeulen, I.B.; Bohté, S.M.; Somefun, D.J.A.; Poutré, La J.A.

    2007-01-01

    We present a dynamic and distributed approach to the hospital patient scheduling problem, in which patients can have multiple appointments that have to be scheduled to different resources. To efficiently solve this problem we develop a multi-agent Pareto-improvement appointment exchanging algorithm:

  5. Analysis for Performance of Symbiosis Co-evolutionary Algorithm

    OpenAIRE

    根路銘, もえ子; 遠藤, 聡志; 山田, 孝治; 宮城, 隼夫; Nerome, Moeko; Endo, Satoshi; Yamada, Koji; Miyagi, Hayao

    2000-01-01

    In this paper, we analyze the behavior of symbiotic evolution algorithm for the N-Queens problem as benchmark problem for search methods in the field of aritificial intelligence. It is shown that this algorithm improves the ability of evolutionary search method. When the problem is solved by Genetic Algorithms (GAs), an ordinal representation is often used as one of gene conversion methods which convert from phenotype to genotype and reconvert. The representation can hinder occurrence of leth...

  6. Hybrid Robust Multi-Objective Evolutionary Optimization Algorithm

    Science.gov (United States)

    2009-03-10

    xfar by xint. Else, generate a new individual, using the Sobol pseudo- random sequence generator within the upper and lower bounds of the variables...12. Deb, K., Multi-Objective Optimization Using Evolutionary Algorithms, John Wiley & Sons. 2002. 13. Sobol , I. M., "Uniformly Distributed Sequences

  7. Sounds unheard of evolutionary algorithms as creative tools for the contemporary composer

    DEFF Research Database (Denmark)

    Dahlstedt, Palle

    2004-01-01

    Evolutionary algorithms are studied as tools for generating novel musical material in the form of musical scores and synthesized sounds. The choice of genetic representation defines a space of potential music. This space is explored using evolutionary algorithms, in search of useful musical mater...... composed with the tools described in the thesis are presented....

  8. Constrained Optimization Based on Hybrid Evolutionary Algorithm and Adaptive Constraint-Handling Technique

    DEFF Research Database (Denmark)

    Wang, Yong; Cai, Zixing; Zhou, Yuren

    2009-01-01

    A novel approach to deal with numerical and engineering constrained optimization problems, which incorporates a hybrid evolutionary algorithm and an adaptive constraint-handling technique, is presented in this paper. The hybrid evolutionary algorithm simultaneously uses simplex crossover and two...... mutation operators to generate the offspring population. Additionally, the adaptive constraint-handling technique consists of three main situations. In detail, at each situation, one constraint-handling mechanism is designed based on current population state. Experiments on 13 benchmark test functions...... and four well-known constrained design problems verify the effectiveness and efficiency of the proposed method. The experimental results show that integrating the hybrid evolutionary algorithm with the adaptive constraint-handling technique is beneficial, and the proposed method achieves competitive...

  9. Food processing optimization using evolutionary algorithms | Enitan ...

    African Journals Online (AJOL)

    Evolutionary algorithms are widely used in single and multi-objective optimization. They are easy to use and provide solution(s) in one simulation run. They are used in food processing industries for decision making. Food processing presents constrained and unconstrained optimization problems. This paper reviews the ...

  10. Pareto Optimal Design for Synthetic Biology.

    Science.gov (United States)

    Patanè, Andrea; Santoro, Andrea; Costanza, Jole; Carapezza, Giovanni; Nicosia, Giuseppe

    2015-08-01

    Recent advances in synthetic biology call for robust, flexible and efficient in silico optimization methodologies. We present a Pareto design approach for the bi-level optimization problem associated to the overproduction of specific metabolites in Escherichia coli. Our method efficiently explores the high dimensional genetic manipulation space, finding a number of trade-offs between synthetic and biological objectives, hence furnishing a deeper biological insight to the addressed problem and important results for industrial purposes. We demonstrate the computational capabilities of our Pareto-oriented approach comparing it with state-of-the-art heuristics in the overproduction problems of i) 1,4-butanediol, ii) myristoyl-CoA, i ii) malonyl-CoA , iv) acetate and v) succinate. We show that our algorithms are able to gracefully adapt and scale to more complex models and more biologically-relevant simulations of the genetic manipulations allowed. The Results obtained for 1,4-butanediol overproduction significantly outperform results previously obtained, in terms of 1,4-butanediol to biomass formation ratio and knock-out costs. In particular overproduction percentage is of +662.7%, from 1.425 mmolh⁻¹gDW⁻¹ (wild type) to 10.869 mmolh⁻¹gDW⁻¹, with a knockout cost of 6. Whereas, Pareto-optimal designs we have found in fatty acid optimizations strictly dominate the ones obtained by the other methodologies, e.g., biomass and myristoyl-CoA exportation improvement of +21.43% (0.17 h⁻¹) and +5.19% (1.62 mmolh⁻¹gDW⁻¹), respectively. Furthermore CPU time required by our heuristic approach is more than halved. Finally we implement pathway oriented sensitivity analysis, epsilon-dominance analysis and robustness analysis to enhance our biological understanding of the problem and to improve the optimization algorithm capabilities.

  11. Evaluation of Preanalytical Quality Indicators by Six Sigma and Pareto`s Principle.

    Science.gov (United States)

    Kulkarni, Sweta; Ramesh, R; Srinivasan, A R; Silvia, C R Wilma Delphine

    2018-01-01

    Preanalytical steps are the major sources of error in clinical laboratory. The analytical errors can be corrected by quality control procedures but there is a need for stringent quality checks in preanalytical area as these processes are done outside the laboratory. Sigma value depicts the performance of laboratory and its quality measures. Hence in the present study six sigma and Pareto principle was applied to preanalytical quality indicators to evaluate the clinical biochemistry laboratory performance. This observational study was carried out for a period of 1 year from November 2015-2016. A total of 1,44,208 samples and 54,265 test requisition forms were screened for preanalytical errors like missing patient information, sample collection details in forms and hemolysed, lipemic, inappropriate, insufficient samples and total number of errors were calculated and converted into defects per million and sigma scale. Pareto`s chart was drawn using total number of errors and cumulative percentage. In 75% test requisition forms diagnosis was not mentioned and sigma value of 0.9 was obtained and for other errors like sample receiving time, stat and type of sample sigma values were 2.9, 2.6, and 2.8 respectively. For insufficient sample and improper ratio of blood to anticoagulant sigma value was 4.3. Pareto`s chart depicts out of 80% of errors in requisition forms, 20% is contributed by missing information like diagnosis. The development of quality indicators, application of six sigma and Pareto`s principle are quality measures by which not only preanalytical, the total testing process can be improved.

  12. The (1+λ) evolutionary algorithm with self-adjusting mutation rate

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Witt, Carsten; Gießen, Christian

    2017-01-01

    We propose a new way to self-adjust the mutation rate in population-based evolutionary algorithms. Roughly speaking, it consists of creating half the offspring with a mutation rate that is twice the current mutation rate and the other half with half the current rate. The mutation rate is then upd......We propose a new way to self-adjust the mutation rate in population-based evolutionary algorithms. Roughly speaking, it consists of creating half the offspring with a mutation rate that is twice the current mutation rate and the other half with half the current rate. The mutation rate...... is then updated to the rate used in that subpopulation which contains the best offspring. We analyze how the (1 + A) evolutionary algorithm with this self-adjusting mutation rate optimizes the OneMax test function. We prove that this dynamic version of the (1 + A) EA finds the optimum in an expected optimization...... time (number of fitness evaluations) of O(nA/log A + n log n). This time is asymptotically smaller than the optimization time of the classic (1 + A) EA. Previous work shows that this performance is best-possible among all A-parallel mutation-based unbiased black-box algorithms. This result shows...

  13. Distributed approximation of Pareto surfaces in multicriteria radiation therapy treatment planning

    International Nuclear Information System (INIS)

    Bokrantz, Rasmus

    2013-01-01

    We consider multicriteria radiation therapy treatment planning by navigation over the Pareto surface, implemented by interpolation between discrete treatment plans. Current state of the art for calculation of a discrete representation of the Pareto surface is to sandwich this set between inner and outer approximations that are updated one point at a time. In this paper, we generalize this sequential method to an algorithm that permits parallelization. The principle of the generalization is to apply the sequential method to an approximation of an inexpensive model of the Pareto surface. The information gathered from the model is sub-sequently used for the calculation of points from the exact Pareto surface, which are processed in parallel. The model is constructed according to the current inner and outer approximations, and given a shape that is difficult to approximate, in order to avoid that parts of the Pareto surface are incorrectly disregarded. Approximations of comparable quality to those generated by the sequential method are demonstrated when the degree of parallelization is up to twice the number of dimensions of the objective space. For practical applications, the number of dimensions is typically at least five, so that a speed-up of one order of magnitude is obtained. (paper)

  14. Distributed approximation of Pareto surfaces in multicriteria radiation therapy treatment planning.

    Science.gov (United States)

    Bokrantz, Rasmus

    2013-06-07

    We consider multicriteria radiation therapy treatment planning by navigation over the Pareto surface, implemented by interpolation between discrete treatment plans. Current state of the art for calculation of a discrete representation of the Pareto surface is to sandwich this set between inner and outer approximations that are updated one point at a time. In this paper, we generalize this sequential method to an algorithm that permits parallelization. The principle of the generalization is to apply the sequential method to an approximation of an inexpensive model of the Pareto surface. The information gathered from the model is sub-sequently used for the calculation of points from the exact Pareto surface, which are processed in parallel. The model is constructed according to the current inner and outer approximations, and given a shape that is difficult to approximate, in order to avoid that parts of the Pareto surface are incorrectly disregarded. Approximations of comparable quality to those generated by the sequential method are demonstrated when the degree of parallelization is up to twice the number of dimensions of the objective space. For practical applications, the number of dimensions is typically at least five, so that a speed-up of one order of magnitude is obtained.

  15. TÉCNICAS EVOLUTIVAS EN PROBLEMAS MULTI-OBJETIVOS EN EL PROCESO DE PLANIFICACIÓN DE LA PRODUCCIÓN / EVOLUTIONARY TECHNIQUES FOR MULTI-OBJECTIVE PROBLEMS IN PRODUCTION PLANNING

    Directory of Open Access Journals (Sweden)

    Mariano Frutos-Alazard

    2012-01-01

    Full Text Available La planificación, en el ámbito productivo, se encarga de diseñar, coordinar, administrar y controlar todas las operaciones que se hallan presentes en la explotación de los sistemas productivos. En este marco de trabajo, aparecen numerosos Problemas de Optimización Multi-objetivo (MOPs. Éstos constan de varias funciones que suelen ser complejas y evaluarlas puede ser muy costoso. La optimización multi-objetivo es la disciplina que trata de encontrar las soluciones, denominadas Pareto óptimas, a este tipo de problemas. La compleja resolución de los MOPs es debida a las dimensiones propias del problema, al carácter combinatorio de los algoritmos y a la naturaleza de los objetivos, los cuales están vinculados a la eficiencia del sistema. En las últimas décadas muchos MOPs vinculados a la producción han sido tratados con éxito con técnicas de resolución basadas en Algoritmos Genéticos. En este trabajo se evalúa a NSGAII (Non-dominated Sorting Genetic Algorithm II, SPEAII (Strength Pareto Evolutionary Algorithm II y a sus antecesores, NSGA y SPEA, en el proceso de planificación de la producción no estandarizada. Luego de la experiencia realizada, el algoritmo NSGAII mostró mayor eficiencia.Planning in production environments takes care of designing, coordinating, managing and controlling all the operations existing in the use of productive systems. There are, in the framework analyzed within this work, several relevant Multi-Objective Optimization Problems (MOPs. They consist of several functions which tend to be complex and expensive to evaluate. Multi-objective optimization is the discipline developed to provide solutions, called Pareto optimal, for the simultaneous optimization of those functions. The costs of solving MOPs is due to the dimension of the problems, the combinatorial nature of the algorithms and the kind of objectives represented, linked to the efficiency of the system.. In the last decades several production

  16. Performance comparison of some evolutionary algorithms on job shop scheduling problems

    Science.gov (United States)

    Mishra, S. K.; Rao, C. S. P.

    2016-09-01

    Job Shop Scheduling as a state space search problem belonging to NP-hard category due to its complexity and combinational explosion of states. Several naturally inspire evolutionary methods have been developed to solve Job Shop Scheduling Problems. In this paper the evolutionary methods namely Particles Swarm Optimization, Artificial Intelligence, Invasive Weed Optimization, Bacterial Foraging Optimization, Music Based Harmony Search Algorithms are applied and find tuned to model and solve Job Shop Scheduling Problems. To compare about 250 Bench Mark instances have been used to evaluate the performance of these algorithms. The capabilities of each these algorithms in solving Job Shop Scheduling Problems are outlined.

  17. Prospective Algorithms for Quantum Evolutionary Computation

    OpenAIRE

    Sofge, Donald A.

    2008-01-01

    This effort examines the intersection of the emerging field of quantum computing and the more established field of evolutionary computation. The goal is to understand what benefits quantum computing might offer to computational intelligence and how computational intelligence paradigms might be implemented as quantum programs to be run on a future quantum computer. We critically examine proposed algorithms and methods for implementing computational intelligence paradigms, primarily focused on ...

  18. Academic Training: Evolutionary Heuristic Optimization: Genetic Algorithms and Estimation of Distribution Algorithms - Lecture series

    CERN Multimedia

    Françoise Benz

    2004-01-01

    ACADEMIC TRAINING LECTURE REGULAR PROGRAMME 1, 2, 3 and 4 June From 11:00 hrs to 12:00 hrs - Main Auditorium bldg. 500 Evolutionary Heuristic Optimization: Genetic Algorithms and Estimation of Distribution Algorithms V. Robles Forcada and M. Perez Hernandez / Univ. de Madrid, Spain In the real world, there exist a huge number of problems that require getting an optimum or near-to-optimum solution. Optimization can be used to solve a lot of different problems such as network design, sets and partitions, storage and retrieval or scheduling. On the other hand, in nature, there exist many processes that seek a stable state. These processes can be seen as natural optimization processes. Over the last 30 years several attempts have been made to develop optimization algorithms, which simulate these natural optimization processes. These attempts have resulted in methods such as Simulated Annealing, based on natural annealing processes or Evolutionary Computation, based on biological evolution processes. Geneti...

  19. Nash evolutionary algorithms : Testing problem size in reconstruction problems in frame structures

    OpenAIRE

    Greiner, D.; Periaux, Jacques; Emperador, J.M.; Galván, B.; Winter, G.

    2016-01-01

    The use of evolutionary algorithms has been enhanced in recent years for solving real engineering problems, where the requirements of intense computational calculations are needed, especially when computational engineering simulations are involved (use of finite element method, boundary element method, etc). The coupling of game-theory concepts in evolutionary algorithms has been a recent line of research which could enhance the efficiency of the optimum design procedure and th...

  20. Sci-Thur AM: Planning - 04: Evaluation of the fluence complexity, solution quality, and run efficiency produced by five fluence parameterizations implemented in PARETO multiobjective radiotherapy treatment planning software.

    Science.gov (United States)

    Champion, H; Fiege, J; McCurdy, B; Potrebko, P; Cull, A

    2012-07-01

    PARETO (Pareto-Aware Radiotherapy Evolutionary Treatment Optimization) is a novel multiobjective treatment planning system that performs beam orientation and fluence optimization simultaneously using an advanced evolutionary algorithm. In order to reduce the number of parameters involved in this enormous search space, we present several methods for modeling the beam fluence. The parameterizations are compared using innovative tools that evaluate fluence complexity, solution quality, and run efficiency. A PARETO run is performed using the basic weight (BW), linear gradient (LG), cosine transform (CT), beam group (BG), and isodose-projection (IP) methods for applying fluence modulation over the projection of the Planning Target Volume in the beam's-eye-view plane. The solutions of each run are non-dominated with respect to other trial solutions encountered during the run. However, to compare the solution quality of independent runs, each run competes against every other run in a round robin fashion. Score is assigned based on the fraction of solutions that survive when a tournament selection operator is applied to the solutions of the two competitors. To compare fluence complexity, a modulation index, fractal dimension, and image gradient entropy are calculated for the fluence maps of each optimal plan. We have found that the LG method results in superior solution quality for a spine phantom, lung patient, and cauda equina patient. The BG method produces solutions with the highest degree of fluence complexity. Most methods result in comparable run times. The LG method produces superior solution quality using a moderate degree of fluence modulation. © 2012 American Association of Physicists in Medicine.

  1. Packets Distributing Evolutionary Algorithm Based on PSO for Ad Hoc Network

    Science.gov (United States)

    Xu, Xiao-Feng

    2018-03-01

    Wireless communication network has such features as limited bandwidth, changeful channel and dynamic topology, etc. Ad hoc network has lots of difficulties in accessing control, bandwidth distribution, resource assign and congestion control. Therefore, a wireless packets distributing Evolutionary algorithm based on PSO (DPSO)for Ad Hoc Network is proposed. Firstly, parameters impact on performance of network are analyzed and researched to obtain network performance effective function. Secondly, the improved PSO Evolutionary Algorithm is used to solve the optimization problem from local to global in the process of network packets distributing. The simulation results show that the algorithm can ensure fairness and timeliness of network transmission, as well as improve ad hoc network resource integrated utilization efficiency.

  2. Pareto evolution of gene networks: an algorithm to optimize multiple fitness objectives

    International Nuclear Information System (INIS)

    Warmflash, Aryeh; Siggia, Eric D; Francois, Paul

    2012-01-01

    The computational evolution of gene networks functions like a forward genetic screen to generate, without preconceptions, all networks that can be assembled from a defined list of parts to implement a given function. Frequently networks are subject to multiple design criteria that cannot all be optimized simultaneously. To explore how these tradeoffs interact with evolution, we implement Pareto optimization in the context of gene network evolution. In response to a temporal pulse of a signal, we evolve networks whose output turns on slowly after the pulse begins, and shuts down rapidly when the pulse terminates. The best performing networks under our conditions do not fall into categories such as feed forward and negative feedback that also encode the input–output relation we used for selection. Pareto evolution can more efficiently search the space of networks than optimization based on a single ad hoc combination of the design criteria. (paper)

  3. Pareto evolution of gene networks: an algorithm to optimize multiple fitness objectives.

    Science.gov (United States)

    Warmflash, Aryeh; Francois, Paul; Siggia, Eric D

    2012-10-01

    The computational evolution of gene networks functions like a forward genetic screen to generate, without preconceptions, all networks that can be assembled from a defined list of parts to implement a given function. Frequently networks are subject to multiple design criteria that cannot all be optimized simultaneously. To explore how these tradeoffs interact with evolution, we implement Pareto optimization in the context of gene network evolution. In response to a temporal pulse of a signal, we evolve networks whose output turns on slowly after the pulse begins, and shuts down rapidly when the pulse terminates. The best performing networks under our conditions do not fall into categories such as feed forward and negative feedback that also encode the input-output relation we used for selection. Pareto evolution can more efficiently search the space of networks than optimization based on a single ad hoc combination of the design criteria.

  4. International Conference on Artificial Intelligence and Evolutionary Algorithms in Engineering Systems

    CERN Document Server

    Dash, Subhransu; Panigrahi, Bijaya

    2015-01-01

      The book is a collection of high-quality peer-reviewed research papers presented in Proceedings of International Conference on Artificial Intelligence and Evolutionary Algorithms in Engineering Systems (ICAEES 2014) held at Noorul Islam Centre for Higher Education, Kumaracoil, India. These research papers provide the latest developments in the broad area of use of artificial intelligence and evolutionary algorithms in engineering systems. The book discusses wide variety of industrial, engineering and scientific applications of the emerging techniques. It presents invited papers from the inventors/originators of new applications and advanced technologies.

  5. Optimal Design of a Centrifugal Compressor Impeller Using Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Soo-Yong Cho

    2012-01-01

    Full Text Available An optimization study was conducted on a centrifugal compressor. Eight design variables were chosen from the control points for the Bezier curves which widely influenced the geometric variation; four design variables were selected to optimize the flow passage between the hub and the shroud, and other four design variables were used to improve the performance of the impeller blade. As an optimization algorithm, an artificial neural network (ANN was adopted. Initially, the design of experiments was applied to set up the initial data space of the ANN, which was improved during the optimization process using a genetic algorithm. If a result of the ANN reached a higher level, that result was re-calculated by computational fluid dynamics (CFD and was applied to develop a new ANN. The prediction difference between the ANN and CFD was consequently less than 1% after the 6th generation. Using this optimization technique, the computational time for the optimization was greatly reduced and the accuracy of the optimization algorithm was increased. The efficiency was improved by 1.4% without losing the pressure ratio, and Pareto-optimal solutions of the efficiency versus the pressure ratio were obtained through the 21st generation.

  6. The feasibility of using Pareto fronts for comparison of treatment planning systems and delivery techniques

    International Nuclear Information System (INIS)

    Ottosson, Rickard O.; Sjoestroem, David; Behrens, Claus F.; Karlsson, Anna; Engstroem, Per E.; Knoeoes, Tommy; Ceberg, Crister

    2009-01-01

    Pareto optimality is a concept that formalises the trade-off between a given set of mutually contradicting objectives. A solution is said to be Pareto optimal when it is not possible to improve one objective without deteriorating at least one of the other. A set of Pareto optimal solutions constitute the Pareto front. The Pareto concept applies well to the inverse planning process, which involves inherently contradictory objectives, high and uniform target dose on one hand, and sparing of surrounding tissue and nearby organs at risk (OAR) on the other. Due to the specific characteristics of a treatment planning system (TPS), treatment strategy or delivery technique, Pareto fronts for a given case are likely to differ. The aim of this study was to investigate the feasibility of using Pareto fronts as a comparative tool for TPSs, treatment strategies and delivery techniques. In order to sample Pareto fronts, multiple treatment plans with varying target conformity and dose sparing of OAR were created for a number of prostate and head and neck IMRT cases. The DVHs of each plan were evaluated with respect to target coverage and dose to relevant OAR. Pareto fronts were successfully created for all studied cases. The results did indeed follow the definition of the Pareto concept, i.e. dose sparing of the OAR could not be improved without target coverage being impaired or vice versa. Furthermore, various treatment techniques resulted in distinguished and well separated Pareto fronts. Pareto fronts may be used to evaluate a number of parameters within radiotherapy. Examples are TPS optimization algorithms, the variation between accelerators or delivery techniques and the degradation of a plan during the treatment planning process. The issue of designing a model for unbiased comparison of parameters with such large inherent discrepancies, e.g. different TPSs, is problematic and should be carefully considered

  7. The feasibility of using Pareto fronts for comparison of treatment planning systems and delivery techniques.

    Science.gov (United States)

    Ottosson, Rickard O; Engstrom, Per E; Sjöström, David; Behrens, Claus F; Karlsson, Anna; Knöös, Tommy; Ceberg, Crister

    2009-01-01

    Pareto optimality is a concept that formalises the trade-off between a given set of mutually contradicting objectives. A solution is said to be Pareto optimal when it is not possible to improve one objective without deteriorating at least one of the other. A set of Pareto optimal solutions constitute the Pareto front. The Pareto concept applies well to the inverse planning process, which involves inherently contradictory objectives, high and uniform target dose on one hand, and sparing of surrounding tissue and nearby organs at risk (OAR) on the other. Due to the specific characteristics of a treatment planning system (TPS), treatment strategy or delivery technique, Pareto fronts for a given case are likely to differ. The aim of this study was to investigate the feasibility of using Pareto fronts as a comparative tool for TPSs, treatment strategies and delivery techniques. In order to sample Pareto fronts, multiple treatment plans with varying target conformity and dose sparing of OAR were created for a number of prostate and head & neck IMRT cases. The DVHs of each plan were evaluated with respect to target coverage and dose to relevant OAR. Pareto fronts were successfully created for all studied cases. The results did indeed follow the definition of the Pareto concept, i.e. dose sparing of the OAR could not be improved without target coverage being impaired or vice versa. Furthermore, various treatment techniques resulted in distinguished and well separated Pareto fronts. Pareto fronts may be used to evaluate a number of parameters within radiotherapy. Examples are TPS optimization algorithms, the variation between accelerators or delivery techniques and the degradation of a plan during the treatment planning process. The issue of designing a model for unbiased comparison of parameters with such large inherent discrepancies, e.g. different TPSs, is problematic and should be carefully considered.

  8. A hybrid pareto mixture for conditional asymmetric fat-tailed distributions.

    Science.gov (United States)

    Carreau, Julie; Bengio, Yoshua

    2009-07-01

    In many cases, we observe some variables X that contain predictive information over a scalar variable of interest Y , with (X,Y) pairs observed in a training set. We can take advantage of this information to estimate the conditional density p(Y|X = x). In this paper, we propose a conditional mixture model with hybrid Pareto components to estimate p(Y|X = x). The hybrid Pareto is a Gaussian whose upper tail has been replaced by a generalized Pareto tail. A third parameter, in addition to the location and spread parameters of the Gaussian, controls the heaviness of the upper tail. Using the hybrid Pareto in a mixture model results in a nonparametric estimator that can adapt to multimodality, asymmetry, and heavy tails. A conditional density estimator is built by modeling the parameters of the mixture estimator as functions of X. We use a neural network to implement these functions. Such conditional density estimators have important applications in many domains such as finance and insurance. We show experimentally that this novel approach better models the conditional density in terms of likelihood, compared to competing algorithms: conditional mixture models with other types of components and a classical kernel-based nonparametric model.

  9. An Evolutionary Algorithm to Mine High-Utility Itemsets

    Directory of Open Access Journals (Sweden)

    Jerry Chun-Wei Lin

    2015-01-01

    Full Text Available High-utility itemset mining (HUIM is a critical issue in recent years since it can be used to reveal the profitable products by considering both the quantity and profit factors instead of frequent itemset mining (FIM of association rules (ARs. In this paper, an evolutionary algorithm is presented to efficiently mine high-utility itemsets (HUIs based on the binary particle swarm optimization. A maximal pattern (MP-tree strcutrue is further designed to solve the combinational problem in the evolution process. Substantial experiments on real-life datasets show that the proposed binary PSO-based algorithm has better results compared to the state-of-the-art GA-based algorithm.

  10. Reinforcement Learning for Online Control of Evolutionary Algorithms

    NARCIS (Netherlands)

    Eiben, A.; Horvath, Mark; Kowalczyk, Wojtek; Schut, Martijn

    2007-01-01

    The research reported in this paper is concerned with assessing the usefulness of reinforcment learning (RL) for on-line calibration of parameters in evolutionary algorithms (EA). We are running an RL procedure and the EA simultaneously and the RL is changing the EA parameters on-the-fly. We

  11. GENERALIZED DOUBLE PARETO SHRINKAGE.

    Science.gov (United States)

    Armagan, Artin; Dunson, David B; Lee, Jaeyong

    2013-01-01

    We propose a generalized double Pareto prior for Bayesian shrinkage estimation and inferences in linear models. The prior can be obtained via a scale mixture of Laplace or normal distributions, forming a bridge between the Laplace and Normal-Jeffreys' priors. While it has a spike at zero like the Laplace density, it also has a Student's t -like tail behavior. Bayesian computation is straightforward via a simple Gibbs sampling algorithm. We investigate the properties of the maximum a posteriori estimator, as sparse estimation plays an important role in many problems, reveal connections with some well-established regularization procedures, and show some asymptotic results. The performance of the prior is tested through simulations and an application.

  12. Designing synthetic networks in silico: a generalised evolutionary algorithm approach.

    Science.gov (United States)

    Smith, Robert W; van Sluijs, Bob; Fleck, Christian

    2017-12-02

    Evolution has led to the development of biological networks that are shaped by environmental signals. Elucidating, understanding and then reconstructing important network motifs is one of the principal aims of Systems & Synthetic Biology. Consequently, previous research has focused on finding optimal network structures and reaction rates that respond to pulses or produce stable oscillations. In this work we present a generalised in silico evolutionary algorithm that simultaneously finds network structures and reaction rates (genotypes) that can satisfy multiple defined objectives (phenotypes). The key step to our approach is to translate a schema/binary-based description of biological networks into systems of ordinary differential equations (ODEs). The ODEs can then be solved numerically to provide dynamic information about an evolved networks functionality. Initially we benchmark algorithm performance by finding optimal networks that can recapitulate concentration time-series data and perform parameter optimisation on oscillatory dynamics of the Repressilator. We go on to show the utility of our algorithm by finding new designs for robust synthetic oscillators, and by performing multi-objective optimisation to find a set of oscillators and feed-forward loops that are optimal at balancing different system properties. In sum, our results not only confirm and build on previous observations but we also provide new designs of synthetic oscillators for experimental construction. In this work we have presented and tested an evolutionary algorithm that can design a biological network to produce desired output. Given that previous designs of synthetic networks have been limited to subregions of network- and parameter-space, the use of our evolutionary optimisation algorithm will enable Synthetic Biologists to construct new systems with the potential to display a wider range of complex responses.

  13. Physical Mapping Using Simulated Annealing and Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Vesterstrøm, Jacob Svaneborg

    2003-01-01

    optimization method when searching for an ordering of the fragments in PM. In this paper, we applied an evolutionary algorithm to the problem, and compared its performance to that of SA and local search on simulated PM data, in order to determine the important factors in finding a good ordering of the segments....... The analysis highlights the importance of a good PM model, a well-correlated fitness function, and high quality hybridization data. We suggest that future work in PM should focus on design of more reliable fitness functions and on developing error-screening algorithms....

  14. An Extensible Component-Based Multi-Objective Evolutionary Algorithm Framework

    DEFF Research Database (Denmark)

    Sørensen, Jan Corfixen; Jørgensen, Bo Nørregaard

    2017-01-01

    The ability to easily modify the problem definition is currently missing in Multi-Objective Evolutionary Algorithms (MOEA). Existing MOEA frameworks do not support dynamic addition and extension of the problem formulation. The existing frameworks require a re-specification of the problem definition...

  15. A Pareto archive floating search procedure for solving multi-objective flexible job shop scheduling problem

    Directory of Open Access Journals (Sweden)

    J. S. Sadaghiani

    2014-04-01

    Full Text Available Flexible job shop scheduling problem is a key factor of using efficiently in production systems. This paper attempts to simultaneously optimize three objectives including minimization of the make span, total workload and maximum workload of jobs. Since the multi objective flexible job shop scheduling problem is strongly NP-Hard, an integrated heuristic approach has been used to solve it. The proposed approach was based on a floating search procedure that has used some heuristic algorithms. Within floating search procedure utilize local heuristic algorithms; it makes the considered problem into two sections including assigning and sequencing sub problem. First of all search is done upon assignment space achieving an acceptable solution and then search would continue on sequencing space based on a heuristic algorithm. This paper has used a multi-objective approach for producing Pareto solution. Thus proposed approach was adapted on NSGA II algorithm and evaluated Pareto-archives. The elements and parameters of the proposed algorithms were adjusted upon preliminary experiments. Finally, computational results were used to analyze efficiency of the proposed algorithm and this results showed that the proposed algorithm capable to produce efficient solutions.

  16. Evolutionary Algorithms for Boolean Functions in Diverse Domains of Cryptography.

    Science.gov (United States)

    Picek, Stjepan; Carlet, Claude; Guilley, Sylvain; Miller, Julian F; Jakobovic, Domagoj

    2016-01-01

    The role of Boolean functions is prominent in several areas including cryptography, sequences, and coding theory. Therefore, various methods for the construction of Boolean functions with desired properties are of direct interest. New motivations on the role of Boolean functions in cryptography with attendant new properties have emerged over the years. There are still many combinations of design criteria left unexplored and in this matter evolutionary computation can play a distinct role. This article concentrates on two scenarios for the use of Boolean functions in cryptography. The first uses Boolean functions as the source of the nonlinearity in filter and combiner generators. Although relatively well explored using evolutionary algorithms, it still presents an interesting goal in terms of the practical sizes of Boolean functions. The second scenario appeared rather recently where the objective is to find Boolean functions that have various orders of the correlation immunity and minimal Hamming weight. In both these scenarios we see that evolutionary algorithms are able to find high-quality solutions where genetic programming performs the best.

  17. Classification as clustering: a Pareto cooperative-competitive GP approach.

    Science.gov (United States)

    McIntyre, Andrew R; Heywood, Malcolm I

    2011-01-01

    Intuitively population based algorithms such as genetic programming provide a natural environment for supporting solutions that learn to decompose the overall task between multiple individuals, or a team. This work presents a framework for evolving teams without recourse to prespecifying the number of cooperating individuals. To do so, each individual evolves a mapping to a distribution of outcomes that, following clustering, establishes the parameterization of a (Gaussian) local membership function. This gives individuals the opportunity to represent subsets of tasks, where the overall task is that of classification under the supervised learning domain. Thus, rather than each team member representing an entire class, individuals are free to identify unique subsets of the overall classification task. The framework is supported by techniques from evolutionary multiobjective optimization (EMO) and Pareto competitive coevolution. EMO establishes the basis for encouraging individuals to provide accurate yet nonoverlaping behaviors; whereas competitive coevolution provides the mechanism for scaling to potentially large unbalanced datasets. Benchmarking is performed against recent examples of nonlinear SVM classifiers over 12 UCI datasets with between 150 and 200,000 training instances. Solutions from the proposed coevolutionary multiobjective GP framework appear to provide a good balance between classification performance and model complexity, especially as the dataset instance count increases.

  18. Estimation of the shape parameter of a generalized Pareto distribution based on a transformation to Pareto distributed variables

    OpenAIRE

    van Zyl, J. Martin

    2012-01-01

    Random variables of the generalized Pareto distribution, can be transformed to that of the Pareto distribution. Explicit expressions exist for the maximum likelihood estimators of the parameters of the Pareto distribution. The performance of the estimation of the shape parameter of generalized Pareto distributed using transformed observations, based on the probability weighted method is tested. It was found to improve the performance of the probability weighted estimator and performs good wit...

  19. Fuzzy ranking based non-dominated sorting genetic algorithm-II for network overload alleviation

    Directory of Open Access Journals (Sweden)

    Pandiarajan K.

    2014-09-01

    Full Text Available This paper presents an effective method of network overload management in power systems. The three competing objectives 1 generation cost 2 transmission line overload and 3 real power loss are optimized to provide pareto-optimal solutions. A fuzzy ranking based non-dominated sorting genetic algorithm-II (NSGA-II is used to solve this complex nonlinear optimization problem. The minimization of competing objectives is done by generation rescheduling. Fuzzy ranking method is employed to extract the best compromise solution out of the available non-dominated solutions depending upon its highest rank. N-1 contingency analysis is carried out to identify the most severe lines and those lines are selected for outage. The effectiveness of the proposed approach is demonstrated for different contingency cases in IEEE 30 and IEEE 118 bus systems with smooth cost functions and their results are compared with other single objective evolutionary algorithms like Particle swarm optimization (PSO and Differential evolution (DE. Simulation results show the effectiveness of the proposed approach to generate well distributed pareto-optimal non-dominated solutions of multi-objective problem

  20. A Hybrid Multiobjective Evolutionary Approach for Flexible Job-Shop Scheduling Problems

    Directory of Open Access Journals (Sweden)

    Jian Xiong

    2012-01-01

    Full Text Available This paper addresses multiobjective flexible job-shop scheduling problem (FJSP with three simultaneously considered objectives: minimizing makespan, minimizing total workload, and minimizing maximal workload. A hybrid multiobjective evolutionary approach (H-MOEA is developed to solve the problem. According to the characteristic of FJSP, a modified crowding distance measure is introduced to maintain the diversity of individuals. In the proposed H-MOEA, well-designed chromosome representation and genetic operators are developed for FJSP. Moreover, a local search procedure based on critical path theory is incorporated in H-MOEA to improve the convergence ability of the algorithm. Experiment results on several well-known benchmark instances demonstrate the efficiency and stability of the proposed algorithm. The comparison with other recently published approaches validates that H-MOEA can obtain Pareto-optimal solutions with better quality and/or diversity.

  1. Fast stochastic algorithm for simulating evolutionary population dynamics

    Science.gov (United States)

    Tsimring, Lev; Hasty, Jeff; Mather, William

    2012-02-01

    Evolution and co-evolution of ecological communities are stochastic processes often characterized by vastly different rates of reproduction and mutation and a coexistence of very large and very small sub-populations of co-evolving species. This creates serious difficulties for accurate statistical modeling of evolutionary dynamics. In this talk, we introduce a new exact algorithm for fast fully stochastic simulations of birth/death/mutation processes. It produces a significant speedup compared to the direct stochastic simulation algorithm in a typical case when the total population size is large and the mutation rates are much smaller than birth/death rates. We illustrate the performance of the algorithm on several representative examples: evolution on a smooth fitness landscape, NK model, and stochastic predator-prey system.

  2. Estimating the ratios of the stationary distribution values for Markov chains modeling evolutionary algorithms.

    Science.gov (United States)

    Mitavskiy, Boris; Cannings, Chris

    2009-01-01

    The evolutionary algorithm stochastic process is well-known to be Markovian. These have been under investigation in much of the theoretical evolutionary computing research. When the mutation rate is positive, the Markov chain modeling of an evolutionary algorithm is irreducible and, therefore, has a unique stationary distribution. Rather little is known about the stationary distribution. In fact, the only quantitative facts established so far tell us that the stationary distributions of Markov chains modeling evolutionary algorithms concentrate on uniform populations (i.e., those populations consisting of a repeated copy of the same individual). At the same time, knowing the stationary distribution may provide some information about the expected time it takes for the algorithm to reach a certain solution, assessment of the biases due to recombination and selection, and is of importance in population genetics to assess what is called a "genetic load" (see the introduction for more details). In the recent joint works of the first author, some bounds have been established on the rates at which the stationary distribution concentrates on the uniform populations. The primary tool used in these papers is the "quotient construction" method. It turns out that the quotient construction method can be exploited to derive much more informative bounds on ratios of the stationary distribution values of various subsets of the state space. In fact, some of the bounds obtained in the current work are expressed in terms of the parameters involved in all the three main stages of an evolutionary algorithm: namely, selection, recombination, and mutation.

  3. Record Values of a Pareto Distribution.

    Science.gov (United States)

    Ahsanullah, M.

    The record values of the Pareto distribution, labelled Pareto (II) (alpha, beta, nu), are reviewed. The best linear unbiased estimates of the parameters in terms of the record values are provided. The prediction of the sth record value based on the first m (s>m) record values are obtained. A classical Pareto distribution provides reasonably…

  4. An External Archive-Guided Multiobjective Particle Swarm Optimization Algorithm.

    Science.gov (United States)

    Zhu, Qingling; Lin, Qiuzhen; Chen, Weineng; Wong, Ka-Chun; Coello Coello, Carlos A; Li, Jianqiang; Chen, Jianyong; Zhang, Jun

    2017-09-01

    The selection of swarm leaders (i.e., the personal best and global best), is important in the design of a multiobjective particle swarm optimization (MOPSO) algorithm. Such leaders are expected to effectively guide the swarm to approach the true Pareto optimal front. In this paper, we present a novel external archive-guided MOPSO algorithm (AgMOPSO), where the leaders for velocity update are all selected from the external archive. In our algorithm, multiobjective optimization problems (MOPs) are transformed into a set of subproblems using a decomposition approach, and then each particle is assigned accordingly to optimize each subproblem. A novel archive-guided velocity update method is designed to guide the swarm for exploration, and the external archive is also evolved using an immune-based evolutionary strategy. These proposed approaches speed up the convergence of AgMOPSO. The experimental results fully demonstrate the superiority of our proposed AgMOPSO in solving most of the test problems adopted, in terms of two commonly used performance measures. Moreover, the effectiveness of our proposed archive-guided velocity update method and immune-based evolutionary strategy is also experimentally validated on more than 30 test MOPs.

  5. Fixed Parameter Evolutionary Algorithms and Maximum Leaf Spanning Trees: A Matter of Mutations

    DEFF Research Database (Denmark)

    Kratsch, Stefan; Lehre, Per Kristian; Neumann, Frank

    2011-01-01

    Evolutionary algorithms have been shown to be very successful for a wide range of NP-hard combinatorial optimization problems. We investigate the NP-hard problem of computing a spanning tree that has a maximal number of leaves by evolutionary algorithms in the context of fixed parameter tractabil...... two common mutation operators, we show that an operator related to spanning tree problems leads to an FPT running time in contrast to a general mutation operator that does not have this property....

  6. A Probability-based Evolutionary Algorithm with Mutations to Learn Bayesian Networks

    Directory of Open Access Journals (Sweden)

    Sho Fukuda

    2014-12-01

    Full Text Available Bayesian networks are regarded as one of the essential tools to analyze causal relationship between events from data. To learn the structure of highly-reliable Bayesian networks from data as quickly as possible is one of the important problems that several studies have been tried to achieve. In recent years, probability-based evolutionary algorithms have been proposed as a new efficient approach to learn Bayesian networks. In this paper, we target on one of the probability-based evolutionary algorithms called PBIL (Probability-Based Incremental Learning, and propose a new mutation operator. Through performance evaluation, we found that the proposed mutation operator has a good performance in learning Bayesian networks

  7. Pareto Optimal Solutions for Network Defense Strategy Selection Simulator in Multi-Objective Reinforcement Learning

    Directory of Open Access Journals (Sweden)

    Yang Sun

    2018-01-01

    Full Text Available Using Pareto optimization in Multi-Objective Reinforcement Learning (MORL leads to better learning results for network defense games. This is particularly useful for network security agents, who must often balance several goals when choosing what action to take in defense of a network. If the defender knows his preferred reward distribution, the advantages of Pareto optimization can be retained by using a scalarization algorithm prior to the implementation of the MORL. In this paper, we simulate a network defense scenario by creating a multi-objective zero-sum game and using Pareto optimization and MORL to determine optimal solutions and compare those solutions to different scalarization approaches. We build a Pareto Defense Strategy Selection Simulator (PDSSS system for assisting network administrators on decision-making, specifically, on defense strategy selection, and the experiment results show that the Satisficing Trade-Off Method (STOM scalarization approach performs better than linear scalarization or GUESS method. The results of this paper can aid network security agents attempting to find an optimal defense policy for network security games.

  8. Evolutionary Algorithms For Neural Networks Binary And Real Data Classification

    Directory of Open Access Journals (Sweden)

    Dr. Hanan A.R. Akkar

    2015-08-01

    Full Text Available Artificial neural networks are complex networks emulating the way human rational neurons process data. They have been widely used generally in prediction clustering classification and association. The training algorithms that used to determine the network weights are almost the most important factor that influence the neural networks performance. Recently many meta-heuristic and Evolutionary algorithms are employed to optimize neural networks weights to achieve better neural performance. This paper aims to use recently proposed algorithms for optimizing neural networks weights comparing these algorithms performance with other classical meta-heuristic algorithms used for the same purpose. However to evaluate the performance of such algorithms for training neural networks we examine such algorithms to classify four opposite binary XOR clusters and classification of continuous real data sets such as Iris and Ecoli.

  9. The Genetic-Algorithm-Based Normal Boundary Intersection (GANBI) Method; An Efficient Approach to Pareto Multiobjective Optimization for Engineering Design

    Science.gov (United States)

    2006-05-15

    of different evolutionary approaches to multiobjective optimal design are given by Van Veldhuizen ,7 Van Veldhuizen and Lamont,8 and Zitzler and Thiele...and Machine Learning, Addison-Wesley, Boston, 1989. 7. D. A. Van Veldhuizen , "Multiobjective Evolutionary Algorithms: Classifications, Analyses, and...New Innovations," Ph.D. Dissertation, Air Force Institute of Technology, 1999. 39 8. D. A. Van Veldhuizen and G. B. Lamont, "Multiobjective

  10. Analog Group Delay Equalizers Design Based on Evolutionary Algorithm

    Directory of Open Access Journals (Sweden)

    M. Laipert

    2006-04-01

    Full Text Available This paper deals with a design method of the analog all-pass filter designated for equalization of the group delay frequency response of the analog filter. This method is based on usage of evolutionary algorithm, the Differential Evolution algorithm in particular. We are able to design such equalizers to be obtained equal-ripple group delay frequency response in the pass-band of the low-pass filter. The procedure works automatically without an input estimation. The method is presented on solving practical examples.

  11. On the Truncated Pareto Distribution with applications

    OpenAIRE

    Zaninetti, Lorenzo; Ferraro, Mario

    2008-01-01

    The Pareto probability distribution is widely applied in different fields such us finance, physics, hydrology, geology and astronomy. This note deals with an application of the Pareto distribution to astrophysics and more precisely to the statistical analysis of mass of stars and of diameters of asteroids. In particular a comparison between the usual Pareto distribution and its truncated version is presented. Finally a possible physical mechanism that produces Pareto tails for the distributio...

  12. Pareto joint inversion of 2D magnetotelluric and gravity data

    Science.gov (United States)

    Miernik, Katarzyna; Bogacz, Adrian; Kozubal, Adam; Danek, Tomasz; Wojdyła, Marek

    2015-04-01

    In this contribution, the first results of the "Innovative technology of petrophysical parameters estimation of geological media using joint inversion algorithms" project were described. At this stage of the development, Pareto joint inversion scheme for 2D MT and gravity data was used. Additionally, seismic data were provided to set some constrains for the inversion. Sharp Boundary Interface(SBI) approach and description model with set of polygons were used to limit the dimensionality of the solution space. The main engine was based on modified Particle Swarm Optimization(PSO). This algorithm was properly adapted to handle two or more target function at once. Additional algorithm was used to eliminate non- realistic solution proposals. Because PSO is a method of stochastic global optimization, it requires a lot of proposals to be evaluated to find a single Pareto solution and then compose a Pareto front. To optimize this stage parallel computing was used for both inversion engine and 2D MT forward solver. There are many advantages of proposed solution of joint inversion problems. First of all, Pareto scheme eliminates cumbersome rescaling of the target functions, that can highly affect the final solution. Secondly, the whole set of solution is created in one optimization run, providing a choice of the final solution. This choice can be based off qualitative data, that are usually very hard to be incorporated into the regular inversion schema. SBI parameterisation not only limits the problem of dimensionality, but also makes constraining of the solution easier. At this stage of work, decision to test the approach using MT and gravity data was made, because this combination is often used in practice. It is important to mention, that the general solution is not limited to this two methods and it is flexible enough to be used with more than two sources of data. Presented results were obtained for synthetic models, imitating real geological conditions, where

  13. Bidirectional Dynamic Diversity Evolutionary Algorithm for Constrained Optimization

    Directory of Open Access Journals (Sweden)

    Weishang Gao

    2013-01-01

    Full Text Available Evolutionary algorithms (EAs were shown to be effective for complex constrained optimization problems. However, inflexible exploration-exploitation and improper penalty in EAs with penalty function would lead to losing the global optimum nearby or on the constrained boundary. To determine an appropriate penalty coefficient is also difficult in most studies. In this paper, we propose a bidirectional dynamic diversity evolutionary algorithm (Bi-DDEA with multiagents guiding exploration-exploitation through local extrema to the global optimum in suitable steps. In Bi-DDEA potential advantage is detected by three kinds of agents. The scale and the density of agents will change dynamically according to the emerging of potential optimal area, which play an important role of flexible exploration-exploitation. Meanwhile, a novel double optimum estimation strategy with objective fitness and penalty fitness is suggested to compute, respectively, the dominance trend of agents in feasible region and forbidden region. This bidirectional evolving with multiagents can not only effectively avoid the problem of determining penalty coefficient but also quickly converge to the global optimum nearby or on the constrained boundary. By examining the rapidity and veracity of Bi-DDEA across benchmark functions, the proposed method is shown to be effective.

  14. Sum-of-squares-based fuzzy controller design using quantum-inspired evolutionary algorithm

    Science.gov (United States)

    Yu, Gwo-Ruey; Huang, Yu-Chia; Cheng, Chih-Yung

    2016-07-01

    In the field of fuzzy control, control gains are obtained by solving stabilisation conditions in linear-matrix-inequality-based Takagi-Sugeno fuzzy control method and sum-of-squares-based polynomial fuzzy control method. However, the optimal performance requirements are not considered under those stabilisation conditions. In order to handle specific performance problems, this paper proposes a novel design procedure with regard to polynomial fuzzy controllers using quantum-inspired evolutionary algorithms. The first contribution of this paper is a combination of polynomial fuzzy control and quantum-inspired evolutionary algorithms to undertake an optimal performance controller design. The second contribution is the proposed stability condition derived from the polynomial Lyapunov function. The proposed design approach is dissimilar to the traditional approach, in which control gains are obtained by solving the stabilisation conditions. The first step of the controller design uses the quantum-inspired evolutionary algorithms to determine the control gains with the best performance. Then, the stability of the closed-loop system is analysed under the proposed stability conditions. To illustrate effectiveness and validity, the problem of balancing and the up-swing of an inverted pendulum on a cart is used.

  15. EFFICIENT MULTI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR JOB SHOP SCHEDULING

    Institute of Scientific and Technical Information of China (English)

    Lei Deming; Wu Zhiming

    2005-01-01

    A new representation method is first presented based on priority rules. According to this method, each entry in the chromosome indicates that in the procedure of the Giffler and Thompson (GT) algorithm, the conflict occurring in the corresponding machine is resolved by the corresponding priority rule. Then crowding-measure multi-objective evolutionary algorithm (CMOEA) is designed,in which both archive maintenance and fitness assignment use crowding measure. Finally the comparisons between CMOEA and SPEA in solving 15 scheduling problems demonstrate that CMOEA is suitable to job shop scheduling.

  16. Evolutionary algorithm for optimization of nonimaging Fresnel lens geometry.

    Science.gov (United States)

    Yamada, N; Nishikawa, T

    2010-06-21

    In this study, an evolutionary algorithm (EA), which consists of genetic and immune algorithms, is introduced to design the optical geometry of a nonimaging Fresnel lens; this lens generates the uniform flux concentration required for a photovoltaic cell. Herein, a design procedure that incorporates a ray-tracing technique in the EA is described, and the validity of the design is demonstrated. The results show that the EA automatically generated a unique geometry of the Fresnel lens; the use of this geometry resulted in better uniform flux concentration with high optical efficiency.

  17. Optimal PMU Placement with Uncertainty Using Pareto Method

    Directory of Open Access Journals (Sweden)

    A. Ketabi

    2012-01-01

    Full Text Available This paper proposes a method for optimal placement of Phasor Measurement Units (PMUs in state estimation considering uncertainty. State estimation has first been turned into an optimization exercise in which the objective function is selected to be the number of unobservable buses which is determined based on Singular Value Decomposition (SVD. For the normal condition, Differential Evolution (DE algorithm is used to find the optimal placement of PMUs. By considering uncertainty, a multiobjective optimization exercise is hence formulated. To achieve this, DE algorithm based on Pareto optimum method has been proposed here. The suggested strategy is applied on the IEEE 30-bus test system in several case studies to evaluate the optimal PMUs placement.

  18. Existence of pareto equilibria for multiobjective games without compactness

    OpenAIRE

    Shiraishi, Yuya; Kuroiwa, Daishi

    2013-01-01

    In this paper, we investigate the existence of Pareto and weak Pareto equilibria for multiobjective games without compactness. By employing an existence theorem of Pareto equilibria due to Yu and Yuan([10]), several existence theorems of Pareto and weak Pareto equilibria for the multiobjective games are established in a similar way to Flores-B´azan.

  19. Expert-guided evolutionary algorithm for layout design of complex space stations

    Science.gov (United States)

    Qian, Zhiqin; Bi, Zhuming; Cao, Qun; Ju, Weiguo; Teng, Hongfei; Zheng, Yang; Zheng, Siyu

    2017-08-01

    The layout of a space station should be designed in such a way that different equipment and instruments are placed for the station as a whole to achieve the best overall performance. The station layout design is a typical nondeterministic polynomial problem. In particular, how to manage the design complexity to achieve an acceptable solution within a reasonable timeframe poses a great challenge. In this article, a new evolutionary algorithm has been proposed to meet such a challenge. It is called as the expert-guided evolutionary algorithm with a tree-like structure decomposition (EGEA-TSD). Two innovations in EGEA-TSD are (i) to deal with the design complexity, the entire design space is divided into subspaces with a tree-like structure; it reduces the computation and facilitates experts' involvement in the solving process. (ii) A human-intervention interface is developed to allow experts' involvement in avoiding local optimums and accelerating convergence. To validate the proposed algorithm, the layout design of one-space station is formulated as a multi-disciplinary design problem, the developed algorithm is programmed and executed, and the result is compared with those from other two algorithms; it has illustrated the superior performance of the proposed EGEA-TSD.

  20. AN EVOLUTIONARY ALGORITHM FOR FAST INTENSITY BASED IMAGE MATCHING BETWEEN OPTICAL AND SAR SATELLITE IMAGERY

    Directory of Open Access Journals (Sweden)

    P. Fischer

    2018-04-01

    Full Text Available This paper presents a hybrid evolutionary algorithm for fast intensity based matching between satellite imagery from SAR and very high-resolution (VHR optical sensor systems. The precise and accurate co-registration of image time series and images of different sensors is a key task in multi-sensor image processing scenarios. The necessary preprocessing step of image matching and tie-point detection is divided into a search problem and a similarity measurement. Within this paper we evaluate the use of an evolutionary search strategy for establishing the spatial correspondence between satellite imagery of optical and radar sensors. The aim of the proposed algorithm is to decrease the computational costs during the search process by formulating the search as an optimization problem. Based upon the canonical evolutionary algorithm, the proposed algorithm is adapted for SAR/optical imagery intensity based matching. Extensions are drawn using techniques like hybridization (e.g. local search and others to lower the number of objective function calls and refine the result. The algorithm significantely decreases the computational costs whilst finding the optimal solution in a reliable way.

  1. Harmonic elimination in diode-clamped multilevel inverter using evolutionary algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Barkati, Said [Laboratoire d' analyse des Signaux et Systemes (LASS), Universite de M' sila, BP. 166, rue Ichbilia 28000 M' sila (Algeria); Baghli, Lotfi [Groupe de Recherche en Electrotechnique et Electronique de Nancy (GREEN), CNRS UMR 7030, Universite Henri Poincare Nancy 1, BP. 239, 54506 Vandoeuvre-les-Nancy (France); Berkouk, El Madjid; Boucherit, Mohamed-Seghir [Laboratoire de Commande des Processus (LCP), Ecole Nationale Polytechnique, BP. 182, 10 Avenue Hassen Badi, 16200 El Harrach, Alger (Algeria)

    2008-10-15

    This paper describes two evolutionary algorithms for the optimized harmonic stepped-waveform technique. Genetic algorithms and particle swarm optimization are applied to compute the switching angles in a three-phase seven-level inverter to produce the required fundamental voltage while, at the same time, specified harmonics are eliminated. Furthermore, these algorithms are also used to solve the starting point problem of the Newton-Raphson conventional method. This combination provides a very effective method for the harmonic elimination technique. This strategy is useful for different structures of seven-level inverters. The diode-clamped topology is considered in this study. (author)

  2. General upper bounds on the runtime of parallel evolutionary algorithms.

    Science.gov (United States)

    Lässig, Jörg; Sudholt, Dirk

    2014-01-01

    We present a general method for analyzing the runtime of parallel evolutionary algorithms with spatially structured populations. Based on the fitness-level method, it yields upper bounds on the expected parallel runtime. This allows for a rigorous estimate of the speedup gained by parallelization. Tailored results are given for common migration topologies: ring graphs, torus graphs, hypercubes, and the complete graph. Example applications for pseudo-Boolean optimization show that our method is easy to apply and that it gives powerful results. In our examples the performance guarantees improve with the density of the topology. Surprisingly, even sparse topologies such as ring graphs lead to a significant speedup for many functions while not increasing the total number of function evaluations by more than a constant factor. We also identify which number of processors lead to the best guaranteed speedups, thus giving hints on how to parameterize parallel evolutionary algorithms.

  3. RNA-Pareto: interactive analysis of Pareto-optimal RNA sequence-structure alignments.

    Science.gov (United States)

    Schnattinger, Thomas; Schöning, Uwe; Marchfelder, Anita; Kestler, Hans A

    2013-12-01

    Incorporating secondary structure information into the alignment process improves the quality of RNA sequence alignments. Instead of using fixed weighting parameters, sequence and structure components can be treated as different objectives and optimized simultaneously. The result is not a single, but a Pareto-set of equally optimal solutions, which all represent different possible weighting parameters. We now provide the interactive graphical software tool RNA-Pareto, which allows a direct inspection of all feasible results to the pairwise RNA sequence-structure alignment problem and greatly facilitates the exploration of the optimal solution set.

  4. Comparison of evolutionary algorithms in gene regulatory network model inference.

    LENUS (Irish Health Repository)

    2010-01-01

    ABSTRACT: BACKGROUND: The evolution of high throughput technologies that measure gene expression levels has created a data base for inferring GRNs (a process also known as reverse engineering of GRNs). However, the nature of these data has made this process very difficult. At the moment, several methods of discovering qualitative causal relationships between genes with high accuracy from microarray data exist, but large scale quantitative analysis on real biological datasets cannot be performed, to date, as existing approaches are not suitable for real microarray data which are noisy and insufficient. RESULTS: This paper performs an analysis of several existing evolutionary algorithms for quantitative gene regulatory network modelling. The aim is to present the techniques used and offer a comprehensive comparison of approaches, under a common framework. Algorithms are applied to both synthetic and real gene expression data from DNA microarrays, and ability to reproduce biological behaviour, scalability and robustness to noise are assessed and compared. CONCLUSIONS: Presented is a comparison framework for assessment of evolutionary algorithms, used to infer gene regulatory networks. Promising methods are identified and a platform for development of appropriate model formalisms is established.

  5. A divide and conquer approach to determine the Pareto frontier for optimization of protein engineering experiments

    Science.gov (United States)

    He, Lu; Friedman, Alan M.; Bailey-Kellogg, Chris

    2016-01-01

    In developing improved protein variants by site-directed mutagenesis or recombination, there are often competing objectives that must be considered in designing an experiment (selecting mutations or breakpoints): stability vs. novelty, affinity vs. specificity, activity vs. immunogenicity, and so forth. Pareto optimal experimental designs make the best trade-offs between competing objectives. Such designs are not “dominated”; i.e., no other design is better than a Pareto optimal design for one objective without being worse for another objective. Our goal is to produce all the Pareto optimal designs (the Pareto frontier), in order to characterize the trade-offs and suggest designs most worth considering, but to avoid explicitly considering the large number of dominated designs. To do so, we develop a divide-and-conquer algorithm, PEPFR (Protein Engineering Pareto FRontier), that hierarchically subdivides the objective space, employing appropriate dynamic programming or integer programming methods to optimize designs in different regions. This divide-and-conquer approach is efficient in that the number of divisions (and thus calls to the optimizer) is directly proportional to the number of Pareto optimal designs. We demonstrate PEPFR with three protein engineering case studies: site-directed recombination for stability and diversity via dynamic programming, site-directed mutagenesis of interacting proteins for affinity and specificity via integer programming, and site-directed mutagenesis of a therapeutic protein for activity and immunogenicity via integer programming. We show that PEPFR is able to effectively produce all the Pareto optimal designs, discovering many more designs than previous methods. The characterization of the Pareto frontier provides additional insights into the local stability of design choices as well as global trends leading to trade-offs between competing criteria. PMID:22180081

  6. A divide-and-conquer approach to determine the Pareto frontier for optimization of protein engineering experiments.

    Science.gov (United States)

    He, Lu; Friedman, Alan M; Bailey-Kellogg, Chris

    2012-03-01

    In developing improved protein variants by site-directed mutagenesis or recombination, there are often competing objectives that must be considered in designing an experiment (selecting mutations or breakpoints): stability versus novelty, affinity versus specificity, activity versus immunogenicity, and so forth. Pareto optimal experimental designs make the best trade-offs between competing objectives. Such designs are not "dominated"; that is, no other design is better than a Pareto optimal design for one objective without being worse for another objective. Our goal is to produce all the Pareto optimal designs (the Pareto frontier), to characterize the trade-offs and suggest designs most worth considering, but to avoid explicitly considering the large number of dominated designs. To do so, we develop a divide-and-conquer algorithm, Protein Engineering Pareto FRontier (PEPFR), that hierarchically subdivides the objective space, using appropriate dynamic programming or integer programming methods to optimize designs in different regions. This divide-and-conquer approach is efficient in that the number of divisions (and thus calls to the optimizer) is directly proportional to the number of Pareto optimal designs. We demonstrate PEPFR with three protein engineering case studies: site-directed recombination for stability and diversity via dynamic programming, site-directed mutagenesis of interacting proteins for affinity and specificity via integer programming, and site-directed mutagenesis of a therapeutic protein for activity and immunogenicity via integer programming. We show that PEPFR is able to effectively produce all the Pareto optimal designs, discovering many more designs than previous methods. The characterization of the Pareto frontier provides additional insights into the local stability of design choices as well as global trends leading to trade-offs between competing criteria. Copyright © 2011 Wiley Periodicals, Inc.

  7. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment.

    Science.gov (United States)

    Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che

    2014-01-16

    To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high

  8. A kNN method that uses a non-natural evolutionary algorithm for ...

    African Journals Online (AJOL)

    We used this algorithm for component selection of a kNN (k Nearest Neighbor) method for breast cancer prognosis. Results with the UCI prognosis data set show that we can find components that help improve the accuracy of kNN by almost 3%, raising it above 79%. Keywords: kNN; classification; evolutionary algorithm; ...

  9. Low emittance lattice optimization using a multi-objective evolutionary algorithm

    International Nuclear Information System (INIS)

    Gao Weiwei; Wang Lin; Li Weimin; He Duohui

    2011-01-01

    A low emittance lattice design and optimization procedure are systematically studied with a non-dominated sorting-based multi-objective evolutionary algorithm which not only globally searches the low emittance lattice, but also optimizes some beam quantities such as betatron tunes, momentum compaction factor and dispersion function simultaneously. In this paper the detailed algorithm and lattice design procedure are presented. The Hefei light source upgrade project storage ring lattice, with fixed magnet layout, is designed to illustrate this optimization procedure. (authors)

  10. Parameterless evolutionary algorithm applied to the nuclear reload problem

    International Nuclear Information System (INIS)

    Caldas, Gustavo Henrique Flores; Schirru, Roberto

    2008-01-01

    In this work, an evolutionary algorithm with no parameters called FPBIL (parameter free PBIL) is developed based on PBIL (population-based incremental learning). Moreover, the analysis reveals how the parameters from PBIL can be replaced by self-adaptable mechanisms which appear from the radically different form by which the evolution is processed. Despite the advantages, the FPBIL reveals itself compact and relatively modest in the use of computational resources. The FPBIL is then applied to the nuclear reload problem. The experimental results observed are compared to those of other works and corroborate to affirm the superiority of the new algorithm

  11. A Survey on Evolutionary Algorithm Based Hybrid Intelligence in Bioinformatics

    Directory of Open Access Journals (Sweden)

    Shan Li

    2014-01-01

    Full Text Available With the rapid advance in genomics, proteomics, metabolomics, and other types of omics technologies during the past decades, a tremendous amount of data related to molecular biology has been produced. It is becoming a big challenge for the bioinformatists to analyze and interpret these data with conventional intelligent techniques, for example, support vector machines. Recently, the hybrid intelligent methods, which integrate several standard intelligent approaches, are becoming more and more popular due to their robustness and efficiency. Specifically, the hybrid intelligent approaches based on evolutionary algorithms (EAs are widely used in various fields due to the efficiency and robustness of EAs. In this review, we give an introduction about the applications of hybrid intelligent methods, in particular those based on evolutionary algorithm, in bioinformatics. In particular, we focus on their applications to three common problems that arise in bioinformatics, that is, feature selection, parameter estimation, and reconstruction of biological networks.

  12. Identification of Water Diffusivity of Inorganic Porous Materials Using Evolutionary Algorithms

    Czech Academy of Sciences Publication Activity Database

    Kočí, J.; Maděra, J.; Jerman, M.; Keppert, M.; Svora, Petr; Černý, R.

    2016-01-01

    Roč. 113, č. 1 (2016), s. 51-66 ISSN 0169-3913 Institutional support: RVO:61388980 Keywords : Evolutionary algorithms * Water transport * Inorganic porous materials * Inverse analysis Subject RIV: CA - Inorganic Chemistry Impact factor: 2.205, year: 2016

  13. Comparison of some evolutionary algorithms for optimization of the path synthesis problem

    Science.gov (United States)

    Grabski, Jakub Krzysztof; Walczak, Tomasz; Buśkiewicz, Jacek; Michałowska, Martyna

    2018-01-01

    The paper presents comparison of the results obtained in a mechanism synthesis by means of some selected evolutionary algorithms. The optimization problem considered in the paper as an example is the dimensional synthesis of the path generating four-bar mechanism. In order to solve this problem, three different artificial intelligence algorithms are employed in this study.

  14. Modelling and Pareto optimization of mechanical properties of friction stir welded AA7075/AA5083 butt joints using neural network and particle swarm algorithm

    International Nuclear Information System (INIS)

    Shojaeefard, Mohammad Hasan; Behnagh, Reza Abdi; Akbari, Mostafa; Givi, Mohammad Kazem Besharati; Farhani, Foad

    2013-01-01

    Highlights: ► Defect-free friction stir welds have been produced for AA5083-O/AA7075-O. ► Back-propagation was sufficient for predicting hardness and tensile strength. ► A hybrid multi-objective algorithm is proposed to deal with this MOP. ► Multi-objective particle swarm optimization was used to find the Pareto solutions. ► TOPSIS is used to rank the given alternatives of the Pareto solutions. -- Abstract: Friction Stir Welding (FSW) has been successfully used to weld similar and dissimilar cast and wrought aluminium alloys, especially for aircraft aluminium alloys, that generally present with low weldability by the traditional fusion welding process. This paper focuses on the microstructural and mechanical properties of the Friction Stir Welding (FSW) of AA7075-O to AA5083-O aluminium alloys. Weld microstructures, hardness and tensile properties were evaluated in as-welded condition. Tensile tests indicated that mechanical properties of the joint were better than in the base metals. An Artificial Neural Network (ANN) model was developed to simulate the correlation between the Friction Stir Welding parameters and mechanical properties. Performance of the ANN model was excellent and the model was employed to predict the ultimate tensile strength and hardness of butt joint of AA7075–AA5083 as functions of weld and rotational speeds. The multi-objective particle swarm optimization was used to obtain the Pareto-optimal set. Finally, the Technique for Order Preference by Similarity to the Ideal Solution (TOPSIS) was applied to determine the best compromised solution.

  15. The mixing evolutionary algorithm : indepedent selection and allocation of trials

    NARCIS (Netherlands)

    C.H.M. van Kemenade

    1997-01-01

    textabstractWhen using an evolutionary algorithm to solve a problem involving building blocks we have to grow the building blocks and then mix these building blocks to obtain the (optimal) solution. Finding a good balance between the growing and the mixing process is a prerequisite to get a reliable

  16. DNA evolutionary algorithm (DNAEA) for source term identification in convection-diffusion equation

    International Nuclear Information System (INIS)

    Yang, X-H; Hu, X-X; Shen, Z-Y

    2008-01-01

    The source identification problem is changed into an optimization problem in this paper. This is a complicated nonlinear optimization problem. It is very intractable with traditional optimization methods. So DNA evolutionary algorithm (DNAEA) is presented to solve the discussed problem. In this algorithm, an initial population is generated by a chaos algorithm. With the shrinking of searching range, DNAEA gradually directs to an optimal result with excellent individuals obtained by DNAEA. The position and intensity of pollution source are well found with DNAEA. Compared with Gray-coded genetic algorithm and pure random search algorithm, DNAEA has rapider convergent speed and higher calculation precision

  17. Large-Scale Portfolio Optimization Using Multiobjective Evolutionary Algorithms and Preselection Methods

    Directory of Open Access Journals (Sweden)

    B. Y. Qu

    2017-01-01

    Full Text Available Portfolio optimization problems involve selection of different assets to invest in order to maximize the overall return and minimize the overall risk simultaneously. The complexity of the optimal asset allocation problem increases with an increase in the number of assets available to select from for investing. The optimization problem becomes computationally challenging when there are more than a few hundreds of assets to select from. To reduce the complexity of large-scale portfolio optimization, two asset preselection procedures that consider return and risk of individual asset and pairwise correlation to remove assets that may not potentially be selected into any portfolio are proposed in this paper. With these asset preselection methods, the number of assets considered to be included in a portfolio can be increased to thousands. To test the effectiveness of the proposed methods, a Normalized Multiobjective Evolutionary Algorithm based on Decomposition (NMOEA/D algorithm and several other commonly used multiobjective evolutionary algorithms are applied and compared. Six experiments with different settings are carried out. The experimental results show that with the proposed methods the simulation time is reduced while return-risk trade-off performances are significantly improved. Meanwhile, the NMOEA/D is able to outperform other compared algorithms on all experiments according to the comparative analysis.

  18. Evolutionary algorithms for the Vehicle Routing Problem with Time Windows

    NARCIS (Netherlands)

    Bräysy, Olli; Dullaert, Wout; Gendreau, Michel

    2004-01-01

    This paper surveys the research on evolutionary algorithms for the Vehicle Routing Problem with Time Windows (VRPTW). The VRPTW can be described as the problem of designing least cost routes from a single depot to a set of geographically scattered points. The routes must be designed in such a way

  19. A Novel Evolutionary Algorithm Inspired by Beans Dispersal

    Directory of Open Access Journals (Sweden)

    Xiaoming Zhang

    2013-02-01

    Full Text Available Inspired by the transmission of beans in nature, a novel evolutionary algorithm-Bean Optimization Algorithm (BOA is proposed in this paper. BOA is mainly based on the normal distribution which is an important continuous probability distribution of quantitative phenomena. Through simulating the self-adaptive phenomena of plant, BOA is designed for solving continuous optimization problems. We also analyze the global convergence of BOA by using the Solis and Wetsarsquo; research results. The conclusion is that BOA can converge to the global optimization solution with probability one. In order to validate its effectiveness, BOA is tested against benchmark functions. And its performance is also compared with that of particle swarm optimization (PSO algorithm. The experimental results show that BOA has competitive performance to PSO in terms of accuracy and convergence speed on the explored tests and stands out as a promising alternative to existing optimization methods for engineering designs or applications.

  20. XTALOPT: An open-source evolutionary algorithm for crystal structure prediction

    Science.gov (United States)

    Lonie, David C.; Zurek, Eva

    2011-02-01

    The implementation and testing of XTALOPT, an evolutionary algorithm for crystal structure prediction, is outlined. We present our new periodic displacement (ripple) operator which is ideally suited to extended systems. It is demonstrated that hybrid operators, which combine two pure operators, reduce the number of duplicate structures in the search. This allows for better exploration of the potential energy surface of the system in question, while simultaneously zooming in on the most promising regions. A continuous workflow, which makes better use of computational resources as compared to traditional generation based algorithms, is employed. Various parameters in XTALOPT are optimized using a novel benchmarking scheme. XTALOPT is available under the GNU Public License, has been interfaced with various codes commonly used to study extended systems, and has an easy to use, intuitive graphical interface. Program summaryProgram title:XTALOPT Catalogue identifier: AEGX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL v2.1 or later [1] No. of lines in distributed program, including test data, etc.: 36 849 No. of bytes in distributed program, including test data, etc.: 1 149 399 Distribution format: tar.gz Programming language: C++ Computer: PCs, workstations, or clusters Operating system: Linux Classification: 7.7 External routines: QT [2], OpenBabel [3], AVOGADRO [4], SPGLIB [8] and one of: VASP [5], PWSCF [6], GULP [7]. Nature of problem: Predicting the crystal structure of a system from its stoichiometry alone remains a grand challenge in computational materials science, chemistry, and physics. Solution method: Evolutionary algorithms are stochastic search techniques which use concepts from biological evolution in order to locate the global minimum on their potential energy surface. Our evolutionary algorithm, XTALOPT, is freely

  1. An efficient non-dominated sorting method for evolutionary algorithms.

    Science.gov (United States)

    Fang, Hongbing; Wang, Qian; Tu, Yi-Cheng; Horstemeyer, Mark F

    2008-01-01

    We present a new non-dominated sorting algorithm to generate the non-dominated fronts in multi-objective optimization with evolutionary algorithms, particularly the NSGA-II. The non-dominated sorting algorithm used by NSGA-II has a time complexity of O(MN(2)) in generating non-dominated fronts in one generation (iteration) for a population size N and M objective functions. Since generating non-dominated fronts takes the majority of total computational time (excluding the cost of fitness evaluations) of NSGA-II, making this algorithm faster will significantly improve the overall efficiency of NSGA-II and other genetic algorithms using non-dominated sorting. The new non-dominated sorting algorithm proposed in this study reduces the number of redundant comparisons existing in the algorithm of NSGA-II by recording the dominance information among solutions from their first comparisons. By utilizing a new data structure called the dominance tree and the divide-and-conquer mechanism, the new algorithm is faster than NSGA-II for different numbers of objective functions. Although the number of solution comparisons by the proposed algorithm is close to that of NSGA-II when the number of objectives becomes large, the total computational time shows that the proposed algorithm still has better efficiency because of the adoption of the dominance tree structure and the divide-and-conquer mechanism.

  2. A New Evolutionary Algorithm Based on Bacterial Evolution and Its Application for Scheduling A Flexible Manufacturing System

    Directory of Open Access Journals (Sweden)

    Chandramouli Anandaraman

    2012-01-01

    Full Text Available A new evolutionary computation algorithm, Superbug algorithm, which simulates evolution of bacteria in a culture, is proposed. The algorithm is developed for solving large scale optimization problems such as scheduling, transportation and assignment problems. In this work, the algorithm optimizes machine schedules in a Flexible Manufacturing System (FMS by minimizing makespan. The FMS comprises of four machines and two identical Automated Guided Vehicles (AGVs. AGVs are used for carrying jobs between the Load/Unload (L/U station and the machines. Experimental results indicate the efficiency of the proposed algorithm in its optimization performance in scheduling is noticeably superior to other evolutionary algorithms when compared to the best results reported in the literature for FMS Scheduling.

  3. Design of a centrifugal compressor impeller using multi-objective optimization algorithm

    International Nuclear Information System (INIS)

    Kim, Jin Hyuk; Husain, Afzal; Kim, Kwang Yong; Choi, Jae Ho

    2009-01-01

    This paper presents a design optimization of a centrifugal compressor impeller with hybrid multi-objective evolutionary algorithm (hybrid MOEA). Reynolds-averaged Navier-Stokes equations with shear stress transport turbulence model are discretized by finite volume approximations and solved on hexahedral grids for flow analyses. Two objectives, i.e., isentropic efficiency and total pressure ratio are selected with four design variables defining impeller hub and shroud contours in meridional contours to optimize the system. Non-dominated Sorting of Genetic Algorithm (NSGA-II) with ε-constraint strategy for local search coupled with Radial Basis Neural Network model is used for multi-objective optimization. The optimization results show that isentropic efficiencies and total pressure ratios of the five cluster points at the Pareto-optimal solutions are enhanced by multi-objective optimization.

  4. Design of a centrifugal compressor impeller using multi-objective optimization algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jin Hyuk; Husain, Afzal; Kim, Kwang Yong [Inha University, Incheon (Korea, Republic of); Choi, Jae Ho [Samsung Techwin Co., Ltd., Changwon (Korea, Republic of)

    2009-07-01

    This paper presents a design optimization of a centrifugal compressor impeller with hybrid multi-objective evolutionary algorithm (hybrid MOEA). Reynolds-averaged Navier-Stokes equations with shear stress transport turbulence model are discretized by finite volume approximations and solved on hexahedral grids for flow analyses. Two objectives, i.e., isentropic efficiency and total pressure ratio are selected with four design variables defining impeller hub and shroud contours in meridional contours to optimize the system. Non-dominated Sorting of Genetic Algorithm (NSGA-II) with {epsilon}-constraint strategy for local search coupled with Radial Basis Neural Network model is used for multi-objective optimization. The optimization results show that isentropic efficiencies and total pressure ratios of the five cluster points at the Pareto-optimal solutions are enhanced by multi-objective optimization.

  5. Optimal design of a spherical parallel manipulator based on kinetostatic performance using evolutionary techniques

    Energy Technology Data Exchange (ETDEWEB)

    Daneshmand, Morteza [University of Tartu, Tartu (Estonia); Saadatzi, Mohammad Hossein [Colorado School of Mines, Golden (United States); Kaloorazi, Mohammad Hadi [École de Technologie Supérieur, Montréal (Canada); Masouleh, Mehdi Tale [University of Tehran, Tehran (Iran, Islamic Republic of); Anbarjafari, Gholamreza [Hasan Kalyoncu University, Gaziantep (Turkmenistan)

    2016-03-15

    This study aims to provide an optimal design for a Spherical parallel manipulator (SPM), namely, the Agile Eye. This aim is approached by investigating kinetostatic performance and workspace and searching for the most promising design. Previously recommended designs are examined to determine whether they provide acceptable kinetostatic performance and workspace. Optimal designs are provided according to different kinetostatic performance indices, especially kinematic sensitivity. The optimization process is launched based on the concept of the genetic algorithm. A single-objective process is implemented in accordance with the guidelines of an evolutionary algorithm called differential evolution. A multi-objective procedure is then provided following the reasoning of the nondominated sorting genetic algorithm-II. This process results in several sets of Pareto points for reconciliation between kinetostatic performance indices and workspace. The concept of numerous kinetostatic performance indices and the results of optimization algorithms are elaborated. The conclusions provide hints on the provided set of designs and their credibility to provide a well-conditioned workspace and acceptable kinetostatic performance for the SPM under study, which can be well extended to other types of SPMs.

  6. Multi-objective hierarchical genetic algorithms for multilevel redundancy allocation optimization

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Ranjan [Department of Aeronautics and Astronautics, Kyoto University, Yoshida-honmachi, Sakyo-ku, Kyoto 606-8501 (Japan)], E-mail: ranjan.k@ks3.ecs.kyoto-u.ac.jp; Izui, Kazuhiro [Department of Aeronautics and Astronautics, Kyoto University, Yoshida-honmachi, Sakyo-ku, Kyoto 606-8501 (Japan)], E-mail: izui@prec.kyoto-u.ac.jp; Yoshimura, Masataka [Department of Aeronautics and Astronautics, Kyoto University, Yoshida-honmachi, Sakyo-ku, Kyoto 606-8501 (Japan)], E-mail: yoshimura@prec.kyoto-u.ac.jp; Nishiwaki, Shinji [Department of Aeronautics and Astronautics, Kyoto University, Yoshida-honmachi, Sakyo-ku, Kyoto 606-8501 (Japan)], E-mail: shinji@prec.kyoto-u.ac.jp

    2009-04-15

    Multilevel redundancy allocation optimization problems (MRAOPs) occur frequently when attempting to maximize the system reliability of a hierarchical system, and almost all complex engineering systems are hierarchical. Despite their practical significance, limited research has been done concerning the solving of simple MRAOPs. These problems are not only NP hard but also involve hierarchical design variables. Genetic algorithms (GAs) have been applied in solving MRAOPs, since they are computationally efficient in solving such problems, unlike exact methods, but their applications has been confined to single-objective formulation of MRAOPs. This paper proposes a multi-objective formulation of MRAOPs and a methodology for solving such problems. In this methodology, a hierarchical GA framework for multi-objective optimization is proposed by introducing hierarchical genotype encoding for design variables. In addition, we implement the proposed approach by integrating the hierarchical genotype encoding scheme with two popular multi-objective genetic algorithms (MOGAs)-the strength Pareto evolutionary genetic algorithm (SPEA2) and the non-dominated sorting genetic algorithm (NSGA-II). In the provided numerical examples, the proposed multi-objective hierarchical approach is applied to solve two hierarchical MRAOPs, a 4- and a 3-level problems. The proposed method is compared with a single-objective optimization method that uses a hierarchical genetic algorithm (HGA), also applied to solve the 3- and 4-level problems. The results show that a multi-objective hierarchical GA (MOHGA) that includes elitism and mechanism for diversity preserving performed better than a single-objective GA that only uses elitism, when solving large-scale MRAOPs. Additionally, the experimental results show that the proposed method with NSGA-II outperformed the proposed method with SPEA2 in finding useful Pareto optimal solution sets.

  7. Multi-objective hierarchical genetic algorithms for multilevel redundancy allocation optimization

    International Nuclear Information System (INIS)

    Kumar, Ranjan; Izui, Kazuhiro; Yoshimura, Masataka; Nishiwaki, Shinji

    2009-01-01

    Multilevel redundancy allocation optimization problems (MRAOPs) occur frequently when attempting to maximize the system reliability of a hierarchical system, and almost all complex engineering systems are hierarchical. Despite their practical significance, limited research has been done concerning the solving of simple MRAOPs. These problems are not only NP hard but also involve hierarchical design variables. Genetic algorithms (GAs) have been applied in solving MRAOPs, since they are computationally efficient in solving such problems, unlike exact methods, but their applications has been confined to single-objective formulation of MRAOPs. This paper proposes a multi-objective formulation of MRAOPs and a methodology for solving such problems. In this methodology, a hierarchical GA framework for multi-objective optimization is proposed by introducing hierarchical genotype encoding for design variables. In addition, we implement the proposed approach by integrating the hierarchical genotype encoding scheme with two popular multi-objective genetic algorithms (MOGAs)-the strength Pareto evolutionary genetic algorithm (SPEA2) and the non-dominated sorting genetic algorithm (NSGA-II). In the provided numerical examples, the proposed multi-objective hierarchical approach is applied to solve two hierarchical MRAOPs, a 4- and a 3-level problems. The proposed method is compared with a single-objective optimization method that uses a hierarchical genetic algorithm (HGA), also applied to solve the 3- and 4-level problems. The results show that a multi-objective hierarchical GA (MOHGA) that includes elitism and mechanism for diversity preserving performed better than a single-objective GA that only uses elitism, when solving large-scale MRAOPs. Additionally, the experimental results show that the proposed method with NSGA-II outperformed the proposed method with SPEA2 in finding useful Pareto optimal solution sets

  8. Predicting patchy particle crystals: variable box shape simulations and evolutionary algorithms.

    Science.gov (United States)

    Bianchi, Emanuela; Doppelbauer, Günther; Filion, Laura; Dijkstra, Marjolein; Kahl, Gerhard

    2012-06-07

    We consider several patchy particle models that have been proposed in literature and we investigate their candidate crystal structures in a systematic way. We compare two different algorithms for predicting crystal structures: (i) an approach based on Monte Carlo simulations in the isobaric-isothermal ensemble and (ii) an optimization technique based on ideas of evolutionary algorithms. We show that the two methods are equally successful and provide consistent results on crystalline phases of patchy particle systems.

  9. Modelling Evolutionary Algorithms with Stochastic Differential Equations.

    Science.gov (United States)

    Heredia, Jorge Pérez

    2017-11-20

    There has been renewed interest in modelling the behaviour of evolutionary algorithms (EAs) by more traditional mathematical objects, such as ordinary differential equations or Markov chains. The advantage is that the analysis becomes greatly facilitated due to the existence of well established methods. However, this typically comes at the cost of disregarding information about the process. Here, we introduce the use of stochastic differential equations (SDEs) for the study of EAs. SDEs can produce simple analytical results for the dynamics of stochastic processes, unlike Markov chains which can produce rigorous but unwieldy expressions about the dynamics. On the other hand, unlike ordinary differential equations (ODEs), they do not discard information about the stochasticity of the process. We show that these are especially suitable for the analysis of fixed budget scenarios and present analogues of the additive and multiplicative drift theorems from runtime analysis. In addition, we derive a new more general multiplicative drift theorem that also covers non-elitist EAs. This theorem simultaneously allows for positive and negative results, providing information on the algorithm's progress even when the problem cannot be optimised efficiently. Finally, we provide results for some well-known heuristics namely Random Walk (RW), Random Local Search (RLS), the (1+1) EA, the Metropolis Algorithm (MA), and the Strong Selection Weak Mutation (SSWM) algorithm.

  10. The Application of Fitness Sharing Method in Evolutionary Algorithm to Optimizing the Travelling Salesman Problem (TSP

    Directory of Open Access Journals (Sweden)

    Nurmaulidar Nurmaulidar

    2015-04-01

    Full Text Available Travelling Salesman Problem (TSP is one of complex optimization problem that is difficult to be solved, and require quite a long time for a large number of cities. Evolutionary algorithm is a precise algorithm used in solving complex optimization problem as it is part of heuristic method. Evolutionary algorithm, like many other algorithms, also experiences a premature convergence phenomenon, whereby variation is eliminated from a population of fairly fit individuals before a complete solution is achieved. Therefore it requires a method to delay the convergence. A specific method of fitness sharing called phenotype fitness sharing has been used in this research. The aim of this research is to find out whether fitness sharing in evolutionary algorithm is able to optimize TSP. There are two concepts of evolutionary algorithm being used in this research. the first one used single elitism and the other one used federated solution. The two concepts had been tested to the method of fitness sharing by using the threshold of 0.25, 0.50 and 0.75. The result was then compared to a non fitness sharing method. The result in this study indicated that by using single elitism concept, fitness sharing was able to give a more optimum result for the data of 100-1000 cities. On the other hand, by using federation solution concept, fitness sharing can yield a more optimum result for the data above 1000 cities, as well as a better solution of data-spreading compared to the method without fitness sharing.

  11. Post Pareto optimization-A case

    Science.gov (United States)

    Popov, Stoyan; Baeva, Silvia; Marinova, Daniela

    2017-12-01

    Simulation performance may be evaluated according to multiple quality measures that are in competition and their simultaneous consideration poses a conflict. In the current study we propose a practical framework for investigating such simulation performance criteria, exploring the inherent conflicts amongst them and identifying the best available tradeoffs, based upon multi-objective Pareto optimization. This approach necessitates the rigorous derivation of performance criteria to serve as objective functions and undergo vector optimization. We demonstrate the effectiveness of our proposed approach by applying it with multiple stochastic quality measures. We formulate performance criteria of this use-case, pose an optimization problem, and solve it by means of a simulation-based Pareto approach. Upon attainment of the underlying Pareto Frontier, we analyze it and prescribe preference-dependent configurations for the optimal simulation training.

  12. Multiobjective RFID Network Optimization Using Multiobjective Evolutionary and Swarm Intelligence Approaches

    Directory of Open Access Journals (Sweden)

    Hanning Chen

    2014-01-01

    Full Text Available The development of radio frequency identification (RFID technology generates the most challenging RFID network planning (RNP problem, which needs to be solved in order to operate the large-scale RFID network in an optimal fashion. RNP involves many objectives and constraints and has been proven to be a NP-hard multi-objective problem. The application of evolutionary algorithm (EA and swarm intelligence (SI for solving multiobjective RNP (MORNP has gained significant attention in the literature, but these algorithms always transform multiple objectives into a single objective by weighted coefficient approach. In this paper, we use multiobjective EA and SI algorithms to find all the Pareto optimal solutions and to achieve the optimal planning solutions by simultaneously optimizing four conflicting objectives in MORNP, instead of transforming multiobjective functions into a single objective function. The experiment presents an exhaustive comparison of three successful multiobjective EA and SI, namely, the recently developed multiobjective artificial bee colony algorithm (MOABC, the nondominated sorting genetic algorithm II (NSGA-II, and the multiobjective particle swarm optimization (MOPSO, on MORNP instances of different nature, namely, the two-objective and three-objective MORNP. Simulation results show that MOABC proves to be more superior for planning RFID networks than NSGA-II and MOPSO in terms of optimization accuracy and computation robustness.

  13. A note on the estimation of the Pareto efficient set for multiobjective matrix permutation problems.

    Science.gov (United States)

    Brusco, Michael J; Steinley, Douglas

    2012-02-01

    There are a number of important problems in quantitative psychology that require the identification of a permutation of the n rows and columns of an n × n proximity matrix. These problems encompass applications such as unidimensional scaling, paired-comparison ranking, and anti-Robinson forms. The importance of simultaneously incorporating multiple objective criteria in matrix permutation applications is well recognized in the literature; however, to date, there has been a reliance on weighted-sum approaches that transform the multiobjective problem into a single-objective optimization problem. Although exact solutions to these single-objective problems produce supported Pareto efficient solutions to the multiobjective problem, many interesting unsupported Pareto efficient solutions may be missed. We illustrate the limitation of the weighted-sum approach with an example from the psychological literature and devise an effective heuristic algorithm for estimating both the supported and unsupported solutions of the Pareto efficient set. © 2011 The British Psychological Society.

  14. A new mechanism for maintaining diversity of Pareto archive in multi-objective optimization

    Czech Academy of Sciences Publication Activity Database

    Hájek, J.; Szöllös, A.; Šístek, Jakub

    2010-01-01

    Roč. 41, 7-8 (2010), s. 1031-1057 ISSN 0965-9978 R&D Projects: GA AV ČR IAA100760702 Institutional research plan: CEZ:AV0Z10190503 Keywords : multi-objective optimization * micro-genetic algorithm * diversity * Pareto archive Subject RIV: BA - General Mathematics Impact factor: 1.004, year: 2010 http://www.sciencedirect.com/science/article/pii/S0965997810000451

  15. A new mechanism for maintaining diversity of Pareto archive in multi-objective optimization

    Czech Academy of Sciences Publication Activity Database

    Hájek, J.; Szöllös, A.; Šístek, Jakub

    2010-01-01

    Roč. 41, 7-8 (2010), s. 1031-1057 ISSN 0965-9978 R&D Projects: GA AV ČR IAA100760702 Institutional research plan: CEZ:AV0Z10190503 Keywords : multi-objective optimization * micro- genetic algorithm * diversity * Pareto archive Subject RIV: BA - General Mathematics Impact factor: 1.004, year: 2010 http://www.sciencedirect.com/science/article/pii/S0965997810000451

  16. A chaos-based evolutionary algorithm for general nonlinear programming problems

    International Nuclear Information System (INIS)

    El-Shorbagy, M.A.; Mousa, A.A.; Nasr, S.M.

    2016-01-01

    In this paper we present a chaos-based evolutionary algorithm (EA) for solving nonlinear programming problems named chaotic genetic algorithm (CGA). CGA integrates genetic algorithm (GA) and chaotic local search (CLS) strategy to accelerate the optimum seeking operation and to speed the convergence to the global solution. The integration of global search represented in genetic algorithm and CLS procedures should offer the advantages of both optimization methods while offsetting their disadvantages. By this way, it is intended to enhance the global convergence and to prevent to stick on a local solution. The inherent characteristics of chaos can enhance optimization algorithms by enabling it to escape from local solutions and increase the convergence to reach to the global solution. Twelve chaotic maps have been analyzed in the proposed approach. The simulation results using the set of CEC’2005 show that the application of chaotic mapping may be an effective strategy to improve the performances of EAs.

  17. Swarm, genetic and evolutionary programming algorithms applied to multiuser detection

    Directory of Open Access Journals (Sweden)

    Paul Jean Etienne Jeszensky

    2005-02-01

    Full Text Available In this paper, the particles swarm optimization technique, recently published in the literature, and applied to Direct Sequence/Code Division Multiple Access systems (DS/CDMA with multiuser detection (MuD is analyzed, evaluated and compared. The Swarm algorithm efficiency when applied to the DS-CDMA multiuser detection (Swarm-MuD is compared through the tradeoff performance versus computational complexity, being the complexity expressed in terms of the number of necessary operations in order to reach the performance obtained through the optimum detector or the Maximum Likelihood detector (ML. The comparison is accomplished among the genetic algorithm, evolutionary programming with cloning and Swarm algorithm under the same simulation basis. Additionally, it is proposed an heuristics-MuD complexity analysis through the number of computational operations. Finally, an analysis is carried out for the input parameters of the Swarm algorithm in the attempt to find the optimum parameters (or almost-optimum for the algorithm applied to the MuD problem.

  18. Research and Setting the Modified Algorithm "Predator-Prey" in the Problem of the Multi-Objective Optimization

    Directory of Open Access Journals (Sweden)

    A. P. Karpenko

    2016-01-01

    Full Text Available We consider a class of algorithms for multi-objective optimization - Pareto-approximation algorithms, which suppose a preliminary building of finite-dimensional approximation of a Pareto set, thereby also a Pareto front of the problem. The article gives an overview of population and non-population algorithms of the Pareto-approximation, identifies their strengths and weaknesses, and presents a canonical algorithm "predator-prey", showing its shortcomings. We offer a number of modifications of the canonical algorithm "predator-prey" with the aim to overcome the drawbacks of this algorithm, present the results of a broad study of the efficiency of these modifications of the algorithm. The peculiarity of the study is the use of the quality indicators of the Pareto-approximation, which previous publications have not used. In addition, we present the results of the meta-optimization of the modified algorithm, i.e. determining the optimal values of some free parameters of the algorithm. The study of efficiency of the modified algorithm "predator-prey" has shown that the proposed modifications allow us to improve the following indicators of the basic algorithm: cardinality of a set of the archive solutions, uniformity of archive solutions, and computation time. By and large, the research results have shown that the modified and meta-optimized algorithm enables achieving exactly the same approximation as the basic algorithm, but with the number of preys being one order less. Computational costs are proportionally reduced.

  19. Virus evolutionary genetic algorithm for task collaboration of logistics distribution

    Science.gov (United States)

    Ning, Fanghua; Chen, Zichen; Xiong, Li

    2005-12-01

    In order to achieve JIT (Just-In-Time) level and clients' maximum satisfaction in logistics collaboration, a Virus Evolutionary Genetic Algorithm (VEGA) was put forward under double constraints of logistics resource and operation sequence. Based on mathematic description of a multiple objective function, the algorithm was designed to schedule logistics tasks with different due dates and allocate them to network members. By introducing a penalty item, make span and customers' satisfaction were expressed in fitness function. And a dynamic adaptive probability of infection was used to improve performance of local search. Compared to standard Genetic Algorithm (GA), experimental result illustrates the performance superiority of VEGA. So the VEGA can provide a powerful decision-making technique for optimizing resource configuration in logistics network.

  20. Pareto-optimal estimates that constrain mean California precipitation change

    Science.gov (United States)

    Langenbrunner, B.; Neelin, J. D.

    2017-12-01

    Global climate model (GCM) projections of greenhouse gas-induced precipitation change can exhibit notable uncertainty at the regional scale, particularly in regions where the mean change is small compared to internal variability. This is especially true for California, which is located in a transition zone between robust precipitation increases to the north and decreases to the south, and where GCMs from the Climate Model Intercomparison Project phase 5 (CMIP5) archive show no consensus on mean change (in either magnitude or sign) across the central and southern parts of the state. With the goal of constraining this uncertainty, we apply a multiobjective approach to a large set of subensembles (subsets of models from the full CMIP5 ensemble). These constraints are based on subensemble performance in three fields important to California precipitation: tropical Pacific sea surface temperatures, upper-level zonal winds in the midlatitude Pacific, and precipitation over the state. An evolutionary algorithm is used to sort through and identify the set of Pareto-optimal subensembles across these three measures in the historical climatology, and we use this information to constrain end-of-century California wet season precipitation change. This technique narrows the range of projections throughout the state and increases confidence in estimates of positive mean change. Furthermore, these methods complement and generalize emergent constraint approaches that aim to restrict uncertainty in end-of-century projections, and they have applications to even broader aspects of uncertainty quantification, including parameter sensitivity and model calibration.

  1. Pareto fronts in clinical practice for pinnacle.

    Science.gov (United States)

    Janssen, Tomas; van Kesteren, Zdenko; Franssen, Gijs; Damen, Eugène; van Vliet, Corine

    2013-03-01

    Our aim was to develop a framework to objectively perform treatment planning studies using Pareto fronts. The Pareto front represents all optimal possible tradeoffs among several conflicting criteria and is an ideal tool with which to study the possibilities of a given treatment technique. The framework should require minimal user interaction and should resemble and be applicable to daily clinical practice. To generate the Pareto fronts, we used the native scripting language of Pinnacle(3) (Philips Healthcare, Andover, MA). The framework generates thousands of plans automatically from which the Pareto front is generated. As an example, the framework is applied to compare intensity modulated radiation therapy (IMRT) with volumetric modulated arc therapy (VMAT) for prostate cancer patients. For each patient and each technique, 3000 plans are generated, resulting in a total of 60,000 plans. The comparison is based on 5-dimensional Pareto fronts. Generating 3000 plans for 10 patients in parallel requires on average 96 h for IMRT and 483 hours for VMAT. Using VMAT, compared to IMRT, the maximum dose of the boost PTV was reduced by 0.4 Gy (P=.074), the mean dose in the anal sphincter by 1.6 Gy (P=.055), the conformity index of the 95% isodose (CI(95%)) by 0.02 (P=.005), and the rectal wall V(65 Gy) by 1.1% (P=.008). We showed the feasibility of automatically generating Pareto fronts with Pinnacle(3). Pareto fronts provide a valuable tool for performing objective comparative treatment planning studies. We compared VMAT with IMRT in prostate patients and found VMAT had a dosimetric advantage over IMRT. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Pareto Fronts in Clinical Practice for Pinnacle

    International Nuclear Information System (INIS)

    Janssen, Tomas; Kesteren, Zdenko van; Franssen, Gijs; Damen, Eugène; Vliet, Corine van

    2013-01-01

    Purpose: Our aim was to develop a framework to objectively perform treatment planning studies using Pareto fronts. The Pareto front represents all optimal possible tradeoffs among several conflicting criteria and is an ideal tool with which to study the possibilities of a given treatment technique. The framework should require minimal user interaction and should resemble and be applicable to daily clinical practice. Methods and Materials: To generate the Pareto fronts, we used the native scripting language of Pinnacle 3 (Philips Healthcare, Andover, MA). The framework generates thousands of plans automatically from which the Pareto front is generated. As an example, the framework is applied to compare intensity modulated radiation therapy (IMRT) with volumetric modulated arc therapy (VMAT) for prostate cancer patients. For each patient and each technique, 3000 plans are generated, resulting in a total of 60,000 plans. The comparison is based on 5-dimensional Pareto fronts. Results: Generating 3000 plans for 10 patients in parallel requires on average 96 h for IMRT and 483 hours for VMAT. Using VMAT, compared to IMRT, the maximum dose of the boost PTV was reduced by 0.4 Gy (P=.074), the mean dose in the anal sphincter by 1.6 Gy (P=.055), the conformity index of the 95% isodose (CI 95% ) by 0.02 (P=.005), and the rectal wall V 65 Gy by 1.1% (P=.008). Conclusions: We showed the feasibility of automatically generating Pareto fronts with Pinnacle 3 . Pareto fronts provide a valuable tool for performing objective comparative treatment planning studies. We compared VMAT with IMRT in prostate patients and found VMAT had a dosimetric advantage over IMRT

  3. Self-organized modularization in evolutionary algorithms.

    Science.gov (United States)

    Dauscher, Peter; Uthmann, Thomas

    2005-01-01

    The principle of modularization has proven to be extremely successful in the field of technical applications and particularly for Software Engineering purposes. The question to be answered within the present article is whether mechanisms can also be identified within the framework of Evolutionary Computation that cause a modularization of solutions. We will concentrate on processes, where modularization results only from the typical evolutionary operators, i.e. selection and variation by recombination and mutation (and not, e.g., from special modularization operators). This is what we call Self-Organized Modularization. Based on a combination of two formalizations by Radcliffe and Altenberg, some quantitative measures of modularity are introduced. Particularly, we distinguish Built-in Modularity as an inherent property of a genotype and Effective Modularity, which depends on the rest of the population. These measures can easily be applied to a wide range of present Evolutionary Computation models. It will be shown, both theoretically and by simulation, that under certain conditions, Effective Modularity (as defined within this paper) can be a selection factor. This causes Self-Organized Modularization to take place. The experimental observations emphasize the importance of Effective Modularity in comparison with Built-in Modularity. Although the experimental results have been obtained using a minimalist toy model, they can lead to a number of consequences for existing models as well as for future approaches. Furthermore, the results suggest a complex self-amplification of highly modular equivalence classes in the case of respected relations. Since the well-known Holland schemata are just the equivalence classes of respected relations in most Simple Genetic Algorithms, this observation emphasizes the role of schemata as Building Blocks (in comparison with arbitrary subsets of the search space).

  4. A Comparison of Evolutionary Algorithms for Tracking Time-Varying Recursive Systems

    Directory of Open Access Journals (Sweden)

    White Michael S

    2003-01-01

    Full Text Available A comparison is made of the behaviour of some evolutionary algorithms in time-varying adaptive recursive filter systems. Simulations show that an algorithm including random immigrants outperforms a more conventional algorithm using the breeder genetic algorithm as the mutation operator when the time variation is discontinuous, but neither algorithm performs well when the time variation is rapid but smooth. To meet this deficit, a new hybrid algorithm which uses a hill climber as an additional genetic operator, applied for several steps at each generation, is introduced. A comparison is made of the effect of applying the hill climbing operator a few times to all members of the population or a larger number of times solely to the best individual; it is found that applying to the whole population yields the better results, substantially improved compared with those obtained using earlier methods.

  5. A Novel Evolutionary Algorithm for Designing Robust Analog Filters

    Directory of Open Access Journals (Sweden)

    Shaobo Li

    2018-03-01

    Full Text Available Designing robust circuits that withstand environmental perturbation and device degradation is critical for many applications. Traditional robust circuit design is mainly done by tuning parameters to improve system robustness. However, the topological structure of a system may set a limit on the robustness achievable through parameter tuning. This paper proposes a new evolutionary algorithm for robust design that exploits the open-ended topological search capability of genetic programming (GP coupled with bond graph modeling. We applied our GP-based robust design (GPRD algorithm to evolve robust lowpass and highpass analog filters. Compared with a traditional robust design approach based on a state-of-the-art real-parameter genetic algorithm (GA, our GPRD algorithm with a fitness criterion rewarding robustness, with respect to parameter perturbations, can evolve more robust filters than what was achieved through parameter tuning alone. We also find that inappropriate GA tuning may mislead the search process and that multiple-simulation and perturbed fitness evaluation methods for evolving robustness have complementary behaviors with no absolute advantage of one over the other.

  6. A Guiding Evolutionary Algorithm with Greedy Strategy for Global Optimization Problems

    Directory of Open Access Journals (Sweden)

    Leilei Cao

    2016-01-01

    Full Text Available A Guiding Evolutionary Algorithm (GEA with greedy strategy for global optimization problems is proposed. Inspired by Particle Swarm Optimization, the Genetic Algorithm, and the Bat Algorithm, the GEA was designed to retain some advantages of each method while avoiding some disadvantages. In contrast to the usual Genetic Algorithm, each individual in GEA is crossed with the current global best one instead of a randomly selected individual. The current best individual served as a guide to attract offspring to its region of genotype space. Mutation was added to offspring according to a dynamic mutation probability. To increase the capability of exploitation, a local search mechanism was applied to new individuals according to a dynamic probability of local search. Experimental results show that GEA outperformed the other three typical global optimization algorithms with which it was compared.

  7. Comparing the Robustness of Evolutionary Algorithms on the Basis of Benchmark Functions

    Directory of Open Access Journals (Sweden)

    DENIZ ULKER, E.

    2013-05-01

    Full Text Available In real-world optimization problems, even though the solution quality is of great importance, the robustness of the solution is also an important aspect. This paper investigates how the optimization algorithms are sensitive to the variations of control parameters and to the random initialization of the solution set for fixed control parameters. The comparison is performed of three well-known evolutionary algorithms which are Particle Swarm Optimization (PSO algorithm, Differential Evolution (DE algorithm and the Harmony Search (HS algorithm. Various benchmark functions with different characteristics are used for the evaluation of these algorithms. The experimental results show that the solution quality of the algorithms is not directly related to their robustness. In particular, the algorithm that is highly robust can have a low solution quality, or the algorithm that has a high quality of solution can be quite sensitive to the parameter variations.

  8. An approach to multiobjective optimization of rotational therapy. II. Pareto optimal surfaces and linear combinations of modulated blocked arcs for a prostate geometry.

    Science.gov (United States)

    Pardo-Montero, Juan; Fenwick, John D

    2010-06-01

    The purpose of this work is twofold: To further develop an approach to multiobjective optimization of rotational therapy treatments recently introduced by the authors [J. Pardo-Montero and J. D. Fenwick, "An approach to multiobjective optimization of rotational therapy," Med. Phys. 36, 3292-3303 (2009)], especially regarding its application to realistic geometries, and to study the quality (Pareto optimality) of plans obtained using such an approach by comparing them with Pareto optimal plans obtained through inverse planning. In the previous work of the authors, a methodology is proposed for constructing a large number of plans, with different compromises between the objectives involved, from a small number of geometrically based arcs, each arc prioritizing different objectives. Here, this method has been further developed and studied. Two different techniques for constructing these arcs are investigated, one based on image-reconstruction algorithms and the other based on more common gradient-descent algorithms. The difficulty of dealing with organs abutting the target, briefly reported in previous work of the authors, has been investigated using partial OAR unblocking. Optimality of the solutions has been investigated by comparison with a Pareto front obtained from inverse planning. A relative Euclidean distance has been used to measure the distance of these plans to the Pareto front, and dose volume histogram comparisons have been used to gauge the clinical impact of these distances. A prostate geometry has been used for the study. For geometries where a blocked OAR abuts the target, moderate OAR unblocking can substantially improve target dose distribution and minimize hot spots while not overly compromising dose sparing of the organ. Image-reconstruction type and gradient-descent blocked-arc computations generate similar results. The Pareto front for the prostate geometry, reconstructed using a large number of inverse plans, presents a hockey-stick shape

  9. A Multiagent Evolutionary Algorithm for the Resource-Constrained Project Portfolio Selection and Scheduling Problem

    Directory of Open Access Journals (Sweden)

    Yongyi Shou

    2014-01-01

    Full Text Available A multiagent evolutionary algorithm is proposed to solve the resource-constrained project portfolio selection and scheduling problem. The proposed algorithm has a dual level structure. In the upper level a set of agents make decisions to select appropriate project portfolios. Each agent selects its project portfolio independently. The neighborhood competition operator and self-learning operator are designed to improve the agent’s energy, that is, the portfolio profit. In the lower level the selected projects are scheduled simultaneously and completion times are computed to estimate the expected portfolio profit. A priority rule-based heuristic is used by each agent to solve the multiproject scheduling problem. A set of instances were generated systematically from the widely used Patterson set. Computational experiments confirmed that the proposed evolutionary algorithm is effective for the resource-constrained project portfolio selection and scheduling problem.

  10. Multi-objective optimization of HVAC system with an evolutionary computation algorithm

    International Nuclear Information System (INIS)

    Kusiak, Andrew; Tang, Fan; Xu, Guanglin

    2011-01-01

    A data-mining approach for the optimization of a HVAC (heating, ventilation, and air conditioning) system is presented. A predictive model of the HVAC system is derived by data-mining algorithms, using a dataset collected from an experiment conducted at a research facility. To minimize the energy while maintaining the corresponding IAQ (indoor air quality) within a user-defined range, a multi-objective optimization model is developed. The solutions of this model are set points of the control system derived with an evolutionary computation algorithm. The controllable input variables - supply air temperature and supply air duct static pressure set points - are generated to reduce the energy use. The results produced by the evolutionary computation algorithm show that the control strategy saves energy by optimizing operations of an HVAC system. -- Highlights: → A data-mining approach for the optimization of a heating, ventilation, and air conditioning (HVAC) system is presented. → The data used in the project has been collected from an experiment conducted at an energy research facility. → The approach presented in the paper leads to accomplishing significant energy savings without compromising the indoor air quality. → The energy savings are accomplished by computing set points for the supply air temperature and the supply air duct static pressure.

  11. Academic Training: Evolutionary Heuristic Optimization: Genetic Algorithms and Estimation of Distribution Algorithms - Lecture serie

    CERN Multimedia

    Françoise Benz

    2004-01-01

    ENSEIGNEMENT ACADEMIQUE ACADEMIC TRAINING Françoise Benz 73127 academic.training@cern.ch ACADEMIC TRAINING LECTURE REGULAR PROGRAMME 1, 2, 3 and 4 June From 11:00 hrs to 12:00 hrs - Main Auditorium bldg. 500 Evolutionary Heuristic Optimization: Genetic Algorithms and Estimation of Distribution Algorithms V. Robles Forcada and M. Perez Hernandez / Univ. de Madrid, Spain In the real world, there exist a huge number of problems that require getting an optimum or near-to-optimum solution. Optimization can be used to solve a lot of different problems such as network design, sets and partitions, storage and retrieval or scheduling. On the other hand, in nature, there exist many processes that seek a stable state. These processes can be seen as natural optimization processes. Over the last 30 years several attempts have been made to develop optimization algorithms, which simulate these natural optimization processes. These attempts have resulted in methods such as Simulated Annealing, based on nat...

  12. An Adaptive Evolutionary Algorithm for Traveling Salesman Problem with Precedence Constraints

    Directory of Open Access Journals (Sweden)

    Jinmo Sung

    2014-01-01

    Full Text Available Traveling sales man problem with precedence constraints is one of the most notorious problems in terms of the efficiency of its solution approach, even though it has very wide range of industrial applications. We propose a new evolutionary algorithm to efficiently obtain good solutions by improving the search process. Our genetic operators guarantee the feasibility of solutions over the generations of population, which significantly improves the computational efficiency even when it is combined with our flexible adaptive searching strategy. The efficiency of the algorithm is investigated by computational experiments.

  13. Kullback-Leibler divergence and the Pareto-Exponential approximation.

    Science.gov (United States)

    Weinberg, G V

    2016-01-01

    Recent radar research interests in the Pareto distribution as a model for X-band maritime surveillance radar clutter returns have resulted in analysis of the asymptotic behaviour of this clutter model. In particular, it is of interest to understand when the Pareto distribution is well approximated by an Exponential distribution. The justification for this is that under the latter clutter model assumption, simpler radar detection schemes can be applied. An information theory approach is introduced to investigate the Pareto-Exponential approximation. By analysing the Kullback-Leibler divergence between the two distributions it is possible to not only assess when the approximation is valid, but to determine, for a given Pareto model, the optimal Exponential approximation.

  14. Comparative Study of Evolutionary Multi-objective Optimization Algorithms for a Non-linear Greenhouse Climate Control Problem

    DEFF Research Database (Denmark)

    Ghoreishi, Newsha; Sørensen, Jan Corfixen; Jørgensen, Bo Nørregaard

    2015-01-01

    Non-trivial real world decision-making processes usually involve multiple parties having potentially conflicting interests over a set of issues. State-of-the-art multi-objective evolutionary algorithms (MOEA) are well known to solve this class of complex real-world problems. In this paper, we...... compare the performance of state-of-the-art multi-objective evolutionary algorithms to solve a non-linear multi-objective multi-issue optimisation problem found in Greenhouse climate control. The chosen algorithms in the study includes NSGAII, eNSGAII, eMOEA, PAES, PESAII and SPEAII. The performance...... of all aforementioned algorithms is assessed and compared using performance indicators to evaluate proximity, diversity and consistency. Our insights to this comparative study enhanced our understanding of MOEAs performance in order to solve a non-linear complex climate control problem. The empirical...

  15. An Analytical Framework for Runtime of a Class of Continuous Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Yushan Zhang

    2015-01-01

    Full Text Available Although there have been many studies on the runtime of evolutionary algorithms in discrete optimization, relatively few theoretical results have been proposed on continuous optimization, such as evolutionary programming (EP. This paper proposes an analysis of the runtime of two EP algorithms based on Gaussian and Cauchy mutations, using an absorbing Markov chain. Given a constant variation, we calculate the runtime upper bound of special Gaussian mutation EP and Cauchy mutation EP. Our analysis reveals that the upper bounds are impacted by individual number, problem dimension number n, searching range, and the Lebesgue measure of the optimal neighborhood. Furthermore, we provide conditions whereby the average runtime of the considered EP can be no more than a polynomial of n. The condition is that the Lebesgue measure of the optimal neighborhood is larger than a combinatorial calculation of an exponential and the given polynomial of n.

  16. The exponentiated generalized Pareto distribution | Adeyemi | Ife ...

    African Journals Online (AJOL)

    Recently Gupta et al. (1998) introduced the exponentiated exponential distribution as a generalization of the standard exponential distribution. In this paper, we introduce a three-parameter generalized Pareto distribution, the exponentiated generalized Pareto distribution (EGP). We present a comprehensive treatment of the ...

  17. Optimal Scheduling for Retrieval Jobs in Double-Deep AS/RS by Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Kuo-Yang Wu

    2013-01-01

    Full Text Available We investigate the optimal scheduling of retrieval jobs for double-deep type Automated Storage and Retrieval Systems (AS/RS in the Flexible Manufacturing System (FMS used in modern industrial production. Three types of evolutionary algorithms, the Genetic Algorithm (GA, the Immune Genetic Algorithm (IGA, and the Particle Swarm Optimization (PSO algorithm, are implemented to obtain the optimal assignments. The objective is to minimize the working distance, that is, the shortest retrieval time travelled by the Storage and Retrieval (S/R machine. Simulation results and comparisons show the advantages and feasibility of the proposed methods.

  18. An improved shuffled frog leaping algorithm based evolutionary framework for currency exchange rate prediction

    Science.gov (United States)

    Dash, Rajashree

    2017-11-01

    Forecasting purchasing power of one currency with respect to another currency is always an interesting topic in the field of financial time series prediction. Despite the existence of several traditional and computational models for currency exchange rate forecasting, there is always a need for developing simpler and more efficient model, which will produce better prediction capability. In this paper, an evolutionary framework is proposed by using an improved shuffled frog leaping (ISFL) algorithm with a computationally efficient functional link artificial neural network (CEFLANN) for prediction of currency exchange rate. The model is validated by observing the monthly prediction measures obtained for three currency exchange data sets such as USD/CAD, USD/CHF, and USD/JPY accumulated within same period of time. The model performance is also compared with two other evolutionary learning techniques such as Shuffled frog leaping algorithm and Particle Swarm optimization algorithm. Practical analysis of results suggest that, the proposed model developed using the ISFL algorithm with CEFLANN network is a promising predictor model for currency exchange rate prediction compared to other models included in the study.

  19. A review and experimental study on the application of classifiers and evolutionary algorithms in EEG-based brain-machine interface systems

    Science.gov (United States)

    Tahernezhad-Javazm, Farajollah; Azimirad, Vahid; Shoaran, Maryam

    2018-04-01

    Objective. Considering the importance and the near-future development of noninvasive brain-machine interface (BMI) systems, this paper presents a comprehensive theoretical-experimental survey on the classification and evolutionary methods for BMI-based systems in which EEG signals are used. Approach. The paper is divided into two main parts. In the first part, a wide range of different types of the base and combinatorial classifiers including boosting and bagging classifiers and evolutionary algorithms are reviewed and investigated. In the second part, these classifiers and evolutionary algorithms are assessed and compared based on two types of relatively widely used BMI systems, sensory motor rhythm-BMI and event-related potentials-BMI. Moreover, in the second part, some of the improved evolutionary algorithms as well as bi-objective algorithms are experimentally assessed and compared. Main results. In this study two databases are used, and cross-validation accuracy (CVA) and stability to data volume (SDV) are considered as the evaluation criteria for the classifiers. According to the experimental results on both databases, regarding the base classifiers, linear discriminant analysis and support vector machines with respect to CVA evaluation metric, and naive Bayes with respect to SDV demonstrated the best performances. Among the combinatorial classifiers, four classifiers, Bagg-DT (bagging decision tree), LogitBoost, and GentleBoost with respect to CVA, and Bagging-LR (bagging logistic regression) and AdaBoost (adaptive boosting) with respect to SDV had the best performances. Finally, regarding the evolutionary algorithms, single-objective invasive weed optimization (IWO) and bi-objective nondominated sorting IWO algorithms demonstrated the best performances. Significance. We present a general survey on the base and the combinatorial classification methods for EEG signals (sensory motor rhythm and event-related potentials) as well as their optimization methods

  20. NSGA-II Algorithm with a Local Search Strategy for Multiobjective Optimal Design of Dry-Type Air-Core Reactor

    Directory of Open Access Journals (Sweden)

    Chengfen Zhang

    2015-01-01

    Full Text Available Dry-type air-core reactor is now widely applied in electrical power distribution systems, for which the optimization design is a crucial issue. In the optimization design problem of dry-type air-core reactor, the objectives of minimizing the production cost and minimizing the operation cost are both important. In this paper, a multiobjective optimal model is established considering simultaneously the two objectives of minimizing the production cost and minimizing the operation cost. To solve the multi-objective optimization problem, a memetic evolutionary algorithm is proposed, which combines elitist nondominated sorting genetic algorithm version II (NSGA-II with a local search strategy based on the covariance matrix adaptation evolution strategy (CMA-ES. NSGA-II can provide decision maker with flexible choices among the different trade-off solutions, while the local-search strategy, which is applied to nondominated individuals randomly selected from the current population in a given generation and quantity, can accelerate the convergence speed. Furthermore, another modification is that an external archive is set in the proposed algorithm for increasing the evolutionary efficiency. The proposed algorithm is tested on a dry-type air-core reactor made of rectangular cross-section litz-wire. Simulation results show that the proposed algorithm has high efficiency and it converges to a better Pareto front.

  1. Analysis of Various Multi-Objective Optimization Evolutionary Algorithms for Monte Carlo Treatment Planning System

    CERN Document Server

    Tydrichova, Magdalena

    2017-01-01

    In this project, various available multi-objective optimization evolutionary algorithms were compared considering their performance and distribution of solutions. The main goal was to select the most suitable algorithms for applications in cancer hadron therapy planning. For our purposes, a complex testing and analysis software was developed. Also, many conclusions and hypothesis have been done for the further research.

  2. TopN-Pareto Front Search

    Energy Technology Data Exchange (ETDEWEB)

    2016-12-21

    The JMP Add-In TopN-PFS provides an automated tool for finding layered Pareto front to identify the top N solutions from an enumerated list of candidates subject to optimizing multiple criteria. The approach constructs the N layers of Pareto fronts, and then provides a suite of graphical tools to explore the alternatives based on different prioritizations of the criteria. The tool is designed to provide a set of alternatives from which the decision-maker can select the best option for their study goals.

  3. Determination of Pareto frontier in multi-objective maintenance optimization

    International Nuclear Information System (INIS)

    Certa, Antonella; Galante, Giacomo; Lupo, Toni; Passannanti, Gianfranco

    2011-01-01

    The objective of a maintenance policy generally is the global maintenance cost minimization that involves not only the direct costs for both the maintenance actions and the spare parts, but also those ones due to the system stop for preventive maintenance and the downtime for failure. For some operating systems, the failure event can be dangerous so that they are asked to operate assuring a very high reliability level between two consecutive fixed stops. The present paper attempts to individuate the set of elements on which performing maintenance actions so that the system can assure the required reliability level until the next fixed stop for maintenance, minimizing both the global maintenance cost and the total maintenance time. In order to solve the previous constrained multi-objective optimization problem, an effective approach is proposed to obtain the best solutions (that is the Pareto optimal frontier) among which the decision maker will choose the more suitable one. As well known, describing the whole Pareto optimal frontier generally is a troublesome task. The paper proposes an algorithm able to rapidly overcome this problem and its effectiveness is shown by an application to a case study regarding a complex series-parallel system.

  4. Mapping the Pareto optimal design space for a functionally deimmunized biotherapeutic candidate.

    Science.gov (United States)

    Salvat, Regina S; Parker, Andrew S; Choi, Yoonjoo; Bailey-Kellogg, Chris; Griswold, Karl E

    2015-01-01

    The immunogenicity of biotherapeutics can bottleneck development pipelines and poses a barrier to widespread clinical application. As a result, there is a growing need for improved deimmunization technologies. We have recently described algorithms that simultaneously optimize proteins for both reduced T cell epitope content and high-level function. In silico analysis of this dual objective design space reveals that there is no single global optimum with respect to protein deimmunization. Instead, mutagenic epitope deletion yields a spectrum of designs that exhibit tradeoffs between immunogenic potential and molecular function. The leading edge of this design space is the Pareto frontier, i.e. the undominated variants for which no other single design exhibits better performance in both criteria. Here, the Pareto frontier of a therapeutic enzyme has been designed, constructed, and evaluated experimentally. Various measures of protein performance were found to map a functional sequence space that correlated well with computational predictions. These results represent the first systematic and rigorous assessment of the functional penalty that must be paid for pursuing progressively more deimmunized biotherapeutic candidates. Given this capacity to rapidly assess and design for tradeoffs between protein immunogenicity and functionality, these algorithms may prove useful in augmenting, accelerating, and de-risking experimental deimmunization efforts.

  5. Pareto optimality in organelle energy metabolism analysis.

    Science.gov (United States)

    Angione, Claudio; Carapezza, Giovanni; Costanza, Jole; Lió, Pietro; Nicosia, Giuseppe

    2013-01-01

    In low and high eukaryotes, energy is collected or transformed in compartments, the organelles. The rich variety of size, characteristics, and density of the organelles makes it difficult to build a general picture. In this paper, we make use of the Pareto-front analysis to investigate the optimization of energy metabolism in mitochondria and chloroplasts. Using the Pareto optimality principle, we compare models of organelle metabolism on the basis of single- and multiobjective optimization, approximation techniques (the Bayesian Automatic Relevance Determination), robustness, and pathway sensitivity analysis. Finally, we report the first analysis of the metabolic model for the hydrogenosome of Trichomonas vaginalis, which is found in several protozoan parasites. Our analysis has shown the importance of the Pareto optimality for such comparison and for insights into the evolution of the metabolism from cytoplasmic to organelle bound, involving a model order reduction. We report that Pareto fronts represent an asymptotic analysis useful to describe the metabolism of an organism aimed at maximizing concurrently two or more metabolite concentrations.

  6. Multi objective optimization of horizontal axis tidal current turbines, using Meta heuristics algorithms

    International Nuclear Information System (INIS)

    Tahani, Mojtaba; Babayan, Narek; Astaraei, Fatemeh Razi; Moghadam, Ali

    2015-01-01

    Highlights: • The performance of four different Meta heuristic optimization algorithms was studied. • Power coefficient and produced torque on stationary blade were selected as objective functions. • Chord and twist distributions were selected as decision variables. • All optimization algorithms were combined with blade element momentum theory. • The best Pareto front was obtained by multi objective flower pollination algorithm for HATCTs. - Abstract: The performance of horizontal axis tidal current turbines (HATCT) strongly depends on their geometry. According to this fact, the optimum performance will be achieved by optimized geometry. In this research study, the multi objective optimization of the HATCT is carried out by using four different multi objective optimization algorithms and their performance is evaluated in combination with blade element momentum theory (BEM). The second version of non-dominated sorting genetic algorithm (NSGA-II), multi objective particle swarm optimization algorithm (MOPSO), multi objective cuckoo search algorithm (MOCS) and multi objective flower pollination algorithm (MOFPA) are the selected algorithms. The power coefficient and the produced torque on stationary blade are selected as objective functions and chord and twist distributions along the blade span are selected as decision variables. These algorithms are combined with the blade element momentum (BEM) theory for the purpose of achieving the best Pareto front. The obtained Pareto fronts are compared with each other. Different sets of experiments are carried out by considering different numbers of iterations, population size and tip speed ratios. The Pareto fronts which are achieved by MOFPA and NSGA-II have better quality in comparison to MOCS and MOPSO, but on the other hand a detail comparison between the first fronts of MOFPA and NSGA-II indicated that MOFPA algorithm can obtain the best Pareto front and can maximize the power coefficient up to 4.3% and the

  7. Robust Design in Multiobjective Systems using Taguchi’s Parameter Design Approach and a Pareto Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Enrique Canessa

    2014-01-01

    Full Text Available Se presenta un Algoritmo Genético de Pareto (AGP, que encuentra la frontera de Pareto en problemas de diseño robusto para sistemas multiobjetivo. El AGP fue diseñado para ser aplicado usando el método de Diseño de Parámetros de Taguchi, el cual es el método más frecuentemente empleado por profesionales para ejecutar diseño robusto. El AGP se probó con datos obtenidos de un sistema real con una respuesta y de un simulador de procesos multiobjetivo con muchos factores de control y ruido. En todos los casos, el AGP entregó soluciones óptimas que cumplen con los objetivos del diseño robusto. Además, la discusión de resultados muestra que tener dichas soluciones ayuda en la selección de las mejores a ser implementadas en el sistema bajo estudio, especialmente cuando el sistema tiene muchos factores de control y salidas.

  8. Synthesis of Steered Flat-top Beam Pattern Using Evolutionary Algorithm

    Directory of Open Access Journals (Sweden)

    D. Mandal

    2016-12-01

    Full Text Available In this paper a pattern synthesis method based on Evolutionary Algorithm is presented. A Flat-top beam pattern has been generated from a concentric ring array of isotropic elements by finding out the optimum set of elements amplitudes and phases using Differential Evolution algorithm. The said pattern is generated in three predefined azimuth planes instate of a single phi plane and also verified for a range of azimuth plane for the same optimum excitations. The main beam is steered to an elevation angle of 30 degree with lower peak SLL and ripple. Dynamic range ratio (DRR is also being improved by eliminating the weakly excited array elements, which simplify the design complexity of feed networks.

  9. A Pareto scale-inflated outlier model and its Bayesian analysis

    OpenAIRE

    Scollnik, David P. M.

    2016-01-01

    This paper develops a Pareto scale-inflated outlier model. This model is intended for use when data from some standard Pareto distribution of interest is suspected to have been contaminated with a relatively small number of outliers from a Pareto distribution with the same shape parameter but with an inflated scale parameter. The Bayesian analysis of this Pareto scale-inflated outlier model is considered and its implementation using the Gibbs sampler is discussed. The paper contains three wor...

  10. Finding the Pareto Optimal Equitable Allocation of Homogeneous Divisible Goods Among Three Players

    Directory of Open Access Journals (Sweden)

    Marco Dall'Aglio

    2017-01-01

    Full Text Available We consider the allocation of a finite number of homogeneous divisible items among three players. Under the assumption that each player assigns a positive value to every item, we develop a simple algorithm that returns a Pareto optimal and equitable allocation. This is based on the tight relationship between two geometric objects of fair division: The Individual Pieces Set (IPS and the Radon-Nykodim Set (RNS. The algorithm can be considered as an extension of the Adjusted Winner procedure by Brams and Taylor to the three-player case, without the guarantee of envy-freeness. (original abstract

  11. An evolutionary algorithm technique for intelligence, surveillance, and reconnaissance plan optimization

    Science.gov (United States)

    Langton, John T.; Caroli, Joseph A.; Rosenberg, Brad

    2008-04-01

    To support an Effects Based Approach to Operations (EBAO), Intelligence, Surveillance, and Reconnaissance (ISR) planners must optimize collection plans within an evolving battlespace. A need exists for a decision support tool that allows ISR planners to rapidly generate and rehearse high-performing ISR plans that balance multiple objectives and constraints to address dynamic collection requirements for assessment. To meet this need we have designed an evolutionary algorithm (EA)-based "Integrated ISR Plan Analysis and Rehearsal System" (I2PARS) to support Effects-based Assessment (EBA). I2PARS supports ISR mission planning and dynamic replanning to coordinate assets and optimize their routes, allocation and tasking. It uses an evolutionary algorithm to address the large parametric space of route-finding problems which is sometimes discontinuous in the ISR domain because of conflicting objectives such as minimizing asset utilization yet maximizing ISR coverage. EAs are uniquely suited for generating solutions in dynamic environments and also allow user feedback. They are therefore ideal for "streaming optimization" and dynamic replanning of ISR mission plans. I2PARS uses the Non-dominated Sorting Genetic Algorithm (NSGA-II) to automatically generate a diverse set of high performing collection plans given multiple objectives, constraints, and assets. Intended end users of I2PARS include ISR planners in the Combined Air Operations Centers and Joint Intelligence Centers. Here we show the feasibility of applying the NSGA-II algorithm and EAs in general to the ISR planning domain. Unique genetic representations and operators for optimization within the ISR domain are presented along with multi-objective optimization criteria for ISR planning. Promising results of the I2PARS architecture design, early software prototype, and limited domain testing of the new algorithm are discussed. We also present plans for future research and development, as well as technology

  12. A new ARMAX model based on evolutionary algorithm and particle swarm optimization for short-term load forecasting

    International Nuclear Information System (INIS)

    Wang, Bo; Tai, Neng-ling; Zhai, Hai-qing; Ye, Jian; Zhu, Jia-dong; Qi, Liang-bo

    2008-01-01

    In this paper, a new ARMAX model based on evolutionary algorithm and particle swarm optimization for short-term load forecasting is proposed. Auto-regressive (AR) and moving average (MA) with exogenous variables (ARMAX) has been widely applied in the load forecasting area. Because of the nonlinear characteristics of the power system loads, the forecasting function has many local optimal points. The traditional method based on gradient searching may be trapped in local optimal points and lead to high error. While, the hybrid method based on evolutionary algorithm and particle swarm optimization can solve this problem more efficiently than the traditional ways. It takes advantage of evolutionary strategy to speed up the convergence of particle swarm optimization (PSO), and applies the crossover operation of genetic algorithm to enhance the global search ability. The new ARMAX model for short-term load forecasting has been tested based on the load data of Eastern China location market, and the results indicate that the proposed approach has achieved good accuracy. (author)

  13. Multi-objective optimization with estimation of distribution algorithm in a noisy environment.

    Science.gov (United States)

    Shim, Vui Ann; Tan, Kay Chen; Chia, Jun Yong; Al Mamun, Abdullah

    2013-01-01

    Many real-world optimization problems are subjected to uncertainties that may be characterized by the presence of noise in the objective functions. The estimation of distribution algorithm (EDA), which models the global distribution of the population for searching tasks, is one of the evolutionary computation techniques that deals with noisy information. This paper studies the potential of EDAs; particularly an EDA based on restricted Boltzmann machines that handles multi-objective optimization problems in a noisy environment. Noise is introduced to the objective functions in the form of a Gaussian distribution. In order to reduce the detrimental effect of noise, a likelihood correction feature is proposed to tune the marginal probability distribution of each decision variable. The EDA is subsequently hybridized with a particle swarm optimization algorithm in a discrete domain to improve its search ability. The effectiveness of the proposed algorithm is examined via eight benchmark instances with different characteristics and shapes of the Pareto optimal front. The scalability, hybridization, and computational time are rigorously studied. Comparative studies show that the proposed approach outperforms other state of the art algorithms.

  14. ANTQ evolutionary algorithm applied to nuclear fuel reload problem

    International Nuclear Information System (INIS)

    Machado, Liana; Schirru, Roberto

    2000-01-01

    Nuclear fuel reload optimization is a NP-complete combinatorial optimization problem where the aim is to find fuel rods' configuration that maximizes burnup or minimizes the power peak factor. For decades this problem was solved exclusively using an expert's knowledge. From the eighties, however, there have been efforts to automatize fuel reload. The first relevant effort used Simulated Annealing, but more recent publications show Genetic Algorithm's (GA) efficiency on this problem's solution. Following this direction, our aim is to optimize nuclear fuel reload using Ant-Q, a reinforcement learning algorithm based on the Cellular Computing paradigm. Ant-Q's results on the Travelling Salesmen Problem, which is conceptually similar to fuel reload, are better than the GA's ones. Ant-Q was tested on fuel reload by the simulation of the first cycle in-out reload of Bibils, a 193 fuel element PWR. Comparing An-Q's result with the GA's ones, it can b seen that even without a local heuristics, the former evolutionary algorithm can be used to solve the nuclear fuel reload problem. (author)

  15. An evolutionary algorithm for model selection

    Energy Technology Data Exchange (ETDEWEB)

    Bicker, Karl [CERN, Geneva (Switzerland); Chung, Suh-Urk; Friedrich, Jan; Grube, Boris; Haas, Florian; Ketzer, Bernhard; Neubert, Sebastian; Paul, Stephan; Ryabchikov, Dimitry [Technische Univ. Muenchen (Germany)

    2013-07-01

    When performing partial-wave analyses of multi-body final states, the choice of the fit model, i.e. the set of waves to be used in the fit, can significantly alter the results of the partial wave fit. Traditionally, the models were chosen based on physical arguments and by observing the changes in log-likelihood of the fits. To reduce possible bias in the model selection process, an evolutionary algorithm was developed based on a Bayesian goodness-of-fit criterion which takes into account the model complexity. Starting from systematically constructed pools of waves which contain significantly more waves than the typical fit model, the algorithm yields a model with an optimal log-likelihood and with a number of partial waves which is appropriate for the number of events in the data. Partial waves with small contributions to the total intensity are penalized and likely to be dropped during the selection process, as are models were excessive correlations between single waves occur. Due to the automated nature of the model selection, a much larger part of the model space can be explored than would be possible in a manual selection. In addition the method allows to assess the dependence of the fit result on the fit model which is an important contribution to the systematic uncertainty.

  16. Evolutionary Pseudo-Relaxation Learning Algorithm for Bidirectional Associative Memory

    Institute of Scientific and Technical Information of China (English)

    Sheng-Zhi Du; Zeng-Qiang Chen; Zhu-Zhi Yuan

    2005-01-01

    This paper analyzes the sensitivity to noise in BAM (Bidirectional Associative Memory), and then proves the noise immunity of BAM relates not only to the minimum absolute value of net inputs (MAV) but also to the variance of weights associated with synapse connections. In fact, it is a positive monotonically increasing function of the quotient of MAV divided by the variance of weights. Besides, the performance of pseudo-relaxation method depends on learning parameters (λ and ζ), but the relation of them is not linear. So it is hard to find a best combination of λ and ζ which leads to the best BAM performance. And it is obvious that pseudo-relaxation is a kind of local optimization method, so it cannot guarantee to get the global optimal solution. In this paper, a novel learning algorithm EPRBAM (evolutionary psendo-relaxation learning algorithm for bidirectional association memory) employing genetic algorithm and pseudo-relaxation method is proposed to get feasible solution of BAM weight matrix. This algorithm uses the quotient as the fitness of each individual and employs pseudo-relaxation method to adjust individual solution when it does not satisfy constraining condition any more after genetic operation. Experimental results show this algorithm improves noise immunity of BAM greatly. At the same time, EPRBAM does not depend on learning parameters and can get global optimal solution.

  17. A Gaze-Driven Evolutionary Algorithm to Study Aesthetic Evaluation of Visual Symmetry

    Directory of Open Access Journals (Sweden)

    Alexis D. J. Makin

    2016-03-01

    Full Text Available Empirical work has shown that people like visual symmetry. We used a gaze-driven evolutionary algorithm technique to answer three questions about symmetry preference. First, do people automatically evaluate symmetry without explicit instruction? Second, is perfect symmetry the best stimulus, or do people prefer a degree of imperfection? Third, does initial preference for symmetry diminish after familiarity sets in? Stimuli were generated as phenotypes from an algorithmic genotype, with genes for symmetry (coded as deviation from a symmetrical template, deviation–symmetry, DS gene and orientation (0° to 90°, orientation, ORI gene. An eye tracker identified phenotypes that were good at attracting and retaining the gaze of the observer. Resulting fitness scores determined the genotypes that passed to the next generation. We recorded changes to the distribution of DS and ORI genes over 20 generations. When participants looked for symmetry, there was an increase in high-symmetry genes. When participants looked for the patterns they preferred, there was a smaller increase in symmetry, indicating that people tolerated some imperfection. Conversely, there was no increase in symmetry during free viewing, and no effect of familiarity or orientation. This work demonstrates the viability of the evolutionary algorithm approach as a quantitative measure of aesthetic preference.

  18. Projections onto the Pareto surface in multicriteria radiation therapy optimization

    International Nuclear Information System (INIS)

    Bokrantz, Rasmus; Miettinen, Kaisa

    2015-01-01

    Purpose: To eliminate or reduce the error to Pareto optimality that arises in Pareto surface navigation when the Pareto surface is approximated by a small number of plans. Methods: The authors propose to project the navigated plan onto the Pareto surface as a postprocessing step to the navigation. The projection attempts to find a Pareto optimal plan that is at least as good as or better than the initial navigated plan with respect to all objective functions. An augmented form of projection is also suggested where dose–volume histogram constraints are used to prevent that the projection causes a violation of some clinical goal. The projections were evaluated with respect to planning for intensity modulated radiation therapy delivered by step-and-shoot and sliding window and spot-scanned intensity modulated proton therapy. Retrospective plans were generated for a prostate and a head and neck case. Results: The projections led to improved dose conformity and better sparing of organs at risk (OARs) for all three delivery techniques and both patient cases. The mean dose to OARs decreased by 3.1 Gy on average for the unconstrained form of the projection and by 2.0 Gy on average when dose–volume histogram constraints were used. No consistent improvements in target homogeneity were observed. Conclusions: There are situations when Pareto navigation leaves room for improvement in OAR sparing and dose conformity, for example, if the approximation of the Pareto surface is coarse or the problem formulation has too permissive constraints. A projection onto the Pareto surface can identify an inaccurate Pareto surface representation and, if necessary, improve the quality of the navigated plan

  19. Projections onto the Pareto surface in multicriteria radiation therapy optimization.

    Science.gov (United States)

    Bokrantz, Rasmus; Miettinen, Kaisa

    2015-10-01

    To eliminate or reduce the error to Pareto optimality that arises in Pareto surface navigation when the Pareto surface is approximated by a small number of plans. The authors propose to project the navigated plan onto the Pareto surface as a postprocessing step to the navigation. The projection attempts to find a Pareto optimal plan that is at least as good as or better than the initial navigated plan with respect to all objective functions. An augmented form of projection is also suggested where dose-volume histogram constraints are used to prevent that the projection causes a violation of some clinical goal. The projections were evaluated with respect to planning for intensity modulated radiation therapy delivered by step-and-shoot and sliding window and spot-scanned intensity modulated proton therapy. Retrospective plans were generated for a prostate and a head and neck case. The projections led to improved dose conformity and better sparing of organs at risk (OARs) for all three delivery techniques and both patient cases. The mean dose to OARs decreased by 3.1 Gy on average for the unconstrained form of the projection and by 2.0 Gy on average when dose-volume histogram constraints were used. No consistent improvements in target homogeneity were observed. There are situations when Pareto navigation leaves room for improvement in OAR sparing and dose conformity, for example, if the approximation of the Pareto surface is coarse or the problem formulation has too permissive constraints. A projection onto the Pareto surface can identify an inaccurate Pareto surface representation and, if necessary, improve the quality of the navigated plan.

  20. Application of Pareto optimization method for ontology matching in nuclear reactor domain

    International Nuclear Information System (INIS)

    Meenachi, N. Madurai; Baba, M. Sai

    2017-01-01

    This article describes the need for ontology matching and describes the methods to achieve the same. Efforts are put in the implementation of the semantic web based knowledge management system for nuclear domain which necessitated use of the methods for development of ontology matching. In order to exchange information in a distributed environment, ontology mapping has been used. The constraints in matching the ontology are also discussed. Pareto based ontology matching algorithm is used to find the similarity between two ontologies in the nuclear reactor domain. Algorithms like Jaro Winkler distance, Needleman Wunsch algorithm, Bigram, Kull Back and Cosine divergence are employed to demonstrate ontology matching. A case study was carried out to analysis the ontology matching in diversity in the nuclear reactor domain and same was illustrated.

  1. Application of Pareto optimization method for ontology matching in nuclear reactor domain

    Energy Technology Data Exchange (ETDEWEB)

    Meenachi, N. Madurai [Indira Gandhi Centre for Atomic Research, HBNI, Tamil Nadu (India). Planning and Human Resource Management Div.; Baba, M. Sai [Indira Gandhi Centre for Atomic Research, HBNI, Tamil Nadu (India). Resources Management Group

    2017-12-15

    This article describes the need for ontology matching and describes the methods to achieve the same. Efforts are put in the implementation of the semantic web based knowledge management system for nuclear domain which necessitated use of the methods for development of ontology matching. In order to exchange information in a distributed environment, ontology mapping has been used. The constraints in matching the ontology are also discussed. Pareto based ontology matching algorithm is used to find the similarity between two ontologies in the nuclear reactor domain. Algorithms like Jaro Winkler distance, Needleman Wunsch algorithm, Bigram, Kull Back and Cosine divergence are employed to demonstrate ontology matching. A case study was carried out to analysis the ontology matching in diversity in the nuclear reactor domain and same was illustrated.

  2. A Pareto Optimal Auction Mechanism for Carbon Emission Rights

    Directory of Open Access Journals (Sweden)

    Mingxi Wang

    2014-01-01

    Full Text Available The carbon emission rights do not fit well into the framework of existing multi-item auction mechanisms because of their own unique features. This paper proposes a new auction mechanism which converges to a unique Pareto optimal equilibrium in a finite number of periods. In the proposed auction mechanism, the assignment outcome is Pareto efficient and the carbon emission rights’ resources are efficiently used. For commercial application and theoretical completeness, both discrete and continuous markets—represented by discrete and continuous bid prices, respectively—are examined, and the results show the existence of a Pareto optimal equilibrium under the constraint of individual rationality. With no ties, the Pareto optimal equilibrium can be further proven to be unique.

  3. A Regionalization Approach to select the final watershed parameter set among the Pareto solutions

    Science.gov (United States)

    Park, G. H.; Micheletty, P. D.; Carney, S.; Quebbeman, J.; Day, G. N.

    2017-12-01

    The calibration of hydrological models often results in model parameters that are inconsistent with those from neighboring basins. Considering that physical similarity exists within neighboring basins some of the physically related parameters should be consistent among them. Traditional manual calibration techniques require an iterative process to make the parameters consistent, which takes additional effort in model calibration. We developed a multi-objective optimization procedure to calibrate the National Weather Service (NWS) Research Distributed Hydrological Model (RDHM), using the Nondominant Sorting Genetic Algorithm (NSGA-II) with expert knowledge of the model parameter interrelationships one objective function. The multi-objective algorithm enables us to obtain diverse parameter sets that are equally acceptable with respect to the objective functions and to choose one from the pool of the parameter sets during a subsequent regionalization step. Although all Pareto solutions are non-inferior, we exclude some of the parameter sets that show extremely values for any of the objective functions to expedite the selection process. We use an apriori model parameter set derived from the physical properties of the watershed (Koren et al., 2000) to assess the similarity for a given parameter across basins. Each parameter is assigned a weight based on its assumed similarity, such that parameters that are similar across basins are given higher weights. The parameter weights are useful to compute a closeness measure between Pareto sets of nearby basins. The regionalization approach chooses the Pareto parameter sets that minimize the closeness measure of the basin being regionalized. The presentation will describe the results of applying the regionalization approach to a set of pilot basins in the Upper Colorado basin as part of a NASA-funded project.

  4. A new evolutionary algorithm with LQV learning for combinatorial problems optimization

    International Nuclear Information System (INIS)

    Machado, Marcelo Dornellas; Schirru, Roberto

    2000-01-01

    Genetic algorithms are biologically motivated adaptive systems which have been used, with good results, for combinatorial problems optimization. In this work, a new learning mode, to be used by the population-based incremental learning algorithm, has the aim to build a new evolutionary algorithm to be used in optimization of numerical problems and combinatorial problems. This new learning mode uses a variable learning rate during the optimization process, constituting a process known as proportional reward. The development of this new algorithm aims its application in the optimization of reload problem of PWR nuclear reactors, in order to increase the useful life of the nuclear fuel. For the test, two classes of problems are used: numerical problems and combinatorial problems. Due to the fact that the reload problem is a combinatorial problem, the major interest relies on the last class. The results achieved with the tests indicate the applicability of the new learning mode, showing its potential as a developing tool in the solution of reload problem. (author)

  5. Improved multilayer OLED architecture using evolutionary genetic algorithm

    International Nuclear Information System (INIS)

    Quirino, W.G.; Teixeira, K.C.; Legnani, C.; Calil, V.L.; Messer, B.; Neto, O.P. Vilela; Pacheco, M.A.C.; Cremona, M.

    2009-01-01

    Organic light-emitting diodes (OLEDs) constitute a new class of emissive devices, which present high efficiency and low voltage operation, among other advantages over current technology. Multilayer architecture (M-OLED) is generally used to optimize these devices, specially overcoming the suppression of light emission due to the exciton recombination near the metal layers. However, improvement in recombination, transport and charge injection can also be achieved by blending electron and hole transporting layers into the same one. Graded emissive region devices can provide promising results regarding quantum and power efficiency and brightness, as well. The massive number of possible model configurations, however, suggests that a search algorithm would be more suitable for this matter. In this work, multilayer OLEDs were simulated and fabricated using Genetic Algorithms (GAs) as evolutionary strategy to improve their efficiency. Genetic Algorithms are stochastic algorithms based on genetic inheritance and Darwinian strife to survival. In our simulations, it was assumed a 50 nm width graded region, divided into five equally sized layers. The relative concentrations of the materials within each layer were optimized to obtain the lower V/J 0.5 ratio, where V is the applied voltage and J the current density. The best M-OLED architecture obtained by genetic algorithm presented a V/J 0.5 ratio nearly 7% lower than the value reported in the literature. In order to check the experimental validity of the improved results obtained in the simulations, two M-OLEDs with different architectures were fabricated by thermal deposition in high vacuum environment. The results of the comparison between simulation and some experiments are presented and discussed.

  6. A hybrid evolutionary algorithm for multi-objective anatomy-based dose optimization in high-dose-rate brachytherapy

    International Nuclear Information System (INIS)

    Lahanas, M; Baltas, D; Zamboglou, N

    2003-01-01

    Multiple objectives must be considered in anatomy-based dose optimization for high-dose-rate brachytherapy and a large number of parameters must be optimized to satisfy often competing objectives. For objectives expressed solely in terms of dose variances, deterministic gradient-based algorithms can be applied and a weighted sum approach is able to produce a representative set of non-dominated solutions. As the number of objectives increases, or non-convex objectives are used, local minima can be present and deterministic or stochastic algorithms such as simulated annealing either cannot be used or are not efficient. In this case we employ a modified hybrid version of the multi-objective optimization algorithm NSGA-II. This, in combination with the deterministic optimization algorithm, produces a representative sample of the Pareto set. This algorithm can be used with any kind of objectives, including non-convex, and does not require artificial importance factors. A representation of the trade-off surface can be obtained with more than 1000 non-dominated solutions in 2-5 min. An analysis of the solutions provides information on the possibilities available using these objectives. Simple decision making tools allow the selection of a solution that provides a best fit for the clinical goals. We show an example with a prostate implant and compare results obtained by variance and dose-volume histogram (DVH) based objectives

  7. Evolutionary Bi-objective Optimization for Bulldozer and Its Blade in Soil Cutting

    Science.gov (United States)

    Sharma, Deepak; Barakat, Nada

    2018-02-01

    An evolutionary optimization approach is adopted in this paper for simultaneously achieving the economic and productive soil cutting. The economic aspect is defined by minimizing the power requirement from the bulldozer, and the soil cutting is made productive by minimizing the time of soil cutting. For determining the power requirement, two force models are adopted from the literature to quantify the cutting force on the blade. Three domain-specific constraints are also proposed, which are limiting the power from the bulldozer, limiting the maximum force on the bulldozer blade and achieving the desired production rate. The bi-objective optimization problem is solved using five benchmark multi-objective evolutionary algorithms and one classical optimization technique using the ɛ-constraint method. The Pareto-optimal solutions are obtained with the knee-region. Further, the post-optimal analysis is performed on the obtained solutions to decipher relationships among the objectives and decision variables. Such relationships are later used for making guidelines for selecting the optimal set of input parameters. The obtained results are then compared with the experiment results from the literature that show a close agreement among them.

  8. An evolutionary firefly algorithm for the estimation of nonlinear biological model parameters.

    Directory of Open Access Journals (Sweden)

    Afnizanfaizal Abdullah

    Full Text Available The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test.

  9. An evolutionary firefly algorithm for the estimation of nonlinear biological model parameters.

    Science.gov (United States)

    Abdullah, Afnizanfaizal; Deris, Safaai; Anwar, Sohail; Arjunan, Satya N V

    2013-01-01

    The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test.

  10. Preventive maintenance scheduling by variable dimension evolutionary algorithms

    International Nuclear Information System (INIS)

    Limbourg, Philipp; Kochs, Hans-Dieter

    2006-01-01

    Black box optimization strategies have been proven to be useful tools for solving complex maintenance optimization problems. There has been a considerable amount of research on the right choice of optimization strategies for finding optimal preventive maintenance schedules. Much less attention is turned to the representation of the schedule to the algorithm. Either the search space is represented as a binary string leading to highly complex combinatorial problem or maintenance operations are defined by regular intervals which may restrict the search space to suboptimal solutions. An adequate representation however is vitally important for result quality. This work presents several nonstandard input representations and compares them to the standard binary representation. An evolutionary algorithm with extensions to handle variable length genomes is used for the comparison. The results demonstrate that two new representations perform better than the binary representation scheme. A second analysis shows that the performance may be even more increased using modified genetic operators. Thus, the choice of alternative representations leads to better results in the same amount of time and without any loss of accuracy

  11. Tractable Pareto Optimization of Temporal Preferences

    Science.gov (United States)

    Morris, Robert; Morris, Paul; Khatib, Lina; Venable, Brent

    2003-01-01

    This paper focuses on temporal constraint problems where the objective is to optimize a set of local preferences for when events occur. In previous work, a subclass of these problems has been formalized as a generalization of Temporal CSPs, and a tractable strategy for optimization has been proposed, where global optimality is defined as maximizing the minimum of the component preference values. This criterion for optimality, which we call 'Weakest Link Optimization' (WLO), is known to have limited practical usefulness because solutions are compared only on the basis of their worst value; thus, there is no requirement to improve the other values. To address this limitation, we introduce a new algorithm that re-applies WLO iteratively in a way that leads to improvement of all the values. We show the value of this strategy by proving that, with suitable preference functions, the resulting solutions are Pareto Optimal.

  12. Short-term economic environmental hydrothermal scheduling using improved multi-objective gravitational search algorithm

    International Nuclear Information System (INIS)

    Li, Chunlong; Zhou, Jianzhong; Lu, Peng; Wang, Chao

    2015-01-01

    Highlights: • Improved multi-objective gravitational search algorithm. • An elite archive set is proposed to guide evolutionary process. • Neighborhood searching mechanism to improve local search ability. • Adopt chaotic mutation for avoiding premature convergence. • Propose feasible space method to handle hydro plant constrains. - Abstract: With growing concerns about energy and environment, short-term economic environmental hydrothermal scheduling (SEEHS) plays a more and more important role in power system. Because of the two objectives and various constraints, SEEHS is a complex multi-objective optimization problem (MOOP). In order to solve the problem, we propose an improved multi-objective gravitational search algorithm (IMOGSA) in this paper. In IMOGSA, the mass of the agent is redefined by multiple objectives to make it suitable for MOOP. An elite archive set is proposed to keep Pareto optimal solutions and guide evolutionary process. For balancing exploration and exploitation, a neighborhood searching mechanism is presented to cooperate with chaotic mutation. Moreover, a novel method based on feasible space is proposed to handle hydro plant constraints during SEEHS, and a violation adjustment method is adopted to handle power balance constraint. For verifying its effectiveness, the proposed IMOGSA is applied to a hydrothermal system in two different case studies. The simulation results show that IMOGSA has a competitive performance in SEEHS when compared with other established algorithms

  13. Learning and anticipation in online dynamic optimization with evolutionary algorithms: The stochastic case

    NARCIS (Netherlands)

    P.A.N. Bosman (Peter); J.A. La Poutré (Han); D. Thierens (Dirk)

    2007-01-01

    htmlabstractThe focus of this paper is on how to design evolutionary algorithms (EAs) for solving stochastic dynamic optimization problems online, i.e. as time goes by. For a proper design, the EA must not only be capable of tracking shifting optima, it must also take into account the future

  14. Part E: Evolutionary Computation

    DEFF Research Database (Denmark)

    2015-01-01

    of Computational Intelligence. First, comprehensive surveys of genetic algorithms, genetic programming, evolution strategies, parallel evolutionary algorithms are presented, which are readable and constructive so that a large audience might find them useful and – to some extent – ready to use. Some more general...... kinds of evolutionary algorithms, have been prudently analyzed. This analysis was followed by a thorough analysis of various issues involved in stochastic local search algorithms. An interesting survey of various technological and industrial applications in mechanical engineering and design has been...... topics like the estimation of distribution algorithms, indicator-based selection, etc., are also discussed. An important problem, from a theoretical and practical point of view, of learning classifier systems is presented in depth. Multiobjective evolutionary algorithms, which constitute one of the most...

  15. An efficient hybrid evolutionary algorithm based on PSO and HBMO algorithms for multi-objective Distribution Feeder Reconfiguration

    Energy Technology Data Exchange (ETDEWEB)

    Niknam, Taher [Electronic and Electrical Engineering Department, Shiraz University of Technology, Shiraz (Iran)

    2009-08-15

    This paper introduces a robust searching hybrid evolutionary algorithm to solve the multi-objective Distribution Feeder Reconfiguration (DFR). The main objective of the DFR is to minimize the real power loss, deviation of the nodes' voltage, the number of switching operations, and balance the loads on the feeders. Because of the fact that the objectives are different and no commensurable, it is difficult to solve the problem by conventional approaches that may optimize a single objective. This paper presents a new approach based on norm3 for the DFR problem. In the proposed method, the objective functions are considered as a vector and the aim is to maximize the distance (norm2) between the objective function vector and the worst objective function vector while the constraints are met. Since the proposed DFR is a multi objective and non-differentiable optimization problem, a new hybrid evolutionary algorithm (EA) based on the combination of the Honey Bee Mating Optimization (HBMO) and the Discrete Particle Swarm Optimization (DPSO), called DPSO-HBMO, is implied to solve it. The results of the proposed reconfiguration method are compared with the solutions obtained by other approaches, the original DPSO and HBMO over different distribution test systems. (author)

  16. Synthesizing mixed H2/H-infinity dynamic controller using evolutionary algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf; Langballe, A.S.; Wisniewski, Rafal

    2001-01-01

    This paper covers the design of an Evolutionary Algorithm (EA), which should be able to synthesize a mixed H2/H-infinity. It will be shown how a system can be expressed as Matrix Inequalities (MI) and these will then be used in the design of the EA. The main objective is to examine whether a mixed...... H2/H-infinity controller is feasible, and if so, how the optimal mixed controller might befound....

  17. Multi-objective genetic algorithm for solving N-version program design problem

    Energy Technology Data Exchange (ETDEWEB)

    Yamachi, Hidemi [Department of Computer and Information Engineering, Nippon Institute of Technology, Miyashiro, Saitama 345-8501 (Japan) and Department of Production and Information Systems Engineering, Tokyo Metropolitan Institute of Technology, Hino, Tokyo 191-0065 (Japan)]. E-mail: yamachi@nit.ac.jp; Tsujimura, Yasuhiro [Department of Computer and Information Engineering, Nippon Institute of Technology, Miyashiro, Saitama 345-8501 (Japan)]. E-mail: tujimr@nit.ac.jp; Kambayashi, Yasushi [Department of Computer and Information Engineering, Nippon Institute of Technology, Miyashiro, Saitama 345-8501 (Japan)]. E-mail: yasushi@nit.ac.jp; Yamamoto, Hisashi [Department of Production and Information Systems Engineering, Tokyo Metropolitan Institute of Technology, Hino, Tokyo 191-0065 (Japan)]. E-mail: yamamoto@cc.tmit.ac.jp

    2006-09-15

    N-version programming (NVP) is a programming approach for constructing fault tolerant software systems. Generally, an optimization model utilized in NVP selects the optimal set of versions for each module to maximize the system reliability and to constrain the total cost to remain within a given budget. In such a model, while the number of versions included in the obtained solution is generally reduced, the budget restriction may be so rigid that it may fail to find the optimal solution. In order to ameliorate this problem, this paper proposes a novel bi-objective optimization model that maximizes the system reliability and minimizes the system total cost for designing N-version software systems. When solving multi-objective optimization problem, it is crucial to find Pareto solutions. It is, however, not easy to obtain them. In this paper, we propose a novel bi-objective optimization model that obtains many Pareto solutions efficiently. We formulate the optimal design problem of NVP as a bi-objective 0-1 nonlinear integer programming problem. In order to overcome this problem, we propose a Multi-objective genetic algorithm (MOGA), which is a powerful, though time-consuming, method to solve multi-objective optimization problems. When implementing genetic algorithm (GA), the use of an appropriate genetic representation scheme is one of the most important issues to obtain good performance. We employ random-key representation in our MOGA to find many Pareto solutions spaced as evenly as possible along the Pareto frontier. To pursue improve further performance, we introduce elitism, the Pareto-insertion and the Pareto-deletion operations based on distance between Pareto solutions in the selection process. The proposed MOGA obtains many Pareto solutions along the Pareto frontier evenly. The user of the MOGA can select the best compromise solution among the candidates by controlling the balance between the system reliability and the total cost.

  18. Multi-objective genetic algorithm for solving N-version program design problem

    International Nuclear Information System (INIS)

    Yamachi, Hidemi; Tsujimura, Yasuhiro; Kambayashi, Yasushi; Yamamoto, Hisashi

    2006-01-01

    N-version programming (NVP) is a programming approach for constructing fault tolerant software systems. Generally, an optimization model utilized in NVP selects the optimal set of versions for each module to maximize the system reliability and to constrain the total cost to remain within a given budget. In such a model, while the number of versions included in the obtained solution is generally reduced, the budget restriction may be so rigid that it may fail to find the optimal solution. In order to ameliorate this problem, this paper proposes a novel bi-objective optimization model that maximizes the system reliability and minimizes the system total cost for designing N-version software systems. When solving multi-objective optimization problem, it is crucial to find Pareto solutions. It is, however, not easy to obtain them. In this paper, we propose a novel bi-objective optimization model that obtains many Pareto solutions efficiently. We formulate the optimal design problem of NVP as a bi-objective 0-1 nonlinear integer programming problem. In order to overcome this problem, we propose a Multi-objective genetic algorithm (MOGA), which is a powerful, though time-consuming, method to solve multi-objective optimization problems. When implementing genetic algorithm (GA), the use of an appropriate genetic representation scheme is one of the most important issues to obtain good performance. We employ random-key representation in our MOGA to find many Pareto solutions spaced as evenly as possible along the Pareto frontier. To pursue improve further performance, we introduce elitism, the Pareto-insertion and the Pareto-deletion operations based on distance between Pareto solutions in the selection process. The proposed MOGA obtains many Pareto solutions along the Pareto frontier evenly. The user of the MOGA can select the best compromise solution among the candidates by controlling the balance between the system reliability and the total cost

  19. THE APPLICATION OF AN EVOLUTIONARY ALGORITHM TO THE OPTIMIZATION OF A MESOSCALE METEOROLOGICAL MODEL

    Energy Technology Data Exchange (ETDEWEB)

    Werth, D.; O' Steen, L.

    2008-02-11

    We show that a simple evolutionary algorithm can optimize a set of mesoscale atmospheric model parameters with respect to agreement between the mesoscale simulation and a limited set of synthetic observations. This is illustrated using the Regional Atmospheric Modeling System (RAMS). A set of 23 RAMS parameters is optimized by minimizing a cost function based on the root mean square (rms) error between the RAMS simulation and synthetic data (observations derived from a separate RAMS simulation). We find that the optimization can be efficient with relatively modest computer resources, thus operational implementation is possible. The optimization efficiency, however, is found to depend strongly on the procedure used to perturb the 'child' parameters relative to their 'parents' within the evolutionary algorithm. In addition, the meteorological variables included in the rms error and their weighting are found to be an important factor with respect to finding the global optimum.

  20. Pareto versus lognormal: a maximum entropy test.

    Science.gov (United States)

    Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano

    2011-08-01

    It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.

  1. Pareto optimality in infinite horizon linear quadratic differential games

    NARCIS (Netherlands)

    Reddy, P.V.; Engwerda, J.C.

    2013-01-01

    In this article we derive conditions for the existence of Pareto optimal solutions for linear quadratic infinite horizon cooperative differential games. First, we present a necessary and sufficient characterization for Pareto optimality which translates to solving a set of constrained optimal

  2. Irrigation water allocation optimization using multi-objective evolutionary algorithm (MOEA) - a review

    Science.gov (United States)

    Fanuel, Ibrahim Mwita; Mushi, Allen; Kajunguri, Damian

    2018-03-01

    This paper analyzes more than 40 papers with a restricted area of application of Multi-Objective Genetic Algorithm, Non-Dominated Sorting Genetic Algorithm-II and Multi-Objective Differential Evolution (MODE) to solve the multi-objective problem in agricultural water management. The paper focused on different application aspects which include water allocation, irrigation planning, crop pattern and allocation of available land. The performance and results of these techniques are discussed. The review finds that there is a potential to use MODE to analyzed the multi-objective problem, the application is more significance due to its advantage of being simple and powerful technique than any Evolutionary Algorithm. The paper concludes with the hopeful new trend of research that demand effective use of MODE; inclusion of benefits derived from farm byproducts and production costs into the model.

  3. A standard deviation selection in evolutionary algorithm for grouper fish feed formulation

    Science.gov (United States)

    Cai-Juan, Soong; Ramli, Razamin; Rahman, Rosshairy Abdul

    2016-10-01

    Malaysia is one of the major producer countries for fishery production due to its location in the equatorial environment. Grouper fish is one of the potential markets in contributing to the income of the country due to its desirable taste, high demand and high price. However, the demand of grouper fish is still insufficient from the wild catch. Therefore, there is a need to farm grouper fish to cater to the market demand. In order to farm grouper fish, there is a need to have prior knowledge of the proper nutrients needed because there is no exact data available. Therefore, in this study, primary data and secondary data are collected even though there is a limitation of related papers and 30 samples are investigated by using standard deviation selection in Evolutionary algorithm. Thus, this study would unlock frontiers for an extensive research in respect of grouper fish feed formulation. Results shown that the fitness of standard deviation selection in evolutionary algorithm is applicable. The feasible and low fitness, quick solution can be obtained. These fitness can be further predicted to minimize cost in farming grouper fish.

  4. Minimizing the symbol-error-rate for amplify-and-forward relaying systems using evolutionary algorithms

    KAUST Repository

    Ahmed, Qasim Zeeshan

    2015-02-01

    In this paper, a new detector is proposed for an amplify-and-forward (AF) relaying system. The detector is designed to minimize the symbol-error-rate (SER) of the system. The SER surface is non-linear and may have multiple minimas, therefore, designing an SER detector for cooperative communications becomes an optimization problem. Evolutionary based algorithms have the capability to find the global minima, therefore, evolutionary algorithms such as particle swarm optimization (PSO) and differential evolution (DE) are exploited to solve this optimization problem. The performance of proposed detectors is compared with the conventional detectors such as maximum likelihood (ML) and minimum mean square error (MMSE) detector. In the simulation results, it can be observed that the SER performance of the proposed detectors is less than 2 dB away from the ML detector. Significant improvement in SER performance is also observed when comparing with the MMSE detector. The computational complexity of the proposed detector is much less than the ML and MMSE algorithms. Moreover, in contrast to ML and MMSE detectors, the computational complexity of the proposed detectors increases linearly with respect to the number of relays.

  5. Pareto 80/20 Law: Derivation via Random Partitioning

    Science.gov (United States)

    Lipovetsky, Stan

    2009-01-01

    The Pareto 80/20 Rule, also known as the Pareto principle or law, states that a small number of causes (20%) is responsible for a large percentage (80%) of the effect. Although widely recognized as a heuristic rule, this proportion has not been theoretically based. The article considers derivation of this 80/20 rule and some other standard…

  6. A DVH-guided IMRT optimization algorithm for automatic treatment planning and adaptive radiotherapy replanning

    International Nuclear Information System (INIS)

    Zarepisheh, Masoud; Li, Nan; Long, Troy; Romeijn, H. Edwin; Tian, Zhen; Jia, Xun; Jiang, Steve B.

    2014-01-01

    Purpose: To develop a novel algorithm that incorporates prior treatment knowledge into intensity modulated radiation therapy optimization to facilitate automatic treatment planning and adaptive radiotherapy (ART) replanning. Methods: The algorithm automatically creates a treatment plan guided by the DVH curves of a reference plan that contains information on the clinician-approved dose-volume trade-offs among different targets/organs and among different portions of a DVH curve for an organ. In ART, the reference plan is the initial plan for the same patient, while for automatic treatment planning the reference plan is selected from a library of clinically approved and delivered plans of previously treated patients with similar medical conditions and geometry. The proposed algorithm employs a voxel-based optimization model and navigates the large voxel-based Pareto surface. The voxel weights are iteratively adjusted to approach a plan that is similar to the reference plan in terms of the DVHs. If the reference plan is feasible but not Pareto optimal, the algorithm generates a Pareto optimal plan with the DVHs better than the reference ones. If the reference plan is too restricting for the new geometry, the algorithm generates a Pareto plan with DVHs close to the reference ones. In both cases, the new plans have similar DVH trade-offs as the reference plans. Results: The algorithm was tested using three patient cases and found to be able to automatically adjust the voxel-weighting factors in order to generate a Pareto plan with similar DVH trade-offs as the reference plan. The algorithm has also been implemented on a GPU for high efficiency. Conclusions: A novel prior-knowledge-based optimization algorithm has been developed that automatically adjust the voxel weights and generate a clinical optimal plan at high efficiency. It is found that the new algorithm can significantly improve the plan quality and planning efficiency in ART replanning and automatic treatment

  7. Optimal design and management of chlorination in drinking water networks: a multi-objective approach using Genetic Algorithms and the Pareto optimality concept

    Science.gov (United States)

    Nouiri, Issam

    2017-11-01

    This paper presents the development of multi-objective Genetic Algorithms to optimize chlorination design and management in drinking water networks (DWN). Three objectives have been considered: the improvement of the chlorination uniformity (healthy objective), the minimization of chlorine booster stations number, and the injected chlorine mass (economic objectives). The problem has been dissociated in medium and short terms ones. The proposed methodology was tested on hypothetical and real DWN. Results proved the ability of the developed optimization tool to identify relationships between the healthy and economic objectives as Pareto fronts. The proposed approach was efficient in computing solutions ensuring better chlorination uniformity while requiring the weakest injected chlorine mass when compared to other approaches. For the real DWN studied, chlorination optimization has been crowned by great improvement of free-chlorine-dosing uniformity and by a meaningful chlorine mass reduction, in comparison with the conventional chlorination.

  8. A New Multiobjective Evolutionary Algorithm for Community Detection in Dynamic Complex Networks

    Directory of Open Access Journals (Sweden)

    Guoqiang Chen

    2013-01-01

    Full Text Available Community detection in dynamic networks is an important research topic and has received an enormous amount of attention in recent years. Modularity is selected as a measure to quantify the quality of the community partition in previous detection methods. But, the modularity has been exposed to resolution limits. In this paper, we propose a novel multiobjective evolutionary algorithm for dynamic networks community detection based on the framework of nondominated sorting genetic algorithm. Modularity density which can address the limitations of modularity function is adopted to measure the snapshot cost, and normalized mutual information is selected to measure temporal cost, respectively. The characteristics knowledge of the problem is used in designing the genetic operators. Furthermore, a local search operator was designed, which can improve the effectiveness and efficiency of community detection. Experimental studies based on synthetic datasets show that the proposed algorithm can obtain better performance than the compared algorithms.

  9. PARETO OPTIMAL SOLUTIONS FOR MULTI-OBJECTIVE GENERALIZED ASSIGNMENT PROBLEM

    Directory of Open Access Journals (Sweden)

    S. Prakash

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: The Multi-Objective Generalized Assignment Problem (MGAP with two objectives, where one objective is linear and the other one is non-linear, has been considered, with the constraints that a job is assigned to only one worker – though he may be assigned more than one job, depending upon the time available to him. An algorithm is proposed to find the set of Pareto optimal solutions of the problem, determining assignments of jobs to workers with two objectives without setting priorities for them. The two objectives are to minimise the total cost of the assignment and to reduce the time taken to complete all the jobs.

    AFRIKAANSE OPSOMMING: ‘n Multi-doelwit veralgemeende toekenningsprobleem (“multi-objective generalised assignment problem – MGAP” met twee doelwitte, waar die een lineêr en die ander nielineêr is nie, word bestudeer, met die randvoorwaarde dat ‘n taak slegs toegedeel word aan een werker – alhoewel meer as een taak aan hom toegedeel kan word sou die tyd beskikbaar wees. ‘n Algoritme word voorgestel om die stel Pareto-optimale oplossings te vind wat die taaktoedelings aan werkers onderhewig aan die twee doelwitte doen sonder dat prioriteite toegeken word. Die twee doelwitte is om die totale koste van die opdrag te minimiseer en om die tyd te verminder om al die take te voltooi.

  10. Multi-objective component sizing of a power-split plug-in hybrid electric vehicle powertrain using Pareto-based natural optimization machines

    Science.gov (United States)

    Mozaffari, Ahmad; Vajedi, Mahyar; Chehresaz, Maryyeh; Azad, Nasser L.

    2016-03-01

    The urgent need to meet increasingly tight environmental regulations and new fuel economy requirements has motivated system science researchers and automotive engineers to take advantage of emerging computational techniques to further advance hybrid electric vehicle and plug-in hybrid electric vehicle (PHEV) designs. In particular, research has focused on vehicle powertrain system design optimization, to reduce the fuel consumption and total energy cost while improving the vehicle's driving performance. In this work, two different natural optimization machines, namely the synchronous self-learning Pareto strategy and the elitism non-dominated sorting genetic algorithm, are implemented for component sizing of a specific power-split PHEV platform with a Toyota plug-in Prius as the baseline vehicle. To do this, a high-fidelity model of the Toyota plug-in Prius is employed for the numerical experiments using the Autonomie simulation software. Based on the simulation results, it is demonstrated that Pareto-based algorithms can successfully optimize the design parameters of the vehicle powertrain.

  11. MO-G-304-04: Generating Well-Dispersed Representations of the Pareto Front for Multi-Criteria Optimization in Radiation Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Kirlik, G; Zhang, H [University of Maryland School of Medicine, Baltimore, MD (United States)

    2015-06-15

    Purpose: To present a novel multi-criteria optimization (MCO) solution approach that generates well-dispersed representation of the Pareto front for radiation treatment planning. Methods: Different algorithms have been proposed and implemented in commercial planning software to generate MCO plans for external-beam radiation therapy. These algorithms consider convex optimization problems. We propose a grid-based algorithm to generate well-dispersed treatment plans over Pareto front. Our method is able to handle nonconvexity in the problem to deal with dose-volume objectives/constraints, biological objectives, such as equivalent uniform dose (EUD), tumor control probability (TCP), normal tissue complication probability (NTCP), etc. In addition, our algorithm is able to provide single MCO plan when clinicians are targeting narrow bounds of objectives for patients. In this situation, usually none of the generated plans were within the bounds and a solution is difficult to identify via manual navigation. We use the subproblem formulation utilized in the grid-based algorithm to obtain a plan within the specified bounds. The subproblem aims to generate a solution that maps into the rectangle defined by the bounds. If such a solution does not exist, it generates the solution closest to the rectangle. We tested our method with 10 locally advanced head and neck cancer cases. Results: 8 objectives were used including 3 different objectives for primary target volume, high-risk and low-risk target volumes, and 5 objectives for each of the organs-at-risk (OARs) (two parotids, spinal cord, brain stem and oral cavity). Given tight bounds, uniform dose was achieved for all targets while as much as 26% improvement was achieved in OAR sparing comparing to clinical plans without MCO and previously proposed MCO method. Conclusion: Our method is able to obtain well-dispersed treatment plans to attain better approximation for convex and nonconvex Pareto fronts. Single treatment plan can

  12. Automated Design Framework for Synthetic Biology Exploiting Pareto Optimality.

    Science.gov (United States)

    Otero-Muras, Irene; Banga, Julio R

    2017-07-21

    In this work we consider Pareto optimality for automated design in synthetic biology. We present a generalized framework based on a mixed-integer dynamic optimization formulation that, given design specifications, allows the computation of Pareto optimal sets of designs, that is, the set of best trade-offs for the metrics of interest. We show how this framework can be used for (i) forward design, that is, finding the Pareto optimal set of synthetic designs for implementation, and (ii) reverse design, that is, analyzing and inferring motifs and/or design principles of gene regulatory networks from the Pareto set of optimal circuits. Finally, we illustrate the capabilities and performance of this framework considering four case studies. In the first problem we consider the forward design of an oscillator. In the remaining problems, we illustrate how to apply the reverse design approach to find motifs for stripe formation, rapid adaption, and fold-change detection, respectively.

  13. Nuclear fuel management optimization using adaptive evolutionary algorithms with heuristics

    International Nuclear Information System (INIS)

    Axmann, J.K.; Van de Velde, A.

    1996-01-01

    Adaptive Evolutionary Algorithms in combination with expert knowledge encoded in heuristics have proved to be a robust and powerful optimization method for the design of optimized PWR fuel loading pattern. Simple parallel algorithmic structures coupled with a low amount of communications between computer processor units in use makes it possible for workstation clusters to be employed efficiently. The extension of classic evolution strategies not only by new and alternative methods but also by the inclusion of heuristics with effects on the exchange probabilities of the fuel assemblies at specific core positions leads to the RELOPAT optimization code of the Technical University of Braunschweig. In combination with the new, neutron-physical 3D nodal core simulator PRISM developed by SIEMENS the PRIMO loading pattern optimization system has been designed. Highly promising results in the recalculation of known reload plans for German PWR's new lead to a commercially usable program. (author)

  14. An Endosymbiotic Evolutionary Algorithm for the Hub Location-Routing Problem

    Directory of Open Access Journals (Sweden)

    Ji Ung Sun

    2015-01-01

    Full Text Available We consider a capacitated hub location-routing problem (HLRP which combines the hub location problem and multihub vehicle routing decisions. The HLRP not only determines the locations of the capacitated p-hubs within a set of potential hubs but also deals with the routes of the vehicles to meet the demands of customers. This problem is formulated as a 0-1 mixed integer programming model with the objective of the minimum total cost including routing cost, fixed hub cost, and fixed vehicle cost. As the HLRP has impractically demanding for the large sized problems, we develop a solution method based on the endosymbiotic evolutionary algorithm (EEA which solves hub location and vehicle routing problem simultaneously. The performance of the proposed algorithm is examined through a comparative study. The experimental results show that the proposed EEA can be a viable solution method for the supply chain network planning.

  15. Generalized Pareto optimum and semi-classical spinors

    Science.gov (United States)

    Rouleux, M.

    2018-02-01

    In 1971, S. Smale presented a generalization of Pareto optimum he called the critical Pareto set. The underlying motivation was to extend Morse theory to several functions, i.e. to find a Morse theory for m differentiable functions defined on a manifold M of dimension ℓ. We use this framework to take a 2 × 2 Hamiltonian ℋ = ℋ(p) ∈ 2 C ∞(T * R 2) to its normal form near a singular point of the Fresnel surface. Namely we say that ℋ has the Pareto property if it decomposes, locally, up to a conjugation with regular matrices, as ℋ(p) = u ‧(p)C(p)(u ‧(p))*, where u : R 2 → R 2 has singularities of codimension 1 or 2, and C(p) is a regular Hermitian matrix (“integrating factor”). In particular this applies in certain cases to the matrix Hamiltonian of Elasticity theory and its (relative) perturbations of order 3 in momentum at the origin.

  16. The application of analytical methods to the study of Pareto - optimal control systems

    Directory of Open Access Journals (Sweden)

    I. K. Romanova

    2014-01-01

    Full Text Available The subject of research articles - - methods of multicriteria optimization and their application for parametric synthesis of double-circuit control systems in conditions of inconsistency of individual criteria. The basis for solving multicriteria problems is a fundamental principle of a multi-criteria choice - the principle of the Edgeworth - Pareto. Getting Pareto - optimal variants due to inconsistency of individual criteria does not mean reaching a final decision. Set these options only offers the designer (DM.An important issue when using traditional numerical methods is their computational cost. An example is the use of methods of sounding the parameter space, including with use of uniform grids and uniformly distributed sequences. Very complex computational task is the application of computer methods of approximation bounds of Pareto.The purpose of this work is the development of a fairly simple search methods of Pareto - optimal solutions for the case of the criteria set out in the analytical form.The proposed solution is based on the study of the properties of the analytical dependences of criteria. The case is not covered so far in the literature, namely, the topology of the task, in which no touch of indifference curves (lines level. It is shown that for such tasks may be earmarked for compromise solutions. Prepositional use of the angular position of antigradient to the indifference curves in the parameter space relative to the coordinate axes. Formulated propositions on the characteristics of comonotonicity and contramonotonicity and angular characteristics of antigradient to determine Pareto optimal solutions. Considers the General algorithm of calculation: determine the scope of permissible values of parameters; investigates properties comonotonicity and contraventanas; to build an equal level (indifference curves; determined touch type: single sided (task is not strictly multicriteria or bilateral (objective relates to the Pareto

  17. Diversity comparison of Pareto front approximations in many-objective optimization.

    Science.gov (United States)

    Li, Miqing; Yang, Shengxiang; Liu, Xiaohui

    2014-12-01

    Diversity assessment of Pareto front approximations is an important issue in the stochastic multiobjective optimization community. Most of the diversity indicators in the literature were designed to work for any number of objectives of Pareto front approximations in principle, but in practice many of these indicators are infeasible or not workable when the number of objectives is large. In this paper, we propose a diversity comparison indicator (DCI) to assess the diversity of Pareto front approximations in many-objective optimization. DCI evaluates relative quality of different Pareto front approximations rather than provides an absolute measure of distribution for a single approximation. In DCI, all the concerned approximations are put into a grid environment so that there are some hyperboxes containing one or more solutions. The proposed indicator only considers the contribution of different approximations to nonempty hyperboxes. Therefore, the computational cost does not increase exponentially with the number of objectives. In fact, the implementation of DCI is of quadratic time complexity, which is fully independent of the number of divisions used in grid. Systematic experiments are conducted using three groups of artificial Pareto front approximations and seven groups of real Pareto front approximations with different numbers of objectives to verify the effectiveness of DCI. Moreover, a comparison with two diversity indicators used widely in many-objective optimization is made analytically and empirically. Finally, a parametric investigation reveals interesting insights of the division number in grid and also offers some suggested settings to the users with different preferences.

  18. The wind power prediction research based on mind evolutionary algorithm

    Science.gov (United States)

    Zhuang, Ling; Zhao, Xinjian; Ji, Tianming; Miao, Jingwen; Cui, Haina

    2018-04-01

    When the wind power is connected to the power grid, its characteristics of fluctuation, intermittent and randomness will affect the stability of the power system. The wind power prediction can guarantee the power quality and reduce the operating cost of power system. There were some limitations in several traditional wind power prediction methods. On the basis, the wind power prediction method based on Mind Evolutionary Algorithm (MEA) is put forward and a prediction model is provided. The experimental results demonstrate that MEA performs efficiently in term of the wind power prediction. The MEA method has broad prospect of engineering application.

  19. Culture belief based multi-objective hybrid differential evolutionary algorithm in short term hydrothermal scheduling

    International Nuclear Information System (INIS)

    Zhang Huifeng; Zhou Jianzhong; Zhang Yongchuan; Lu Youlin; Wang Yongqiang

    2013-01-01

    Highlights: ► Culture belief is integrated into multi-objective differential evolution. ► Chaotic sequence is imported to improve evolutionary population diversity. ► The priority of convergence rate is proved in solving hydrothermal problem. ► The results show the quality and potential of proposed algorithm. - Abstract: A culture belief based multi-objective hybrid differential evolution (CB-MOHDE) is presented to solve short term hydrothermal optimal scheduling with economic emission (SHOSEE) problem. This problem is formulated for compromising thermal cost and emission issue while considering its complicated non-linear constraints with non-smooth and non-convex characteristics. The proposed algorithm integrates a modified multi-objective differential evolutionary algorithm into the computation model of culture algorithm (CA) as well as some communication protocols between population space and belief space, three knowledge structures in belief space are redefined according to these problem-solving characteristics, and in the differential evolution a chaotic factor is embedded into mutation operator for avoiding the premature convergence by enlarging the search scale when the search trajectory reaches local optima. Furthermore, a new heuristic constraint-handling technique is utilized to handle those complex equality and inequality constraints of SHOSEE problem. After the application on hydrothermal scheduling system, the efficiency and stability of the proposed CB-MOHDE is verified by its more desirable results in comparison to other method established recently, and the simulation results also reveal that CB-MOHDE can be a promising alternative for solving SHOSEE.

  20. Analysis of a Pareto Mixture Distribution for Maritime Surveillance Radar

    Directory of Open Access Journals (Sweden)

    Graham V. Weinberg

    2012-01-01

    Full Text Available The Pareto distribution has been shown to be an excellent model for X-band high-resolution maritime surveillance radar clutter returns. Given the success of mixture distributions in radar, it is thus of interest to consider the effect of Pareto mixture models. This paper introduces a formulation of a Pareto intensity mixture distribution and investigates coherent multilook radar detector performance using this new clutter model. Clutter parameter estimates are derived from data sets produced by the Defence Science and Technology Organisation's Ingara maritime surveillance radar.

  1. A Problem-Reduction Evolutionary Algorithm for Solving the Capacitated Vehicle Routing Problem

    Directory of Open Access Journals (Sweden)

    Wanfeng Liu

    2015-01-01

    Full Text Available Assessment of the components of a solution helps provide useful information for an optimization problem. This paper presents a new population-based problem-reduction evolutionary algorithm (PREA based on the solution components assessment. An individual solution is regarded as being constructed by basic elements, and the concept of acceptability is introduced to evaluate them. The PREA consists of a searching phase and an evaluation phase. The acceptability of basic elements is calculated in the evaluation phase and passed to the searching phase. In the searching phase, for each individual solution, the original optimization problem is reduced to a new smaller-size problem. With the evolution of the algorithm, the number of common basic elements in the population increases until all individual solutions are exactly the same which is supposed to be the near-optimal solution of the optimization problem. The new algorithm is applied to a large variety of capacitated vehicle routing problems (CVRP with customers up to nearly 500. Experimental results show that the proposed algorithm has the advantages of fast convergence and robustness in solution quality over the comparative algorithms.

  2. The exponential age distribution and the Pareto firm size distribution

    OpenAIRE

    Coad, Alex

    2008-01-01

    Recent work drawing on data for large and small firms has shown a Pareto distribution of firm size. We mix a Gibrat-type growth process among incumbents with an exponential distribution of firm’s age, to obtain the empirical Pareto distribution.

  3. Synthesizing multi-objective H2/H-infinity dynamic controller using evolutionary algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf; Langballe, A.S.; Wisniewski, Rafal

    This paper covers the design of an Evolutionary Algorithm (EA), which should be able to synthesize a mixed H2/H-infinity. It will be shown how a system can be expressed as Matrix Inequalities (MI) and these will then be used in the design of the EA. The main objective is to examine whether a mixed...... H2/H-infinity controller is feasible, and if so, how the optimal mixed controller might befound....

  4. Efficient fractal-based mutation in evolutionary algorithms from iterated function systems

    Science.gov (United States)

    Salcedo-Sanz, S.; Aybar-Ruíz, A.; Camacho-Gómez, C.; Pereira, E.

    2018-03-01

    In this paper we present a new mutation procedure for Evolutionary Programming (EP) approaches, based on Iterated Function Systems (IFSs). The new mutation procedure proposed consists of considering a set of IFS which are able to generate fractal structures in a two-dimensional phase space, and use them to modify a current individual of the EP algorithm, instead of using random numbers from different probability density functions. We test this new proposal in a set of benchmark functions for continuous optimization problems. In this case, we compare the proposed mutation against classical Evolutionary Programming approaches, with mutations based on Gaussian, Cauchy and chaotic maps. We also include a discussion on the IFS-based mutation in a real application of Tuned Mass Dumper (TMD) location and optimization for vibration cancellation in buildings. In both practical cases, the proposed EP with the IFS-based mutation obtained extremely competitive results compared to alternative classical mutation operators.

  5. Tsallis-Pareto like distributions in hadron-hadron collisions

    International Nuclear Information System (INIS)

    Barnafoeldi, G G; Uermoessy, K; Biro, T S

    2011-01-01

    Non-extensive thermodynamics is a novel approach in high energy physics. In high-energy heavy-ion, and especially in proton-proton collisions we are far from a canonical thermal state, described by the Boltzmann-Gibbs statistic. In these reactions low and intermediate transverse momentum spectra are extremely well reproduced by the Tsallis-Pareto distribution, but the physical origin of Tsallis parameters is still an unsettled question. Here, we analyze whether Tsallis-Pareto energy distribution do overlap with hadron spectra at high-pT. We fitted data, measured in proton-proton (proton-antiproton) collisions in wide center of mass energy range from 200 GeV RHIC up to 7 TeV LHC energies. Furthermore, our test is extended to an investigation of a possible √s-dependence of the power in the Tsallis-Pareto distribution, motivated by QCD evolution equations. We found that Tsallis-Pareto distributions fit well high-pT data, in the wide center of mass energy range. Deviance from the fits appears at p T > 20-30 GeV/c, especially on CDF data. Introducing a pT-scaling ansatz, the fits at low and intermediate transverse momenta still remain good, and the deviations tend to disappear at the highest-pT data.

  6. Pareto vs Simmel: residui ed emozioni

    Directory of Open Access Journals (Sweden)

    Silvia Fornari

    2017-08-01

    Full Text Available A cento anni dalla pubblicazione del Trattato di sociologia generale (Pareto 1988 siamo a mantenere vivo ed attuale lo studio paretiano con una rilettura contemporanea del suo pensiero. Ricordato per la grande versatilità intellettuale dagli economisti, rimane lo scienziato rigoroso ed analitico i cui contributi sono ancora discussi a livello internazionale. Noi ne analizzeremo gli aspetti che l’hanno portato ad avvicinarsi all’approccio sociologico, con l’introduzione della nota distinzione dell’azione sociale: logica e non-logica. Una dicotomia utilizzata per dare conto dei cambiamenti sociali riguardanti le modalità d’azione degli uomini e delle donne. Com’è noto le azioni logiche sono quelle che riguardano comportamenti mossi da logicità e raziocinio, in cui vi è una diretta relazione causa-effetto, azioni oggetto di studio degli economisti, e di cui non si occupano i sociologi. Le azioni non-logiche riguardano tutte le tipologie di agire umano che rientrano nel novero delle scienze sociali, e che rappresentano la parte più ampia dell’agire sociale. Sono le azioni guidate dai sentimenti, dall’emotività, dalla superstizione, ecc., illustrate da Pareto nel Trattato di sociologia generale e in saggi successivi, dove riprende anche il concetto di eterogenesi dei fini, formulato per la prima volta da Giambattista Vico. Concetto secondo il quale la storia umana, pur conservando in potenza la realizzazione di certi fini, non è lineare e lungo il suo percorso evolutivo può accadere che l’uomo nel tentativo di raggiungere una finalità arrivi a conclusioni opposte. Pareto collega la definizione del filosofo napoletano alle tipologie di azione sociale e alla loro distinzione (logiche, non-logiche. L’eterogenesi dei fini per Pareto è dunque l’esito di un particolare tipo di azione non-logica dell’essere umano e della collettività.

  7. Multiobjective Optimal Algorithm for Automatic Calibration of Daily Streamflow Forecasting Model

    Directory of Open Access Journals (Sweden)

    Yi Liu

    2016-01-01

    Full Text Available Single-objection function cannot describe the characteristics of the complicated hydrologic system. Consequently, it stands to reason that multiobjective functions are needed for calibration of hydrologic model. The multiobjective algorithms based on the theory of nondominate are employed to solve this multiobjective optimal problem. In this paper, a novel multiobjective optimization method based on differential evolution with adaptive Cauchy mutation and Chaos searching (MODE-CMCS is proposed to optimize the daily streamflow forecasting model. Besides, to enhance the diversity performance of Pareto solutions, a more precise crowd distance assigner is presented in this paper. Furthermore, the traditional generalized spread metric (SP is sensitive with the size of Pareto set. A novel diversity performance metric, which is independent of Pareto set size, is put forward in this research. The efficacy of the new algorithm MODE-CMCS is compared with the nondominated sorting genetic algorithm II (NSGA-II on a daily streamflow forecasting model based on support vector machine (SVM. The results verify that the performance of MODE-CMCS is superior to the NSGA-II for automatic calibration of hydrologic model.

  8. Pareto-Optimal Multi-objective Inversion of Geophysical Data

    Science.gov (United States)

    Schnaidt, Sebastian; Conway, Dennis; Krieger, Lars; Heinson, Graham

    2018-01-01

    In the process of modelling geophysical properties, jointly inverting different data sets can greatly improve model results, provided that the data sets are compatible, i.e., sensitive to similar features. Such a joint inversion requires a relationship between the different data sets, which can either be analytic or structural. Classically, the joint problem is expressed as a scalar objective function that combines the misfit functions of multiple data sets and a joint term which accounts for the assumed connection between the data sets. This approach suffers from two major disadvantages: first, it can be difficult to assess the compatibility of the data sets and second, the aggregation of misfit terms introduces a weighting of the data sets. We present a pareto-optimal multi-objective joint inversion approach based on an existing genetic algorithm. The algorithm treats each data set as a separate objective, avoiding forced weighting and generating curves of the trade-off between the different objectives. These curves are analysed by their shape and evolution to evaluate data set compatibility. Furthermore, the statistical analysis of the generated solution population provides valuable estimates of model uncertainty.

  9. Pareto Improving Price Regulation when the Asset Market is Incomplete

    NARCIS (Netherlands)

    Herings, P.J.J.; Polemarchakis, H.M.

    1999-01-01

    When the asset market is incomplete, competitive equilibria are constrained suboptimal, which provides a scope for pareto improving interventions. Price regulation can be such a pareto improving policy, even when the welfare effects of rationing are taken into account. An appealing aspect of price

  10. On the Runtime of Randomized Local Search and Simple Evolutionary Algorithms for Dynamic Makespan Scheduling

    DEFF Research Database (Denmark)

    Neumann, Frank; Witt, Carsten

    2015-01-01

    combinatorial optimization problem, namely makespan scheduling. We study the model of a strong adversary which is allowed to change one job at regular intervals. Furthermore, we investigate the setting of random changes. Our results show that randomized local search and a simple evolutionary algorithm are very...

  11. Analysis of Ant Colony Optimization and Population-Based Evolutionary Algorithms on Dynamic Problems

    DEFF Research Database (Denmark)

    Lissovoi, Andrei

    the dynamic optimum for finite alphabets up to size μ, while MMAS is able to do so for any finite alphabet size. Parallel Evolutionary Algorithms on Maze. We prove that while a (1 + λ) EA is unable to track the optimum of the dynamic fitness function Maze for offspring population size up to λ = O(n1-ε......This thesis presents new running time analyses of nature-inspired algorithms on various dynamic problems. It aims to identify and analyse the features of algorithms and problem classes which allow efficient optimization to occur in the presence of dynamic behaviour. We consider the following...... settings: λ-MMAS on Dynamic Shortest Path Problems. We investigate how in-creasing the number of ants simulated per iteration may help an ACO algorithm to track optimum in a dynamic problem. It is shown that while a constant number of ants per-vertex is sufficient to track some oscillations, there also...

  12. Pareto optimization of an industrial ecosystem: sustainability maximization

    Directory of Open Access Journals (Sweden)

    J. G. M.-S. Monteiro

    2010-09-01

    Full Text Available This work investigates a procedure to design an Industrial Ecosystem for sequestrating CO2 and consuming glycerol in a Chemical Complex with 15 integrated processes. The Complex is responsible for the production of methanol, ethylene oxide, ammonia, urea, dimethyl carbonate, ethylene glycol, glycerol carbonate, β-carotene, 1,2-propanediol and olefins, and is simulated using UNISIM Design (Honeywell. The process environmental impact (EI is calculated using the Waste Reduction Algorithm, while Profit (P is estimated using classic cost correlations. MATLAB (The Mathworks Inc is connected to UNISIM to enable optimization. The objective is granting maximum process sustainability, which involves finding a compromise between high profitability and low environmental impact. Sustainability maximization is therefore understood as a multi-criteria optimization problem, addressed by means of the Pareto optimization methodology for trading off P vs. EI.

  13. Finding a pareto-optimal solution for multi-region models subject to capital trade and spillover externalities

    Energy Technology Data Exchange (ETDEWEB)

    Leimbach, Marian [Potsdam-Institut fuer Klimafolgenforschung e.V., Potsdam (Germany); Eisenack, Klaus [Oldenburg Univ. (Germany). Dept. of Economics and Statistics

    2008-11-15

    In this paper we present an algorithm that deals with trade interactions within a multi-region model. In contrast to traditional approaches this algorithm is able to handle spillover externalities. Technological spillovers are expected to foster the diffusion of new technologies, which helps to lower the cost of climate change mitigation. We focus on technological spillovers which are due to capital trade. The algorithm of finding a pareto-optimal solution in an intertemporal framework is embedded in a decomposed optimization process. The paper analyzes convergence and equilibrium properties of this algorithm. In the final part of the paper, we apply the algorithm to investigate possible impacts of technological spillovers. While benefits of technological spillovers are significant for the capital-importing region, benefits for the capital-exporting region depend on the type of regional disparities and the resulting specialization and terms-of-trade effects. (orig.)

  14. Improved quantum-inspired evolutionary algorithm with diversity information applied to economic dispatch problem with prohibited operating zones

    International Nuclear Information System (INIS)

    Vianna Neto, Julio Xavier; Andrade Bernert, Diego Luis de; Santos Coelho, Leandro dos

    2011-01-01

    The objective of the economic dispatch problem (EDP) of electric power generation, whose characteristics are complex and highly nonlinear, is to schedule the committed generating unit outputs so as to meet the required load demand at minimum operating cost while satisfying all unit and system equality and inequality constraints. Recently, as an alternative to the conventional mathematical approaches, modern meta-heuristic optimization techniques have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. Research on merging evolutionary computation and quantum computation has been started since late 1990. Inspired on the quantum computation, this paper presented an improved quantum-inspired evolutionary algorithm (IQEA) based on diversity information of population. A classical quantum-inspired evolutionary algorithm (QEA) and the IQEA were implemented and validated for a benchmark of EDP with 15 thermal generators with prohibited operating zones. From the results for the benchmark problem, it is observed that the proposed IQEA approach provides promising results when compared to various methods available in the literature.

  15. Improved quantum-inspired evolutionary algorithm with diversity information applied to economic dispatch problem with prohibited operating zones

    Energy Technology Data Exchange (ETDEWEB)

    Vianna Neto, Julio Xavier, E-mail: julio.neto@onda.com.b [Pontifical Catholic University of Parana, PUCPR, Undergraduate Program at Mechatronics Engineering, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil); Andrade Bernert, Diego Luis de, E-mail: dbernert@gmail.co [Pontifical Catholic University of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil); Santos Coelho, Leandro dos, E-mail: leandro.coelho@pucpr.b [Pontifical Catholic University of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil)

    2011-01-15

    The objective of the economic dispatch problem (EDP) of electric power generation, whose characteristics are complex and highly nonlinear, is to schedule the committed generating unit outputs so as to meet the required load demand at minimum operating cost while satisfying all unit and system equality and inequality constraints. Recently, as an alternative to the conventional mathematical approaches, modern meta-heuristic optimization techniques have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. Research on merging evolutionary computation and quantum computation has been started since late 1990. Inspired on the quantum computation, this paper presented an improved quantum-inspired evolutionary algorithm (IQEA) based on diversity information of population. A classical quantum-inspired evolutionary algorithm (QEA) and the IQEA were implemented and validated for a benchmark of EDP with 15 thermal generators with prohibited operating zones. From the results for the benchmark problem, it is observed that the proposed IQEA approach provides promising results when compared to various methods available in the literature.

  16. Improved quantum-inspired evolutionary algorithm with diversity information applied to economic dispatch problem with prohibited operating zones

    Energy Technology Data Exchange (ETDEWEB)

    Neto, Julio Xavier Vianna [Pontifical Catholic University of Parana, PUCPR, Undergraduate Program at Mechatronics Engineering, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil); Bernert, Diego Luis de Andrade; Coelho, Leandro dos Santos [Pontifical Catholic University of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil)

    2011-01-15

    The objective of the economic dispatch problem (EDP) of electric power generation, whose characteristics are complex and highly nonlinear, is to schedule the committed generating unit outputs so as to meet the required load demand at minimum operating cost while satisfying all unit and system equality and inequality constraints. Recently, as an alternative to the conventional mathematical approaches, modern meta-heuristic optimization techniques have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. Research on merging evolutionary computation and quantum computation has been started since late 1990. Inspired on the quantum computation, this paper presented an improved quantum-inspired evolutionary algorithm (IQEA) based on diversity information of population. A classical quantum-inspired evolutionary algorithm (QEA) and the IQEA were implemented and validated for a benchmark of EDP with 15 thermal generators with prohibited operating zones. From the results for the benchmark problem, it is observed that the proposed IQEA approach provides promising results when compared to various methods available in the literature. (author)

  17. A multi-objective improved teaching-learning based optimization algorithm for unconstrained and constrained optimization problems

    Directory of Open Access Journals (Sweden)

    R. Venkata Rao

    2014-01-01

    Full Text Available The present work proposes a multi-objective improved teaching-learning based optimization (MO-ITLBO algorithm for unconstrained and constrained multi-objective function optimization. The MO-ITLBO algorithm is the improved version of basic teaching-learning based optimization (TLBO algorithm adapted for multi-objective problems. The basic TLBO algorithm is improved to enhance its exploration and exploitation capacities by introducing the concept of number of teachers, adaptive teaching factor, tutorial training and self-motivated learning. The MO-ITLBO algorithm uses a grid-based approach to adaptively assess the non-dominated solutions (i.e. Pareto front maintained in an external archive. The performance of the MO-ITLBO algorithm is assessed by implementing it on unconstrained and constrained test problems proposed for the Congress on Evolutionary Computation 2009 (CEC 2009 competition. The performance assessment is done by using the inverted generational distance (IGD measure. The IGD measures obtained by using the MO-ITLBO algorithm are compared with the IGD measures of the other state-of-the-art algorithms available in the literature. Finally, Lexicographic ordering is used to assess the overall performance of competitive algorithms. Results have shown that the proposed MO-ITLBO algorithm has obtained the 1st rank in the optimization of unconstrained test functions and the 3rd rank in the optimization of constrained test functions.

  18. A new hybrid evolutionary algorithm based on new fuzzy adaptive PSO and NM algorithms for Distribution Feeder Reconfiguration

    International Nuclear Information System (INIS)

    Niknam, Taher; Azadfarsani, Ehsan; Jabbari, Masoud

    2012-01-01

    Highlights: ► Network reconfiguration is a very important way to save the electrical energy. ► This paper proposes a new algorithm to solve the DFR. ► The algorithm combines NFAPSO with NM. ► The proposed algorithm is tested on two distribution test feeders. - Abstract: Network reconfiguration for loss reduction in distribution system is a very important way to save the electrical energy. This paper proposes a new hybrid evolutionary algorithm to solve the Distribution Feeder Reconfiguration problem (DFR). The algorithm is based on combination of a New Fuzzy Adaptive Particle Swarm Optimization (NFAPSO) and Nelder–Mead simplex search method (NM) called NFAPSO–NM. In the proposed algorithm, a new fuzzy adaptive particle swarm optimization includes two parts. The first part is Fuzzy Adaptive Binary Particle Swarm Optimization (FABPSO) that determines the status of tie switches (open or close) and second part is Fuzzy Adaptive Discrete Particle Swarm Optimization (FADPSO) that determines the sectionalizing switch number. In other side, due to the results of binary PSO(BPSO) and discrete PSO(DPSO) algorithms highly depends on the values of their parameters such as the inertia weight and learning factors, a fuzzy system is employed to adaptively adjust the parameters during the search process. Moreover, the Nelder–Mead simplex search method is combined with the NFAPSO algorithm to improve its performance. Finally, the proposed algorithm is tested on two distribution test feeders. The results of simulation show that the proposed method is very powerful and guarantees to obtain the global optimization.

  19. Creating ensembles of oblique decision trees with evolutionary algorithms and sampling

    Science.gov (United States)

    Cantu-Paz, Erick [Oakland, CA; Kamath, Chandrika [Tracy, CA

    2006-06-13

    A decision tree system that is part of a parallel object-oriented pattern recognition system, which in turn is part of an object oriented data mining system. A decision tree process includes the step of reading the data. If necessary, the data is sorted. A potential split of the data is evaluated according to some criterion. An initial split of the data is determined. The final split of the data is determined using evolutionary algorithms and statistical sampling techniques. The data is split. Multiple decision trees are combined in ensembles.

  20. Scheduling for the National Hockey League Using a Multi-objective Evolutionary Algorithm

    Science.gov (United States)

    Craig, Sam; While, Lyndon; Barone, Luigi

    We describe a multi-objective evolutionary algorithm that derives schedules for the National Hockey League according to three objectives: minimising the teams' total travel, promoting equity in rest time between games, and minimising long streaks of home or away games. Experiments show that the system is able to derive schedules that beat the 2008-9 NHL schedule in all objectives simultaneously, and that it returns a set of schedules that offer a range of trade-offs across the objectives.

  1. On the size distribution of cities: an economic interpretation of the Pareto coefficient.

    Science.gov (United States)

    Suh, S H

    1987-01-01

    "Both the hierarchy and the stochastic models of size distribution of cities are analyzed in order to explain the Pareto coefficient by economic variables. In hierarchy models, it is found that the rate of variation in the productivity of cities and that in the probability of emergence of cities can explain the Pareto coefficient. In stochastic models, the productivity of cities is found to explain the Pareto coefficient. New city-size distribution functions, in which the Pareto coefficient is decomposed by economic variables, are estimated." excerpt

  2. An extension of the directed search domain algorithm to bilevel optimization

    Science.gov (United States)

    Wang, Kaiqiang; Utyuzhnikov, Sergey V.

    2017-08-01

    A method is developed for generating a well-distributed Pareto set for the upper level in bilevel multiobjective optimization. The approach is based on the Directed Search Domain (DSD) algorithm, which is a classical approach for generation of a quasi-evenly distributed Pareto set in multiobjective optimization. The approach contains a double-layer optimizer designed in a specific way under the framework of the DSD method. The double-layer optimizer is based on bilevel single-objective optimization and aims to find a unique optimal Pareto solution rather than generate the whole Pareto frontier on the lower level in order to improve the optimization efficiency. The proposed bilevel DSD approach is verified on several test cases, and a relevant comparison against another classical approach is made. It is shown that the approach can generate a quasi-evenly distributed Pareto set for the upper level with relatively low time consumption.

  3. CCS Site Optimization by Applying a Multi-objective Evolutionary Algorithm to Semi-Analytical Leakage Models

    Science.gov (United States)

    Cody, B. M.; Gonzalez-Nicolas, A.; Bau, D. A.

    2011-12-01

    Carbon capture and storage (CCS) has been proposed as a method of reducing global carbon dioxide (CO2) emissions. Although CCS has the potential to greatly retard greenhouse gas loading to the atmosphere while cleaner, more sustainable energy solutions are developed, there is a possibility that sequestered CO2 may leak and intrude into and adversely affect groundwater resources. It has been reported [1] that, while CO2 intrusion typically does not directly threaten underground drinking water resources, it may cause secondary effects, such as the mobilization of hazardous inorganic constituents present in aquifer minerals and changes in pH values. These risks must be fully understood and minimized before CCS project implementation. Combined management of project resources and leakage risk is crucial for the implementation of CCS. In this work, we present a method of: (a) minimizing the total CCS cost, the summation of major project costs with the cost associated with CO2 leakage; and (b) maximizing the mass of injected CO2, for a given proposed sequestration site. Optimization decision variables include the number of CO2 injection wells, injection rates, and injection well locations. The capital and operational costs of injection wells are directly related to injection well depth, location, injection flow rate, and injection duration. The cost of leakage is directly related to the mass of CO2 leaked through weak areas, such as abandoned oil wells, in the cap rock layers overlying the injected formation. Additional constraints on fluid overpressure caused by CO2 injection are imposed to maintain predefined effective stress levels that prevent cap rock fracturing. Here, both mass leakage and fluid overpressure are estimated using two semi-analytical models based upon work by [2,3]. A multi-objective evolutionary algorithm coupled with these semi-analytical leakage flow models is used to determine Pareto-optimal trade-off sets giving minimum total cost vs. maximum mass

  4. Combining evolutionary algorithms with oblique decision trees to detect bent-double galaxies

    Science.gov (United States)

    Cantu-Paz, Erick; Kamath, Chandrika

    2000-10-01

    Decision tress have long been popular in classification as they use simple and easy-to-understand tests at each node. Most variants of decision trees test a single attribute at a node, leading to axis- parallel trees, where the test results in a hyperplane which is parallel to one of the dimensions in the attribute space. These trees can be rather large and inaccurate in cases where the concept to be learned is best approximated by oblique hyperplanes. In such cases, it may be more appropriate to use an oblique decision tree, where the decision at each node is a linear combination of the attributes. Oblique decision trees have not gained wide popularity in part due to the complexity of constructing good oblique splits and the tendency of existing splitting algorithms to get stuck in local minima. Several alternatives have been proposed to handle these problems including randomization in conjunction wiht deterministic hill-climbing and the use of simulated annealing. In this paper, we use evolutionary algorithms (EAs) to determine the split. EAs are well suited for this problem because of their global search properties, their tolerance to noisy fitness evaluations, and their scalability to large dimensional search spaces. We demonstrate our technique on a synthetic data set, and then we apply it to a practical problem from astronomy, namely, the classification of galaxies with a bent-double morphology. In addition, we describe our experiences with several split evaluation criteria. Our results suggest that, in some cases, the evolutionary approach is faster and more accurate than existing oblique decision tree algorithms. However, for our astronomical data, the accuracy is not significantly different than the axis-parallel trees.

  5. Pareto-path multitask multiple kernel learning.

    Science.gov (United States)

    Li, Cong; Georgiopoulos, Michael; Anagnostopoulos, Georgios C

    2015-01-01

    A traditional and intuitively appealing Multitask Multiple Kernel Learning (MT-MKL) method is to optimize the sum (thus, the average) of objective functions with (partially) shared kernel function, which allows information sharing among the tasks. We point out that the obtained solution corresponds to a single point on the Pareto Front (PF) of a multiobjective optimization problem, which considers the concurrent optimization of all task objectives involved in the Multitask Learning (MTL) problem. Motivated by this last observation and arguing that the former approach is heuristic, we propose a novel support vector machine MT-MKL framework that considers an implicitly defined set of conic combinations of task objectives. We show that solving our framework produces solutions along a path on the aforementioned PF and that it subsumes the optimization of the average of objective functions as a special case. Using the algorithms we derived, we demonstrate through a series of experimental results that the framework is capable of achieving a better classification performance, when compared with other similar MTL approaches.

  6. Performance-based Pareto optimal design

    NARCIS (Netherlands)

    Sariyildiz, I.S.; Bittermann, M.S.; Ciftcioglu, O.

    2008-01-01

    A novel approach for performance-based design is presented, where Pareto optimality is pursued. Design requirements may contain linguistic information, which is difficult to bring into computation or make consistent their impartial estimations from case to case. Fuzzy logic and soft computing are

  7. Comparative analysis of Pareto surfaces in multi-criteria IMRT planning

    Energy Technology Data Exchange (ETDEWEB)

    Teichert, K; Suess, P; Serna, J I; Monz, M; Kuefer, K H [Department of Optimization, Fraunhofer Institute for Industrial Mathematics (ITWM), Fraunhofer Platz 1, 67663 Kaiserslautern (Germany); Thieke, C, E-mail: katrin.teichert@itwm.fhg.de [Clinical Cooperation Unit Radiation Oncology, German Cancer Research Center, Im Neuenheimer Feld 280, 69120 Heidelberg (Germany)

    2011-06-21

    In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g. photons versus protons) than with the classical method of comparing single treatment plans.

  8. Comparative analysis of Pareto surfaces in multi-criteria IMRT planning.

    Science.gov (United States)

    Teichert, K; Süss, P; Serna, J I; Monz, M; Küfer, K H; Thieke, C

    2011-06-21

    In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g., photons versus protons) than with the classical method of comparing single treatment plans.

  9. Comparative analysis of Pareto surfaces in multi-criteria IMRT planning

    International Nuclear Information System (INIS)

    Teichert, K; Suess, P; Serna, J I; Monz, M; Kuefer, K H; Thieke, C

    2011-01-01

    In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g. photons versus protons) than with the classical method of comparing single treatment plans.

  10. Computing the Distribution of Pareto Sums Using Laplace Transformation and Stehfest Inversion

    Science.gov (United States)

    Harris, C. K.; Bourne, S. J.

    2017-05-01

    that is shared by the sum of an arbitrary number of such variables. The technique involves applying the Laplace transform to the normalized sum (which is simply the product of the Laplace transforms of the densities of the individual variables, with a suitable scaling of the Laplace variable), and then inverting it numerically using the Gaver-Stehfest algorithm. After validating the method using a number of test cases, it was applied to address the distribution of total seismic moment, and the quantiles computed for various numbers of seismic events were compared with those obtained in the literature using Monte Carlo simulation. Excellent agreement was obtained. As an application, the method was applied to the evolution of total seismic moment released by tremors due to gas production in the Groningen gas field in the northeastern Netherlands. The speed, accuracy and ease of implementation of the method allows the development of accurate correlations for constraining statistical seismological models using, for example, the maximum-likelihood method. It should also be of value in other natural processes governed by Pareto distributions with exponent less than unity.

  11. Models for Evolutionary Algorithms and Their Applications in System Identification and Control Optimization

    DEFF Research Database (Denmark)

    Ursem, Rasmus Kjær

    population and many generations, which essentially turns the problem into a series of related static problems. To our surprise, the control problem could easily be solved when optimized like this. To further examine this, we compared the EA with a particle swarm and a local search approach, which we...... simulate an evolutionary process where the goal is to evolve solutions by means of crossover, mutation, and selection based on their quality (fitness) with respect to the optimization problem at hand. Evolutionary algorithms (EAs) are highly relevant for industrial applications, because they are capable...... of handling problems with non-linear constraints, multiple objectives, and dynamic components – properties that frequently appear in real-world problems. This thesis presents research in three fundamental areas of EC; fitness function design, methods for parameter control, and techniques for multimodal...

  12. Level Diagrams analysis of Pareto Front for multiobjective system redundancy allocation

    International Nuclear Information System (INIS)

    Zio, E.; Bazzo, R.

    2011-01-01

    Reliability-based and risk-informed design, operation, maintenance and regulation lead to multiobjective (multicriteria) optimization problems. In this context, the Pareto Front and Set found in a multiobjective optimality search provide a family of solutions among which the decision maker has to look for the best choice according to his or her preferences. Efficient visualization techniques for Pareto Front and Set analyses are needed for helping decision makers in the selection task. In this paper, we consider the multiobjective optimization of system redundancy allocation and use the recently introduced Level Diagrams technique for graphically representing the resulting Pareto Front and Set. Each objective and decision variable is represented on separate diagrams where the points of the Pareto Front and Set are positioned according to their proximity to ideally optimal points, as measured by a metric of normalized objective values. All diagrams are synchronized across all objectives and decision variables. On the basis of the analysis of the Level Diagrams, we introduce a procedure for reducing the number of solutions in the Pareto Front; from the reduced set of solutions, the decision maker can more easily identify his or her preferred solution.

  13. An Improved Multi-Objective Artificial Bee Colony Optimization Algorithm with Regulation Operators

    Directory of Open Access Journals (Sweden)

    Jiuyuan Huo

    2017-02-01

    Full Text Available To achieve effective and accurate optimization for multi-objective optimization problems, a multi-objective artificial bee colony algorithm with regulation operators (RMOABC inspired by the intelligent foraging behavior of honey bees was proposed in this paper. The proposed algorithm utilizes the Pareto dominance theory and takes advantage of adaptive grid and regulation operator mechanisms. The adaptive grid technique is used to adaptively assess the Pareto front maintained in an external archive and the regulation operator is used to balance the weights of the local search and the global search in the evolution of the algorithm. The performance of RMOABC was evaluated in comparison with other nature inspired algorithms includes NSGA-II and MOEA/D. The experiments results demonstrated that the RMOABC approach has better accuracy and minimal execution time.

  14. Comparing multi-objective non-evolutionary NLPQL and evolutionary genetic algorithm optimization of a DI diesel engine: DoE estimation and creating surrogate model

    International Nuclear Information System (INIS)

    Navid, Ali; Khalilarya, Shahram; Taghavifar, Hadi

    2016-01-01

    Highlights: • NLPQL algorithm with Latin hypercube and multi-objective GA were applied on engine. • NLPQL converge to the best solution at RunID41, MOGA introduces at RunID84. • Deeper, more encircled design gives the lowest NOx, greater radius and deeper bowl the highest IMEP. • The maximum IMEP and minimum ISFC obtained with NLPQL, the lowest NOx with MOGA. - Abstract: This study is concerned with the application of two major kinds of optimization algorithms on the baseline diesel engine in the class of evolutionary and non-evolutionary algorithms. The multi-objective genetic algorithm and non-linear programming by quadratic Lagrangian (NLPQL) method have completely different functions in optimizing and finding the global optimal design. The design variables are injection angle, half spray cone angle, inner distance of the bowl wall, and the bowl radius, while the objectives include NOx emission, spray droplet diameter, indicated mean effective pressure (IMEP), and indicated specific fuel consumption (ISFC). The restrictions were set on the objectives to distinguish between feasible designs and infeasible designs to sort those cases that cannot fulfill the demands of diesel engine designers and emission control measures. It is found that a design with deeper bowl and more encircled shape (higher swirl motion) is more suitable for NO_x emission control, whereas designs with a bigger bowl radius, and closer inner wall distance of the bowl (Di) may lead to higher engine efficiency indices. Moreover, it was revealed that the NLPQL could rapidly search for the best design at Run ID 41 compared to genetic algorithm, which is able to find the global optima at last runs (ID 84). Both techniques introduce almost the same geometrical shape of the combustion chamber with a negligible contrast in the injection system.

  15. The Research of Disease Spots Extraction Based on Evolutionary Algorithm

    Directory of Open Access Journals (Sweden)

    Kangshun Li

    2017-01-01

    Full Text Available According to the characteristics of maize disease spot performance in the image, this paper designs two-histogram segmentation method based on evolutionary algorithm, which combined with the analysis of image of maize diseases and insect pests, with full consideration of color and texture characteristic of the lesion of pests and diseases, the chroma and gray image, composed of two tuples to build a two-dimensional histogram, solves the problem of one-dimensional histograms that cannot be clearly divided into target and background bimodal distribution and improved the traditional two-dimensional histogram application in pest damage lesion extraction. The chromosome coding suitable for the characteristics of lesion image is designed based on second segmentation of the genetic algorithm Otsu. Determining initial population with analysis results of lesion image, parallel selection, optimal preservation strategy, and adaptive mutation operator are used to improve the search efficiency. Finally, by setting the fluctuation threshold, we continue to search for the best threshold in the range of fluctuations for implementation of global search and local search.

  16. Regular Network Class Features Enhancement Using an Evolutionary Synthesis Algorithm

    Directory of Open Access Journals (Sweden)

    O. G. Monahov

    2014-01-01

    Full Text Available This paper investigates a solution of the optimization problem concerning the construction of diameter-optimal regular networks (graphs. Regular networks are of practical interest as the graph-theoretical models of reliable communication networks of parallel supercomputer systems, as a basis of the structure in a model of small world in optical and neural networks. It presents a new class of parametrically described regular networks - hypercirculant networks (graphs. An approach that uses evolutionary algorithms for the automatic generation of parametric descriptions of optimal hypercirculant networks is developed. Synthesis of optimal hypercirculant networks is based on the optimal circulant networks with smaller degree of nodes. To construct optimal hypercirculant networks is used a template of circulant network from the known optimal families of circulant networks with desired number of nodes and with smaller degree of nodes. Thus, a generating set of the circulant network is used as a generating subset of the hypercirculant network, and the missing generators are synthesized by means of the evolutionary algorithm, which is carrying out minimization of diameter (average diameter of networks. A comparative analysis of the structural characteristics of hypercirculant, toroidal, and circulant networks is conducted. The advantage hypercirculant networks under such structural characteristics, as diameter, average diameter, and the width of bisection, with comparable costs of the number of nodes and the number of connections is demonstrated. It should be noted the advantage of hypercirculant networks of dimension three over four higher-dimensional tori. Thus, the optimization of hypercirculant networks of dimension three is more efficient than the introduction of an additional dimension for the corresponding toroidal structures. The paper also notes the best structural parameters of hypercirculant networks in comparison with iBT-networks previously

  17. Multiobjective Optimization Involving Quadratic Functions

    Directory of Open Access Journals (Sweden)

    Oscar Brito Augusto

    2014-01-01

    Full Text Available Multiobjective optimization is nowadays a word of order in engineering projects. Although the idea involved is simple, the implementation of any procedure to solve a general problem is not an easy task. Evolutionary algorithms are widespread as a satisfactory technique to find a candidate set for the solution. Usually they supply a discrete picture of the Pareto front even if this front is continuous. In this paper we propose three methods for solving unconstrained multiobjective optimization problems involving quadratic functions. In the first, for biobjective optimization defined in the bidimensional space, a continuous Pareto set is found analytically. In the second, applicable to multiobjective optimization, a condition test is proposed to check if a point in the decision space is Pareto optimum or not and, in the third, with functions defined in n-dimensional space, a direct noniterative algorithm is proposed to find the Pareto set. Simple problems highlight the suitability of the proposed methods.

  18. Application of evolutionary algorithms for multi-objective optimization in VLSI and embedded systems

    CERN Document Server

    2015-01-01

    This book describes how evolutionary algorithms (EA), including genetic algorithms (GA) and particle swarm optimization (PSO) can be utilized for solving multi-objective optimization problems in the area of embedded and VLSI system design. Many complex engineering optimization problems can be modelled as multi-objective formulations. This book provides an introduction to multi-objective optimization using meta-heuristic algorithms, GA and PSO, and how they can be applied to problems like hardware/software partitioning in embedded systems, circuit partitioning in VLSI, design of operational amplifiers in analog VLSI, design space exploration in high-level synthesis, delay fault testing in VLSI testing, and scheduling in heterogeneous distributed systems. It is shown how, in each case, the various aspects of the EA, namely its representation, and operators like crossover, mutation, etc. can be separately formulated to solve these problems. This book is intended for design engineers and researchers in the field ...

  19. Designing Pareto-superior demand-response rate options

    International Nuclear Information System (INIS)

    Horowitz, I.; Woo, C.K.

    2006-01-01

    We explore three voluntary service options-real-time pricing, time-of-use pricing, and curtailable/interruptible service-that a local distribution company might offer its customers in order to encourage them to alter their electricity usage in response to changes in the electricity-spot-market price. These options are simple and practical, and make minimal information demands. We show that each of the options is Pareto-superior ex ante, in that it benefits both the participants and the company offering it, while not affecting the non-participants. The options are shown to be Pareto-superior ex post as well, except under certain exceptional circumstances. (author)

  20. Searching for the Pareto frontier in multi-objective protein design.

    Science.gov (United States)

    Nanda, Vikas; Belure, Sandeep V; Shir, Ofer M

    2017-08-01

    The goal of protein engineering and design is to identify sequences that adopt three-dimensional structures of desired function. Often, this is treated as a single-objective optimization problem, identifying the sequence-structure solution with the lowest computed free energy of folding. However, many design problems are multi-state, multi-specificity, or otherwise require concurrent optimization of multiple objectives. There may be tradeoffs among objectives, where improving one feature requires compromising another. The challenge lies in determining solutions that are part of the Pareto optimal set-designs where no further improvement can be achieved in any of the objectives without degrading one of the others. Pareto optimality problems are found in all areas of study, from economics to engineering to biology, and computational methods have been developed specifically to identify the Pareto frontier. We review progress in multi-objective protein design, the development of Pareto optimization methods, and present a specific case study using multi-objective optimization methods to model the tradeoff between three parameters, stability, specificity, and complexity, of a set of interacting synthetic collagen peptides.

  1. Rayleigh Pareto Distribution

    Directory of Open Access Journals (Sweden)

    Kareema ‎ Abed Al-Kadim

    2017-12-01

    Full Text Available In this paper Rayleigh Pareto distribution have  introduced denote by( R_PD. We stated some  useful functions. Therefor  we  give some of its properties like the entropy function, mean, mode, median , variance , the r-th moment about the mean, the rth moment about the origin, reliability, hazard functions, coefficients of variation, of sekeness and of kurtosis. Finally, we estimate the parameters  so the aim of this search  is to introduce a new distribution

  2. Multi-objective optimization of an organic Rankine cycle (ORC) for low grade waste heat recovery using evolutionary algorithm

    International Nuclear Information System (INIS)

    Wang, Jiangfeng; Yan, Zhequan; Wang, Man; Li, Maoqing; Dai, Yiping

    2013-01-01

    Highlights: • Multi-objective optimization of an ORC is conducted to obtain optimum performance. • NSGA-II is employed to solve this multi-objective optimization problem. • The effects of parameters on the exergy efficiency and capital cost are examined. - Abstract: Organic Rankine cycle (ORC) can effectively recover low grade waste heat due to its excellent thermodynamic performance. Based on the examinations of the effects of key thermodynamic parameters on the exergy efficiency and overall capital cost, multi-objective optimization of the ORC with R134a as working fluid is conducted to achieve the system optimization design from both thermodynamic and economic aspects using Non-dominated sorting genetic algorithm-II (NSGA-II). The exergy efficiency and overall capital cost are selected as two objective functions to maximize the exergy efficiency and minimize the overall capital cost under the given waste heat conditions. Turbine inlet pressure, turbine inlet temperature, pinch temperature difference, approach temperature difference and condenser temperature difference are selected as the decision variables owing to their significant effects on the exergy efficiency and overall capital cost. A Pareto frontier obtained shows that an increase in the exergy efficiency can increase the overall capital cost of the ORC system. The optimum design solution with their corresponding decision variables is selected from the Pareto frontier. The optimum exergy efficiency and overall capital cost are 13.98% and 129.28 × 10 4 USD, respectively, under the given waste heat conditions

  3. A Pareto upper tail for capital income distribution

    Science.gov (United States)

    Oancea, Bogdan; Pirjol, Dan; Andrei, Tudorel

    2018-02-01

    We present a study of the capital income distribution and of its contribution to the total income (capital income share) using individual tax income data in Romania, for 2013 and 2014. Using a parametric representation we show that the capital income is Pareto distributed in the upper tail, with a Pareto coefficient α ∼ 1 . 44 which is much smaller than the corresponding coefficient for wage- and non-wage-income (excluding capital income), of α ∼ 2 . 53. Including the capital income contribution has the effect of increasing the overall inequality measures.

  4. A new stellar spectrum interpolation algorithm and its application to Yunnan-III evolutionary population synthesis models

    Science.gov (United States)

    Cheng, Liantao; Zhang, Fenghui; Kang, Xiaoyu; Wang, Lang

    2018-05-01

    In evolutionary population synthesis (EPS) models, we need to convert stellar evolutionary parameters into spectra via interpolation in a stellar spectral library. For theoretical stellar spectral libraries, the spectrum grid is homogeneous on the effective-temperature and gravity plane for a given metallicity. It is relatively easy to derive stellar spectra. For empirical stellar spectral libraries, stellar parameters are irregularly distributed and the interpolation algorithm is relatively complicated. In those EPS models that use empirical stellar spectral libraries, different algorithms are used and the codes are often not released. Moreover, these algorithms are often complicated. In this work, based on a radial basis function (RBF) network, we present a new spectrum interpolation algorithm and its code. Compared with the other interpolation algorithms that are used in EPS models, it can be easily understood and is highly efficient in terms of computation. The code is written in MATLAB scripts and can be used on any computer system. Using it, we can obtain the interpolated spectra from a library or a combination of libraries. We apply this algorithm to several stellar spectral libraries (such as MILES, ELODIE-3.1 and STELIB-3.2) and give the integrated spectral energy distributions (ISEDs) of stellar populations (with ages from 1 Myr to 14 Gyr) by combining them with Yunnan-III isochrones. Our results show that the differences caused by the adoption of different EPS model components are less than 0.2 dex. All data about the stellar population ISEDs in this work and the RBF spectrum interpolation code can be obtained by request from the first author or downloaded from http://www1.ynao.ac.cn/˜zhangfh.

  5. Practical advantages of evolutionary computation

    Science.gov (United States)

    Fogel, David B.

    1997-10-01

    Evolutionary computation is becoming a common technique for solving difficult, real-world problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as their ability to self-adapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine.

  6. Exploiting Genomic Knowledge in Optimising Molecular Breeding Programmes: Algorithms from Evolutionary Computing

    Science.gov (United States)

    O'Hagan, Steve; Knowles, Joshua; Kell, Douglas B.

    2012-01-01

    Comparatively few studies have addressed directly the question of quantifying the benefits to be had from using molecular genetic markers in experimental breeding programmes (e.g. for improved crops and livestock), nor the question of which organisms should be mated with each other to best effect. We argue that this requires in silico modelling, an approach for which there is a large literature in the field of evolutionary computation (EC), but which has not really been applied in this way to experimental breeding programmes. EC seeks to optimise measurable outcomes (phenotypic fitnesses) by optimising in silico the mutation, recombination and selection regimes that are used. We review some of the approaches from EC, and compare experimentally, using a biologically relevant in silico landscape, some algorithms that have knowledge of where they are in the (genotypic) search space (G-algorithms) with some (albeit well-tuned ones) that do not (F-algorithms). For the present kinds of landscapes, F- and G-algorithms were broadly comparable in quality and effectiveness, although we recognise that the G-algorithms were not equipped with any ‘prior knowledge’ of epistatic pathway interactions. This use of algorithms based on machine learning has important implications for the optimisation of experimental breeding programmes in the post-genomic era when we shall potentially have access to the full genome sequence of every organism in a breeding population. The non-proprietary code that we have used is made freely available (via Supplementary information). PMID:23185279

  7. Geomagnetic Navigation of Autonomous Underwater Vehicle Based on Multi-objective Evolutionary Algorithm.

    Science.gov (United States)

    Li, Hong; Liu, Mingyong; Zhang, Feihu

    2017-01-01

    This paper presents a multi-objective evolutionary algorithm of bio-inspired geomagnetic navigation for Autonomous Underwater Vehicle (AUV). Inspired by the biological navigation behavior, the solution was proposed without using a priori information, simply by magnetotaxis searching. However, the existence of the geomagnetic anomalies has significant influence on the geomagnetic navigation system, which often disrupts the distribution of the geomagnetic field. An extreme value region may easily appear in abnormal regions, which makes AUV lost in the navigation phase. This paper proposes an improved bio-inspired algorithm with behavior constraints, for sake of making AUV escape from the abnormal region. First, the navigation problem is considered as the optimization problem. Second, the environmental monitoring operator is introduced, to determine whether the algorithm falls into the geomagnetic anomaly region. Then, the behavior constraint operator is employed to get out of the abnormal region. Finally, the termination condition is triggered. Compared to the state-of- the-art, the proposed approach effectively overcomes the disturbance of the geomagnetic abnormal. The simulation result demonstrates the reliability and feasibility of the proposed approach in complex environments.

  8. Analysis of Parametric Optimization of Field-Oriented Control of 3-Phase Induction Motor with Using Evolutionary Algorithm

    Directory of Open Access Journals (Sweden)

    Wiktor HUDY

    2013-12-01

    Full Text Available In this paper, the impact of regulators set and their types for the characteristic of rotational speed of induction motor was researched.. The evolutionary algorithm was used as optimization tool. Results were verified with using MATLAB/Simulink.

  9. Evolutionary Computation and Its Applications in Neural and Fuzzy Systems

    Directory of Open Access Journals (Sweden)

    Biaobiao Zhang

    2011-01-01

    Full Text Available Neural networks and fuzzy systems are two soft-computing paradigms for system modelling. Adapting a neural or fuzzy system requires to solve two optimization problems: structural optimization and parametric optimization. Structural optimization is a discrete optimization problem which is very hard to solve using conventional optimization techniques. Parametric optimization can be solved using conventional optimization techniques, but the solution may be easily trapped at a bad local optimum. Evolutionary computation is a general-purpose stochastic global optimization approach under the universally accepted neo-Darwinian paradigm, which is a combination of the classical Darwinian evolutionary theory, the selectionism of Weismann, and the genetics of Mendel. Evolutionary algorithms are a major approach to adaptation and optimization. In this paper, we first introduce evolutionary algorithms with emphasis on genetic algorithms and evolutionary strategies. Other evolutionary algorithms such as genetic programming, evolutionary programming, particle swarm optimization, immune algorithm, and ant colony optimization are also described. Some topics pertaining to evolutionary algorithms are also discussed, and a comparison between evolutionary algorithms and simulated annealing is made. Finally, the application of EAs to the learning of neural networks as well as to the structural and parametric adaptations of fuzzy systems is also detailed.

  10. Vilfredo Pareto. L'economista alla luce delle lettere a Maffeo Pantaleoni. (Vilfredo Pareto. The economist in the light of his letters to Maffeo Pantaleoni

    Directory of Open Access Journals (Sweden)

    E. SCHNEIDER

    2014-07-01

    Full Text Available The article is part of a special issue on occasion of the publication of the entire scientific correspondence of Vilfredo Pareto with Maffeo Pantaleoni. The author reconstructs the beginning of their correspondence, the debate in pure mathematical economics and draws main conclusions on the different views of Pareto with respect to Marshal, Edgeworth and Fisher.JEL: B16, B31, C02, C60

  11. Pareto Distribution of Firm Size and Knowledge Spillover Process as a Network

    OpenAIRE

    Tomohiko Konno

    2013-01-01

    The firm size distribution is considered as Pareto distribution. In the present paper, we show that the Pareto distribution of firm size results from the spillover network model which was introduced in Konno (2010).

  12. Evolutionary Cellular Automata for Image Segmentation and Noise Filtering Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Sihem SLATNIA

    2011-01-01

    Full Text Available We use an evolutionary process to seek a specialized set of rules among a wide range of rules to be used by Cellular Automata (CA for a range of tasks,extracting edges in a given gray or colour image, noise filtering applied to black-white image. This is the best set of local rules determine the future state of CA in an asynchronous way. The Genetic Algorithm (GA is applied to search the best CA rules that can realize the best edge detection and noise filtering.

  13. Evolutionary Cellular Automata for Image Segmentation and Noise Filtering Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Okba Kazar

    2011-01-01

    Full Text Available We use an evolutionary process to seek a specialized set of rules among a wide range of rules to be used by Cellular Automata (CA for a range of tasks, extracting edges in a given gray or colour image, noise filtering applied to black-white image. This is the best set of local rules determine the future state of CA in an asynchronous way. The Genetic Algorithm (GA is applied to search the best CA rules that can realize the best edge detection and noise filtering.

  14. Evolutionary algorithm for vehicle driving cycle generation.

    Science.gov (United States)

    Perhinschi, Mario G; Marlowe, Christopher; Tamayo, Sergio; Tu, Jun; Wayne, W Scott

    2011-09-01

    Modeling transit bus emissions and fuel economy requires a large amount of experimental data over wide ranges of operational conditions. Chassis dynamometer tests are typically performed using representative driving cycles defined based on vehicle instantaneous speed as sequences of "microtrips", which are intervals between consecutive vehicle stops. Overall significant parameters of the driving cycle, such as average speed, stops per mile, kinetic intensity, and others, are used as independent variables in the modeling process. Performing tests at all the necessary combinations of parameters is expensive and time consuming. In this paper, a methodology is proposed for building driving cycles at prescribed independent variable values using experimental data through the concatenation of "microtrips" isolated from a limited number of standard chassis dynamometer test cycles. The selection of the adequate "microtrips" is achieved through a customized evolutionary algorithm. The genetic representation uses microtrip definitions as genes. Specific mutation, crossover, and karyotype alteration operators have been defined. The Roulette-Wheel selection technique with elitist strategy drives the optimization process, which consists of minimizing the errors to desired overall cycle parameters. This utility is part of the Integrated Bus Information System developed at West Virginia University.

  15. Multi-objective optimal power flow with FACTS devices

    International Nuclear Information System (INIS)

    Basu, M.

    2011-01-01

    This paper presents multi-objective differential evolution to optimize cost of generation, emission and active power transmission loss of flexible ac transmission systems (FACTS) device-equipped power systems. In the proposed approach, optimal power flow problem is formulated as a multi-objective optimization problem. FACTS devices considered include thyristor controlled series capacitor (TCSC) and thyristor controlled phase shifter (TCPS). The proposed approach has been examined and tested on the modified IEEE 30-bus and 57-bus test systems. The results obtained from the proposed approach have been compared with those obtained from nondominated sorting genetic algorithm-II, strength pareto evolutionary algorithm 2 and pareto differential evolution.

  16. Efficiently approximating the Pareto frontier: Hydropower dam placement in the Amazon basin

    Science.gov (United States)

    Wu, Xiaojian; Gomes-Selman, Jonathan; Shi, Qinru; Xue, Yexiang; Garcia-Villacorta, Roosevelt; Anderson, Elizabeth; Sethi, Suresh; Steinschneider, Scott; Flecker, Alexander; Gomes, Carla P.

    2018-01-01

    Real–world problems are often not fully characterized by a single optimal solution, as they frequently involve multiple competing objectives; it is therefore important to identify the so-called Pareto frontier, which captures solution trade-offs. We propose a fully polynomial-time approximation scheme based on Dynamic Programming (DP) for computing a polynomially succinct curve that approximates the Pareto frontier to within an arbitrarily small > 0 on treestructured networks. Given a set of objectives, our approximation scheme runs in time polynomial in the size of the instance and 1/. We also propose a Mixed Integer Programming (MIP) scheme to approximate the Pareto frontier. The DP and MIP Pareto frontier approaches have complementary strengths and are surprisingly effective. We provide empirical results showing that our methods outperform other approaches in efficiency and accuracy. Our work is motivated by a problem in computational sustainability concerning the proliferation of hydropower dams throughout the Amazon basin. Our goal is to support decision-makers in evaluating impacted ecosystem services on the full scale of the Amazon basin. Our work is general and can be applied to approximate the Pareto frontier of a variety of multiobjective problems on tree-structured networks.

  17. Pareto Optimization Identifies Diverse Set of Phosphorylation Signatures Predicting Response to Treatment with Dasatinib.

    Science.gov (United States)

    Klammer, Martin; Dybowski, J Nikolaj; Hoffmann, Daniel; Schaab, Christoph

    2015-01-01

    Multivariate biomarkers that can predict the effectiveness of targeted therapy in individual patients are highly desired. Previous biomarker discovery studies have largely focused on the identification of single biomarker signatures, aimed at maximizing prediction accuracy. Here, we present a different approach that identifies multiple biomarkers by simultaneously optimizing their predictive power, number of features, and proximity to the drug target in a protein-protein interaction network. To this end, we incorporated NSGA-II, a fast and elitist multi-objective optimization algorithm that is based on the principle of Pareto optimality, into the biomarker discovery workflow. The method was applied to quantitative phosphoproteome data of 19 non-small cell lung cancer (NSCLC) cell lines from a previous biomarker study. The algorithm successfully identified a total of 77 candidate biomarker signatures predicting response to treatment with dasatinib. Through filtering and similarity clustering, this set was trimmed to four final biomarker signatures, which then were validated on an independent set of breast cancer cell lines. All four candidates reached the same good prediction accuracy (83%) as the originally published biomarker. Although the newly discovered signatures were diverse in their composition and in their size, the central protein of the originally published signature - integrin β4 (ITGB4) - was also present in all four Pareto signatures, confirming its pivotal role in predicting dasatinib response in NSCLC cell lines. In summary, the method presented here allows for a robust and simultaneous identification of multiple multivariate biomarkers that are optimized for prediction performance, size, and relevance.

  18. Energy demand forecasting in Iranian metal industry using linear and nonlinear models based on evolutionary algorithms

    International Nuclear Information System (INIS)

    Piltan, Mehdi; Shiri, Hiva; Ghaderi, S.F.

    2012-01-01

    Highlights: ► Investigating different fitness functions for evolutionary algorithms in energy forecasting. ► Energy forecasting of Iranian metal industry by value added, energy prices, investment and employees. ► Using real-coded instead of binary-coded genetic algorithm decreases energy forecasting error. - Abstract: Developing energy-forecasting models is known as one of the most important steps in long-term planning. In order to achieve sustainable energy supply toward economic development and social welfare, it is required to apply precise forecasting model. Applying artificial intelligent models for estimation complex economic and social functions is growing up considerably in many researches recently. In this paper, energy consumption in industrial sector as one of the critical sectors in the consumption of energy has been investigated. Two linear and three nonlinear functions have been used in order to forecast and analyze energy in the Iranian metal industry, Particle Swarm Optimization (PSO) and Genetic Algorithms (GAs) are applied to attain parameters of the models. The Real-Coded Genetic Algorithm (RCGA) has been developed based on real numbers, which is introduced as a new approach in the field of energy forecasting. In the proposed model, electricity consumption has been considered as a function of different variables such as electricity tariff, manufacturing value added, prevailing fuel prices, the number of employees, the investment in equipment and consumption in the previous years. Mean Square Error (MSE), Root Mean Square Error (RMSE), Mean Absolute Deviation (MAD) and Mean Absolute Percent Error (MAPE) are the four functions which have been used as the fitness function in the evolutionary algorithms. The results show that the logarithmic nonlinear model using PSO algorithm with 1.91 error percentage has the best answer. Furthermore, the prediction of electricity consumption in industrial sector of Turkey and also Turkish industrial sector

  19. Diversity shrinkage: Cross-validating pareto-optimal weights to enhance diversity via hiring practices.

    Science.gov (United States)

    Song, Q Chelsea; Wee, Serena; Newman, Daniel A

    2017-12-01

    To reduce adverse impact potential and improve diversity outcomes from personnel selection, one promising technique is De Corte, Lievens, and Sackett's (2007) Pareto-optimal weighting strategy. De Corte et al.'s strategy has been demonstrated on (a) a composite of cognitive and noncognitive (e.g., personality) tests (De Corte, Lievens, & Sackett, 2008) and (b) a composite of specific cognitive ability subtests (Wee, Newman, & Joseph, 2014). Both studies illustrated how Pareto-weighting (in contrast to unit weighting) could lead to substantial improvement in diversity outcomes (i.e., diversity improvement), sometimes more than doubling the number of job offers for minority applicants. The current work addresses a key limitation of the technique-the possibility of shrinkage, especially diversity shrinkage, in the Pareto-optimal solutions. Using Monte Carlo simulations, sample size and predictor combinations were varied and cross-validated Pareto-optimal solutions were obtained. Although diversity shrinkage was sizable for a composite of cognitive and noncognitive predictors when sample size was at or below 500, diversity shrinkage was typically negligible for a composite of specific cognitive subtest predictors when sample size was at least 100. Diversity shrinkage was larger when the Pareto-optimal solution suggested substantial diversity improvement. When sample size was at least 100, cross-validated Pareto-optimal weights typically outperformed unit weights-suggesting that diversity improvement is often possible, despite diversity shrinkage. Implications for Pareto-optimal weighting, adverse impact, sample size of validation studies, and optimizing the diversity-job performance tradeoff are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Prediction in Partial Duration Series With Generalized Pareto-Distributed Exceedances

    DEFF Research Database (Denmark)

    Rosbjerg, Dan; Madsen, Henrik; Rasmussen, Peter Funder

    1992-01-01

    As a generalization of the common assumption of exponential distribution of the exceedances in Partial duration series the generalized Pareto distribution has been adopted. Estimators for the parameters are presented using estimation by both method of moments and probability-weighted moments......-weighted moments. Maintaining the generalized Pareto distribution as the parent exceedance distribution the T-year event is estimated assuming the exceedances to be exponentially distributed. For moderately long-tailed exceedance distributions and small to moderate sample sizes it is found, by comparing mean...... square errors of the T-year event estimators, that the exponential distribution is preferable to the correct generalized Pareto distribution despite the introduced model error and despite a possible rejection of the exponential hypothesis by a test of significance. For moderately short-tailed exceedance...

  1. Strong Convergence Bound of the Pareto Index Estimator under Right Censoring

    Directory of Open Access Journals (Sweden)

    Peng Zuoxiang

    2010-01-01

    Full Text Available Let be a sequence of positive independent and identically distributed random variables with common Pareto-type distribution function as , where represents a slowly varying function at infinity. In this note we study the strong convergence bound of a kind of right censored Pareto index estimator under second-order regularly varying conditions.

  2. Multiclass gene selection using Pareto-fronts.

    Science.gov (United States)

    Rajapakse, Jagath C; Mundra, Piyushkumar A

    2013-01-01

    Filter methods are often used for selection of genes in multiclass sample classification by using microarray data. Such techniques usually tend to bias toward a few classes that are easily distinguishable from other classes due to imbalances of strong features and sample sizes of different classes. It could therefore lead to selection of redundant genes while missing the relevant genes, leading to poor classification of tissue samples. In this manuscript, we propose to decompose multiclass ranking statistics into class-specific statistics and then use Pareto-front analysis for selection of genes. This alleviates the bias induced by class intrinsic characteristics of dominating classes. The use of Pareto-front analysis is demonstrated on two filter criteria commonly used for gene selection: F-score and KW-score. A significant improvement in classification performance and reduction in redundancy among top-ranked genes were achieved in experiments with both synthetic and real-benchmark data sets.

  3. Multi-objective Reactive Power Optimization Based on Improved Particle Swarm Algorithm

    Science.gov (United States)

    Cui, Xue; Gao, Jian; Feng, Yunbin; Zou, Chenlu; Liu, Huanlei

    2018-01-01

    In this paper, an optimization model with the minimum active power loss and minimum voltage deviation of node and maximum static voltage stability margin as the optimization objective is proposed for the reactive power optimization problems. By defining the index value of reactive power compensation, the optimal reactive power compensation node was selected. The particle swarm optimization algorithm was improved, and the selection pool of global optimal and the global optimal of probability (p-gbest) were introduced. A set of Pareto optimal solution sets is obtained by this algorithm. And by calculating the fuzzy membership value of the pareto optimal solution sets, individuals with the smallest fuzzy membership value were selected as the final optimization results. The above improved algorithm is used to optimize the reactive power of IEEE14 standard node system. Through the comparison and analysis of the results, it has been proven that the optimization effect of this algorithm was very good.

  4. The Burr X Pareto Distribution: Properties, Applications and VaR Estimation

    Directory of Open Access Journals (Sweden)

    Mustafa Ç. Korkmaz

    2017-12-01

    Full Text Available In this paper, a new three-parameter Pareto distribution is introduced and studied. We discuss various mathematical and statistical properties of the new model. Some estimation methods of the model parameters are performed. Moreover, the peaks-over-threshold method is used to estimate Value-at-Risk (VaR by means of the proposed distribution. We compare the distribution with a few other models to show its versatility in modelling data with heavy tails. VaR estimation with the Burr X Pareto distribution is presented using time series data, and the new model could be considered as an alternative VaR model against the generalized Pareto model for financial institutions.

  5. Pareto-Zipf law in growing systems with multiplicative interactions

    Science.gov (United States)

    Ohtsuki, Toshiya; Tanimoto, Satoshi; Sekiyama, Makoto; Fujihara, Akihiro; Yamamoto, Hiroshi

    2018-06-01

    Numerical simulations of multiplicatively interacting stochastic processes with weighted selections were conducted. A feedback mechanism to control the weight w of selections was proposed. It becomes evident that when w is moderately controlled around 0, such systems spontaneously exhibit the Pareto-Zipf distribution. The simulation results are universal in the sense that microscopic details, such as parameter values and the type of control and weight, are irrelevant. The central ingredient of the Pareto-Zipf law is argued to be the mild control of interactions.

  6. Support Vector Machines Trained with Evolutionary Algorithms Employing Kernel Adatron for Large Scale Classification of Protein Structures.

    Science.gov (United States)

    Arana-Daniel, Nancy; Gallegos, Alberto A; López-Franco, Carlos; Alanís, Alma Y; Morales, Jacob; López-Franco, Adriana

    2016-01-01

    With the increasing power of computers, the amount of data that can be processed in small periods of time has grown exponentially, as has the importance of classifying large-scale data efficiently. Support vector machines have shown good results classifying large amounts of high-dimensional data, such as data generated by protein structure prediction, spam recognition, medical diagnosis, optical character recognition and text classification, etc. Most state of the art approaches for large-scale learning use traditional optimization methods, such as quadratic programming or gradient descent, which makes the use of evolutionary algorithms for training support vector machines an area to be explored. The present paper proposes an approach that is simple to implement based on evolutionary algorithms and Kernel-Adatron for solving large-scale classification problems, focusing on protein structure prediction. The functional properties of proteins depend upon their three-dimensional structures. Knowing the structures of proteins is crucial for biology and can lead to improvements in areas such as medicine, agriculture and biofuels.

  7. Development of a multi-objective PBIL evolutionary algorithm applied to a nuclear reactor core reload optimization problem

    International Nuclear Information System (INIS)

    Machado, Marcelo D.; Dchirru, Roberto

    2005-01-01

    The nuclear reactor core reload optimization problem consists in finding a pattern of partially burned-up and fresh fuels that optimizes the plant's next operation cycle. This optimization problem has been traditionally solved using an expert's knowledge, but recently artificial intelligence techniques have also been applied successfully. The artificial intelligence optimization techniques generally have a single objective. However, most real-world engineering problems, including nuclear core reload optimization, have more than one objective (multi-objective) and these objectives are usually conflicting. The aim of this work is to develop a tool to solve multi-objective problems based on the Population-Based Incremental Learning (PBIL) algorithm. The new tool is applied to solve the Angra 1 PWR core reload optimization problem with the purpose of creating a Pareto surface, so that a pattern selected from this surface can be applied for the plant's next operation cycle. (author)

  8. Performance Analysis of Evolutionary Algorithms for Steiner Tree Problems.

    Science.gov (United States)

    Lai, Xinsheng; Zhou, Yuren; Xia, Xiaoyun; Zhang, Qingfu

    2017-01-01

    The Steiner tree problem (STP) aims to determine some Steiner nodes such that the minimum spanning tree over these Steiner nodes and a given set of special nodes has the minimum weight, which is NP-hard. STP includes several important cases. The Steiner tree problem in graphs (GSTP) is one of them. Many heuristics have been proposed for STP, and some of them have proved to be performance guarantee approximation algorithms for this problem. Since evolutionary algorithms (EAs) are general and popular randomized heuristics, it is significant to investigate the performance of EAs for STP. Several empirical investigations have shown that EAs are efficient for STP. However, up to now, there is no theoretical work on the performance of EAs for STP. In this article, we reveal that the (1+1) EA achieves 3/2-approximation ratio for STP in a special class of quasi-bipartite graphs in expected runtime [Formula: see text], where [Formula: see text], [Formula: see text], and [Formula: see text] are, respectively, the number of Steiner nodes, the number of special nodes, and the largest weight among all edges in the input graph. We also show that the (1+1) EA is better than two other heuristics on two GSTP instances, and the (1+1) EA may be inefficient on a constructed GSTP instance.

  9. Coordinated Voltage Control in Distribution Network with the Presence of DGs and Variable Loads Using Pareto and Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    José Raúl Castro

    2016-02-01

    Full Text Available This paper presents an efficient algorithm to solve the multi-objective (MO voltage control problem in distribution networks. The proposed algorithm minimizes the following three objectives: voltage variation on pilot buses, reactive power production ratio deviation, and generator voltage deviation. This work leverages two optimization techniques: fuzzy logic to find the optimum value of the reactive power of the distributed generation (DG and Pareto optimization to find the optimal value of the pilot bus voltage so that this produces lower losses under the constraints that the voltage remains within established limits. Variable loads and DGs are taken into account in this paper. The algorithm is tested on an IEEE 13-node test feeder and the results show the effectiveness of the proposed model.

  10. Robustness analysis of bogie suspension components Pareto optimised values

    Science.gov (United States)

    Mousavi Bideleh, Seyed Milad

    2017-08-01

    Bogie suspension system of high speed trains can significantly affect vehicle performance. Multiobjective optimisation problems are often formulated and solved to find the Pareto optimised values of the suspension components and improve cost efficiency in railway operations from different perspectives. Uncertainties in the design parameters of suspension system can negatively influence the dynamics behaviour of railway vehicles. In this regard, robustness analysis of a bogie dynamics response with respect to uncertainties in the suspension design parameters is considered. A one-car railway vehicle model with 50 degrees of freedom and wear/comfort Pareto optimised values of bogie suspension components is chosen for the analysis. Longitudinal and lateral primary stiffnesses, longitudinal and vertical secondary stiffnesses, as well as yaw damping are considered as five design parameters. The effects of parameter uncertainties on wear, ride comfort, track shift force, stability, and risk of derailment are studied by varying the design parameters around their respective Pareto optimised values according to a lognormal distribution with different coefficient of variations (COVs). The robustness analysis is carried out based on the maximum entropy concept. The multiplicative dimensional reduction method is utilised to simplify the calculation of fractional moments and improve the computational efficiency. The results showed that the dynamics response of the vehicle with wear/comfort Pareto optimised values of bogie suspension is robust against uncertainties in the design parameters and the probability of failure is small for parameter uncertainties with COV up to 0.1.

  11. System optimization for HVAC energy management using the robust evolutionary algorithm

    International Nuclear Information System (INIS)

    Fong, K.F.; Hanby, V.I.; Chow, T.T.

    2009-01-01

    For an installed centralized heating, ventilating and air conditioning (HVAC) system, appropriate energy management measures would achieve energy conservation targets through the optimal control and operation. The performance optimization of conventional HVAC systems may be handled by operation experience, but it may not cover different optimization scenarios and parameters in response to a variety of load and weather conditions. In this regard, it is common to apply the suitable simulation-optimization technique to model the system then determine the required operation parameters. The particular plant simulation models can be built up by either using the available simulation programs or a system of mathematical expressions. To handle the simulation models, iterations would be involved in the numerical solution methods. Since the gradient information is not easily available due to the complex nature of equations, the traditional gradient-based optimization methods are not applicable for this kind of system models. For the heuristic optimization methods, the continual search is commonly necessary, and the system function call is required for each search. The frequency of simulation function calls would then be a time-determining step, and an efficient optimization method is crucial, in order to find the solution through a number of function calls in a reasonable computational period. In this paper, the robust evolutionary algorithm (REA) is presented to tackle this nature of the HVAC simulation models. REA is based on one of the paradigms of evolutionary algorithm, evolution strategy, which is a stochastic population-based searching technique emphasized on mutation. The REA, which incorporates the Cauchy deterministic mutation, tournament selection and arithmetic recombination, would provide a synergetic effect for optimal search. The REA is effective to cope with the complex simulation models, as well as those represented by explicit mathematical expressions of

  12. Spatial multiobjective optimization of agricultural conservation practices using a SWAT model and an evolutionary algorithm.

    Science.gov (United States)

    Rabotyagov, Sergey; Campbell, Todd; Valcu, Adriana; Gassman, Philip; Jha, Manoj; Schilling, Keith; Wolter, Calvin; Kling, Catherine

    2012-12-09

    Finding the cost-efficient (i.e., lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.,(5,12,20)) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization. Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods(3,4,9,10,13-15,17-19,22,23,25). In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model(7) with a

  13. Optimal routes scheduling for municipal waste disposal garbage trucks using evolutionary algorithm and artificial immune system

    Directory of Open Access Journals (Sweden)

    Bogna MRÓWCZYŃSKA

    2011-01-01

    Full Text Available This paper describes an application of an evolutionary algorithm and an artificial immune systems to solve a problem of scheduling an optimal route for waste disposal garbage trucks in its daily operation. Problem of an optimisation is formulated and solved using both methods. The results are presented for an area in one of the Polish cities.

  14. Optimal power system generation scheduling by multi-objective genetic algorithms with preferences

    International Nuclear Information System (INIS)

    Zio, E.; Baraldi, P.; Pedroni, N.

    2009-01-01

    Power system generation scheduling is an important issue both from the economical and environmental safety viewpoints. The scheduling involves decisions with regards to the units start-up and shut-down times and to the assignment of the load demands to the committed generating units for minimizing the system operation costs and the emission of atmospheric pollutants. As many other real-world engineering problems, power system generation scheduling involves multiple, conflicting optimization criteria for which there exists no single best solution with respect to all criteria considered. Multi-objective optimization algorithms, based on the principle of Pareto optimality, can then be designed to search for the set of nondominated scheduling solutions from which the decision-maker (DM) must a posteriori choose the preferred alternative. On the other hand, often, information is available a priori regarding the preference values of the DM with respect to the objectives. When possible, it is important to exploit this information during the search so as to focus it on the region of preference of the Pareto-optimal set. In this paper, ways are explored to use this preference information for driving a multi-objective genetic algorithm towards the preferential region of the Pareto-optimal front. Two methods are considered: the first one extends the concept of Pareto dominance by biasing the chromosome replacement step of the algorithm by means of numerical weights that express the DM' s preferences; the second one drives the search algorithm by changing the shape of the dominance region according to linear trade-off functions specified by the DM. The effectiveness of the proposed approaches is first compared on a case study of literature. Then, a nonlinear, constrained, two-objective power generation scheduling problem is effectively tackled

  15. Improved Shape Parameter Estimation in Pareto Distributed Clutter with Neural Networks

    Directory of Open Access Journals (Sweden)

    José Raúl Machado-Fernández

    2016-12-01

    Full Text Available The main problem faced by naval radars is the elimination of the clutter input which is a distortion signal appearing mixed with target reflections. Recently, the Pareto distribution has been related to sea clutter measurements suggesting that it may provide a better fit than other traditional distributions. The authors propose a new method for estimating the Pareto shape parameter based on artificial neural networks. The solution achieves a precise estimation of the parameter, having a low computational cost, and outperforming the classic method which uses Maximum Likelihood Estimates (MLE. The presented scheme contributes to the development of the NATE detector for Pareto clutter, which uses the knowledge of clutter statistics for improving the stability of the detection, among other applications.

  16. A Comparative Study of Differential Evolution, Particle Swarm Optimization, and Evolutionary Algorithms on Numerical Benchmark Problems

    DEFF Research Database (Denmark)

    Vesterstrøm, Jacob Svaneborg; Thomsen, Rene

    2004-01-01

    Several extensions to evolutionary algorithms (EAs) and particle swarm optimization (PSO) have been suggested during the last decades offering improved performance on selected benchmark problems. Recently, another search heuristic termed differential evolution (DE) has shown superior performance...... in several real-world applications. In this paper, we evaluate the performance of DE, PSO, and EAs regarding their general applicability as numerical optimization techniques. The comparison is performed on a suite of 34 widely used benchmark problems. The results from our study show that DE generally...... outperforms the other algorithms. However, on two noisy functions, both DE and PSO were outperformed by the EA....

  17. Application of multi-objective controller to optimal tuning of PID gains for a hydraulic turbine regulating system using adaptive grid particle swam optimization.

    Science.gov (United States)

    Chen, Zhihuan; Yuan, Yanbin; Yuan, Xiaohui; Huang, Yuehua; Li, Xianshan; Li, Wenwu

    2015-05-01

    A hydraulic turbine regulating system (HTRS) is one of the most important components of hydropower plant, which plays a key role in maintaining safety, stability and economical operation of hydro-electrical installations. At present, the conventional PID controller is widely applied in the HTRS system for its practicability and robustness, and the primary problem with respect to this control law is how to optimally tune the parameters, i.e. the determination of PID controller gains for satisfactory performance. In this paper, a kind of multi-objective evolutionary algorithms, named adaptive grid particle swarm optimization (AGPSO) is applied to solve the PID gains tuning problem of the HTRS system. This newly AGPSO optimized method, which differs from a traditional one-single objective optimization method, is designed to take care of settling time and overshoot level simultaneously, in which a set of non-inferior alternatives solutions (i.e. Pareto solution) is generated. Furthermore, a fuzzy-based membership value assignment method is employed to choose the best compromise solution from the obtained Pareto set. An illustrative example associated with the best compromise solution for parameter tuning of the nonlinear HTRS system is introduced to verify the feasibility and the effectiveness of the proposed AGPSO-based optimization approach, as compared with two another prominent multi-objective algorithms, i.e. Non-dominated Sorting Genetic Algorithm II (NSGAII) and Strength Pareto Evolutionary Algorithm II (SPEAII), for the quality and diversity of obtained Pareto solutions set. Consequently, simulation results show that this AGPSO optimized approach outperforms than compared methods with higher efficiency and better quality no matter whether the HTRS system works under unload or load conditions. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Data envelopment analysis and Pareto genetic algorithm applied to robust design in multiresponse systems

    Directory of Open Access Journals (Sweden)

    Enrique Carlos Canessa-Terrazas

    2016-01-01

    Full Text Available Se presenta el uso de Análisis Envolvente de Datos (AED para priorizar y seleccionar soluciones encontradas por un Algoritmo Genético de Pareto (AGP a problemas de diseño robusto en sistemas multirespuesta con muchos factores de control y ruido. El análisis de eficiencia de las soluciones con AED muestra que el AGP encuentra una buena aproximación a la frontera eficiente. Además, se usa AED para determinar la combinación del nivel de ajuste de media y variación de las respuestas del sistema, y con la finalidad de minimizar el costo económico de alcanzar dichos objetivos. Al unir ese costo con otras consideraciones técnicas y/o económicas, la solución que mejor se ajuste con un nivel predeterminado de calidad puede ser seleccionada más apropiadamente.

  19. Genetic algorithm based separation cascade optimization

    International Nuclear Information System (INIS)

    Mahendra, A.K.; Sanyal, A.; Gouthaman, G.; Bera, T.K.

    2008-01-01

    The conventional separation cascade design procedure does not give an optimum design because of squaring-off, variation of flow rates and separation factor of the element with respect to stage location. Multi-component isotope separation further complicates the design procedure. Cascade design can be stated as a constrained multi-objective optimization. Cascade's expectation from the separating element is multi-objective i.e. overall separation factor, cut, optimum feed and separative power. Decision maker may aspire for more comprehensive multi-objective goals where optimization of cascade is coupled with the exploration of separating element optimization vector space. In real life there are many issues which make it important to understand the decision maker's perception of cost-quality-speed trade-off and consistency of preferences. Genetic algorithm (GA) is one such evolutionary technique that can be used for cascade design optimization. This paper addresses various issues involved in the GA based multi-objective optimization of the separation cascade. Reference point based optimization methodology with GA based Pareto optimality concept for separation cascade was found pragmatic and promising. This method should be explored, tested, examined and further developed for binary as well as multi-component separations. (author)

  20. An asymptotically unbiased minimum density power divergence estimator for the Pareto-tail index

    DEFF Research Database (Denmark)

    Dierckx, Goedele; Goegebeur, Yuri; Guillou, Armelle

    2013-01-01

    We introduce a robust and asymptotically unbiased estimator for the tail index of Pareto-type distributions. The estimator is obtained by fitting the extended Pareto distribution to the relative excesses over a high threshold with the minimum density power divergence criterion. Consistency...

  1. Optimization of the core configuration design using a hybrid artificial intelligence algorithm for research reactors

    International Nuclear Information System (INIS)

    Hedayat, Afshin; Davilu, Hadi; Barfrosh, Ahmad Abdollahzadeh; Sepanloo, Kamran

    2009-01-01

    To successfully carry out material irradiation experiments and radioisotope productions, a high thermal neutron flux at irradiation box over a desired life time of a core configuration is needed. On the other hand, reactor safety and operational constraints must be preserved during core configuration selection. Two main objectives and two safety and operational constraints are suggested to optimize reactor core configuration design. Suggested parameters and conditions are considered as two separate fitness functions composed of two main objectives and two penalty functions. This is a constrained and combinatorial type of a multi-objective optimization problem. In this paper, a fast and effective hybrid artificial intelligence algorithm is introduced and developed to reach a Pareto optimal set. The hybrid algorithm is composed of a fast and elitist multi-objective genetic algorithm (GA) and a fast fitness function evaluating system based on the cascade feed forward artificial neural networks (ANNs). A specific GA representation of core configuration and also special GA operators are introduced and used to overcome the combinatorial constraints of this optimization problem. A software package (Core Pattern Calculator 1) is developed to prepare and reform required data for ANNs training and also to revise the optimization results. Some practical test parameters and conditions are suggested to adjust main parameters of the hybrid algorithm. Results show that introduced ANNs can be trained and estimate selected core parameters of a research reactor very quickly. It improves effectively optimization process. Final optimization results show that a uniform and dense diversity of Pareto fronts are gained over a wide range of fitness function values. To take a more careful selection of Pareto optimal solutions, a revision system is introduced and used. The revision of gained Pareto optimal set is performed by using developed software package. Also some secondary operational

  2. Optimization of the core configuration design using a hybrid artificial intelligence algorithm for research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hedayat, Afshin, E-mail: ahedayat@aut.ac.i [Department of Nuclear Engineering and Physics, Amirkabir University of Technology (Tehran Polytechnic), 424 Hafez Avenue, P.O. Box 15875-4413, Tehran (Iran, Islamic Republic of); Reactor Research and Development School, Nuclear Science and Technology Research Institute (NSTRI), End of North Karegar Street, P.O. Box 14395-836, Tehran (Iran, Islamic Republic of); Davilu, Hadi [Department of Nuclear Engineering and Physics, Amirkabir University of Technology (Tehran Polytechnic), 424 Hafez Avenue, P.O. Box 15875-4413, Tehran (Iran, Islamic Republic of); Barfrosh, Ahmad Abdollahzadeh [Department of Computer Engineering, Amirkabir University of Technology (Tehran Polytechnic), 424 Hafez Avenue, P.O. Box 15875-4413, Tehran (Iran, Islamic Republic of); Sepanloo, Kamran [Reactor Research and Development School, Nuclear Science and Technology Research Institute (NSTRI), End of North Karegar Street, P.O. Box 14395-836, Tehran (Iran, Islamic Republic of)

    2009-12-15

    To successfully carry out material irradiation experiments and radioisotope productions, a high thermal neutron flux at irradiation box over a desired life time of a core configuration is needed. On the other hand, reactor safety and operational constraints must be preserved during core configuration selection. Two main objectives and two safety and operational constraints are suggested to optimize reactor core configuration design. Suggested parameters and conditions are considered as two separate fitness functions composed of two main objectives and two penalty functions. This is a constrained and combinatorial type of a multi-objective optimization problem. In this paper, a fast and effective hybrid artificial intelligence algorithm is introduced and developed to reach a Pareto optimal set. The hybrid algorithm is composed of a fast and elitist multi-objective genetic algorithm (GA) and a fast fitness function evaluating system based on the cascade feed forward artificial neural networks (ANNs). A specific GA representation of core configuration and also special GA operators are introduced and used to overcome the combinatorial constraints of this optimization problem. A software package (Core Pattern Calculator 1) is developed to prepare and reform required data for ANNs training and also to revise the optimization results. Some practical test parameters and conditions are suggested to adjust main parameters of the hybrid algorithm. Results show that introduced ANNs can be trained and estimate selected core parameters of a research reactor very quickly. It improves effectively optimization process. Final optimization results show that a uniform and dense diversity of Pareto fronts are gained over a wide range of fitness function values. To take a more careful selection of Pareto optimal solutions, a revision system is introduced and used. The revision of gained Pareto optimal set is performed by using developed software package. Also some secondary operational

  3. Optimum oil production planning using infeasibility driven evolutionary algorithm.

    Science.gov (United States)

    Singh, Hemant Kumar; Ray, Tapabrata; Sarker, Ruhul

    2013-01-01

    In this paper, we discuss a practical oil production planning optimization problem. For oil wells with insufficient reservoir pressure, gas is usually injected to artificially lift oil, a practice commonly referred to as enhanced oil recovery (EOR). The total gas that can be used for oil extraction is constrained by daily availability limits. The oil extracted from each well is known to be a nonlinear function of the gas injected into the well and varies between wells. The problem is to identify the optimal amount of gas that needs to be injected into each well to maximize the amount of oil extracted subject to the constraint on the total daily gas availability. The problem has long been of practical interest to all major oil exploration companies as it has the potential to derive large financial benefit. In this paper, an infeasibility driven evolutionary algorithm is used to solve a 56 well reservoir problem which demonstrates its efficiency in solving constrained optimization problems. Furthermore, a multi-objective formulation of the problem is posed and solved using a number of algorithms, which eliminates the need for solving the (single objective) problem on a regular basis. Lastly, a modified single objective formulation of the problem is also proposed, which aims to maximize the profit instead of the quantity of oil. It is shown that even with a lesser amount of oil extracted, more economic benefits can be achieved through the modified formulation.

  4. Genetic Algorithm Optimizes Q-LAW Control Parameters

    Science.gov (United States)

    Lee, Seungwon; von Allmen, Paul; Petropoulos, Anastassios; Terrile, Richard

    2008-01-01

    A document discusses a multi-objective, genetic algorithm designed to optimize Lyapunov feedback control law (Q-law) parameters in order to efficiently find Pareto-optimal solutions for low-thrust trajectories for electronic propulsion systems. These would be propellant-optimal solutions for a given flight time, or flight time optimal solutions for a given propellant requirement. The approximate solutions are used as good initial solutions for high-fidelity optimization tools. When the good initial solutions are used, the high-fidelity optimization tools quickly converge to a locally optimal solution near the initial solution. Q-law control parameters are represented as real-valued genes in the genetic algorithm. The performances of the Q-law control parameters are evaluated in the multi-objective space (flight time vs. propellant mass) and sorted by the non-dominated sorting method that assigns a better fitness value to the solutions that are dominated by a fewer number of other solutions. With the ranking result, the genetic algorithm encourages the solutions with higher fitness values to participate in the reproduction process, improving the solutions in the evolution process. The population of solutions converges to the Pareto front that is permitted within the Q-law control parameter space.

  5. Zipf's law and influential factors of the Pareto exponent of the city size distribution: Evidence from China

    OpenAIRE

    GAO Hongying; WU Kangping

    2007-01-01

    This paper estimates the Pareto exponent of the city size (population size and economy size) distribution, all provinces, and three regions in China in 1997, 2000 and 2003 by OLS, comparatively analyzes the Pareto exponent cross section and times, and empirically analyzes the factors which impacts on the Pareto exponents of provinces. Our analyses show that the size distributions of cities in China follow the Pareto distribution and are of structural features. Variations in the value of the P...

  6. Global shape optimization of airfoil using multi-objective genetic algorithm

    International Nuclear Information System (INIS)

    Lee, Ju Hee; Lee, Sang Hwan; Park, Kyoung Woo

    2005-01-01

    The shape optimization of an airfoil has been performed for an incompressible viscous flow. In this study, Pareto frontier sets, which are global and non-dominated solutions, can be obtained without various weighting factors by using the multi-objective genetic algorithm. An NACA0012 airfoil is considered as a baseline model, and the profile of the airfoil is parameterized and rebuilt with four Bezier curves. Two curves, from leading to maximum thickness, are composed of five control points and the rest, from maximum thickness to tailing edge, are composed of four control points. There are eighteen design variables and two objective functions such as the lift and drag coefficients. A generation is made up of forty-five individuals. After fifteenth evolutions, the Pareto individuals of twenty can be achieved. One Pareto, which is the best of the reduction of the drag force, improves its drag to 13% and lift-drag ratio to 2%. Another Pareto, however, which is focused on increasing the lift force, can improve its lift force to 61%, while sustaining its drag force, compared to those of the baseline model

  7. Global shape optimization of airfoil using multi-objective genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ju Hee; Lee, Sang Hwan [Hanyang Univ., Seoul (Korea, Republic of); Park, Kyoung Woo [Hoseo Univ., Asan (Korea, Republic of)

    2005-10-01

    The shape optimization of an airfoil has been performed for an incompressible viscous flow. In this study, Pareto frontier sets, which are global and non-dominated solutions, can be obtained without various weighting factors by using the multi-objective genetic algorithm. An NACA0012 airfoil is considered as a baseline model, and the profile of the airfoil is parameterized and rebuilt with four Bezier curves. Two curves, from leading to maximum thickness, are composed of five control points and the rest, from maximum thickness to tailing edge, are composed of four control points. There are eighteen design variables and two objective functions such as the lift and drag coefficients. A generation is made up of forty-five individuals. After fifteenth evolutions, the Pareto individuals of twenty can be achieved. One Pareto, which is the best of the reduction of the drag force, improves its drag to 13% and lift-drag ratio to 2%. Another Pareto, however, which is focused on increasing the lift force, can improve its lift force to 61%, while sustaining its drag force, compared to those of the baseline model.

  8. Selfish Gene Algorithm Vs Genetic Algorithm: A Review

    Science.gov (United States)

    Ariff, Norharyati Md; Khalid, Noor Elaiza Abdul; Hashim, Rathiah; Noor, Noorhayati Mohamed

    2016-11-01

    Evolutionary algorithm is one of the algorithms inspired by the nature. Within little more than a decade hundreds of papers have reported successful applications of EAs. In this paper, the Selfish Gene Algorithms (SFGA), as one of the latest evolutionary algorithms (EAs) inspired from the Selfish Gene Theory which is an interpretation of Darwinian Theory ideas from the biologist Richards Dawkins on 1989. In this paper, following a brief introduction to the Selfish Gene Algorithm (SFGA), the chronology of its evolution is presented. It is the purpose of this paper is to present an overview of the concepts of Selfish Gene Algorithm (SFGA) as well as its opportunities and challenges. Accordingly, the history, step involves in the algorithm are discussed and its different applications together with an analysis of these applications are evaluated.

  9. A Hybrid Quantum Evolutionary Algorithm with Improved Decoding Scheme for a Robotic Flow Shop Scheduling Problem

    Directory of Open Access Journals (Sweden)

    Weidong Lei

    2017-01-01

    Full Text Available We aim at solving the cyclic scheduling problem with a single robot and flexible processing times in a robotic flow shop, which is a well-known optimization problem in advanced manufacturing systems. The objective of the problem is to find an optimal robot move sequence such that the throughput rate is maximized. We propose a hybrid algorithm based on the Quantum-Inspired Evolutionary Algorithm (QEA and genetic operators for solving the problem. The algorithm integrates three different decoding strategies to convert quantum individuals into robot move sequences. The Q-gate is applied to update the states of Q-bits in each individual. Besides, crossover and mutation operators with adaptive probabilities are used to increase the population diversity. A repairing procedure is proposed to deal with infeasible individuals. Comparison results on both benchmark and randomly generated instances demonstrate that the proposed algorithm is more effective in solving the studied problem in terms of solution quality and computational time.

  10. A fast method for calculating reliable event supports in tree reconciliations via Pareto optimality.

    Science.gov (United States)

    To, Thu-Hien; Jacox, Edwin; Ranwez, Vincent; Scornavacca, Celine

    2015-11-14

    Given a gene and a species tree, reconciliation methods attempt to retrieve the macro-evolutionary events that best explain the discrepancies between the two tree topologies. The DTL parsimonious approach searches for a most parsimonious reconciliation between a gene tree and a (dated) species tree, considering four possible macro-evolutionary events (speciation, duplication, transfer, and loss) with specific costs. Unfortunately, many events are erroneously predicted due to errors in the input trees, inappropriate input cost values or because of the existence of several equally parsimonious scenarios. It is thus crucial to provide a measure of the reliability for predicted events. It has been recently proposed that the reliability of an event can be estimated via its frequency in the set of most parsimonious reconciliations obtained using a variety of reasonable input cost vectors. To compute such a support, a straightforward but time-consuming approach is to generate the costs slightly departing from the original ones, independently compute the set of all most parsimonious reconciliations for each vector, and combine these sets a posteriori. Another proposed approach uses Pareto-optimality to partition cost values into regions which induce reconciliations with the same number of DTL events. The support of an event is then defined as its frequency in the set of regions. However, often, the number of regions is not large enough to provide reliable supports. We present here a method to compute efficiently event supports via a polynomial-sized graph, which can represent all reconciliations for several different costs. Moreover, two methods are proposed to take into account alternative input costs: either explicitly providing an input cost range or allowing a tolerance for the over cost of a reconciliation. Our methods are faster than the region based method, substantially faster than the sampling-costs approach, and have a higher event-prediction accuracy on

  11. Robust bayesian inference of generalized Pareto distribution ...

    African Journals Online (AJOL)

    En utilisant une etude exhaustive de Monte Carlo, nous prouvons que, moyennant une fonction perte generalisee adequate, on peut construire un estimateur Bayesien robuste du modele. Key words: Bayesian estimation; Extreme value; Generalized Fisher information; Gener- alized Pareto distribution; Monte Carlo; ...

  12. Strong Convergence Bound of the Pareto Index Estimator under Right Censoring

    Directory of Open Access Journals (Sweden)

    Bao Tao

    2010-01-01

    Full Text Available Let {Xn,n≥1} be a sequence of positive independent and identically distributed random variables with common Pareto-type distribution function F(x=1−x−1/γlF(x as γ>0, where lF(x represents a slowly varying function at infinity. In this note we study the strong convergence bound of a kind of right censored Pareto index estimator under second-order regularly varying conditions.

  13. A backtracking evolutionary algorithm for power systems

    Directory of Open Access Journals (Sweden)

    Chiou Ji-Pyng

    2017-01-01

    Full Text Available This paper presents a backtracking variable scaling hybrid differential evolution, called backtracking VSHDE, for solving the optimal network reconfiguration problems for power loss reduction in distribution systems. The concepts of the backtracking, variable scaling factor, migrating, accelerated, and boundary control mechanism are embedded in the original differential evolution (DE to form the backtracking VSHDE. The concepts of the backtracking and boundary control mechanism can increase the population diversity. And, according to the convergence property of the population, the scaling factor is adjusted based on the 1/5 success rule of the evolution strategies (ESs. A larger population size must be used in the evolutionary algorithms (EAs to maintain the population diversity. To overcome this drawback, two operations, acceleration operation and migrating operation, are embedded into the proposed method. The feeder reconfiguration of distribution systems is modelled as an optimization problem which aims at achieving the minimum loss subject to voltage and current constraints. So, the proper system topology that reduces the power loss according to a load pattern is an important issue. Mathematically, the network reconfiguration system is a nonlinear programming problem with integer variables. One three-feeder network reconfiguration system from the literature is researched by the proposed backtracking VSHDE method and simulated annealing (SA. Numerical results show that the perfrmance of the proposed method outperformed the SA method.

  14. Evolutionary Computation Methods and their applications in Statistics

    Directory of Open Access Journals (Sweden)

    Francesco Battaglia

    2013-05-01

    Full Text Available A brief discussion of the genesis of evolutionary computation methods, their relationship to artificial intelligence, and the contribution of genetics and Darwin’s theory of natural evolution is provided. Then, the main evolutionary computation methods are illustrated: evolution strategies, genetic algorithms, estimation of distribution algorithms, differential evolution, and a brief description of some evolutionary behavior methods such as ant colony and particle swarm optimization. We also discuss the role of the genetic algorithm for multivariate probability distribution random generation, rather than as a function optimizer. Finally, some relevant applications of genetic algorithm to statistical problems are reviewed: selection of variables in regression, time series model building, outlier identification, cluster analysis, design of experiments.

  15. WH-EA: An Evolutionary Algorithm for Wiener-Hammerstein System Identification

    Directory of Open Access Journals (Sweden)

    J. Zambrano

    2018-01-01

    Full Text Available Current methods to identify Wiener-Hammerstein systems using Best Linear Approximation (BLA involve at least two steps. First, BLA is divided into obtaining front and back linear dynamics of the Wiener-Hammerstein model. Second, a refitting procedure of all parameters is carried out to reduce modelling errors. In this paper, a novel approach to identify Wiener-Hammerstein systems in a single step is proposed. This approach is based on a customized evolutionary algorithm (WH-EA able to look for the best BLA split, capturing at the same time the process static nonlinearity with high precision. Furthermore, to correct possible errors in BLA estimation, the locations of poles and zeros are subtly modified within an adequate search space to allow a fine-tuning of the model. The performance of the proposed approach is analysed by using a demonstration example and a nonlinear system identification benchmark.

  16. Axiomatizations of Pareto Equilibria in Multicriteria Games

    NARCIS (Netherlands)

    Voorneveld, M.; Vermeulen, D.; Borm, P.E.M.

    1997-01-01

    We focus on axiomatizations of the Pareto equilibrium concept in multicriteria games based on consistency.Axiomatizations of the Nash equilibrium concept by Peleg and Tijs (1996) and Peleg, Potters, and Tijs (1996) have immediate generalizations.The axiomatization of Norde et al.(1996) cannot be

  17. Support vector machines and evolutionary algorithms for classification single or together?

    CERN Document Server

    Stoean, Catalin

    2014-01-01

    When discussing classification, support vector machines are known to be a capable and efficient technique to learn and predict with high accuracy within a quick time frame. Yet, their black box means to do so make the practical users quite circumspect about relying on it, without much understanding of the how and why of its predictions. The question raised in this book is how can this ‘masked hero’ be made more comprehensible and friendly to the public: provide a surrogate model for its hidden optimization engine, replace the method completely or appoint a more friendly approach to tag along and offer the much desired explanations? Evolutionary algorithms can do all these and this book presents such possibilities of achieving high accuracy, comprehensibility, reasonable runtime as well as unconstrained performance.

  18. Ranking of microRNA target prediction scores by Pareto front analysis.

    Science.gov (United States)

    Sahoo, Sudhakar; Albrecht, Andreas A

    2010-12-01

    Over the past ten years, a variety of microRNA target prediction methods has been developed, and many of the methods are constantly improved and adapted to recent insights into miRNA-mRNA interactions. In a typical scenario, different methods return different rankings of putative targets, even if the ranking is reduced to selected mRNAs that are related to a specific disease or cell type. For the experimental validation it is then difficult to decide in which order to process the predicted miRNA-mRNA bindings, since each validation is a laborious task and therefore only a limited number of mRNAs can be analysed. We propose a new ranking scheme that combines ranked predictions from several methods and - unlike standard thresholding methods - utilises the concept of Pareto fronts as defined in multi-objective optimisation. In the present study, we attempt a proof of concept by applying the new ranking scheme to hsa-miR-21, hsa-miR-125b, and hsa-miR-373 and prediction scores supplied by PITA and RNAhybrid. The scores are interpreted as a two-objective optimisation problem, and the elements of the Pareto front are ranked by the STarMir score with a subsequent re-calculation of the Pareto front after removal of the top-ranked mRNA from the basic set of prediction scores. The method is evaluated on validated targets of the three miRNA, and the ranking is compared to scores from DIANA-microT and TargetScan. We observed that the new ranking method performs well and consistent, and the first validated targets are elements of Pareto fronts at a relatively early stage of the recurrent procedure, which encourages further research towards a higher-dimensional analysis of Pareto fronts. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. Multiobjective planning of distribution networks incorporating switches and protective devices using a memetic optimization

    International Nuclear Information System (INIS)

    Pombo, A. Vieira; Murta-Pina, João; Pires, V. Fernão

    2015-01-01

    A multi-objective planning approach for the reliability of electric distribution networks using a memetic optimization is presented. In this reliability optimization, the type of the equipment (switches or reclosers) and their location are optimized. The multiple objectives considered to find the optimal values for these planning variables are the minimization of the total equipment cost and at the same time the minimization of two distribution network reliability indexes. The reliability indexes are the system average interruption frequency index (SAIFI) and system average interruption duration index (SAIDI). To solve this problem a memetic evolutionary algorithm is proposed, which combines the Non-Dominated Sorting Genetic Algorithm II (NSGA-II) with a local search algorithm. The obtained Pareto-optimal front contains solutions of different trade-offs with respect to the three objectives. A real distribution network is used to test the proposed algorithm. The obtained results show that this approach allows the utility to obtain the optimal type and location of the equipments to achieve the best reliability with the lower cost. - Highlights: • Reliability indexes SAIFI and SAIDI and Equipment Cost are optimized. • Optimization of equipment type, number and location on a MV network. • Memetic evolutionary algorithm with a local search algorithm is proposed. • Pareto optimal front solutions with respect to the three objective functions

  20. Income inequality in Romania: The exponential-Pareto distribution

    Science.gov (United States)

    Oancea, Bogdan; Andrei, Tudorel; Pirjol, Dan

    2017-03-01

    We present a study of the distribution of the gross personal income and income inequality in Romania, using individual tax income data, and both non-parametric and parametric methods. Comparing with official results based on household budget surveys (the Family Budgets Survey and the EU-SILC data), we find that the latter underestimate the income share of the high income region, and the overall income inequality. A parametric study shows that the income distribution is well described by an exponential distribution in the low and middle incomes region, and by a Pareto distribution in the high income region with Pareto coefficient α = 2.53. We note an anomaly in the distribution in the low incomes region (∼9,250 RON), and present a model which explains it in terms of partial income reporting.

  1. [Origination of Pareto distribution in complex dynamic systems].

    Science.gov (United States)

    Chernavskiĭ, D S; Nikitin, A P; Chernavskaia, O D

    2008-01-01

    The Pareto distribution, whose probability density function can be approximated at sufficiently great chi as rho(chi) - chi(-alpha), where alpha > or = 2, is of crucial importance from both the theoretical and practical point of view. The main reason is its qualitative distinction from the normal (Gaussian) distribution. Namely, the probability of high deviations appears to be significantly higher. The conception of the universal applicability of the Gauss law remains to be widely distributed despite the lack of objective confirmation of this notion in a variety of application areas. The origin of the Pareto distribution in dynamic systems located in the gaussian noise field is considered. A simple one-dimensional model is discussed where the system response in a rather wide interval of the variable can be quite precisely approximated by this distribution.

  2. Pareto optimal design of sectored toroidal superconducting magnet for SMES

    Science.gov (United States)

    Bhunia, Uttam; Saha, Subimal; Chakrabarti, Alok

    2014-10-01

    A novel multi-objective optimization design approach for sectored toroidal superconducting magnetic energy storage coil has been developed considering the practical engineering constraints. The objectives include the minimization of necessary superconductor length and torus overall size or volume, which determines a significant part of cost towards realization of SMES. The best trade-off between the necessary conductor length for winding and magnet overall size is achieved in the Pareto-optimal solutions, the compact magnet size leads to increase in required superconducting cable length or vice versa The final choice among Pareto optimal configurations can be done in relation to other issues such as AC loss during transient operation, stray magnetic field at outside the coil assembly, and available discharge period, which is not considered in the optimization process. The proposed design approach is adapted for a 4.5 MJ/1 MW SMES system using low temperature niobium-titanium based Rutherford type cable. Furthermore, the validity of the representative Pareto solutions is confirmed by finite-element analysis (FEA) with a reasonably acceptable accuracy.

  3. Optimal configuration of power grid sources based on optimal particle swarm algorithm

    Science.gov (United States)

    Wen, Yuanhua

    2018-04-01

    In order to optimize the distribution problem of power grid sources, an optimized particle swarm optimization algorithm is proposed. First, the concept of multi-objective optimization and the Pareto solution set are enumerated. Then, the performance of the classical genetic algorithm, the classical particle swarm optimization algorithm and the improved particle swarm optimization algorithm are analyzed. The three algorithms are simulated respectively. Compared with the test results of each algorithm, the superiority of the algorithm in convergence and optimization performance is proved, which lays the foundation for subsequent micro-grid power optimization configuration solution.

  4. How Well Do We Know Pareto Optimality?

    Science.gov (United States)

    Mathur, Vijay K.

    1991-01-01

    Identifies sources of ambiguity in economics textbooks' discussion of the condition for efficient output mix. Points out that diverse statements without accompanying explanations create confusion among students. Argues that conflicting views concerning the concept of Pareto optimality as one source of ambiguity. Suggests clarifying additions to…

  5. A practical multi-objective PSO algorithm for optimal operation management of distribution network with regard to fuel cell power plants

    Energy Technology Data Exchange (ETDEWEB)

    Niknam, Taher; Meymand, Hamed Zeinoddini; Mojarrad, Hasan Doagou [Department of Electrical and Electronics Engineering, Shiraz University of Technology, Shiraz, P.O. 71555-313 (Iran, Islamic Republic of)

    2011-05-15

    In this paper a novel Multi-objective fuzzy self adaptive hybrid particle swarm optimization (MFSAHPSO) evolutionary algorithm to solve the Multi-objective optimal operation management (MOOM) is presented. The purposes of the MOOM problem are to decrease the total electrical energy losses, the total electrical energy cost and the total pollutant emission produced by fuel cells and substation bus. Conventional algorithms used to solve the multi-objective optimization problems convert the multiple objectives into a single objective, using a vector of the user-predefined weights. In this conversion several deficiencies can be detected. For instance, the optimal solution of the algorithms depends greatly on the values of the weights and also some of the information may be lost in the conversion process and so this strategy is not expected to provide a robust solution. This paper presents a new MFSAHPSO algorithm for the MOOM problem. The proposed algorithm maintains a finite-sized repository of non-dominated solutions which gets iteratively updated in the presence of new solutions. Since the objective functions are not the same, a fuzzy clustering technique is used to control the size of the repository, within the limits. The proposed algorithm is tested on a distribution test feeder and the results demonstrate the capabilities of the proposed approach, to generate true and well-distributed Pareto-optimal non-dominated solutions of the MOOM problem. (author)

  6. Identifying best-fitting inputs in health-economic model calibration: a Pareto frontier approach.

    Science.gov (United States)

    Enns, Eva A; Cipriano, Lauren E; Simons, Cyrena T; Kong, Chung Yin

    2015-02-01

    To identify best-fitting input sets using model calibration, individual calibration target fits are often combined into a single goodness-of-fit (GOF) measure using a set of weights. Decisions in the calibration process, such as which weights to use, influence which sets of model inputs are identified as best-fitting, potentially leading to different health economic conclusions. We present an alternative approach to identifying best-fitting input sets based on the concept of Pareto-optimality. A set of model inputs is on the Pareto frontier if no other input set simultaneously fits all calibration targets as well or better. We demonstrate the Pareto frontier approach in the calibration of 2 models: a simple, illustrative Markov model and a previously published cost-effectiveness model of transcatheter aortic valve replacement (TAVR). For each model, we compare the input sets on the Pareto frontier to an equal number of best-fitting input sets according to 2 possible weighted-sum GOF scoring systems, and we compare the health economic conclusions arising from these different definitions of best-fitting. For the simple model, outcomes evaluated over the best-fitting input sets according to the 2 weighted-sum GOF schemes were virtually nonoverlapping on the cost-effectiveness plane and resulted in very different incremental cost-effectiveness ratios ($79,300 [95% CI 72,500-87,600] v. $139,700 [95% CI 79,900-182,800] per quality-adjusted life-year [QALY] gained). Input sets on the Pareto frontier spanned both regions ($79,000 [95% CI 64,900-156,200] per QALY gained). The TAVR model yielded similar results. Choices in generating a summary GOF score may result in different health economic conclusions. The Pareto frontier approach eliminates the need to make these choices by using an intuitive and transparent notion of optimality as the basis for identifying best-fitting input sets. © The Author(s) 2014.

  7. Spatial redistribution of irregularly-spaced Pareto fronts for more intuitive navigation and solution selection

    NARCIS (Netherlands)

    A. Bouter (Anton); K. Pirpinia (Kleopatra); T. Alderliesten (Tanja); P.A.N. Bosman (Peter)

    2017-01-01

    textabstractA multi-objective optimization approach is o.en followed by an a posteriori decision-making process, during which the most appropriate solution of the Pareto set is selected by a professional in the .eld. Conventional visualization methods do not correct for Pareto fronts with

  8. Multi-objective trajectory optimization of Space Manoeuvre Vehicle using adaptive differential evolution and modified game theory

    Science.gov (United States)

    Chai, Runqi; Savvaris, Al; Tsourdos, Antonios; Chai, Senchun

    2017-07-01

    Highly constrained trajectory optimization for Space Manoeuvre Vehicles (SMV) is a challenging problem. In practice, this problem becomes more difficult when multiple mission requirements are taken into account. Because of the nonlinearity in the dynamic model and even the objectives, it is usually hard for designers to generate a compromised trajectory without violating strict path and box constraints. In this paper, a new multi-objective SMV optimal control model is formulated and parameterized using combined shooting-collocation technique. A modified game theory approach, coupled with an adaptive differential evolution algorithm, is designed in order to generate the pareto front of the multi-objective trajectory optimization problem. In addition, to improve the quality of obtained solutions, a control logic is embedded in the framework of the proposed approach. Several existing multi-objective evolutionary algorithms are studied and compared with the proposed method. Simulation results indicate that without driving the solution out of the feasible region, the proposed method can perform better in terms of convergence ability and convergence speed than its counterparts. Moreover, the quality of the pareto set generated using the proposed method is higher than other multi-objective evolutionary algorithms, which means the newly proposed algorithm is more attractive for solving multi-criteria SMV trajectory planning problem.

  9. Optimal operational strategies for a day-ahead electricity market in the presence of market power using multi-objective evolutionary algorithms

    Science.gov (United States)

    Rodrigo, Deepal

    2007-12-01

    This dissertation introduces a novel approach for optimally operating a day-ahead electricity market not only by economically dispatching the generation resources but also by minimizing the influences of market manipulation attempts by the individual generator-owning companies while ensuring that the power system constraints are not violated. Since economic operation of the market conflicts with the individual profit maximization tactics such as market manipulation by generator-owning companies, a methodology that is capable of simultaneously optimizing these two competing objectives has to be selected. Although numerous previous studies have been undertaken on the economic operation of day-ahead markets and other independent studies have been conducted on the mitigation of market power, the operation of a day-ahead electricity market considering these two conflicting objectives simultaneously has not been undertaken previously. These facts provided the incentive and the novelty for this study. A literature survey revealed that many of the traditional solution algorithms convert multi-objective functions into either a single-objective function using weighting schemas or undertake optimization of one function at a time. Hence, these approaches do not truly optimize the multi-objectives concurrently. Due to these inherent deficiencies of the traditional algorithms, the use of alternative non-traditional solution algorithms for such problems has become popular and widely used. Of these, multi-objective evolutionary algorithms (MOEA) have received wide acceptance due to their solution quality and robustness. In the present research, three distinct algorithms were considered: a non-dominated sorting genetic algorithm II (NSGA II), a multi-objective tabu search algorithm (MOTS) and a hybrid of multi-objective tabu search and genetic algorithm (MOTS/GA). The accuracy and quality of the results from these algorithms for applications similar to the problem investigated here

  10. Thermo-economic multi-objective optimization of solar dish-Stirling engine by implementing evolutionary algorithm

    International Nuclear Information System (INIS)

    Ahmadi, Mohammad H.; Sayyaadi, Hoseyn; Mohammadi, Amir H.; Barranco-Jimenez, Marco A.

    2013-01-01

    Highlights: • Thermo-economic multi-objective optimization of solar dish-Stirling engine is studied. • Application of the evolutionary algorithm is investigated. • Error analysis is done to find out the error through investigation. - Abstract: In the recent years, remarkable attention is drawn to Stirling engine due to noticeable advantages, for instance a lot of resources such as biomass, fossil fuels and solar energy can be applied as heat source. Great number of studies are conducted on Stirling engine and finite time thermo-economic is one of them. In the present study, the dimensionless thermo-economic objective function, thermal efficiency and dimensionless power output are optimized for a dish-Stirling system using finite time thermo-economic analysis and NSGA-II algorithm. Optimized answers are chosen from the results using three decision-making methods. Error analysis is done to find out the error through investigation

  11. Comparison of Two Methods Used to Model Shape Parameters of Pareto Distributions

    Science.gov (United States)

    Liu, C.; Charpentier, R.R.; Su, J.

    2011-01-01

    Two methods are compared for estimating the shape parameters of Pareto field-size (or pool-size) distributions for petroleum resource assessment. Both methods assume mature exploration in which most of the larger fields have been discovered. Both methods use the sizes of larger discovered fields to estimate the numbers and sizes of smaller fields: (1) the tail-truncated method uses a plot of field size versus size rank, and (2) the log-geometric method uses data binned in field-size classes and the ratios of adjacent bin counts. Simulation experiments were conducted using discovered oil and gas pool-size distributions from four petroleum systems in Alberta, Canada and using Pareto distributions generated by Monte Carlo simulation. The estimates of the shape parameters of the Pareto distributions, calculated by both the tail-truncated and log-geometric methods, generally stabilize where discovered pool numbers are greater than 100. However, with fewer than 100 discoveries, these estimates can vary greatly with each new discovery. The estimated shape parameters of the tail-truncated method are more stable and larger than those of the log-geometric method where the number of discovered pools is more than 100. Both methods, however, tend to underestimate the shape parameter. Monte Carlo simulation was also used to create sequences of discovered pool sizes by sampling from a Pareto distribution with a discovery process model using a defined exploration efficiency (in order to show how biased the sampling was in favor of larger fields being discovered first). A higher (more biased) exploration efficiency gives better estimates of the Pareto shape parameters. ?? 2011 International Association for Mathematical Geosciences.

  12. Using Pareto points for model identification in predictive toxicology

    Science.gov (United States)

    2013-01-01

    Predictive toxicology is concerned with the development of models that are able to predict the toxicity of chemicals. A reliable prediction of toxic effects of chemicals in living systems is highly desirable in cosmetics, drug design or food protection to speed up the process of chemical compound discovery while reducing the need for lab tests. There is an extensive literature associated with the best practice of model generation and data integration but management and automated identification of relevant models from available collections of models is still an open problem. Currently, the decision on which model should be used for a new chemical compound is left to users. This paper intends to initiate the discussion on automated model identification. We present an algorithm, based on Pareto optimality, which mines model collections and identifies a model that offers a reliable prediction for a new chemical compound. The performance of this new approach is verified for two endpoints: IGC50 and LogP. The results show a great potential for automated model identification methods in predictive toxicology. PMID:23517649

  13. Multiobjective memetic estimation of distribution algorithm based on an incremental tournament local searcher.

    Science.gov (United States)

    Yang, Kaifeng; Mu, Li; Yang, Dongdong; Zou, Feng; Wang, Lei; Jiang, Qiaoyong

    2014-01-01

    A novel hybrid multiobjective algorithm is presented in this paper, which combines a new multiobjective estimation of distribution algorithm, an efficient local searcher and ε-dominance. Besides, two multiobjective problems with variable linkages strictly based on manifold distribution are proposed. The Pareto set to the continuous multiobjective optimization problems, in the decision space, is a piecewise low-dimensional continuous manifold. The regularity by the manifold features just build probability distribution model by globally statistical information from the population, yet, the efficiency of promising individuals is not well exploited, which is not beneficial to search and optimization process. Hereby, an incremental tournament local searcher is designed to exploit local information efficiently and accelerate convergence to the true Pareto-optimal front. Besides, since ε-dominance is a strategy that can make multiobjective algorithm gain well distributed solutions and has low computational complexity, ε-dominance and the incremental tournament local searcher are combined here. The novel memetic multiobjective estimation of distribution algorithm, MMEDA, was proposed accordingly. The algorithm is validated by experiment on twenty-two test problems with and without variable linkages of diverse complexities. Compared with three state-of-the-art multiobjective optimization algorithms, our algorithm achieves comparable results in terms of convergence and diversity metrics.

  14. Multiobjective Memetic Estimation of Distribution Algorithm Based on an Incremental Tournament Local Searcher

    Directory of Open Access Journals (Sweden)

    Kaifeng Yang

    2014-01-01

    Full Text Available A novel hybrid multiobjective algorithm is presented in this paper, which combines a new multiobjective estimation of distribution algorithm, an efficient local searcher and ε-dominance. Besides, two multiobjective problems with variable linkages strictly based on manifold distribution are proposed. The Pareto set to the continuous multiobjective optimization problems, in the decision space, is a piecewise low-dimensional continuous manifold. The regularity by the manifold features just build probability distribution model by globally statistical information from the population, yet, the efficiency of promising individuals is not well exploited, which is not beneficial to search and optimization process. Hereby, an incremental tournament local searcher is designed to exploit local information efficiently and accelerate convergence to the true Pareto-optimal front. Besides, since ε-dominance is a strategy that can make multiobjective algorithm gain well distributed solutions and has low computational complexity, ε-dominance and the incremental tournament local searcher are combined here. The novel memetic multiobjective estimation of distribution algorithm, MMEDA, was proposed accordingly. The algorithm is validated by experiment on twenty-two test problems with and without variable linkages of diverse complexities. Compared with three state-of-the-art multiobjective optimization algorithms, our algorithm achieves comparable results in terms of convergence and diversity metrics.

  15. Minimizing Harmonic Distortion Impact at Distribution System with Considering Large-Scale EV Load Behaviour Using Modified Lightning Search Algorithm and Pareto-Fuzzy Approach

    Directory of Open Access Journals (Sweden)

    S. N. Syed Nasir

    2018-01-01

    Full Text Available This research is focusing on optimal placement and sizing of multiple variable passive filter (VPF to mitigate harmonic distortion due to charging station (CS at 449 bus distribution network. There are 132 units of CS which are scheduled based on user behaviour within 24 hours, with the interval of 15 minutes. By considering the varying of CS patterns and harmonic impact, Modified Lightning Search Algorithm (MLSA is used to find 22 units of VPF coordination, so that less harmonics will be injected from 415 V bus to the medium voltage network and power loss is also reduced. Power system harmonic flow, VPF, CS, battery, and the analysis will be modelled in MATLAB/m-file platform. High Performance Computing (HPC is used to make simulation faster. Pareto-Fuzzy technique is used to obtain sizing of VPF from all nondominated solutions. From the result, the optimal placements and sizes of VPF are able to reduce the maximum THD for voltage and current and also the total apparent losses up to 39.14%, 52.5%, and 2.96%, respectively. Therefore, it can be concluded that the MLSA is suitable method to mitigate harmonic and it is beneficial in minimizing the impact of aggressive CS installation at distribution network.

  16. The Primary Experiments of an Analysis of Pareto Solutions for Conceptual Design Optimization Problem of Hybrid Rocket Engine

    Science.gov (United States)

    Kudo, Fumiya; Yoshikawa, Tomohiro; Furuhashi, Takeshi

    Recentry, Multi-objective Genetic Algorithm, which is the application of Genetic Algorithm to Multi-objective Optimization Problems is focused on in the engineering design field. In this field, the analysis of design variables in the acquired Pareto solutions, which gives the designers useful knowledge in the applied problem, is important as well as the acquisition of advanced solutions. This paper proposes a new visualization method using Isomap which visualizes the geometric distances of solutions in the design variable space considering their distances in the objective space. The proposed method enables a user to analyze the design variables of the acquired solutions considering their relationship in the objective space. This paper applies the proposed method to the conceptual design optimization problem of hybrid rocket engine and studies the effectiveness of the proposed method.

  17. Computing the Pareto-Nash equilibrium set in finite multi-objective mixed-strategy games

    Directory of Open Access Journals (Sweden)

    Victoria Lozan

    2013-10-01

    Full Text Available The Pareto-Nash equilibrium set (PNES is described as intersection of graphs of efficient response mappings. The problem of PNES computing in finite multi-objective mixed-strategy games (Pareto-Nash games is considered. A method for PNES computing is studied. Mathematics Subject Classification 2010: 91A05, 91A06, 91A10, 91A43, 91A44.

  18. Taxon ordering in phylogenetic trees by means of evolutionary algorithms

    Directory of Open Access Journals (Sweden)

    Cerutti Francesco

    2011-07-01

    Full Text Available Abstract Background In in a typical "left-to-right" phylogenetic tree, the vertical order of taxa is meaningless, as only the branch path between them reflects their degree of similarity. To make unresolved trees more informative, here we propose an innovative Evolutionary Algorithm (EA method to search the best graphical representation of unresolved trees, in order to give a biological meaning to the vertical order of taxa. Methods Starting from a West Nile virus phylogenetic tree, in a (1 + 1-EA we evolved it by randomly rotating the internal nodes and selecting the tree with better fitness every generation. The fitness is a sum of genetic distances between the considered taxon and the r (radius next taxa. After having set the radius to the best performance, we evolved the trees with (λ + μ-EAs to study the influence of population on the algorithm. Results The (1 + 1-EA consistently outperformed a random search, and better results were obtained setting the radius to 8. The (λ + μ-EAs performed as well as the (1 + 1, except the larger population (1000 + 1000. Conclusions The trees after the evolution showed an improvement both of the fitness (based on a genetic distance matrix, then close taxa are actually genetically close, and of the biological interpretation. Samples collected in the same state or year moved close each other, making the tree easier to interpret. Biological relationships between samples are also easier to observe.

  19. An evolutionary algorithm for port-of-entry security optimization considering sensor thresholds

    International Nuclear Information System (INIS)

    Concho, Ana Lisbeth; Ramirez-Marquez, Jose Emmanuel

    2010-01-01

    According to the US Customs and Border Protection (CBP), the number of offloaded ship cargo containers arriving at US seaports each year amounts to more than 11 million. The costs of locating an undetonated terrorist weapon at one US port, or even worst, the cost caused by a detonated weapon of mass destruction, would amount to billions of dollars. These costs do not yet account for the devastating consequences that it would cause in the ability to keep the supply chain operating and the sociological and psychological effects. As such, this paper is concerned with developing a container inspection strategy that minimizes the total cost of inspection while maintaining a user specified detection rate for 'suspicious' containers. In this respect and based on a general decision-tree model, this paper presents a holistic evolutionary algorithm for finding the following: (1) optimal threshold values for every sensor and (2) the optimal configuration of the inspection strategy. The algorithm is under the assumption that different sensors with different reliability and cost characteristics can be used. Testing and experimentation show the proposed approach consistently finds high quality solutions in a reduced computational time.

  20. Improvements in seismic event locations in a deep western U.S. coal mine using tomographic velocity models and an evolutionary search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Adam Lurka; Peter Swanson [Central Mining Institute, Katowice (Poland)

    2009-09-15

    Methods of improving seismic event locations were investigated as part of a research study aimed at reducing ground control safety hazards. Seismic event waveforms collected with a 23-station three-dimensional sensor array during longwall coal mining provide the data set used in the analyses. A spatially variable seismic velocity model is constructed using seismic event sources in a passive tomographic method. The resulting three-dimensional velocity model is used to relocate seismic event positions. An evolutionary optimization algorithm is implemented and used in both the velocity model development and in seeking improved event location solutions. Results obtained using the different velocity models are compared. The combination of the tomographic velocity model development and evolutionary search algorithm provides improvement to the event locations. 13 refs., 5 figs., 4 tabs.

  1. Using Self-Adaptive Evolutionary Algorithms to Evolve Dynamism-Oriented Maps for a Real Time Strategy Game

    OpenAIRE

    Lara-Cabrera, Raúl; Cotta, Carlos; Fernández Leiva, Antonio J.

    2013-01-01

    This work presents a procedural content generation system that uses an evolutionary algorithm in order to generate interesting maps for a real-time strategy game, called Planet Wars. Interestingness is here captured by the dynamism of games (i.e., the extent to which they are action-packed). We consider two different approaches to measure the dynamism of the games resulting from these generated maps, one based on fluctuations in the resources controlled by either player and another one based ...

  2. Pareto Principle in Datamining: an Above-Average Fencing Algorithm

    Directory of Open Access Journals (Sweden)

    K. Macek

    2008-01-01

    Full Text Available This paper formulates a new datamining problem: which subset of input space has the relatively highest output where the minimal size of this subset is given. This can be useful where usual datamining methods fail because of error distribution asymmetry. The paper provides a novel algorithm for this datamining problem, and compares it with clustering of above-average individuals.

  3. Many-Objective Optimization Using Adaptive Differential Evolution with a New Ranking Method

    Directory of Open Access Journals (Sweden)

    Xiaoguang He

    2014-01-01

    Full Text Available Pareto dominance is an important concept and is usually used in multiobjective evolutionary algorithms (MOEAs to determine the nondominated solutions. However, for many-objective problems, using Pareto dominance to rank the solutions even in the early generation, most obtained solutions are often the nondominated solutions, which results in a little selection pressure of MOEAs toward the optimal solutions. In this paper, a new ranking method is proposed for many-objective optimization problems to verify a relatively smaller number of representative nondominated solutions with a uniform and wide distribution and improve the selection pressure of MOEAs. After that, a many-objective differential evolution with the new ranking method (MODER for handling many-objective optimization problems is designed. At last, the experiments are conducted and the proposed algorithm is compared with several well-known algorithms. The experimental results show that the proposed algorithm can guide the search to converge to the true PF and maintain the diversity of solutions for many-objective problems.

  4. Pareto-Lognormal Modeling of Known and Unknown Metal Resources. II. Method Refinement and Further Applications

    International Nuclear Information System (INIS)

    Agterberg, Frits

    2017-01-01

    Pareto-lognormal modeling of worldwide metal deposit size–frequency distributions was proposed in an earlier paper (Agterberg in Nat Resour 26:3–20, 2017). In the current paper, the approach is applied to four metals (Cu, Zn, Au and Ag) and a number of model improvements are described and illustrated in detail for copper and gold. The new approach has become possible because of the very large inventory of worldwide metal deposit data recently published by Patiño Douce (Nat Resour 25:97–124, 2016c). Worldwide metal deposits for Cu, Zn and Ag follow basic lognormal size–frequency distributions that form straight lines on lognormal Q–Q plots. Au deposits show a departure from the straight-line model in the vicinity of their median size. Both largest and smallest deposits for the four metals taken as examples exhibit hyperbolic size–frequency relations and their Pareto coefficients are determined by fitting straight lines on log rank–log size plots. As originally pointed out by Patiño Douce (Nat Resour Res 25:365–387, 2016d), the upper Pareto tail cannot be distinguished clearly from the tail of what would be a secondary lognormal distribution. The method previously used in Agterberg (2017) for fitting the bridge function separating the largest deposit size–frequency Pareto tail from the basic lognormal is significantly improved in this paper. A new method is presented for estimating the approximate deposit size value at which the upper tail Pareto comes into effect. Although a theoretical explanation of the proposed Pareto-lognormal distribution model is not a required condition for its applicability, it is shown that existing double Pareto-lognormal models based on Brownian motion generalizations of the multiplicative central limit theorem are not applicable to worldwide metal deposits. Neither are various upper tail frequency amplification models in their present form. Although a physicochemical explanation remains possible, it is argued that

  5. Pareto-Lognormal Modeling of Known and Unknown Metal Resources. II. Method Refinement and Further Applications

    Energy Technology Data Exchange (ETDEWEB)

    Agterberg, Frits, E-mail: agterber@nrcan.gc.ca [Geological Survey of Canada (Canada)

    2017-07-01

    Pareto-lognormal modeling of worldwide metal deposit size–frequency distributions was proposed in an earlier paper (Agterberg in Nat Resour 26:3–20, 2017). In the current paper, the approach is applied to four metals (Cu, Zn, Au and Ag) and a number of model improvements are described and illustrated in detail for copper and gold. The new approach has become possible because of the very large inventory of worldwide metal deposit data recently published by Patiño Douce (Nat Resour 25:97–124, 2016c). Worldwide metal deposits for Cu, Zn and Ag follow basic lognormal size–frequency distributions that form straight lines on lognormal Q–Q plots. Au deposits show a departure from the straight-line model in the vicinity of their median size. Both largest and smallest deposits for the four metals taken as examples exhibit hyperbolic size–frequency relations and their Pareto coefficients are determined by fitting straight lines on log rank–log size plots. As originally pointed out by Patiño Douce (Nat Resour Res 25:365–387, 2016d), the upper Pareto tail cannot be distinguished clearly from the tail of what would be a secondary lognormal distribution. The method previously used in Agterberg (2017) for fitting the bridge function separating the largest deposit size–frequency Pareto tail from the basic lognormal is significantly improved in this paper. A new method is presented for estimating the approximate deposit size value at which the upper tail Pareto comes into effect. Although a theoretical explanation of the proposed Pareto-lognormal distribution model is not a required condition for its applicability, it is shown that existing double Pareto-lognormal models based on Brownian motion generalizations of the multiplicative central limit theorem are not applicable to worldwide metal deposits. Neither are various upper tail frequency amplification models in their present form. Although a physicochemical explanation remains possible, it is argued that

  6. Beam configuration selection for robust intensity-modulated proton therapy in cervical cancer using Pareto front comparison.

    Science.gov (United States)

    van de Schoot, A J A J; Visser, J; van Kesteren, Z; Janssen, T M; Rasch, C R N; Bel, A

    2016-02-21

    The Pareto front reflects the optimal trade-offs between conflicting objectives and can be used to quantify the effect of different beam configurations on plan robustness and dose-volume histogram parameters. Therefore, our aim was to develop and implement a method to automatically approach the Pareto front in robust intensity-modulated proton therapy (IMPT) planning. Additionally, clinically relevant Pareto fronts based on different beam configurations will be derived and compared to enable beam configuration selection in cervical cancer proton therapy. A method to iteratively approach the Pareto front by automatically generating robustly optimized IMPT plans was developed. To verify plan quality, IMPT plans were evaluated on robustness by simulating range and position errors and recalculating the dose. For five retrospectively selected cervical cancer patients, this method was applied for IMPT plans with three different beam configurations using two, three and four beams. 3D Pareto fronts were optimized on target coverage (CTV D(99%)) and OAR doses (rectum V30Gy; bladder V40Gy). Per patient, proportions of non-approved IMPT plans were determined and differences between patient-specific Pareto fronts were quantified in terms of CTV D(99%), rectum V(30Gy) and bladder V(40Gy) to perform beam configuration selection. Per patient and beam configuration, Pareto fronts were successfully sampled based on 200 IMPT plans of which on average 29% were non-approved plans. In all patients, IMPT plans based on the 2-beam set-up were completely dominated by plans with the 3-beam and 4-beam configuration. Compared to the 3-beam set-up, the 4-beam set-up increased the median CTV D(99%) on average by 0.2 Gy and decreased the median rectum V(30Gy) and median bladder V(40Gy) on average by 3.6% and 1.3%, respectively. This study demonstrates a method to automatically derive Pareto fronts in robust IMPT planning. For all patients, the defined four-beam configuration was found optimal

  7. Beam configuration selection for robust intensity-modulated proton therapy in cervical cancer using Pareto front comparison

    International Nuclear Information System (INIS)

    Van de Schoot, A J A J; Visser, J; Van Kesteren, Z; Rasch, C R N; Bel, A; Janssen, T M

    2016-01-01

    The Pareto front reflects the optimal trade-offs between conflicting objectives and can be used to quantify the effect of different beam configurations on plan robustness and dose-volume histogram parameters. Therefore, our aim was to develop and implement a method to automatically approach the Pareto front in robust intensity-modulated proton therapy (IMPT) planning. Additionally, clinically relevant Pareto fronts based on different beam configurations will be derived and compared to enable beam configuration selection in cervical cancer proton therapy. A method to iteratively approach the Pareto front by automatically generating robustly optimized IMPT plans was developed. To verify plan quality, IMPT plans were evaluated on robustness by simulating range and position errors and recalculating the dose. For five retrospectively selected cervical cancer patients, this method was applied for IMPT plans with three different beam configurations using two, three and four beams. 3D Pareto fronts were optimized on target coverage (CTV D 99% ) and OAR doses (rectum V 30Gy ; bladder V 40Gy ). Per patient, proportions of non-approved IMPT plans were determined and differences between patient-specific Pareto fronts were quantified in terms of CTV D 99% , rectum V 30Gy and bladder V 40Gy to perform beam configuration selection. Per patient and beam configuration, Pareto fronts were successfully sampled based on 200 IMPT plans of which on average 29% were non-approved plans. In all patients, IMPT plans based on the 2-beam set-up were completely dominated by plans with the 3-beam and 4-beam configuration. Compared to the 3-beam set-up, the 4-beam set-up increased the median CTV D 99% on average by 0.2 Gy and decreased the median rectum V 30Gy and median bladder V 40Gy on average by 3.6% and 1.3%, respectively. This study demonstrates a method to automatically derive Pareto fronts in robust IMPT planning. For all patients, the defined four-beam configuration was found optimal in

  8. A Pareto-Improving Minimum Wage

    OpenAIRE

    Eliav Danziger; Leif Danziger

    2014-01-01

    This paper shows that a graduated minimum wage, in contrast to a constant minimum wage, can provide a strict Pareto improvement over what can be achieved with an optimal income tax. The reason is that a graduated minimum wage requires high-productivity workers to work more to earn the same income as low-productivity workers, which makes it more difficult for the former to mimic the latter. In effect, a graduated minimum wage allows the low-productivity workers to benefit from second-degree pr...

  9. Bi-objective optimization for multi-modal transportation routing planning problem based on Pareto optimality

    Directory of Open Access Journals (Sweden)

    Yan Sun

    2015-09-01

    Full Text Available Purpose: The purpose of study is to solve the multi-modal transportation routing planning problem that aims to select an optimal route to move a consignment of goods from its origin to its destination through the multi-modal transportation network. And the optimization is from two viewpoints including cost and time. Design/methodology/approach: In this study, a bi-objective mixed integer linear programming model is proposed to optimize the multi-modal transportation routing planning problem. Minimizing the total transportation cost and the total transportation time are set as the optimization objectives of the model. In order to balance the benefit between the two objectives, Pareto optimality is utilized to solve the model by gaining its Pareto frontier. The Pareto frontier of the model can provide the multi-modal transportation operator (MTO and customers with better decision support and it is gained by the normalized normal constraint method. Then, an experimental case study is designed to verify the feasibility of the model and Pareto optimality by using the mathematical programming software Lingo. Finally, the sensitivity analysis of the demand and supply in the multi-modal transportation organization is performed based on the designed case. Findings: The calculation results indicate that the proposed model and Pareto optimality have good performance in dealing with the bi-objective optimization. The sensitivity analysis also shows the influence of the variation of the demand and supply on the multi-modal transportation organization clearly. Therefore, this method can be further promoted to the practice. Originality/value: A bi-objective mixed integer linear programming model is proposed to optimize the multi-modal transportation routing planning problem. The Pareto frontier based sensitivity analysis of the demand and supply in the multi-modal transportation organization is performed based on the designed case.

  10. Calculation of Pareto-optimal solutions to multiple-objective problems using threshold-of-acceptability constraints

    Science.gov (United States)

    Giesy, D. P.

    1978-01-01

    A technique is presented for the calculation of Pareto-optimal solutions to a multiple-objective constrained optimization problem by solving a series of single-objective problems. Threshold-of-acceptability constraints are placed on the objective functions at each stage to both limit the area of search and to mathematically guarantee convergence to a Pareto optimum.

  11. XTALOPT version r11: An open-source evolutionary algorithm for crystal structure prediction

    Science.gov (United States)

    Avery, Patrick; Falls, Zackary; Zurek, Eva

    2018-01-01

    Version 11 of XTALOPT, an evolutionary algorithm for crystal structure prediction, has now been made available for download from the CPC library or the XTALOPT website, http://xtalopt.github.io. Whereas the previous versions of XTALOPT were published under the Gnu Public License (GPL), the current version is made available under the 3-Clause BSD License, which is an open source license that is recognized by the Open Source Initiative. Importantly, the new version can be executed via a command line interface (i.e., it does not require the use of a Graphical User Interface). Moreover, the new version is written as a stand-alone program, rather than an extension to AVOGADRO.

  12. The feasibility of using Pareto fronts for comparison of treatment planning systems and delivery techniques

    DEFF Research Database (Denmark)

    Ottosson, Rickard O; Engstrom, Per E; Sjöström, David

    2008-01-01

    constitute the Pareto front. The Pareto concept applies well to the inverse planning process, which involves inherently contradictory objectives, high and uniform target dose on one hand, and sparing of surrounding tissue and nearby organs at risk (OAR) on the other. Due to the specific characteristics...

  13. MULTIOBJECTIVE EVOLUTIONARY ALGORITHMS APPLIED TO MICROSTRIP ANTENNAS DESIGN ALGORITMOS EVOLUTIVOS MULTIOBJETIVO APLICADOS A LOS PROYECTOS DE ANTENAS MICROSTRIP

    Directory of Open Access Journals (Sweden)

    Juliano Rodrigues Brianeze

    2009-12-01

    Full Text Available This work presents three of the main evolutionary algorithms: Genetic Algorithm, Evolution Strategy and Evolutionary Programming, applied to microstrip antennas design. Efficiency tests were performed, considering the analysis of key physical and geometrical parameters, evolution type, numerical random generators effects, evolution operators and selection criteria. These algorithms were validated through design of microstrip antennas based on the Resonant Cavity Method, and allow multiobjective optimizations, considering bandwidth, standing wave ratio and relative material permittivity. The optimal results obtained with these optimization processes, were confirmed by CST Microwave Studio commercial package.Este trabajo presenta tres de los principales algoritmos evolutivos: Algoritmo Genético, Estrategia Evolutiva y Programación Evolutiva, aplicados al diseño de antenas de microlíneas (microstrip. Se realizaron pruebas de eficiencia de los algoritmos, considerando el análisis de los parámetros físicos y geométricos, tipo de evolución, efecto de generación de números aleatorios, operadores evolutivos y los criterios de selección. Estos algoritmos fueron validados a través del diseño de antenas de microlíneas basado en el Método de Cavidades Resonantes y permiten optimizaciones multiobjetivo, considerando ancho de banda, razón de onda estacionaria y permitividad relativa del dieléctrico. Los resultados óptimos obtenidos fueron confirmados a través del software comercial CST Microwave Studio.

  14. A modified gravitational search algorithm based on a non-dominated sorting genetic approach for hydro-thermal-wind economic emission dispatching

    International Nuclear Information System (INIS)

    Chen, Fang; Zhou, Jianzhong; Wang, Chao; Li, Chunlong; Lu, Peng

    2017-01-01

    Wind power is a type of clean and renewable energy, and reasonable utilization of wind power is beneficial to environmental protection and economic development. Therefore, a short-term hydro-thermal-wind economic emission dispatching (SHTW-EED) problem is presented in this paper. The proposed problem aims to distribute the load among hydro, thermal and wind power units to simultaneously minimize economic cost and pollutant emission. To solve the SHTW-EED problem with complex constraints, a modified gravitational search algorithm based on the non-dominated sorting genetic algorithm-III (MGSA-NSGA-III) is proposed. In the proposed MGSA-NSGA-III, a non-dominated sorting approach, reference-point based selection mechanism and chaotic mutation strategy are applied to improve the evolutionary process of the original gravitational search algorithm (GSA) and maintain the distribution diversity of Pareto optimal solutions. Moreover, a parallel computing strategy is introduced to improve the computational efficiency. Finally, the proposed MGSA-NSGA-III is applied to a typical hydro-thermal-wind system to verify its feasibility and effectiveness. The simulation results indicate that the proposed algorithm can obtain low economic cost and small pollutant emission when dealing with the SHTW-EED problem. - Highlights: • A hybrid algorithm is proposed to handle hydro-thermal-wind power dispatching. • Several improvement strategies are applied to the algorithm. • A parallel computing strategy is applied to improve computational efficiency. • Two cases are analyzed to verify the efficiency of the optimize mode.

  15. Study on Parameter Optimization Design of Drum Brake Based on Hybrid Cellular Multiobjective Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Yi Zhang

    2012-01-01

    Full Text Available In consideration of the significant role the brake plays in ensuring the fast and safe running of vehicles, and since the present parameter optimization design models of brake are far from the practical application, this paper proposes a multiobjective optimization model of drum brake, aiming at maximizing the braking efficiency and minimizing the volume and temperature rise of drum brake. As the commonly used optimization algorithms are of some deficiency, we present a differential evolution cellular multiobjective genetic algorithm (DECell by introducing differential evolution strategy into the canonical cellular genetic algorithm for tackling this problem. For DECell, the gained Pareto front could be as close as possible to the exact Pareto front, and also the diversity of nondominated individuals could be better maintained. The experiments on the test functions reveal that DECell is of good performance in solving high-dimension nonlinear multiobjective problems. And the results of optimizing the new brake model indicate that DECell obviously outperforms the compared popular algorithm NSGA-II concerning the number of obtained brake design parameter sets, the speed, and stability for finding them.

  16. An evolutionary algorithm for tomographic reconstructions in limited data sets problems

    International Nuclear Information System (INIS)

    Turcanu, Catrinel; Craciunescu, Teddy

    2000-01-01

    The paper proposes a new method for tomographic reconstructions. Unlike nuclear medicine applications, in physical science problems we are often confronted with limited data sets: constraints in the number of projections or limited angle views. The problem of image reconstruction from projections may be considered as a problem of finding an image (solution) having projections that match the experimental ones. In our approach, we choose a statistical correlation coefficient to evaluate the fitness of any potential solution. The optimization process is carried out by an evolutionary algorithm. Our algorithm has some problem-oriented characteristics. One of them is that a chromosome, representing a potential solution, is not linear but coded as a matrix of pixels corresponding to a two-dimensional image. This kind of internal representation reflects the genuine manifestation and slight differences between two points situated in the original problem space give rise to similar differences once they become coded. Another particular feature is a newly built crossover operator: the grid-based crossover, suitable for high dimension two-dimensional chromosomes. Except for the population size and the dimension of the cutting grid for the grid-based crossover, all the other parameters of the algorithm are independent of the geometry of the tomographic reconstruction. The performances of the method are evaluated in comparison with a traditional tomographic method, based on the maximization of the entropy of the image, that proved to work well with limited data sets. The test phantom is typical for an application with limited data sets: the determination of the neutron energy spectra with time resolution in case of short-pulsed neutron emission. The qualitative judgement and also the quantitative one, based on some figures of merit, point out that the proposed method ensures an improved reconstruction of shapes, sizes and resolution in the image, even in the presence of noise

  17. A robust controller design method for feedback substitution schemes using genetic algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Trujillo, Mirsha M; Hadjiloucas, Sillas; Becerra, Victor M, E-mail: s.hadjiloucas@reading.ac.uk [Cybernetics, School of Systems Engineering, University of Reading, RG6 6AY (United Kingdom)

    2011-08-17

    Controllers for feedback substitution schemes demonstrate a trade-off between noise power gain and normalized response time. Using as an example the design of a controller for a radiometric transduction process subjected to arbitrary noise power gain and robustness constraints, a Pareto-front of optimal controller solutions fulfilling a range of time-domain design objectives can be derived. In this work, we consider designs using a loop shaping design procedure (LSDP). The approach uses linear matrix inequalities to specify a range of objectives and a genetic algorithm (GA) to perform a multi-objective optimization for the controller weights (MOGA). A clonal selection algorithm is used to further provide a directed search of the GA towards the Pareto front. We demonstrate that with the proposed methodology, it is possible to design higher order controllers with superior performance in terms of response time, noise power gain and robustness.

  18. Decomposition and Simplification of Multivariate Data using Pareto Sets.

    Science.gov (United States)

    Huettenberger, Lars; Heine, Christian; Garth, Christoph

    2014-12-01

    Topological and structural analysis of multivariate data is aimed at improving the understanding and usage of such data through identification of intrinsic features and structural relationships among multiple variables. We present two novel methods for simplifying so-called Pareto sets that describe such structural relationships. Such simplification is a precondition for meaningful visualization of structurally rich or noisy data. As a framework for simplification operations, we introduce a decomposition of the data domain into regions of equivalent structural behavior and the reachability graph that describes global connectivity of Pareto extrema. Simplification is then performed as a sequence of edge collapses in this graph; to determine a suitable sequence of such operations, we describe and utilize a comparison measure that reflects the changes to the data that each operation represents. We demonstrate and evaluate our methods on synthetic and real-world examples.

  19. Identifying irregularly shaped crime hot-spots using a multiobjective evolutionary algorithm

    Science.gov (United States)

    Wu, Xiaolan; Grubesic, Tony H.

    2010-12-01

    Spatial cluster detection techniques are widely used in criminology, geography, epidemiology, and other fields. In particular, spatial scan statistics are popular and efficient techniques for detecting areas of elevated crime or disease events. The majority of spatial scan approaches attempt to delineate geographic zones by evaluating the significance of clusters using likelihood ratio statistics tested with the Poisson distribution. While this can be effective, many scan statistics give preference to circular clusters, diminishing their ability to identify elongated and/or irregular shaped clusters. Although adjusting the shape of the scan window can mitigate some of these problems, both the significance of irregular clusters and their spatial structure must be accounted for in a meaningful way. This paper utilizes a multiobjective evolutionary algorithm to find clusters with maximum significance while quantitatively tracking their geographic structure. Crime data for the city of Cincinnati are utilized to demonstrate the advantages of the new approach and highlight its benefits versus more traditional scan statistics.

  20. A Novel Handwritten Letter Recognizer Using Enhanced Evolutionary Neural Network

    Science.gov (United States)

    Mahmoudi, Fariborz; Mirzashaeri, Mohsen; Shahamatnia, Ehsan; Faridnia, Saed

    This paper introduces a novel design for handwritten letter recognition by employing a hybrid back-propagation neural network with an enhanced evolutionary algorithm. Feeding the neural network consists of a new approach which is invariant to translation, rotation, and scaling of input letters. Evolutionary algorithm is used for the global search of the search space and the back-propagation algorithm is used for the local search. The results have been computed by implementing this approach for recognizing 26 English capital letters in the handwritings of different people. The computational results show that the neural network reaches very satisfying results with relatively scarce input data and a promising performance improvement in convergence of the hybrid evolutionary back-propagation algorithms is exhibited.

  1. Monopoly, Pareto and Ramsey mark-ups

    OpenAIRE

    Ten Raa, T.

    2009-01-01

    Monopoly prices are too high. It is a price level problem, in the sense that the relative mark-ups have Ramsey optimal proportions, at least for independent constant elasticity demands. I show that this feature of monopoly prices breaks down the moment one demand is replaced by the textbook linear demand or, even within the constant elasticity framework, dependence is introduced. The analysis provides a single Generalized Inverse Elasticity Rule for the problems of monopoly, Pareto and Ramsey.

  2. COMPROMISE, OPTIMAL AND TRACTIONAL ACCOUNTS ON PARETO SET

    Directory of Open Access Journals (Sweden)

    V. V. Lahuta

    2010-11-01

    Full Text Available The problem of optimum traction calculations is considered as a problem about optimum distribution of a resource. The dynamic programming solution is based on a step-by-step calculation of set of points of Pareto-optimum values of a criterion function (energy expenses and a resource (time.

  3. Investigating multi-objective fluence and beam orientation IMRT optimization

    Science.gov (United States)

    Potrebko, Peter S.; Fiege, Jason; Biagioli, Matthew; Poleszczuk, Jan

    2017-07-01

    Radiation Oncology treatment planning requires compromises to be made between clinical objectives that are invariably in conflict. It would be beneficial to have a ‘bird’s-eye-view’ perspective of the full spectrum of treatment plans that represent the possible trade-offs between delivering the intended dose to the planning target volume (PTV) while optimally sparing the organs-at-risk (OARs). In this work, the authors demonstrate Pareto-aware radiotherapy evolutionary treatment optimization (PARETO), a multi-objective tool featuring such bird’s-eye-view functionality, which optimizes fluence patterns and beam angles for intensity-modulated radiation therapy (IMRT) treatment planning. The problem of IMRT treatment plan optimization is managed as a combined monolithic problem, where all beam fluence and angle parameters are treated equally during the optimization. To achieve this, PARETO is built around a powerful multi-objective evolutionary algorithm, called Ferret, which simultaneously optimizes multiple fitness functions that encode the attributes of the desired dose distribution for the PTV and OARs. The graphical interfaces within PARETO provide useful information such as: the convergence behavior during optimization, trade-off plots between the competing objectives, and a graphical representation of the optimal solution database allowing for the rapid exploration of treatment plan quality through the evaluation of dose-volume histograms and isodose distributions. PARETO was evaluated for two relatively complex clinical cases, a paranasal sinus and a pancreas case. The end result of each PARETO run was a database of optimal (non-dominated) treatment plans that demonstrated trade-offs between the OAR and PTV fitness functions, which were all equally good in the Pareto-optimal sense (where no one objective can be improved without worsening at least one other). Ferret was able to produce high quality solutions even though a large number of parameters

  4. Calculating and controlling the error of discrete representations of Pareto surfaces in convex multi-criteria optimization.

    Science.gov (United States)

    Craft, David

    2010-10-01

    A discrete set of points and their convex combinations can serve as a sparse representation of the Pareto surface in multiple objective convex optimization. We develop a method to evaluate the quality of such a representation, and show by example that in multiple objective radiotherapy planning, the number of Pareto optimal solutions needed to represent Pareto surfaces of up to five dimensions grows at most linearly with the number of objectives. The method described is also applicable to the representation of convex sets. Copyright © 2009 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. Trade-off bounds for the Pareto surface approximation in multi-criteria IMRT planning

    International Nuclear Information System (INIS)

    Serna, J I; Monz, M; Kuefer, K H; Thieke, C

    2009-01-01

    One approach to multi-criteria IMRT planning is to automatically calculate a data set of Pareto-optimal plans for a given planning problem in a first phase, and then interactively explore the solution space and decide on the clinically best treatment plan in a second phase. The challenge of computing the plan data set is to ensure that all clinically meaningful plans are covered and that as many clinically irrelevant plans as possible are excluded to keep computation times within reasonable limits. In this work, we focus on the approximation of the clinically relevant part of the Pareto surface, the process that constitutes the first phase. It is possible that two plans on the Pareto surface have a small, clinically insignificant difference in one criterion and a significant difference in another criterion. For such cases, only the plan that is clinically clearly superior should be included into the data set. To achieve this during the Pareto surface approximation, we propose to introduce bounds that restrict the relative quality between plans, the so-called trade-off bounds. We show how to integrate these trade-off bounds into the approximation scheme and study their effects. The proposed scheme is applied to two artificial cases and one clinical case of a paraspinal tumor. For all cases, the quality of the Pareto surface approximation is measured with respect to the number of computed plans, and the range of values occurring in the approximation for different criteria is compared. Through enforcing trade-off bounds, the scheme disregards clinically irrelevant plans during the approximation. Thereby, the number of plans necessary to achieve a good approximation quality can be significantly reduced. Thus, trade-off bounds are an effective tool to focus the planning and to reduce computation time.

  6. Trade-off bounds for the Pareto surface approximation in multi-criteria IMRT planning.

    Science.gov (United States)

    Serna, J I; Monz, M; Küfer, K H; Thieke, C

    2009-10-21

    One approach to multi-criteria IMRT planning is to automatically calculate a data set of Pareto-optimal plans for a given planning problem in a first phase, and then interactively explore the solution space and decide on the clinically best treatment plan in a second phase. The challenge of computing the plan data set is to ensure that all clinically meaningful plans are covered and that as many clinically irrelevant plans as possible are excluded to keep computation times within reasonable limits. In this work, we focus on the approximation of the clinically relevant part of the Pareto surface, the process that constitutes the first phase. It is possible that two plans on the Pareto surface have a small, clinically insignificant difference in one criterion and a significant difference in another criterion. For such cases, only the plan that is clinically clearly superior should be included into the data set. To achieve this during the Pareto surface approximation, we propose to introduce bounds that restrict the relative quality between plans, the so-called trade-off bounds. We show how to integrate these trade-off bounds into the approximation scheme and study their effects. The proposed scheme is applied to two artificial cases and one clinical case of a paraspinal tumor. For all cases, the quality of the Pareto surface approximation is measured with respect to the number of computed plans, and the range of values occurring in the approximation for different criteria is compared. Through enforcing trade-off bounds, the scheme disregards clinically irrelevant plans during the approximation. Thereby, the number of plans necessary to achieve a good approximation quality can be significantly reduced. Thus, trade-off bounds are an effective tool to focus the planning and to reduce computation time.

  7. Optimization of heat transfer utilizing graph based evolutionary algorithms

    International Nuclear Information System (INIS)

    Bryden, Kenneth M.; Ashlock, Daniel A.; McCorkle, Douglas S.; Urban, Gregory L.

    2003-01-01

    This paper examines the use of graph based evolutionary algorithms (GBEAs) for optimization of heat transfer in a complex system. The specific case examined in this paper is the optimization of heat transfer in a biomass cookstove utilizing three-dimensional computational fluid dynamics to generate the fitness function. In this stove hot combustion gases are used to heat a cooking surface. The goal is to provide an even spatial temperature distribution on the cooking surface by redirecting the flow of combustion gases with baffles. The variables in the optimization are the position and size of the baffles, which are described by integer values. GBEAs are a novel type of EA in which a topology or geography is imposed on an evolving population of solutions. The choice of graph controls the rate at which solutions can spread within the population, impacting the diversity of solutions and convergence rate of the EAs. In this study, the choice of graph in the GBEAs changes the number of mating events required for convergence by a factor of approximately 2.25 and the diversity of the population by a factor of 2. These results confirm that by tuning the graph and parameters in GBEAs, computational time can be significantly reduced

  8. An efficient dynamic load balancing algorithm

    Science.gov (United States)

    Lagaros, Nikos D.

    2014-01-01

    In engineering problems, randomness and uncertainties are inherent. Robust design procedures, formulated in the framework of multi-objective optimization, have been proposed in order to take into account sources of randomness and uncertainty. These design procedures require orders of magnitude more computational effort than conventional analysis or optimum design processes since a very large number of finite element analyses is required to be dealt. It is therefore an imperative need to exploit the capabilities of computing resources in order to deal with this kind of problems. In particular, parallel computing can be implemented at the level of metaheuristic optimization, by exploiting the physical parallelization feature of the nondominated sorting evolution strategies method, as well as at the level of repeated structural analyses required for assessing the behavioural constraints and for calculating the objective functions. In this study an efficient dynamic load balancing algorithm for optimum exploitation of available computing resources is proposed and, without loss of generality, is applied for computing the desired Pareto front. In such problems the computation of the complete Pareto front with feasible designs only, constitutes a very challenging task. The proposed algorithm achieves linear speedup factors and almost 100% speedup factor values with reference to the sequential procedure.

  9. Evolutionary Statistical Procedures

    CERN Document Server

    Baragona, Roberto; Poli, Irene

    2011-01-01

    This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a

  10. Multiobjective anatomy-based dose optimization for HDR-brachytherapy with constraint free deterministic algorithms

    International Nuclear Information System (INIS)

    Milickovic, N.; Lahanas, M.; Papagiannopoulou, M.; Zamboglou, N.; Baltas, D.

    2002-01-01

    In high dose rate (HDR) brachytherapy, conventional dose optimization algorithms consider multiple objectives in the form of an aggregate function that transforms the multiobjective problem into a single-objective problem. As a result, there is a loss of information on the available alternative possible solutions. This method assumes that the treatment planner exactly understands the correlation between competing objectives and knows the physical constraints. This knowledge is provided by the Pareto trade-off set obtained by single-objective optimization algorithms with a repeated optimization with different importance vectors. A mapping technique avoids non-feasible solutions with negative dwell weights and allows the use of constraint free gradient-based deterministic algorithms. We compare various such algorithms and methods which could improve their performance. This finally allows us to generate a large number of solutions in a few minutes. We use objectives expressed in terms of dose variances obtained from a few hundred sampling points in the planning target volume (PTV) and in organs at risk (OAR). We compare two- to four-dimensional Pareto fronts obtained with the deterministic algorithms and with a fast-simulated annealing algorithm. For PTV-based objectives, due to the convex objective functions, the obtained solutions are global optimal. If OARs are included, then the solutions found are also global optimal, although local minima may be present as suggested. (author)

  11. Pareto optimal design of sectored toroidal superconducting magnet for SMES

    Energy Technology Data Exchange (ETDEWEB)

    Bhunia, Uttam, E-mail: ubhunia@vecc.gov.in; Saha, Subimal; Chakrabarti, Alok

    2014-10-15

    Highlights: • The optimization approach minimizes both the magnet size and necessary cable length of a sectored toroidal SMES unit. • Design approach is suitable for low temperature superconducting cable suitable for medium size SMES unit. • It investigates coil parameters with respect to practical engineering aspects. - Abstract: A novel multi-objective optimization design approach for sectored toroidal superconducting magnetic energy storage coil has been developed considering the practical engineering constraints. The objectives include the minimization of necessary superconductor length and torus overall size or volume, which determines a significant part of cost towards realization of SMES. The best trade-off between the necessary conductor length for winding and magnet overall size is achieved in the Pareto-optimal solutions, the compact magnet size leads to increase in required superconducting cable length or vice versa The final choice among Pareto optimal configurations can be done in relation to other issues such as AC loss during transient operation, stray magnetic field at outside the coil assembly, and available discharge period, which is not considered in the optimization process. The proposed design approach is adapted for a 4.5 MJ/1 MW SMES system using low temperature niobium–titanium based Rutherford type cable. Furthermore, the validity of the representative Pareto solutions is confirmed by finite-element analysis (FEA) with a reasonably acceptable accuracy.

  12. Pareto optimal design of sectored toroidal superconducting magnet for SMES

    International Nuclear Information System (INIS)

    Bhunia, Uttam; Saha, Subimal; Chakrabarti, Alok

    2014-01-01

    Highlights: • The optimization approach minimizes both the magnet size and necessary cable length of a sectored toroidal SMES unit. • Design approach is suitable for low temperature superconducting cable suitable for medium size SMES unit. • It investigates coil parameters with respect to practical engineering aspects. - Abstract: A novel multi-objective optimization design approach for sectored toroidal superconducting magnetic energy storage coil has been developed considering the practical engineering constraints. The objectives include the minimization of necessary superconductor length and torus overall size or volume, which determines a significant part of cost towards realization of SMES. The best trade-off between the necessary conductor length for winding and magnet overall size is achieved in the Pareto-optimal solutions, the compact magnet size leads to increase in required superconducting cable length or vice versa The final choice among Pareto optimal configurations can be done in relation to other issues such as AC loss during transient operation, stray magnetic field at outside the coil assembly, and available discharge period, which is not considered in the optimization process. The proposed design approach is adapted for a 4.5 MJ/1 MW SMES system using low temperature niobium–titanium based Rutherford type cable. Furthermore, the validity of the representative Pareto solutions is confirmed by finite-element analysis (FEA) with a reasonably acceptable accuracy

  13. Multi-agent Pareto appointment exchanging in hospital patient scheduling

    NARCIS (Netherlands)

    I.B. Vermeulen (Ivan); S.M. Bohte (Sander); D.J.A. Somefun (Koye); J.A. La Poutré (Han)

    2007-01-01

    htmlabstractWe present a dynamic and distributed approach to the hospital patient scheduling problem, in which patients can have multiple appointments that have to be scheduled to different resources. To efficiently solve this problem we develop a multi-agent Pareto-improvement appointment

  14. Pareto-Efficiency, Hayek’s Marvel, and the Invisible Executor

    OpenAIRE

    Kakarot-Handtke, Egmont

    2014-01-01

    This non-technical contribution to the RWER-Blog deals with the interrelations of market clearing, efficient information processing through the price system, and distribution. The point of entry is a transparent example of Pareto-efficiency taken from the popular book How Markets Fail.

  15. An efficient and accurate solution methodology for bilevel multi-objective programming problems using a hybrid evolutionary-local-search algorithm.

    Science.gov (United States)

    Deb, Kalyanmoy; Sinha, Ankur

    2010-01-01

    Bilevel optimization problems involve two optimization tasks (upper and lower level), in which every feasible upper level solution must correspond to an optimal solution to a lower level optimization problem. These problems commonly appear in many practical problem solving tasks including optimal control, process optimization, game-playing strategy developments, transportation problems, and others. However, they are commonly converted into a single level optimization problem by using an approximate solution procedure to replace the lower level optimization task. Although there exist a number of theoretical, numerical, and evolutionary optimization studies involving single-objective bilevel programming problems, not many studies look at the context of multiple conflicting objectives in each level of a bilevel programming problem. In this paper, we address certain intricate issues related to solving multi-objective bilevel programming problems, present challenging test problems, and propose a viable and hybrid evolutionary-cum-local-search based algorithm as a solution methodology. The hybrid approach performs better than a number of existing methodologies and scales well up to 40-variable difficult test problems used in this study. The population sizing and termination criteria are made self-adaptive, so that no additional parameters need to be supplied by the user. The study indicates a clear niche of evolutionary algorithms in solving such difficult problems of practical importance compared to their usual solution by a computationally expensive nested procedure. The study opens up many issues related to multi-objective bilevel programming and hopefully this study will motivate EMO and other researchers to pay more attention to this important and difficult problem solving activity.

  16. Optimization of well field management

    DEFF Research Database (Denmark)

    Hansen, Annette Kirstine

    Groundwater is a limited but important resource for fresh water supply. Differ- ent conflicting objectives are important when operating a well field. This study investigates how the management of a well field can be improved with respect to different objectives simultaneously. A framework...... for optimizing well field man- agement using multi-objective optimization is developed. The optimization uses the Strength Pareto Evolutionary Algorithm 2 (SPEA2) to find the Pareto front be- tween the conflicting objectives. The Pareto front is a set of non-inferior optimal points and provides an important tool...... for the decision-makers. The optimization framework is tested on two case studies. Both abstract around 20,000 cubic meter of water per day, but are otherwise rather different. The first case study concerns the management of Hardhof waterworks, Switzer- land, where artificial infiltration of river water...

  17. Improving predicted protein loop structure ranking using a Pareto-optimality consensus method.

    Science.gov (United States)

    Li, Yaohang; Rata, Ionel; Chiu, See-wing; Jakobsson, Eric

    2010-07-20

    Accurate protein loop structure models are important to understand functions of many proteins. Identifying the native or near-native models by distinguishing them from the misfolded ones is a critical step in protein loop structure prediction. We have developed a Pareto Optimal Consensus (POC) method, which is a consensus model ranking approach to integrate multiple knowledge- or physics-based scoring functions. The procedure of identifying the models of best quality in a model set includes: 1) identifying the models at the Pareto optimal front with respect to a set of scoring functions, and 2) ranking them based on the fuzzy dominance relationship to the rest of the models. We apply the POC method to a large number of decoy sets for loops of 4- to 12-residue in length using a functional space composed of several carefully-selected scoring functions: Rosetta, DOPE, DDFIRE, OPLS-AA, and a triplet backbone dihedral potential developed in our lab. Our computational results show that the sets of Pareto-optimal decoys, which are typically composed of approximately 20% or less of the overall decoys in a set, have a good coverage of the best or near-best decoys in more than 99% of the loop targets. Compared to the individual scoring function yielding best selection accuracy in the decoy sets, the POC method yields 23%, 37%, and 64% less false positives in distinguishing the native conformation, indentifying a near-native model (RMSD Pareto optimality and fuzzy dominance, the POC method is effective in distinguishing the best loop models from the other ones within a loop model set.

  18. Meta-Modeling by Symbolic Regression and Pareto Simulated Annealing

    NARCIS (Netherlands)

    Stinstra, E.; Rennen, G.; Teeuwen, G.J.A.

    2006-01-01

    The subject of this paper is a new approach to Symbolic Regression.Other publications on Symbolic Regression use Genetic Programming.This paper describes an alternative method based on Pareto Simulated Annealing.Our method is based on linear regression for the estimation of constants.Interval

  19. Bi and tri-objective optimization in the deterministic network interdiction problem

    International Nuclear Information System (INIS)

    Rocco S, Claudio M.; Emmanuel Ramirez-Marquez, Jose; Salazar A, Daniel E.

    2010-01-01

    Solution approaches to the deterministic network interdiction problem have previously been developed for optimizing a single figure-of-merit of the network configuration (i.e. flow that can be transmitted between a source node and a sink node for a fixed network design) under constraints related to limited amount of resources available to interdict network links. These approaches work under the assumption that: (1) nominal capacity of each link is completely reduced when interdicted and (2) there is a single criterion to optimize. This paper presents a newly developed evolutionary algorithm that for the first time allows solving multi-objective optimization models for the design of network interdiction strategies that take into account a variety of figures-of-merit. The algorithm provides an approximation to the optimal Pareto frontier using: (a) techniques in Monte Carlo simulation to generate potential network interdiction strategies, (b) graph theory to analyze strategies' maximum source-sink flow and (c) an evolutionary search that is driven by the probability that a link will belong to the optimal Pareto set. Examples for different sizes of networks and network behavior are used throughout the paper to illustrate and validate the approach.

  20. Statement of Problem of Pareto Frontier Management and Its Solution in the Analysis and Synthesis of Optimal Systems

    Directory of Open Access Journals (Sweden)

    I. K. Romanova

    2015-01-01

    Full Text Available The article research concerns the multi-criteria optimization (MCO, which assumes that operation quality criteria of the system are independent and specifies a way to improve values of these criteria. Mutual contradiction of some criteria is a major problem in MCO. One of the most important areas of research is to obtain the so-called Pareto - optimal options.The subject of research is Pareto front, also called the Pareto frontier. The article discusses front classifications by its geometric representation for the case of two-criterion task. It presents a mathematical description of the front characteristics using the gradients and their projections. A review of current domestic and foreign literature has revealed that the aim of works in constructing the Pareto frontier is to conduct research in conditions of uncertainty, in the stochastic statement, with no restrictions. A topology both in two- and in three-dimensional case is under consideration. The targets of modern applications are multi-agent systems and groups of players in differential games. However, all considered works have no task to provide an active management of the front.The objective of this article is to discuss the research problem the Pareto frontier in a new production, namely, with the active co-developers of the systems and (or the decision makers (DM in the management of the Pareto frontier. It notes that such formulation differs from the traditionally accepted approach based on the analysis of already existing solutions.The article discusses three ways to describe a quality of the object management system. The first way is to use the direct quality criteria for the model of a closed system as the vibrational level of the General form. The second one is to study a specific two-loop system of an aircraft control using the angular velocity and normal acceleration loops. The third is the use of the integrated quality criteria. In all three cases, the selected criteria are

  1. Using Pareto optimality to explore the topology and dynamics of the human connectome.

    Science.gov (United States)

    Avena-Koenigsberger, Andrea; Goñi, Joaquín; Betzel, Richard F; van den Heuvel, Martijn P; Griffa, Alessandra; Hagmann, Patric; Thiran, Jean-Philippe; Sporns, Olaf

    2014-10-05

    Graph theory has provided a key mathematical framework to analyse the architecture of human brain networks. This architecture embodies an inherently complex relationship between connection topology, the spatial arrangement of network elements, and the resulting network cost and functional performance. An exploration of these interacting factors and driving forces may reveal salient network features that are critically important for shaping and constraining the brain's topological organization and its evolvability. Several studies have pointed to an economic balance between network cost and network efficiency with networks organized in an 'economical' small-world favouring high communication efficiency at a low wiring cost. In this study, we define and explore a network morphospace in order to characterize different aspects of communication efficiency in human brain networks. Using a multi-objective evolutionary approach that approximates a Pareto-optimal set within the morphospace, we investigate the capacity of anatomical brain networks to evolve towards topologies that exhibit optimal information processing features while preserving network cost. This approach allows us to investigate network topologies that emerge under specific selection pressures, thus providing some insight into the selectional forces that may have shaped the network architecture of existing human brains.

  2. Bio-inspired algorithms applied to molecular docking simulations.

    Science.gov (United States)

    Heberlé, G; de Azevedo, W F

    2011-01-01

    Nature as a source of inspiration has been shown to have a great beneficial impact on the development of new computational methodologies. In this scenario, analyses of the interactions between a protein target and a ligand can be simulated by biologically inspired algorithms (BIAs). These algorithms mimic biological systems to create new paradigms for computation, such as neural networks, evolutionary computing, and swarm intelligence. This review provides a description of the main concepts behind BIAs applied to molecular docking simulations. Special attention is devoted to evolutionary algorithms, guided-directed evolutionary algorithms, and Lamarckian genetic algorithms. Recent applications of these methodologies to protein targets identified in the Mycobacterium tuberculosis genome are described.

  3. Optimization of operating schedule of machines in granite industry using evolutionary algorithms

    International Nuclear Information System (INIS)

    Loganthurai, P.; Rajasekaran, V.; Gnanambal, K.

    2014-01-01

    Highlights: • Operating time of machines in granite industries was studied. • Operating time has been optimized using evolutionary algorithms such as PSO, DE. • The maximum demand has been reduced. • Hence the electricity cost of the industry and feeder stress have been reduced. - Abstract: Electrical energy consumption cost plays an important role in the production cost of any industry. The electrical energy consumption cost is calculated as two part tariff, the first part is maximum demand cost and the second part is energy consumption cost or unit cost (kW h). The maximum demand cost can be reduced without affecting the production. This paper focuses on the reduction of maximum demand by proper operating schedule of major equipments. For this analysis, various granite industries are considered. The major equipments in granite industries are cutting machine, polishing machine and compressor. To reduce the maximum demand, the operating time of polishing machine is rescheduled by optimization techniques such as Differential Evolution (DE) and particle swarm optimization (PSO). The maximum demand costs are calculated before and after rescheduling. The results show that if the machines are optimally operated, the cost is reduced. Both DE and PSO algorithms reduce the maximum demand cost at the same rate for all the granite industries. However, the optimum scheduling obtained by DE reduces the feeder power flow than the PSO scheduling

  4. Efficient approximation of black-box functions and Pareto sets

    NARCIS (Netherlands)

    Rennen, G.

    2009-01-01

    In the case of time-consuming simulation models or other so-called black-box functions, we determine a metamodel which approximates the relation between the input- and output-variables of the simulation model. To solve multi-objective optimization problems, we approximate the Pareto set, i.e. the

  5. Optimization of single channel glazed photovoltaic thermal (PVT) array using Evolutionary Algorithm (EA) and carbon credit earned by the optimized array

    International Nuclear Information System (INIS)

    Singh, Sonveer; Agrawal, Sanjay; Gadh, Rajit

    2015-01-01

    Highlights: • Optimization of SCGPVT array using Evolutionary Algorithm. • The overall exergy gain is maximized with an Evolutionary Algorithm. • Annual Performance has been evaluated for New Delhi (India). • There are improvement in results than the model given in literature. • Carbon credit analysis has been done. - Abstract: In this paper, work is carried out in three steps. In the first step, optimization of single channel glazed photovoltaic thermal (SCGPVT) array has been done with an Evolutionary Algorithm (EA) keeping the overall exergy gain is an objective function of the SCGPVT array. For maximization of overall exergy gain, total seven design variables have been optimized such as length of the channel (L), mass flow rate of flowing fluid (m_F), velocity of flowing fluid (V_F), convective heat transfer coefficient through the tedlar (U_T), overall heat transfer coefficient between solar cell to ambient through glass cover (U_S_C_A_G), overall back loss heat transfer coefficient from flowing fluid to ambient (U_F_A) and convective heat transfer coefficient of tedlar (h_T). It has been observed that the instant overall exergy gain obtained from optimized system is 1.42 kW h, which is 87.86% more than the overall exergy gain of a un-optimized system given in literature. In the second step, overall exergy gain and overall thermal gain of SCGPVT array has been evaluated annually and there are 69.52% and 88.05% improvement in annual overall exergy gain and annual overall thermal gain respectively than the un-optimized system for the same input irradiance and ambient temperature. In the third step, carbon credit earned by the optimized SCGPVT array has also been evaluated as per norms of Kyoto Protocol Bangalore climatic conditions.

  6. Random phenotypic variation of yeast (Saccharomyces cerevisiae) single-gene knockouts fits a double pareto-lognormal distribution.

    Science.gov (United States)

    Graham, John H; Robb, Daniel T; Poe, Amy R

    2012-01-01

    Distributed robustness is thought to influence the buffering of random phenotypic variation through the scale-free topology of gene regulatory, metabolic, and protein-protein interaction networks. If this hypothesis is true, then the phenotypic response to the perturbation of particular nodes in such a network should be proportional to the number of links those nodes make with neighboring nodes. This suggests a probability distribution approximating an inverse power-law of random phenotypic variation. Zero phenotypic variation, however, is impossible, because random molecular and cellular processes are essential to normal development. Consequently, a more realistic distribution should have a y-intercept close to zero in the lower tail, a mode greater than zero, and a long (fat) upper tail. The double Pareto-lognormal (DPLN) distribution is an ideal candidate distribution. It consists of a mixture of a lognormal body and upper and lower power-law tails. If our assumptions are true, the DPLN distribution should provide a better fit to random phenotypic variation in a large series of single-gene knockout lines than other skewed or symmetrical distributions. We fit a large published data set of single-gene knockout lines in Saccharomyces cerevisiae to seven different probability distributions: DPLN, right Pareto-lognormal (RPLN), left Pareto-lognormal (LPLN), normal, lognormal, exponential, and Pareto. The best model was judged by the Akaike Information Criterion (AIC). Phenotypic variation among gene knockouts in S. cerevisiae fits a double Pareto-lognormal (DPLN) distribution better than any of the alternative distributions, including the right Pareto-lognormal and lognormal distributions. A DPLN distribution is consistent with the hypothesis that developmental stability is mediated, in part, by distributed robustness, the resilience of gene regulatory, metabolic, and protein-protein interaction networks. Alternatively, multiplicative cell growth, and the mixing of

  7. Pareto-optimal alloys

    DEFF Research Database (Denmark)

    Bligaard, Thomas; Johannesson, Gisli Holmar; Ruban, Andrei

    2003-01-01

    Large databases that can be used in the search for new materials with specific properties remain an elusive goal in materials science. The problem is complicated by the fact that the optimal material for a given application is usually a compromise between a number of materials properties and the ......Large databases that can be used in the search for new materials with specific properties remain an elusive goal in materials science. The problem is complicated by the fact that the optimal material for a given application is usually a compromise between a number of materials properties...... and the cost. In this letter we present a database consisting of the lattice parameters, bulk moduli, and heats of formation for over 64 000 ordered metallic alloys, which has been established by direct first-principles density-functional-theory calculations. Furthermore, we use a concept from economic theory......, the Pareto-optimal set, to determine optimal alloy solutions for the compromise between low compressibility, high stability, and cost....

  8. A New Generalization of the Pareto Distribution and Its Application to Insurance Data

    Directory of Open Access Journals (Sweden)

    Mohamed E. Ghitany

    2018-02-01

    Full Text Available The Pareto classical distribution is one of the most attractive in statistics and particularly in the scenario of actuarial statistics and finance. For example, it is widely used when calculating reinsurance premiums. In the last years, many alternative distributions have been proposed to obtain better adjustments especially when the tail of the empirical distribution of the data is very long. In this work, an alternative generalization of the Pareto distribution is proposed and its properties are studied. Finally, application of the proposed model to the earthquake insurance data set is presented.

  9. Dynamic Power Dispatch Considering Electric Vehicles and Wind Power Using Decomposition Based Multi-Objective Evolutionary Algorithm

    Directory of Open Access Journals (Sweden)

    Boyang Qu

    2017-12-01

    Full Text Available The intermittency of wind power and the large-scale integration of electric vehicles (EVs bring new challenges to the reliability and economy of power system dispatching. In this paper, a novel multi-objective dynamic economic emission dispatch (DEED model is proposed considering the EVs and uncertainties of wind power. The total fuel cost and pollutant emission are considered as the optimization objectives, and the vehicle to grid (V2G power and the conventional generator output power are set as the decision variables. The stochastic wind power is derived by Weibull probability distribution function. Under the premise of meeting the system energy and user’s travel demand, the charging and discharging behavior of the EVs are dynamically managed. Moreover, we propose a two-step dynamic constraint processing strategy for decision variables based on penalty function, and, on this basis, the Multi-Objective Evolutionary Algorithm Based on Decomposition (MOEA/D algorithm is improved. The proposed model and approach are verified by the 10-generator system. The results demonstrate that the proposed DEED model and the improved MOEA/D algorithm are effective and reasonable.

  10. Optimization of multi-objective micro-grid based on improved particle swarm optimization algorithm

    Science.gov (United States)

    Zhang, Jian; Gan, Yang

    2018-04-01

    The paper presents a multi-objective optimal configuration model for independent micro-grid with the aim of economy and environmental protection. The Pareto solution set can be obtained by solving the multi-objective optimization configuration model of micro-grid with the improved particle swarm algorithm. The feasibility of the improved particle swarm optimization algorithm for multi-objective model is verified, which provides an important reference for multi-objective optimization of independent micro-grid.

  11. A new evolutionary algorithm with LVQ learning for the optimization of combinatory problems as a reload of nuclear reactors

    International Nuclear Information System (INIS)

    Machado, Marcelo Dornellas

    1999-04-01

    Genetic algorithms are biologically motivated adaptive systems which have been used, with good results, for function optimization. In this work, a new learning mode, to be used by the Population-Based Incremental Learning (PBIL) algorithm, who combines mechanisms of standard genetic algorithm with simple competitive learning, has the aim to build a new evolutionary algorithm to be used in optimization of numerical problems and combinatorial problems. This new learning mode uses a variable learning rate during the optimization process, constituting a process know as proportional reward. The development of this new algorithm aims its application in the optimization of reload problem of PWR nuclear reactors. This problem can be interpreted as search of a load pattern to be used in the nucleus of the reactor in order to increase the useful life of the nuclear fuel. For the test, two classes of problems are used: numerical problems and combinatorial problem, the major interest relies on the last class. The results achieved with the tests indicate the applicability of the new learning mode, showing its potential as a developing tool in the solution of reload problem. (author)

  12. Design and selection of load control strategies using a multiple objective model and evolutionary algorithms

    International Nuclear Information System (INIS)

    Gomes, Alvaro; Antunes, Carlos Henggeler; Martins, Antonio Gomes

    2005-01-01

    This paper aims at presenting a multiple objective model to evaluate the attractiveness of the use of demand resources (through load management control actions) by different stakeholders and in diverse structure scenarios in electricity systems. For the sake of model flexibility, the multiple (and conflicting) objective functions of technical, economical and quality of service nature are able to capture distinct market scenarios and operating entities that may be interested in promoting load management activities. The computation of compromise solutions is made by resorting to evolutionary algorithms, which are well suited to tackle multiobjective problems of combinatorial nature herein involving the identification and selection of control actions to be applied to groups of loads. (Author)

  13. Word frequencies: A comparison of Pareto type distributions

    Science.gov (United States)

    Wiegand, Martin; Nadarajah, Saralees; Si, Yuancheng

    2018-03-01

    Mehri and Jamaati (2017) [18] used Zipf's law to model word frequencies in Holy Bible translations for one hundred live languages. We compare the fit of Zipf's law to a number of Pareto type distributions. The latter distributions are shown to provide the best fit, as judged by a number of comparative plots and error measures. The fit of Zipf's law appears generally poor.

  14. Multi-objective Optimization of Pulsed Gas Metal Arc Welding Process Using Neuro NSGA-II

    Science.gov (United States)

    Pal, Kamal; Pal, Surjya K.

    2018-05-01

    Weld quality is a critical issue in fabrication industries where products are custom-designed. Multi-objective optimization results number of solutions in the pareto-optimal front. Mathematical regression model based optimization methods are often found to be inadequate for highly non-linear arc welding processes. Thus, various global evolutionary approaches like artificial neural network, genetic algorithm (GA) have been developed. The present work attempts with elitist non-dominated sorting GA (NSGA-II) for optimization of pulsed gas metal arc welding process using back propagation neural network (BPNN) based weld quality feature models. The primary objective to maintain butt joint weld quality is the maximization of tensile strength with minimum plate distortion. BPNN has been used to compute the fitness of each solution after adequate training, whereas NSGA-II algorithm generates the optimum solutions for two conflicting objectives. Welding experiments have been conducted on low carbon steel using response surface methodology. The pareto-optimal front with three ranked solutions after 20th generations was considered as the best without further improvement. The joint strength as well as transverse shrinkage was found to be drastically improved over the design of experimental results as per validated pareto-optimal solutions obtained.

  15. An efficient algorithm for bi-objective combined heat and power production planning under the emission trading scheme

    International Nuclear Information System (INIS)

    Rong, Aiying; Figueira, José Rui; Lahdelma, Risto

    2014-01-01

    Highlights: • Define fuel mix setting for the bi-objective CHP environmental/economic dispatch. • Develop an efficient algorithm for constructing the Pareto frontier for the problem. • Time complexity analysis is conducted for the proposed algorithm. • The algorithm is theoretically compared against a traditional algorithm. • The efficiency of the algorithm is justified by numerical results. - Abstract: The growing environmental awareness and the apparent conflicts between economic and environmental objectives turn energy planning problems naturally into multi-objective optimization problems. In the current study, mixed fuel combustion is considered as an option to achieve tradeoff between economic objective (associated with fuel cost) and emission objective (measured in CO 2 emission cost according to fuels and emission allowance price) because a fuel with higher emissions is usually cheaper than one with lower emissions. Combined heat and power (CHP) production is an important high-efficiency technology to promote under the emission trading scheme. In CHP production, the production planning of both commodities must be done in coordination. A long-term planning problem decomposes into thousands of hourly subproblems. In this paper, a bi-objective multi-period linear programming CHP planning model is presented first. Then, an efficient specialized merging algorithm for constructing the exact Pareto frontier (PF) of the problem is presented. The algorithm is theoretically and empirically compared against a modified dichotomic search algorithm. The efficiency and effectiveness of the algorithm is justified

  16. Double-layer evolutionary algorithm for distributed optimization of particle detection on the Grid

    International Nuclear Information System (INIS)

    Padée, Adam; Zaremba, Krzysztof; Kurek, Krzysztof

    2013-01-01

    Reconstruction of particle tracks from information collected by position-sensitive detectors is an important procedure in HEP experiments. It is usually controlled by a set of numerical parameters which have to be manually optimized. This paper proposes an automatic approach to this task by utilizing evolutionary algorithm (EA) operating on both real-valued and binary representations. Because of computational complexity of the task a special distributed architecture of the algorithm is proposed, designed to be run in grid environment. It is two-level hierarchical hybrid utilizing asynchronous master-slave EA on the level of clusters and island model EA on the level of the grid. The technical aspects of usage of production grid infrastructure are covered, including communication protocols on both levels. The paper deals also with the problem of heterogeneity of the resources, presenting efficiency tests on a benchmark function. These tests confirm that even relatively small islands (clusters) can be beneficial to the optimization process when connected to the larger ones. Finally a real-life usage example is presented, which is an optimization of track reconstruction in Large Angle Spectrometer of NA-58 COMPASS experiment held at CERN, using a sample of Monte Carlo simulated data. The overall reconstruction efficiency gain, achieved by the proposed method, is more than 4%, compared to the manually optimized parameters

  17. Combining Environment-Driven Adaptation and Task-Driven Optimisation in Evolutionary Robotics

    NARCIS (Netherlands)

    Haasdijk, E.W.; Bredeche, Nicolas; Eiben, A.E.

    2014-01-01

    Embodied evolutionary robotics is a sub-field of evolutionary robotics that employs evolutionary algorithms on the robotic hardware itself, during the operational period, i.e., in an on-line fashion. This enables robotic systems that continuously adapt, and are therefore capable of (re-)adjusting

  18. Comparison of parameter estimation algorithms in hydrological modelling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena; Madsen, Henrik; Rosbjerg, Dan

    2006-01-01

    Local search methods have been applied successfully in calibration of simple groundwater models, but might fail in locating the optimum for models of increased complexity, due to the more complex shape of the response surface. Global search algorithms have been demonstrated to perform well......-Marquardt-Levenberg algorithm (implemented in the PEST software), when applied to a steady-state and a transient groundwater model. The results show that PEST can have severe problems in locating the global optimum and in being trapped in local regions of attractions. The global SCE procedure is, in general, more effective...... and provides a better coverage of the Pareto optimal solutions at a lower computational cost....

  19. Application of the Pareto chart and Ishikawa diagram for the identification of major defects in metal composite castings

    OpenAIRE

    K. Gawdzińska

    2011-01-01

    This author discusses the use of selected quality management tools, i.e. the Pareto chart and Ishikawa fishbone diagram, for the descriptionof composite casting defects. The Pareto chart allows to determine defect priority related with metallic composite castings, while theIshikawa diagram indicates the causes of defect formation and enables calculating defect weights.

  20. Evolutionary computation for reinforcement learning

    NARCIS (Netherlands)

    Whiteson, S.; Wiering, M.; van Otterlo, M.

    2012-01-01

    Algorithms for evolutionary computation, which simulate the process of natural selection to solve optimization problems, are an effective tool for discovering high-performing reinforcement-learning policies. Because they can automatically find good representations, handle continuous action spaces,