Subbaraj Potti
2011-01-01
Full Text Available Problem statement: A new multi-objective approach, Strength Pareto Evolutionary Algorithm (SPEA, is presented in this paper to solve the shortest path routing problem. The routing problem is formulated as a multi-objective mathematical programming problem which attempts to minimize both cost and delay objectives simultaneously. Approach: SPEA handles the shortest path routing problem as a true multi-objective optimization problem with competing and noncommensurable objectives. Results: SPEA combines several features of previous multi-objective evolutionary algorithms in a unique manner. SPEA stores nondominated solutions externally in another continuously-updated population and uses a hierarchical clustering algorithm to provide the decision maker with a manageable pareto-optimal set. SPEA is applied to a 20 node network as well as to large size networks ranging from 50-200 nodes. Conclusion: The results demonstrate the capabilities of the proposed approach to generate true and well distributed pareto-optimal nondominated solutions.
Gharari, Rahman [Nuclear Science and Technology Research Institute (NSTRI), Tehran (Iran, Islamic Republic of); Poursalehi, Navid; Abbasi, Mohmmadreza; Aghale, Mahdi [Nuclear Engineering Dept, Shahid Beheshti University, Tehran (Iran, Islamic Republic of)
2016-10-15
In this research, for the first time, a new optimization method, i.e., strength Pareto evolutionary algorithm II (SPEA-II), is developed for the burnable poison placement (BPP) optimization of a nuclear reactor core. In the BPP problem, an optimized placement map of fuel assemblies with burnable poison is searched for a given core loading pattern according to defined objectives. In this work, SPEA-II coupled with a nodal expansion code is used for solving the BPP problem of Kraftwerk Union AG (KWU) pressurized water reactor. Our optimization goal for the BPP is to achieve a greater multiplication factor (K-e-f-f) for gaining possible longer operation cycles along with more flattening of fuel assembly relative power distribution, considering a safety constraint on the radial power peaking factor. For appraising the proposed methodology, the basic approach, i.e., SPEA, is also developed in order to compare obtained results. In general, results reveal the acceptance performance and high strength of SPEA, particularly its new version, i.e., SPEA-II, in achieving a semioptimized loading pattern for the BPP optimization of KWU pressurized water reactor.
Strength Pareto Evolutionary Algorithm using Self-Organizing Data Analysis Techniques
Ionut Balan
2015-03-01
Full Text Available Multiobjective optimization is widely used in problems solving from a variety of areas. To solve such problems there was developed a set of algorithms, most of them based on evolutionary techniques. One of the algorithms from this class, which gives quite good results is SPEA2, method which is the basis of the proposed algorithm in this paper. Results from this paper are obtained by running these two algorithms on a flow-shop problem.
Pareto analysis of evolutionary and learning systems
Yaochu JIN; Robin GRUNA; Bernhard SENDHOFF
2009-01-01
This paper attempts to argue that most adaptive systems, such as evolutionary or learning systems, have inherently multiple objectives to deal with. Very often, there is no single solution that can optimize all the objectives. In this case, the concept of Pareto optimality is key to analyzing these systems. To support this argument, we first present an example that considers the robustness and evolvability trade-off in a redundant genetic representation for simulated evolution. It is well known that robustness is critical for biological evolution, since without a sufficient degree of mutational robustness, it is impossible for evolution to create new functionalities. On the other hand, the genetic representation should also provide the chance to find new phenotypes, i.e., the ability to innovate. This example shows quantitatively that a trade-off between robustness and innovation does exist in the studied redundant representation. Interesting results will also be given to show that new insights into learning problems can be gained when the concept of Pareto optimality is applied to machine learning. In the first example, a Pareto-based multi-objective approach is employed to alleviate catastrophic forgetting in neural network learning, We show that learning new information and memorizing learned knowledge are two conflicting objectives, and a major part of both information can be memorized when the multi-objective learning approach is adopted.In the second example, we demonstrate that a Pareto-based approach can address neural network regularization more elegantly. By analyzing the Pareto-optimal solutions, it is possible to identifying interesting solutions on the Pareto front.
Strength Pareto particle swarm optimization and hybrid EA-PSO for multi-objective optimization.
Elhossini, Ahmed; Areibi, Shawki; Dony, Robert
2010-01-01
This paper proposes an efficient particle swarm optimization (PSO) technique that can handle multi-objective optimization problems. It is based on the strength Pareto approach originally used in evolutionary algorithms (EA). The proposed modified particle swarm algorithm is used to build three hybrid EA-PSO algorithms to solve different multi-objective optimization problems. This algorithm and its hybrid forms are tested using seven benchmarks from the literature and the results are compared to the strength Pareto evolutionary algorithm (SPEA2) and a competitive multi-objective PSO using several metrics. The proposed algorithm shows a slower convergence, compared to the other algorithms, but requires less CPU time. Combining PSO and evolutionary algorithms leads to superior hybrid algorithms that outperform SPEA2, the competitive multi-objective PSO (MO-PSO), and the proposed strength Pareto PSO based on different metrics.
PARETO: A novel evolutionary optimization approach to multiobjective IMRT planning.
Fiege, Jason; McCurdy, Boyd; Potrebko, Peter; Champion, Heather; Cull, Andrew
2011-09-01
In radiation therapy treatment planning, the clinical objectives of uniform high dose to the planning target volume (PTV) and low dose to the organs-at-risk (OARs) are invariably in conflict, often requiring compromises to be made between them when selecting the best treatment plan for a particular patient. In this work, the authors introduce Pareto-Aware Radiotherapy Evolutionary Treatment Optimization (pareto), a multiobjective optimization tool to solve for beam angles and fluence patterns in intensity-modulated radiation therapy (IMRT) treatment planning. pareto is built around a powerful multiobjective genetic algorithm (GA), which allows us to treat the problem of IMRT treatment plan optimization as a combined monolithic problem, where all beam fluence and angle parameters are treated equally during the optimization. We have employed a simple parameterized beam fluence representation with a realistic dose calculation approach, incorporating patient scatter effects, to demonstrate feasibility of the proposed approach on two phantoms. The first phantom is a simple cylindrical phantom containing a target surrounded by three OARs, while the second phantom is more complex and represents a paraspinal patient. pareto results in a large database of Pareto nondominated solutions that represent the necessary trade-offs between objectives. The solution quality was examined for several PTV and OAR fitness functions. The combination of a conformity-based PTV fitness function and a dose-volume histogram (DVH) or equivalent uniform dose (EUD) -based fitness function for the OAR produced relatively uniform and conformal PTV doses, with well-spaced beams. A penalty function added to the fitness functions eliminates hotspots. Comparison of resulting DVHs to those from treatment plans developed with a single-objective fluence optimizer (from a commercial treatment planning system) showed good correlation. Results also indicated that pareto shows promise in optimizing the number
Abdul Rani, Khairul Najmy; Abdulmalek, Mohamedfareq; A. Rahim, Hasliza; Siew Chin, Neoh; Abd Wahab, Alawiyah
2017-04-01
This research proposes the various versions of modified cuckoo search (MCS) metaheuristic algorithm deploying the strength Pareto evolutionary algorithm (SPEA) multiobjective (MO) optimization technique in rectangular array geometry synthesis. Precisely, the MCS algorithm is proposed by incorporating the Roulette wheel selection operator to choose the initial host nests (individuals) that give better results, adaptive inertia weight to control the positions exploration of the potential best host nests (solutions), and dynamic discovery rate to manage the fraction probability of finding the best host nests in 3-dimensional search space. In addition, the MCS algorithm is hybridized with the particle swarm optimization (PSO) and hill climbing (HC) stochastic techniques along with the standard strength Pareto evolutionary algorithm (SPEA) forming the MCSPSOSPEA and MCSHCSPEA, respectively. All the proposed MCS-based algorithms are examined to perform MO optimization on Zitzler-Deb-Thiele’s (ZDT’s) test functions. Pareto optimum trade-offs are done to generate a set of three non-dominated solutions, which are locations, excitation amplitudes, and excitation phases of array elements, respectively. Overall, simulations demonstrates that the proposed MCSPSOSPEA outperforms other compatible competitors, in gaining a high antenna directivity, small half-power beamwidth (HPBW), low average side lobe level (SLL) suppression, and/or significant predefined nulls mitigation, simultaneously.
An Evolutionary Efficiency Alternative to the Notion of Pareto Efficiency
Staveren, Irene
2012-01-01
textabstractThe paper argues that the notion of Pareto efficiency builds on two normative assumptions: the more general consequentialist norm of any efficiency criterion, and the strong no-harm principle of the prohibition of any redistribution during the economic process that hurts at least one person. These normative concerns lead to a constrained and static notion of efficiency in mainstream economics, ignoring dynamic efficiency gains from more equal allocations of resources. The paper ar...
Fernández Caballero, Juan Carlos; Martínez, Francisco José; Hervás, César; Gutiérrez, Pedro Antonio
2010-05-01
This paper proposes a multiclassification algorithm using multilayer perceptron neural network models. It tries to boost two conflicting main objectives of multiclassifiers: a high correct classification rate level and a high classification rate for each class. This last objective is not usually optimized in classification, but is considered here given the need to obtain high precision in each class in real problems. To solve this machine learning problem, we use a Pareto-based multiobjective optimization methodology based on a memetic evolutionary algorithm. We consider a memetic Pareto evolutionary approach based on the NSGA2 evolutionary algorithm (MPENSGA2). Once the Pareto front is built, two strategies or automatic individual selection are used: the best model in accuracy and the best model in sensitivity (extremes in the Pareto front). These methodologies are applied to solve 17 classification benchmark problems obtained from the University of California at Irvine (UCI) repository and one complex real classification problem. The models obtained show high accuracy and a high classification rate for each class.
Deb, Kalyanmoy; Mohan, Manikanth; Mishra, Shikhar
2005-01-01
Since the suggestion of a computing procedure of multiple Pareto-optimal solutions in multi-objective optimization problems in the early Nineties, researchers have been on the look out for a procedure which is computationally fast and simultaneously capable of finding a well-converged and well-distributed set of solutions. Most multi-objective evolutionary algorithms (MOEAs) developed in the past decade are either good for achieving a well-distributed solutions at the expense of a large computational effort or computationally fast at the expense of achieving a not-so-good distribution of solutions. For example, although the Strength Pareto Evolutionary Algorithm or SPEA (Zitzler and Thiele, 1999) produces a much better distribution compared to the elitist non-dominated sorting GA or NSGA-II (Deb et al., 2002a), the computational time needed to run SPEA is much greater. In this paper, we evaluate a recently-proposed steady-state MOEA (Deb et al., 2003) which was developed based on the epsilon-dominance concept introduced earlier(Laumanns et al., 2002) and using efficient parent and archive update strategies for achieving a well-distributed and well-converged set of solutions quickly. Based on an extensive comparative study with four other state-of-the-art MOEAs on a number of two, three, and four objective test problems, it is observed that the steady-state MOEA is a good compromise in terms of convergence near to the Pareto-optimal front, diversity of solutions, and computational time. Moreover, the epsilon-MOEA is a step closer towards making MOEAs pragmatic, particularly allowing a decision-maker to control the achievable accuracy in the obtained Pareto-optimal solutions.
Wismans, Luc; Berkum, van Eric; Bliemer, Michiel; Allkim, T.P.; Arem, van B.
2748-01-01
Multi objective optimization of externalities of traffic is performed solving a network design problem in which Dynamic Traffic Management measures are used. The resulting Pareto optimal set is determined by employing the SPEA2+ evolutionary algorithm.
Evolutionary Pareto-optimization of stably folding peptides
Hoffmann Daniel
2008-02-01
Full Text Available Abstract Background As a rule, peptides are more flexible and unstructured than proteins with their substantial stabilizing hydrophobic cores. Nevertheless, a few stably folding peptides have been discovered. This raises the question whether there may be more such peptides that are unknown as yet. These molecules could be helpful in basic research and medicine. Results As a method to explore the space of conformationally stable peptides, we have developed an evolutionary algorithm that allows optimization of sequences with respect to several criteria simultaneously, for instance stability, accessibility of arbitrary parts of the peptide, etc. In a proof-of-concept experiment we have perturbed the sequence of the peptide Villin Headpiece, known to be stable in vitro. Starting from the perturbed sequence we applied our algorithm to optimize peptide stability and accessibility of a loop. Unexpectedly, two clusters of sequences were generated in this way that, according to our criteria, should form structures with higher stability than the wild-type. The structures in one of the clusters possess a fold that markedly differs from the native fold of Villin Headpiece. One of the mutants predicted to be stable was selected for synthesis, its molecular 3D-structure was characterized by nuclear magnetic resonance spectroscopy, and its stability was measured by circular dichroism. Predicted structure and stability were in good agreement with experiment. Eight other sequences and structures, including five with a non-native fold are provided as bona fide predictions. Conclusion The results suggest that much more conformationally stable peptides may exist than are known so far, and that small fold classes could comprise well-separated sub-folds.
Oliver Chikumbo
2012-01-01
Full Text Available A stand-level, multiobjective evolutionary algorithm (MOEA for determining a set of efficient thinning regimes satisfying two objectives, that is, value production for sawlog harvesting and volume production for a pulpwood market, was successfully demonstrated for a Eucalyptus fastigata trial in Kaingaroa Forest, New Zealand. The MOEA approximated the set of efficient thinning regimes (with a discontinuous Pareto front by employing a ranking scheme developed by Fonseca and Fleming (1993, which was a Pareto-based ranking (a.k.a Multiobjective Genetic Algorithm—MOGA. In this paper we solve the same problem using an improved version of a fitness sharing Pareto ranking algorithm (a.k.a Nondominated Sorting Genetic Algorithm—NSGA II originally developed by Srinivas and Deb (1994 and examine the results. Our findings indicate that NSGA II approximates the entire Pareto front whereas MOGA only determines a subdomain of the Pareto points.
Othman, Muhammad Murtadha; Abd Rahman, Nurulazmi; Musirin, Ismail; Fotuhi-Firuzabad, Mahmud; Rajabi-Ghahnavieh, Abbas
2015-01-01
This paper introduces a novel multiobjective approach for capacity benefit margin (CBM) assessment taking into account tie-line reliability of interconnected systems. CBM is the imperative information utilized as a reference by the load-serving entities (LSE) to estimate a certain margin of transfer capability so that a reliable access to generation through interconnected system could be attained. A new Pareto-based evolutionary programming (EP) technique is used to perform a simultaneous determination of CBM for all areas of the interconnected system. The selection of CBM at the Pareto optimal front is proposed to be performed by referring to a heuristic ranking index that takes into account system loss of load expectation (LOLE) in various conditions. Eventually, the power transfer based available transfer capability (ATC) is determined by considering the firm and nonfirm transfers of CBM. A comprehensive set of numerical studies are conducted on the modified IEEE-RTS79 and the performance of the proposed method is numerically investigated in detail. The main advantage of the proposed technique is in terms of flexibility offered to an independent system operator in selecting an appropriate solution of CBM simultaneously for all areas.
Jiang, Shouyong; Yang, Shengxiang
2016-02-01
The multiobjective evolutionary algorithm based on decomposition (MOEA/D) has been shown to be very efficient in solving multiobjective optimization problems (MOPs). In practice, the Pareto-optimal front (POF) of many MOPs has complex characteristics. For example, the POF may have a long tail and sharp peak and disconnected regions, which significantly degrades the performance of MOEA/D. This paper proposes an improved MOEA/D for handling such kind of complex problems. In the proposed algorithm, a two-phase strategy (TP) is employed to divide the whole optimization procedure into two phases. Based on the crowdedness of solutions found in the first phase, the algorithm decides whether or not to delicate computational resources to handle unsolved subproblems in the second phase. Besides, a new niche scheme is introduced into the improved MOEA/D to guide the selection of mating parents to avoid producing duplicate solutions, which is very helpful for maintaining the population diversity when the POF of the MOP being optimized is discontinuous. The performance of the proposed algorithm is investigated on some existing benchmark and newly designed MOPs with complex POF shapes in comparison with several MOEA/D variants and other approaches. The experimental results show that the proposed algorithm produces promising performance on these complex problems.
The Evolutionary Algorithm to Find Robust Pareto-Optimal Solutions over Time
Meirong Chen
2015-01-01
Full Text Available In dynamic multiobjective optimization problems, the environmental parameters change over time, which makes the true pareto fronts shifted. So far, most works of research on dynamic multiobjective optimization methods have concentrated on detecting the changed environment and triggering the population based optimization methods so as to track the moving pareto fronts over time. Yet, in many real-world applications, it is not necessary to find the optimal nondominant solutions in each dynamic environment. To solve this weakness, a novel method called robust pareto-optimal solution over time is proposed. It is in fact to replace the optimal pareto front at each time-varying moment with the series of robust pareto-optimal solutions. This means that each robust solution can fit for more than one time-varying moment. Two metrics, including the average survival time and average robust generational distance, are present to measure the robustness of the robust pareto solution set. Another contribution is to construct the algorithm framework searching for robust pareto-optimal solutions over time based on the survival time. Experimental results indicate that this definition is a more practical and time-saving method of addressing dynamic multiobjective optimization problems changing over time.
一种混合策略的Pareto演化规划%A Mixed Strategies Pareto Evolutionary Programming
董红斌; 黄厚宽; 何军; 侯薇; 穆成坡
2006-01-01
提出一种多目标演化算法--混合策略Pareto演化规划(Mixed Strategies Pareto Evolutionary Programming,MSPEP).借鉴强度Pareto Ⅱ演化算法的个体比较技术,通过计算个体位序的Pareto强度值进行比较排序,混合策略变异机制用于指导算法有效搜索过程.标准测试函数的实验结果验证算法的通用性和有效性.算法搜索的解集能快速逼近Pareto最优前沿.
In optimization problems with at least two conflicting objectives, a set of solutions rather than a unique one exists because of the trade-offs between these objectives. The Pareto optimal set is achieved when no solution can be improved without degrading another one. This study investigated the ap...
2011-01-01
Itaalia majandusteadlase Vilfredo Pareto jõudmisest oma kuulsa printsiibini ja selle printsiibi mõjust tänapäevasele juhtimisele. Pareto printsiibi kohaselt ei aita suurem osa tegevusest meid tulemuseni jõuda, vaid on aja raiskamine. Diagramm
2011-01-01
Itaalia majandusteadlase Vilfredo Pareto jõudmisest oma kuulsa printsiibini ja selle printsiibi mõjust tänapäevasele juhtimisele. Pareto printsiibi kohaselt ei aita suurem osa tegevusest meid tulemuseni jõuda, vaid on aja raiskamine. Diagramm
Masako, I.; Laeven, R.J.A.; Magnus, J.R.; Muris, C.
2013-01-01
In searching for an appropriate utility function in the expected utility framework, we formulate four properties that we want the utility function to satisfy. We conduct a search for such a function, and we identify Pareto utility as a function satisfying all four desired properties. Pareto utility
Prediction of Concrete Compressive Strength by Evolutionary Artificial Neural Networks
Mehdi Nikoo
2015-01-01
Full Text Available Compressive strength of concrete has been predicted using evolutionary artificial neural networks (EANNs as a combination of artificial neural network (ANN and evolutionary search procedures, such as genetic algorithms (GA. In this paper for purpose of constructing models samples of cylindrical concrete parts with different characteristics have been used with 173 experimental data patterns. Water-cement ratio, maximum sand size, amount of gravel, cement, 3/4 sand, 3/8 sand, and coefficient of soft sand parameters were considered as inputs; and using the ANN models, the compressive strength of concrete is calculated. Moreover, using GA, the number of layers and nodes and weights are optimized in ANN models. In order to evaluate the accuracy of the model, the optimized ANN model is compared with the multiple linear regression (MLR model. The results of simulation verify that the recommended ANN model enjoys more flexibility, capability, and accuracy in predicting the compressive strength of concrete.
Pareto optimal pairwise sequence alignment.
DeRonne, Kevin W; Karypis, George
2013-01-01
Sequence alignment using evolutionary profiles is a commonly employed tool when investigating a protein. Many profile-profile scoring functions have been developed for use in such alignments, but there has not yet been a comprehensive study of Pareto optimal pairwise alignments for combining multiple such functions. We show that the problem of generating Pareto optimal pairwise alignments has an optimal substructure property, and develop an efficient algorithm for generating Pareto optimal frontiers of pairwise alignments. All possible sets of two, three, and four profile scoring functions are used from a pool of 11 functions and applied to 588 pairs of proteins in the ce_ref data set. The performance of the best objective combinations on ce_ref is also evaluated on an independent set of 913 protein pairs extracted from the BAliBASE RV11 data set. Our dynamic-programming-based heuristic approach produces approximated Pareto optimal frontiers of pairwise alignments that contain comparable alignments to those on the exact frontier, but on average in less than 1/58th the time in the case of four objectives. Our results show that the Pareto frontiers contain alignments whose quality is better than the alignments obtained by single objectives. However, the task of identifying a single high-quality alignment among those in the Pareto frontier remains challenging.
A scalar optimization approach for averaged Hausdorff approximations of the Pareto front
Schütze, Oliver; Domínguez-Medina, Christian; Cruz-Cortés, Nareli; Gerardo de la Fraga, Luis; Sun, Jian-Qiao; Toscano, Gregorio; Landa, Ricardo
2016-09-01
This article presents a novel method to compute averaged Hausdorff (?) approximations of the Pareto fronts of multi-objective optimization problems. The underlying idea is to utilize directly the scalar optimization problem that is induced by the ? performance indicator. This method can be viewed as a certain set based scalarization approach and can be addressed both by mathematical programming techniques and evolutionary algorithms (EAs). In this work, the focus is on the latter where a first single objective EA for such ? approximations is proposed. Finally, the strength of the novel approach is demonstrated on some bi-objective benchmark problems with different shapes of the Pareto front.
Universal scaling for the dilemma strength in evolutionary games
Wang, Zhen; Kokubo, Satoshi; Jusup, Marko; Tanimoto, Jun
2015-09-01
Why would natural selection favor the prevalence of cooperation within the groups of selfish individuals? A fruitful framework to address this question is evolutionary game theory, the essence of which is captured in the so-called social dilemmas. Such dilemmas have sparked the development of a variety of mathematical approaches to assess the conditions under which cooperation evolves. Furthermore, borrowing from statistical physics and network science, the research of the evolutionary game dynamics has been enriched with phenomena such as pattern formation, equilibrium selection, and self-organization. Numerous advances in understanding the evolution of cooperative behavior over the last few decades have recently been distilled into five reciprocity mechanisms: direct reciprocity, indirect reciprocity, kin selection, group selection, and network reciprocity. However, when social viscosity is introduced into a population via any of the reciprocity mechanisms, the existing scaling parameters for the dilemma strength do not yield a unique answer as to how the evolutionary dynamics should unfold. Motivated by this problem, we review the developments that led to the present state of affairs, highlight the accompanying pitfalls, and propose new universal scaling parameters for the dilemma strength. We prove universality by showing that the conditions for an ESS and the expressions for the internal equilibriums in an infinite, well-mixed population subjected to any of the five reciprocity mechanisms depend only on the new scaling parameters. A similar result is shown to hold for the fixation probability of the different strategies in a finite, well-mixed population. Furthermore, by means of numerical simulations, the same scaling parameters are shown to be effective even if the evolution of cooperation is considered on the spatial networks (with the exception of highly heterogeneous setups). We close the discussion by suggesting promising directions for future research
彭星光; 徐德民; 高晓光
2011-01-01
In order to solve dynamic multi-objective optimization problem(DMOPs), a dynamic multi-objective evolutionary algorithm based on Pareto set linkage and prediction(LP-DMOEA) is proposed and a Pareto set linking method based on hyperboxis designed. In this scheme, several time sequences which present the trend of Pareto solutions can be dynamically maintained. Based on the prediction of these time sequences, the initial population is generated. The LP-DMOEA is applied to the NSGA2 algorithm to solve three benchmark problems. Computational results show the effectiveness of the LPDMOEA to solve DMOPs.%针对动态多目标优化问题,提出一种基于Pareto解集关联与预测的动态多目标进化算法(LP-DMOEA),设计了基于超块的Pareto解集关联方法.该方法能够动态维护若十描述Pareto解变化规律的时间序列,通过对新环境下的Pareto解集进行预测来生成初始种群.将LP-DMOEA应用于非劣分类遗传算法(NSGA2),并对3类标准测试函数进行了实验,所得结果表明该方法能够有效求解动态优化问题.
唐卫东; 关志华; 吴中元
2002-01-01
大多数现有的多目标进化算法(MOEA-Multiobjective Evolutionary Algorithm)都是基于Pareto机制的,如NPGA(Niched Pareto Genetic Algorithm),NSGA(Non-dominated Sorting Genetic Algorithm)等.这些算法的每一个循环都要对种群中的部分或全部个体进行排序或比较,计算量很大.文中介绍了一种基于变权重线性加权的Pareto轨迹法-WSTPEA(Weighted Sum Approach and Tracing Pareto Method),该算法不是同时求得所有可能的非劣解,而是每执行一个循环步骤求得一个非劣解,通过权重变化次数控制算法循环的次数,从而使整个种群遍历Pareto曲线(面).文中给出了算法的详细描述和流程图,并且对两个实验测试问题进行了计算,最后对结果进行了分析.
Mozaffari, Ahmad; Gorji-Bandpy, Mofid; Samadian, Pendar
2013-01-01
well-known Pareto based optimizing algorithms i.e. multi-objective bee algorithm (MOBA), multi-objective particle swarm optimization (MOPSO) algorithm, non-dominated sorting genetic algorithm (NSGA-II), and strength Pareto evolutionary algorithm (SPEA 2) are utilized to confirm the acceptable......Optimizing and controlling of complex engineering systems is a phenomenon that has attracted an incremental interest of numerous scientists. Until now, a variety of intelligent optimizing and controlling techniques such as neural networks, fuzzy logic, game theory, support vector machines...... and stochastic algorithms were proposed to facilitate controlling of the engineering systems. In this study, an extended version of mutable smart bee algorithm (MSBA) called Pareto based mutable smart bee (PBMSB) is inspired to cope with multi-objective problems. Besides, a set of benchmark problems and four...
Pareto-adaptive epsilon-dominance.
Hernández-Díaz, Alfredo G; Santana-Quintero, Luis V; Coello Coello, Carlos A; Molina, Julián
2007-01-01
Efficiency has become one of the main concerns in evolutionary multiobjective optimization during recent years. One of the possible alternatives to achieve a faster convergence is to use a relaxed form of Pareto dominance that allows us to regulate the granularity of the approximation of the Pareto front that we wish to achieve. One such relaxed forms of Pareto dominance that has become popular in the last few years is epsilon-dominance, which has been mainly used as an archiving strategy in some multiobjective evolutionary algorithms. Despite its advantages, epsilon-dominance has some limitations. In this paper, we propose a mechanism that can be seen as a variant of epsilon-dominance, which we call Pareto-adaptive epsilon-dominance (paepsilon-dominance). Our proposed approach tries to overcome the main limitation of epsilon-dominance: the loss of several nondominated solutions from the hypergrid adopted in the archive because of the way in which solutions are selected within each box.
Improving Polyp Detection Algorithms for CT Colonography: Pareto Front Approach.
Huang, Adam; Li, Jiang; Summers, Ronald M; Petrick, Nicholas; Hara, Amy K
2010-03-21
We investigated a Pareto front approach to improving polyp detection algorithms for CT colonography (CTC). A dataset of 56 CTC colon surfaces with 87 proven positive detections of 53 polyps sized 4 to 60 mm was used to evaluate the performance of a one-step and a two-step curvature-based region growing algorithm. The algorithmic performance was statistically evaluated and compared based on the Pareto optimal solutions from 20 experiments by evolutionary algorithms. The false positive rate was lower (pPareto optimization process can effectively help in fine-tuning and redesigning polyp detection algorithms.
J. L. Guardado
2014-01-01
Full Text Available Network reconfiguration is an alternative to reduce power losses and optimize the operation of power distribution systems. In this paper, an encoding scheme for evolutionary algorithms is proposed in order to search efficiently for the Pareto-optimal solutions during the reconfiguration of power distribution systems considering multiobjective optimization. The encoding scheme is based on the edge window decoder (EWD technique, which was embedded in the Strength Pareto Evolutionary Algorithm 2 (SPEA2 and the Nondominated Sorting Genetic Algorithm II (NSGA-II. The effectiveness of the encoding scheme was proved by solving a test problem for which the true Pareto-optimal solutions are known in advance. In order to prove the practicability of the encoding scheme, a real distribution system was used to find the near Pareto-optimal solutions for different objective functions to optimize.
Guardado, J L; Rivas-Davalos, F; Torres, J; Maximov, S; Melgoza, E
2014-01-01
Network reconfiguration is an alternative to reduce power losses and optimize the operation of power distribution systems. In this paper, an encoding scheme for evolutionary algorithms is proposed in order to search efficiently for the Pareto-optimal solutions during the reconfiguration of power distribution systems considering multiobjective optimization. The encoding scheme is based on the edge window decoder (EWD) technique, which was embedded in the Strength Pareto Evolutionary Algorithm 2 (SPEA2) and the Nondominated Sorting Genetic Algorithm II (NSGA-II). The effectiveness of the encoding scheme was proved by solving a test problem for which the true Pareto-optimal solutions are known in advance. In order to prove the practicability of the encoding scheme, a real distribution system was used to find the near Pareto-optimal solutions for different objective functions to optimize.
A monotonic archive for pareto-coevolution.
de Jong, Edwin D
2007-01-01
Coevolution has already produced promising results, but its dynamic evaluation can lead to a variety of problems that prevent most algorithms from progressing monotonically. An important open question therefore is how progress towards a chosen solution concept can be achieved. A general solution concept for coevolution is obtained by viewing opponents or tests as objectives. In this setup known as Pareto-coevolution, the desired solution is the Pareto-optimal set. We present an archive that guarantees monotonicity for this solution concept. The algorithm is called the Incremental Pareto-Coevolution Archive (IPCA), and is based on Evolutionary Multi-Objective Optimization (EMOO). By virtue of its monotonicity, IPCA avoids regress even when combined with a highly explorative generator. This capacity is demonstrated on a challenging test problem requiring both exploration and reliability. IPCA maintains a highly specific selection of tests, but the size of the test archive nonetheless grows unboundedly. We therefore furthermore investigate how archive sizes may be limited while still providing approximate reliability. The LAyered Pareto-Coevolution Archive (LAPCA) maintains a limited number of layers of candidate solutions and tests, and thereby permits a trade-off between archive size and reliability. The algorithm is compared in experiments, and found to be more efficient than IPCA. The work demonstrates how the approximation of a monotonic algorithm can lead to algorithms that are sufficiently reliable in practice while offering better efficiency.
Active learning of Pareto fronts.
Campigotto, Paolo; Passerini, Andrea; Battiti, Roberto
2014-03-01
This paper introduces the active learning of Pareto fronts (ALP) algorithm, a novel approach to recover the Pareto front of a multiobjective optimization problem. ALP casts the identification of the Pareto front into a supervised machine learning task. This approach enables an analytical model of the Pareto front to be built. The computational effort in generating the supervised information is reduced by an active learning strategy. In particular, the model is learned from a set of informative training objective vectors. The training objective vectors are approximated Pareto-optimal vectors obtained by solving different scalarized problem instances. The experimental results show that ALP achieves an accurate Pareto front approximation with a lower computational effort than state-of-the-art estimation of distribution algorithms and widely known genetic techniques.
GENERALIZED DOUBLE PARETO SHRINKAGE.
Armagan, Artin; Dunson, David B; Lee, Jaeyong
2013-01-01
We propose a generalized double Pareto prior for Bayesian shrinkage estimation and inferences in linear models. The prior can be obtained via a scale mixture of Laplace or normal distributions, forming a bridge between the Laplace and Normal-Jeffreys' priors. While it has a spike at zero like the Laplace density, it also has a Student's t-like tail behavior. Bayesian computation is straightforward via a simple Gibbs sampling algorithm. We investigate the properties of the maximum a posteriori estimator, as sparse estimation plays an important role in many problems, reveal connections with some well-established regularization procedures, and show some asymptotic results. The performance of the prior is tested through simulations and an application.
江思珉; 朱国荣; 孙振波; 徐强
2011-01-01
Groundwater simulation-optimization problems are often constrained optimization problems. In this paper, a modified real-coded genetic algorithm is introduced to handle these groundwater simulationoptimization problems by combining a minimal generation gap model and the Pareto strength index procedure.The routine weighing the combination pattern of an objective function and the degree violating the constraints are altered to vector weighing the combination pattern. Thus the original constrained optimization problem is transformed to a two-objective optimization. The present method is verified as feasible through application of a benchmark groundwater optimization problem and comparison with other optimization methods.%针对地下水模拟-优化模型约束优化的特点,本文结合最小代数代沟模型和Pareto强度指标概念,引入一种求解地下水模拟-优化模型的新型实数编码遗传算法,该算法将罚甬数法求解约束优化问题的目标函数和违反约束条件的程度函数的权组合方式改为向量组合形式,从而将约束优化问题转化为两目标优化问题进行求解.通过经典地下水算例与其他优化方法的比较分析表明了新算法的可靠性、通用性和稳健性.
Yue, Lei; Guan, Zailin; Saif, Ullah; Zhang, Fei; Wang, Hao
2016-01-01
Group scheduling is significant for efficient and cost effective production system. However, there exist setup times between the groups, which require to decrease it by sequencing groups in an efficient way. Current research is focused on a sequence dependent group scheduling problem with an aim to minimize the makespan in addition to minimize the total weighted tardiness simultaneously. In most of the production scheduling problems, the processing time of jobs is assumed as fixed. However, the actual processing time of jobs may be reduced due to "learning effect". The integration of sequence dependent group scheduling problem with learning effects has been rarely considered in literature. Therefore, current research considers a single machine group scheduling problem with sequence dependent setup times and learning effects simultaneously. A novel hybrid Pareto artificial bee colony algorithm (HPABC) with some steps of genetic algorithm is proposed for current problem to get Pareto solutions. Furthermore, five different sizes of test problems (small, small medium, medium, large medium, large) are tested using proposed HPABC. Taguchi method is used to tune the effective parameters of the proposed HPABC for each problem category. The performance of HPABC is compared with three famous multi objective optimization algorithms, improved strength Pareto evolutionary algorithm (SPEA2), non-dominated sorting genetic algorithm II (NSGAII) and particle swarm optimization algorithm (PSO). Results indicate that HPABC outperforms SPEA2, NSGAII and PSO and gives better Pareto optimal solutions in terms of diversity and quality for almost all the instances of the different sizes of problems.
Evolutionary games in a generalized Moran process with arbitrary selection strength and mutation
Quan Jia; Wang Xian-Jia
2011-01-01
By using a generalized fitness-dependent Moran process, an evolutionary model for symmetric 2×2 games in a well-mixed population with a finite size is investigated. In the model, the individuals' payoff accumulating from games is mapped into fitness using an exponent function. Both selection strength β and mutation rate ε are considered. The process is an ergodic birth-death process. Based on the limit distribution of the process, we give the analysis results for which strategy will be favoured when e is small enough. The results depend on not only the payoff matrix of the game, but also on the population size. Especially, we prove that natural selection favours the strategy which is risk-dominant when the population size is large enough. For arbitrary β and ε values, the 'Hawk-Dove' game and the 'Coordinate' game are used to illustrate our model. We give the evolutionary stable strategy (ESS) of the games and compare the results with those of the replicator dynamics in the infinite population. The results are determined by simulation experiments.
Signal Detection for Pareto Renewal Processes.
1982-10-01
SThe Pareto distribution itself was, of course, introduced by Vilfredo Pareto (1648 - 1923). (See Reference [221). This distribution has been used and...Bull. Calcutta Statist. Assoc., 7, 115-123. 22. Pareto , Vilfredo (1897). Cours d’Economie Politique. Lausanne and Paris: Rouge and Cie. 23. Park, C...STANDARDS-193-A 0 .1 / - r- ,---------------,- 8-82 SERIES IN STATISTICS AND BIOSTATISTICS SIGNAL DETECTION FOR PARETO RENEWAL PROCESSES C.B. BELL, R
Record Values of a Pareto Distribution.
Ahsanullah, M.
The record values of the Pareto distribution, labelled Pareto (II) (alpha, beta, nu), are reviewed. The best linear unbiased estimates of the parameters in terms of the record values are provided. The prediction of the sth record value based on the first m (s>m) record values are obtained. A classical Pareto distribution provides reasonably good…
Record Values of a Pareto Distribution.
Ahsanullah, M.
The record values of the Pareto distribution, labelled Pareto (II) (alpha, beta, nu), are reviewed. The best linear unbiased estimates of the parameters in terms of the record values are provided. The prediction of the sth record value based on the first m (s>m) record values are obtained. A classical Pareto distribution provides reasonably…
Are your data really Pareto distributed?
Cirillo, Pasquale
2013-01-01
Pareto distributions, and power laws in general, have demonstrated to be very useful models to describe very different phenomena, from physics to finance. In recent years, the econophysical literature has proposed a large amount of papers and models justifying the presence of power laws in economic data. Most of the times, this Paretianity is inferred from the observation of some plots, such as the Zipf plot and the mean excess plot. If the Zipf plot looks almost linear, then everything is ok and the parameters of the Pareto distribution are estimated. Often with OLS. Unfortunately, as we show in this paper, these heuristic graphical tools are not reliable. To be more exact, we show that only a combination of plots can give some degree of confidence about the real presence of Paretianity in the data. We start by reviewing some of the most important plots, discussing their points of strength and weakness, and then we propose some additional tools that can be used to refine the analysis.
Pareto law and Pareto index in the income distribution of Japanese companies
Atushi Ishikawa
2004-01-01
In order to study the phenomenon in detail that income distribution follows Pareto law, we analyze the database of high income companies in Japan. We find a quantitative relation between the average capital of the companies and the Pareto index. The larger the average capital becomes, the smaller the Pareto index becomes. From this relation, we can possibly explain that the Pareto index of company income distribution hardly changes, while the Pareto index of personal income distribution chang...
PARETO-GEOMETRIC DISTRIBUTION%Pareto-Geometric分布
姚惠; 戴勇; 谢林
2012-01-01
In this paper, we introduce a new lifetime distribution with decreasing failure rate, that is two-parameter Pareto-Geometric distribution which is obtained by compounding a Pareto and a geometric distribution. Various properties are studied and the existence and uniqueness of the MLE of parameters are discussed, the MLE of parameters are gained by the EM algorithm and their asymptotic variances and covariances are obtained, also.%本文提出了一种具有单调失效率的新型寿命分布,即由Pareto分布和Geometric分布生成的两参数的Pareto-Geometric分布,研究了该分布的各种性质和参数极大似然估计的存在唯一性,并应用EM算法得到了参数的极大似然估计值和相应的渐近方差、协方差.
Jiahuan Wu; Jianlin Wang; Tao Yu; Liqiang Zhao
2014-01-01
The approaches to discrete approximation of Pareto front using multi-objective evolutionary algorithms have the problems of heavy computation burden, long running time and missing Pareto optimal points. In order to overcome these problems, an approach to continuous approximation of Pareto front using geometric support vector regression is presented. The regression model of the small size approximate discrete Pareto front is constructed by geometric support vector regression modeling and is described as the approximate continuous Pareto front. In the process of geometric support vector regression modeling, considering the distribution characteristic of Pareto optimal points, the separable augmented training sample sets are constructed by shifting original training sample points along multiple coordinated axes. Besides, an interactive decision-making (DM) procedure, in which the continuous approximation of Pareto front and decision-making is performed interactive-ly, is designed for improving the accuracy of the preferred Pareto optimal point. The correctness of the continuous approximation of Pareto front is demonstrated with a typical multi-objective optimization problem. In addition, combined with the interactive decision-making procedure, the continuous approximation of Pareto front is applied in the multi-objective optimization for an industrial fed-batch yeast fermentation process. The experi-mental results show that the generated approximate continuous Pareto front has good accuracy and complete-ness. Compared with the multi-objective evolutionary algorithm with large size population, a more accurate preferred Pareto optimal point can be obtained from the approximate continuous Pareto front with less compu-tation and shorter running time. The operation strategy corresponding to the final preferred Pareto optimal point generated by the interactive DM procedure can improve the production indexes of the fermentation process effectively.
A genetic algorithm for the pareto optimal solution set of multi-objective shortest path problem
HU Shi-cheng; XU Xiao-fei; ZHAN De-chen
2005-01-01
Unlike the shortest path problem that has only one optimal solution and can be solved in polynomial time, the multi-objective shortest path problem (MSPP) has a set of pareto optimal solutions and cannot be solved in polynomial time. The present algorithms focused mainly on how to obtain a precisely pareto optimal solution for MSPP resulting in a long time to obtain multiple pareto optimal solutions with them. In order to obtain a set of satisfied solutions for MSPP in reasonable time to meet the demand of a decision maker, a genetic algorithm MSPP-GA is presented to solve the MSPP with typically competing objectives, cost and time, in this paper. The encoding of the solution and the operators such as crossover, mutation and selection are developed.The algorithm introduced pareto domination tournament and sharing based selection operator, which can not only directly search the pareto optimal frontier but also maintain the diversity of populations in the process of evolutionary computation. Experimental results show that MSPP-GA can obtain most efficient solutions distributed all along the pareto frontier in less time than an exact algorithm. The algorithm proposed in this paper provides a new and effective method of how to obtain the set of pareto optimal solutions for other multiple objective optimization problems in a short time.
BASES COMPONENTS OF PARETO EFFICIENCY
Daniela POPESCU
2011-01-01
Full Text Available This Study take into discussion the problem of underlay the decisions, which are particularly complex and actual, based of an important volume of information, which need an important quantity of work. From our investigations, we conclusion that some inconvenient can be evitable by use also of others concepts, which apply to this kind of information. In this direction, the Study follow up to end the manner which base the decisions, we allot a especial attention to analyze the Concept of Efficiency Pareto, which finally has two fundamental elements: final benefit and opportunity cost, use also in the process for take decisions. So we explain the ample analyze of Concept of Efficiency Pareto, where the main accent is on quantitative aspects evaluation of elements, which characterize them. By amplification is thoroughness the analyze of process for take decisions. So is underlined the closed link between different economical concepts and their great usefulness in practice.
Pareto optimization in algebraic dynamic programming.
Saule, Cédric; Giegerich, Robert
2015-01-01
Pareto optimization combines independent objectives by computing the Pareto front of its search space, defined as the set of all solutions for which no other candidate solution scores better under all objectives. This gives, in a precise sense, better information than an artificial amalgamation of different scores into a single objective, but is more costly to compute. Pareto optimization naturally occurs with genetic algorithms, albeit in a heuristic fashion. Non-heuristic Pareto optimization so far has been used only with a few applications in bioinformatics. We study exact Pareto optimization for two objectives in a dynamic programming framework. We define a binary Pareto product operator [Formula: see text] on arbitrary scoring schemes. Independent of a particular algorithm, we prove that for two scoring schemes A and B used in dynamic programming, the scoring scheme [Formula: see text] correctly performs Pareto optimization over the same search space. We study different implementations of the Pareto operator with respect to their asymptotic and empirical efficiency. Without artificial amalgamation of objectives, and with no heuristics involved, Pareto optimization is faster than computing the same number of answers separately for each objective. For RNA structure prediction under the minimum free energy versus the maximum expected accuracy model, we show that the empirical size of the Pareto front remains within reasonable bounds. Pareto optimization lends itself to the comparative investigation of the behavior of two alternative scoring schemes for the same purpose. For the above scoring schemes, we observe that the Pareto front can be seen as a composition of a few macrostates, each consisting of several microstates that differ in the same limited way. We also study the relationship between abstract shape analysis and the Pareto front, and find that they extract information of a different nature from the folding space and can be meaningfully combined.
A family of bivariate Pareto distributions
P. G. Sankaran
2014-06-01
Full Text Available Pareto distributions have been extensively used in literature for modelling and analysis of income and lifetime data. In the present paper, we introduce a family of bivariate Pareto distributions using a generalized version of dullness property. Some important bivariate Pareto distributions are derived as special cases. Distributional properties of the family are studied. The dependency structure of the family is investigated. Finally, the family of distributions is applied to two real life data situation
Wang, Zhen; Kokubo, Satoshi; Jusup, Marko; Tanimoto, Jun
2015-09-01
While comprehensive reviews of the literature, by gathering in one place most of the relevant information, undoubtedly steer the development of every scientific field, we found that the comments in response to a review article can be as informative as the review itself, if not more. Namely, reading through the comments on the ideas expressed in Ref. [1], we could identify a number of pressing problems for evolutionary game theory, indicating just how much space there still is for major advances and breakthroughs. In an attempt to bring a sense of order to a multitude of opinions, we roughly classified the comments into three categories, i.e. those concerned with: (i) the universality of scaling in heterogeneous topologies, including empirical dynamic networks [2-8], (ii) the universality of scaling for more general game setups, such as the inclusion of multiple strategies and external features [4,9-11], and (iii) experimental confirmations of the theoretical developments [2,12,13].
Integrating Pareto Optimization into Dynamic Programming
Thomas Gatter
2016-01-01
Full Text Available Pareto optimization combines independent objectives by computing the Pareto front of the search space, yielding a set of optima where none scores better on all objectives than any other. Recently, it was shown that Pareto optimization seamlessly integrates with algebraic dynamic programming: when scoring schemes A and B can correctly evaluate the search space via dynamic programming, then so can Pareto optimization with respect to A and B. However, the integration of Pareto optimization into dynamic programming opens a wide range of algorithmic alternatives, which we study in substantial detail in this article, using real-world applications in biosequence analysis, a field where dynamic programming is ubiquitous. Our results are two-fold: (1 We introduce the operation of a “Pareto algebra product” in the dynamic programming framework of Bellman’s GAP. Users of this framework can now ask for Pareto optimization with a single keystroke. Careful evaluation of the implementation alternatives by means of an extended Bellman’s GAP compiler demonstrates the dependence of the best implementation choice on the application at hand. (2 We extract from our experiments several pieces of advice to programmers who do not use a system such as Bellman’s GAP, but who choose to hand-craft their dynamic programming recurrences, incorporating Pareto optimization from scratch.
Pareto law and Pareto index in the income distribution of Japanese companies
Ishikawa, Atushi
2005-04-01
In order to study the phenomenon in detail that income distribution follows Pareto law, we analyze the database of high income companies in Japan. We find a quantitative relation between the average capital of the companies and the Pareto index. The larger the average capital becomes, the smaller the Pareto index becomes. From this relation, we can possibly explain that the Pareto index of company income distribution hardly changes, while the Pareto index of personal income distribution changes sharply, from a viewpoint of capital (or means). We also find a quantitative relation between the lower bound of capital and the typical scale at which Pareto law breaks. The larger the lower bound of capital becomes, the larger the typical scale becomes. From this result, the reason there is a (no) typical scale at which Pareto law breaks in the income distribution can be understood through (no) constraint, such as the lower bound of capital or means of companies, in the financial system.
Praveen Kumar Shukla
2012-07-01
Full Text Available Interpretability and accuracy are two important features of fuzzy systems which are conflicting in their nature. One can be improved at the cost of the other and this situation is identified as “Interpretability-Accuracy Trade-Off”. To deal with this trade-off Multi-Objective Evolutionary Algorithms (MOEA are frequently applied in the design of fuzzy systems. Several novel MOEA have been proposed and invented for this purpose, more specifically, Non-Dominated Sorting Genetic Algorithms (NSGA-II, Strength Pareto Evolutionary Algorithm 2 (SPEA2, Fuzzy Genetics-Based Machine Learning (FGBML, (2 + 2 Pareto Archived Evolutionary Strategy ((2 + 2 PAES, (2 + 2 Memetic- Pareto Archived Evolutionary Strategy ((2 + 2 M-PAES, etc. This paper introduces and reviews the approaches to the issue of developing fuzzy systems using Evolutionary Multi-Objective Optimization (EMO algorithms considering ‘Interpretability-Accuracy Trade-off’ and mainly focusing on the work in the last decade. Different research issues and challenges are also discussed.
Pareto versus lognormal: a maximum entropy test.
Bee, Marco; Riccaboni, Massimo; Schiavo, Stefano
2011-08-01
It is commonly found that distributions that seem to be lognormal over a broad range change to a power-law (Pareto) distribution for the last few percentiles. The distributions of many physical, natural, and social events (earthquake size, species abundance, income and wealth, as well as file, city, and firm sizes) display this structure. We present a test for the occurrence of power-law tails in statistical distributions based on maximum entropy. This methodology allows one to identify the true data-generating processes even in the case when it is neither lognormal nor Pareto. The maximum entropy approach is then compared with other widely used methods and applied to different levels of aggregation of complex systems. Our results provide support for the theory that distributions with lognormal body and Pareto tail can be generated as mixtures of lognormally distributed units.
2016-12-21
The JMP Add-In TopN-PFS provides an automated tool for finding layered Pareto front to identify the top N solutions from an enumerated list of candidates subject to optimizing multiple criteria. The approach constructs the N layers of Pareto fronts, and then provides a suite of graphical tools to explore the alternatives based on different prioritizations of the criteria. The tool is designed to provide a set of alternatives from which the decision-maker can select the best option for their study goals.
This study explored the application of a multi-objective evolutionary algorithm (MOEA) and Pareto ordering in the multiple-objective automatic calibration of the Soil and Water Assessment Tool (SWAT). SWAT was calibrated in the Calapooia watershed, Oregon, USA, with two different pairs of objective ...
Pareto tails and lognormal body of US cities size distribution
Luckstead, Jeff; Devadoss, Stephen
2017-01-01
We consider a distribution, which consists of lower tail Pareto, lognormal body, and upper tail Pareto, to estimate the size distribution of all US cities. This distribution fits the data more accurately than a distribution that comprises of only lognormal and the upper tail Pareto.
How Well Do We Know Pareto Optimality?
Mathur, Vijay K.
1991-01-01
Identifies sources of ambiguity in economics textbooks' discussion of the condition for efficient output mix. Points out that diverse statements without accompanying explanations create confusion among students. Argues that conflicting views concerning the concept of Pareto optimality as one source of ambiguity. Suggests clarifying additions to…
Pareto optimality in organelle energy metabolism analysis.
Angione, Claudio; Carapezza, Giovanni; Costanza, Jole; Lió, Pietro; Nicosia, Giuseppe
2013-01-01
In low and high eukaryotes, energy is collected or transformed in compartments, the organelles. The rich variety of size, characteristics, and density of the organelles makes it difficult to build a general picture. In this paper, we make use of the Pareto-front analysis to investigate the optimization of energy metabolism in mitochondria and chloroplasts. Using the Pareto optimality principle, we compare models of organelle metabolism on the basis of single- and multiobjective optimization, approximation techniques (the Bayesian Automatic Relevance Determination), robustness, and pathway sensitivity analysis. Finally, we report the first analysis of the metabolic model for the hydrogenosome of Trichomonas vaginalis, which is found in several protozoan parasites. Our analysis has shown the importance of the Pareto optimality for such comparison and for insights into the evolution of the metabolism from cytoplasmic to organelle bound, involving a model order reduction. We report that Pareto fronts represent an asymptotic analysis useful to describe the metabolism of an organism aimed at maximizing concurrently two or more metabolite concentrations.
Multiclass gene selection using Pareto-fronts.
Rajapakse, Jagath C; Mundra, Piyushkumar A
2013-01-01
Filter methods are often used for selection of genes in multiclass sample classification by using microarray data. Such techniques usually tend to bias toward a few classes that are easily distinguishable from other classes due to imbalances of strong features and sample sizes of different classes. It could therefore lead to selection of redundant genes while missing the relevant genes, leading to poor classification of tissue samples. In this manuscript, we propose to decompose multiclass ranking statistics into class-specific statistics and then use Pareto-front analysis for selection of genes. This alleviates the bias induced by class intrinsic characteristics of dominating classes. The use of Pareto-front analysis is demonstrated on two filter criteria commonly used for gene selection: F-score and KW-score. A significant improvement in classification performance and reduction in redundancy among top-ranked genes were achieved in experiments with both synthetic and real-benchmark data sets.
Pareto front estimation for decision making.
Giagkiozis, Ioannis; Fleming, Peter J
2014-01-01
The set of available multi-objective optimisation algorithms continues to grow. This fact can be partially attributed to their widespread use and applicability. However, this increase also suggests several issues remain to be addressed satisfactorily. One such issue is the diversity and the number of solutions available to the decision maker (DM). Even for algorithms very well suited for a particular problem, it is difficult-mainly due to the computational cost-to use a population large enough to ensure the likelihood of obtaining a solution close to the DM's preferences. In this paper we present a novel methodology that produces additional Pareto optimal solutions from a Pareto optimal set obtained at the end run of any multi-objective optimisation algorithm for two-objective and three-objective problem instances.
The Generalized Pareto process; with application
Ferreira, Ana
2012-01-01
In extreme value statistics the peaks-over-threshold method is widely used. The method is based on the Generalized Pareto distribution (Balkema and de Haan (1974), Pickands (1975) in univariate theory and e.g. Falk, H\\"usler and Reiss (2010), Rootz\\'en and Tajvidi (2006) in multivariate theory) characterizing probabilities of exceedances over high thresholds. We present a generalization of this concept in the space of continuous functions. We call this the Generalized Pareto process. Different from earlier papers our definition is not based on a distribution function but on functional properties. As an application we use the theory to produce wind fields connected to disastrous storms on the basis of observed extreme but not disastrous storms.
The Gompertz-Pareto income distribution
Chami Figueira, F.; Moura, N. J.; Ribeiro, M. B.
2011-02-01
This work analyzes the Gompertz-Pareto distribution (GPD) of personal income, formed by the combination of the Gompertz curve, representing the overwhelming majority of the economically less favorable part of the population of a country, and the Pareto power law, which describes its tiny richest part. Equations for the Lorenz curve, Gini coefficient and the percentage share of the Gompertzian part relative to the total income are all written in this distribution. We show that only three parameters, determined by linear data fitting, are required for its complete characterization. Consistency checks are carried out using income data of Brazil from 1981 to 2007 and they lead to the conclusion that the GPD is consistent and provides a coherent and simple analytical tool to describe personal income distribution data.
Automatic colonic polyp detection using multi-objective evolutionary techniques
Li, Jiang; Huang, Adam; Yao, Jianhua; Bitter, Ingmar; Petrick, Nicholas; Summers, Ronald M.; Pickhardt, Perry J.; Choi, J. Richard
2006-03-01
Colonic polyps appear like elliptical protrusions on the inner wall of the colon. Curvature based features for colonic polyp detection have proved to be successful in several computer-aided diagnostic CT colonography (CTC) systems. Some simple thresholds are set for those features for creating initial polyp candidates, sophisticated classification scheme are then applied on these polyp candidates to reduce false positives. There are two objective functions, the number of missed polyps and false positive rate, that need to be minimized when setting those thresholds. These two objectives conflict and it is usually difficult to optimize them both by a gradient search. In this paper, we utilized a multiobjective evolutionary method, the Strength Pareto Evolutionary Algorithm (SPEA2), to optimize those thresholds. SPEA2 incorporates the concept of Pareto dominance and applies genetic techniques to evolve individual solutions to the Pareto front. The SPEA2 algorithm was applied to colon CT images from 27 patients each having a prone and a supine scan. There are 40 colonoscopically confirmed polyps resulting in 72 positive detections in CTC reading. The results obtained by SPEA2 were compared with those obtained by our old system, where an appropriate value was set for each of those thresholds by a histogram examination method. If we keep the sensitivity the same as that of our old system, the SPEA2 algorithm reduced false positive rate by 76.4% from average false positive 55.6 to 13.3 per data set. If the false positive rate is kept the same for both systems, SPEA2 increased the sensitivity by 13.1% from 53 to 61 among 72 ground truth detections.
Pareto vs Simmel: residui ed emozioni
Silvia Fornari
2017-08-01
Full Text Available A cento anni dalla pubblicazione del Trattato di sociologia generale (Pareto 1988 siamo a mantenere vivo ed attuale lo studio paretiano con una rilettura contemporanea del suo pensiero. Ricordato per la grande versatilità intellettuale dagli economisti, rimane lo scienziato rigoroso ed analitico i cui contributi sono ancora discussi a livello internazionale. Noi ne analizzeremo gli aspetti che l’hanno portato ad avvicinarsi all’approccio sociologico, con l’introduzione della nota distinzione dell’azione sociale: logica e non-logica. Una dicotomia utilizzata per dare conto dei cambiamenti sociali riguardanti le modalità d’azione degli uomini e delle donne. Com’è noto le azioni logiche sono quelle che riguardano comportamenti mossi da logicità e raziocinio, in cui vi è una diretta relazione causa-effetto, azioni oggetto di studio degli economisti, e di cui non si occupano i sociologi. Le azioni non-logiche riguardano tutte le tipologie di agire umano che rientrano nel novero delle scienze sociali, e che rappresentano la parte più ampia dell’agire sociale. Sono le azioni guidate dai sentimenti, dall’emotività, dalla superstizione, ecc., illustrate da Pareto nel Trattato di sociologia generale e in saggi successivi, dove riprende anche il concetto di eterogenesi dei fini, formulato per la prima volta da Giambattista Vico. Concetto secondo il quale la storia umana, pur conservando in potenza la realizzazione di certi fini, non è lineare e lungo il suo percorso evolutivo può accadere che l’uomo nel tentativo di raggiungere una finalità arrivi a conclusioni opposte. Pareto collega la definizione del filosofo napoletano alle tipologie di azione sociale e alla loro distinzione (logiche, non-logiche. L’eterogenesi dei fini per Pareto è dunque l’esito di un particolare tipo di azione non-logica dell’essere umano e della collettività.
Power laws, Pareto distributions and Zipf's law
Newman, M E J
2004-01-01
When the probability of measuring a particular value of some quantity varies inversely as a power of that value, the quantity is said to follow a power law, also known variously as Zipf's law or the Pareto distribution. Power laws appear widely in physics, biology, earth and planetary sciences, economics and finance, computer science, demography and the social sciences. For instance, the distributions of the sizes of cities, earthquakes, solar flares, moon craters, wars and people's personal ...
The Pareto/NBD Model Extensions%Pareto/NBD模型扩展
马少辉
2008-01-01
Pareto/NBD模型是由schmittlein等(1987)提出,用于描述非契约客户关系情景下客户重复购买行为.该模型被认为是目前客户基分析中的基准模型,但其所基于的概率分布假设过于严格,且存在与常识不符的问题.针对这些问题,使用马尔可夫链蒙特卡罗方法,直接获得客户购买率和流失率的后验样本,从而使放松Pareto/NBD模型的假设成为可能.通过放松Pareto/NBD模型假设,设计了三个扩展模型,推导了相应的公式.用实际的数据集对pareto/NBD模型及其扩展进行预测效果对比,发现了预测能力明显高于Pareto/NBD模型的扩展模型.
RNA-Pareto: interactive analysis of Pareto-optimal RNA sequence-structure alignments.
Schnattinger, Thomas; Schöning, Uwe; Marchfelder, Anita; Kestler, Hans A
2013-12-01
Incorporating secondary structure information into the alignment process improves the quality of RNA sequence alignments. Instead of using fixed weighting parameters, sequence and structure components can be treated as different objectives and optimized simultaneously. The result is not a single, but a Pareto-set of equally optimal solutions, which all represent different possible weighting parameters. We now provide the interactive graphical software tool RNA-Pareto, which allows a direct inspection of all feasible results to the pairwise RNA sequence-structure alignment problem and greatly facilitates the exploration of the optimal solution set.
Characterizations of a family of bivariate Pareto distributions
P. G. Sankaran
2015-09-01
Full Text Available In the present paper, we study properties of a family of bivariate Pareto distributions. The well known dullness property of the univariate Pareto model is extended to the bivariate setup. Two measures of income inequality viz. income gap ratio and mean left proportional residual income are defined in the bivariate case. We also introduce bivariate generalized failure rate useful in reliability analysis. Characterizations, using the above concepts, for various members of the family of bivariate Pareto distributions are derived.
Pareto Optimal Design for Synthetic Biology.
Patanè, Andrea; Santoro, Andrea; Costanza, Jole; Carapezza, Giovanni; Nicosia, Giuseppe
2015-08-01
Recent advances in synthetic biology call for robust, flexible and efficient in silico optimization methodologies. We present a Pareto design approach for the bi-level optimization problem associated to the overproduction of specific metabolites in Escherichia coli. Our method efficiently explores the high dimensional genetic manipulation space, finding a number of trade-offs between synthetic and biological objectives, hence furnishing a deeper biological insight to the addressed problem and important results for industrial purposes. We demonstrate the computational capabilities of our Pareto-oriented approach comparing it with state-of-the-art heuristics in the overproduction problems of i) 1,4-butanediol, ii) myristoyl-CoA, i ii) malonyl-CoA , iv) acetate and v) succinate. We show that our algorithms are able to gracefully adapt and scale to more complex models and more biologically-relevant simulations of the genetic manipulations allowed. The Results obtained for 1,4-butanediol overproduction significantly outperform results previously obtained, in terms of 1,4-butanediol to biomass formation ratio and knock-out costs. In particular overproduction percentage is of +662.7%, from 1.425 mmolh⁻¹gDW⁻¹ (wild type) to 10.869 mmolh⁻¹gDW⁻¹, with a knockout cost of 6. Whereas, Pareto-optimal designs we have found in fatty acid optimizations strictly dominate the ones obtained by the other methodologies, e.g., biomass and myristoyl-CoA exportation improvement of +21.43% (0.17 h⁻¹) and +5.19% (1.62 mmolh⁻¹gDW⁻¹), respectively. Furthermore CPU time required by our heuristic approach is more than halved. Finally we implement pathway oriented sensitivity analysis, epsilon-dominance analysis and robustness analysis to enhance our biological understanding of the problem and to improve the optimization algorithm capabilities.
Casting defects analysis by the Pareto method
B. Borowiecki
2011-07-01
Full Text Available On the basis of receive results formed of diagram Pareto Lorenz. On the basis of receive graph it affirmed, that for 70% general number casting defects answered 3 defects (9 contribution – 100% defects. For 70% general number defects of influence it has three type of causes: sand holes, porosity and slaginclusions. Thedefects show that it is necessary to take up construction gatingsystem. The remaining 8 causes have been concerned only 25%, with general number of casting defects. Analysis of receive results permit to determine of direction of correct actions in order to eliminate or to limit the most defects.
Chen, Jing; Liu, Tundong; Jiang, Hao
2016-01-01
A Pareto-based multi-objective optimization approach is proposed to design multichannel FBG filters. Instead of defining a single optimal objective, the proposed method establishes the multi-objective model by taking two design objectives into account, which are minimizing the maximum index modulation and minimizing the mean dispersion error. To address this optimization problem, we develop a two-stage evolutionary computation approach integrating an elitist non-dominated sorting genetic algorithm (NSGA-II) and technique for order preference by similarity to ideal solution (TOPSIS). NSGA-II is utilized to search for the candidate solutions in terms of both objectives. The obtained results are provided as Pareto front. Subsequently, the best compromise solution is determined by the TOPSIS method from the Pareto front according to the decision maker's preference. The design results show that the proposed approach yields a remarkable reduction of the maximum index modulation and the performance of dispersion spectra of the designed filter can be optimized simultaneously.
Corner Sort for Pareto-Based Many-Objective Optimization.
Wang, Handing; Yao, Xin
2014-01-01
Nondominated sorting plays an important role in Pareto-based multiobjective evolutionary algorithms (MOEAs). When faced with many-objective optimization problems multiobjective optimization problems (MOPs) with more than three objectives, the number of comparisons needed in nondominated sorting becomes very large. In view of this, a new corner sort is proposed in this paper. Corner sort first adopts a fast and simple method to obtain a nondominated solution from the corner solutions, and then uses the nondominated solution to ignore the solutions dominated by it to save comparisons. Obtaining the nondominated solutions requires much fewer objective comparisons in corner sort. In order to evaluate its performance, several state-of-the-art nondominated sorts are compared with our corner sort on three kinds of artificial solution sets of MOPs and the solution sets generated from MOEAs on benchmark problems. On one hand, the experiments on artificial solution sets show the performance on the solution sets with different distributions. On the other hand, the experiments on the solution sets generated from MOEAs show the influence that different sorts bring to MOEAs. The results show that corner sort performs well, especially on many-objective optimization problems. Corner sort uses fewer comparisons than others.
Pareto 80/20 Law: Derivation via Random Partitioning
Lipovetsky, Stan
2009-01-01
The Pareto 80/20 Rule, also known as the Pareto principle or law, states that a small number of causes (20%) is responsible for a large percentage (80%) of the effect. Although widely recognized as a heuristic rule, this proportion has not been theoretically based. The article considers derivation of this 80/20 rule and some other standard…
Pareto Improving Price Regulation when the Asset Market is Incomplete
Herings, P.J.J.; Polemarchakis, H.M.
1999-01-01
When the asset market is incomplete, competitive equilibria are constrained suboptimal, which provides a scope for pareto improving interventions. Price regulation can be such a pareto improving policy, even when the welfare effects of rationing are taken into account. An appealing aspect of price r
Joseph Femia (ed., Vilfredo Pareto (London: Ashgate, 2009
Giorgio Baruchello
2012-03-01
Full Text Available All contemporary textbooks in the social sciences hail Vilfredo Pareto (1848—1923 as one of the founding fathers of modern sociology, alongside celebrated classics such as Auguste Comte, Max Weber and Emile Durkheim. Moreover, Pareto's contribution extends to the field of economics as well, which is an accomplishment that none of the other great sociological minds can boast for himself.
Pareto 80/20 Law: Derivation via Random Partitioning
Lipovetsky, Stan
2009-01-01
The Pareto 80/20 Rule, also known as the Pareto principle or law, states that a small number of causes (20%) is responsible for a large percentage (80%) of the effect. Although widely recognized as a heuristic rule, this proportion has not been theoretically based. The article considers derivation of this 80/20 rule and some other standard…
The exponential age distribution and the Pareto firm size distribution
Coad, Alex
2008-01-01
Recent work drawing on data for large and small firms has shown a Pareto distribution of firm size. We mix a Gibrat-type growth process among incumbents with an exponential distribution of firm’s age, to obtain the empirical Pareto distribution.
Pareto-front shape in multiobservable quantum control
Sun, Qiuyang; Wu, Re-Bing; Rabitz, Herschel
2017-03-01
Many scenarios in the sciences and engineering require simultaneous optimization of multiple objective functions, which are usually conflicting or competing. In such problems the Pareto front, where none of the individual objectives can be further improved without degrading some others, shows the tradeoff relations between the competing objectives. This paper analyzes the Pareto-front shape for the problem of quantum multiobservable control, i.e., optimizing the expectation values of multiple observables in the same quantum system. Analytic and numerical results demonstrate that with two commuting observables the Pareto front is a convex polygon consisting of flat segments only, while with noncommuting observables the Pareto front includes convexly curved segments. We also assess the capability of a weighted-sum method to continuously capture the points along the Pareto front. Illustrative examples with realistic physical conditions are presented, including NMR control experiments on a 1H-13C two-spin system with two commuting or noncommuting observables.
Evolutionary and principled search strategies for sensornet protocol optimization.
Tate, Jonathan; Woolford-Lim, Benjamin; Bate, Iain; Yao, Xin
2012-02-01
Interactions between multiple tunable protocol parameters and multiple performance metrics are generally complex and unknown; finding optimal solutions is generally difficult. However, protocol tuning can yield significant gains in energy efficiency and resource requirements, which is of particular importance for sensornet systems in which resource availability is severely restricted. We address this multi-objective optimization problem for two dissimilar routing protocols and by two distinct approaches. First, we apply factorial design and statistical model fitting methods to reject insignificant factors and locate regions of the problem space containing near-optimal solutions by principled search. Second, we apply the Strength Pareto Evolutionary Algorithm 2 and Two-Archive evolutionary algorithms to explore the problem space, with each iteration potentially yielding solutions of higher quality and diversity than the preceding iteration. Whereas a principled search methodology yields a generally applicable survey of the problem space and enables performance prediction, the evolutionary approach yields viable solutions of higher quality and at lower experimental cost. This is the first study in which sensornet protocol optimization has been explicitly formulated as a multi-objective problem and solved with state-of-the-art multi-objective evolutionary algorithms.
Tractable Pareto Optimization of Temporal Preferences
Morris, Robert; Morris, Paul; Khatib, Lina; Venable, Brent
2003-01-01
This paper focuses on temporal constraint problems where the objective is to optimize a set of local preferences for when events occur. In previous work, a subclass of these problems has been formalized as a generalization of Temporal CSPs, and a tractable strategy for optimization has been proposed, where global optimality is defined as maximizing the minimum of the component preference values. This criterion for optimality, which we call 'Weakest Link Optimization' (WLO), is known to have limited practical usefulness because solutions are compared only on the basis of their worst value; thus, there is no requirement to improve the other values. To address this limitation, we introduce a new algorithm that re-applies WLO iteratively in a way that leads to improvement of all the values. We show the value of this strategy by proving that, with suitable preference functions, the resulting solutions are Pareto Optimal.
Phase transitions in Pareto optimal complex networks
Seoane, Luís F
2015-01-01
The organization of interactions in complex systems can be described by networks connecting different units. These graphs are useful representations of the local and global complexity of the underlying systems. The origin of their topological structure can be diverse, resulting from different mechanisms including multiplicative processes and optimization. In spatial networks or in graphs where cost constraints are at work, as it occurs in a plethora of situations from power grids to the wiring of neurons in the brain, optimization plays an important part in shaping their organization. In this paper we study network designs resulting from a Pareto optimization process, where different simultaneous constraints are the targets of selection. We analyze three variations on a problem finding phase transitions of different kinds. Distinct phases are associated to different arrangements of the connections; but the need of drastic topological changes does not determine the presence, nor the nature of the phase transit...
Sato, Hiroyuki; Aguirre, Hernán E.; Kiyoshi, Tanaka
In this work, we propose a novel multi-objective evolutionary algorithm (MOEA) which improves search performance of MOEA especially for many-objective combinatorial optimization problems. Pareto dominance based MOEAs such as NSGA-II and SPEA2 meet difficulty to rank solutions in the population noticeably deteriorating search performance as we increase the number of objectives. In the proposed method, we rank solutions by calculating Pareto partial dominance between solutions using r objective functions selected from m objective functions to induce appropriate selection pressure in many-objective optimization by Pareto-based MOEA. Also, we temporally switch r objective functions among mCr combinations in every interval generations Ig to optimize all of the objective functions throughout the entire evolution process. In this work, we use many-objective 0/1 knapsack problems to show the search performance of the proposed method and analyze its evolution behavior. Simulation results show that there is an optimum value for the number of objective functions r to be considered for the calculation of Pareto partial dominance and the interval (generation numbers) Ig to maximize the entire search performance. Also, the search performance of the proposed method is superior to recent state-of-the-art MOEAs, i.e., IBEA, CDAS and MSOPS. Furthermore, we show that the computational time of the proposed method is much less than IBEA, CDAS and MSOPS, and comparative or sometimes less than NSGA-II.
Projections onto the Pareto surface in multicriteria radiation therapy optimization
Bokrantz, Rasmus, E-mail: bokrantz@kth.se, E-mail: rasmus.bokrantz@raysearchlabs.com [Optimization and Systems Theory, Department of Mathematics, KTH Royal Institute of Technology, Stockholm SE-100 44, Sweden and RaySearch Laboratories, Sveavägen 44, Stockholm SE-103 65 (Sweden); Miettinen, Kaisa [Optimization and Systems Theory, Department of Mathematics, KTH Royal Institute of Technology, SE-100 44 Stockholm, Sweden and University of Jyvaskyla, Department of Mathematical Information Technology, FI-400 14 University of Jyvaskyla (Finland)
2015-10-15
Purpose: To eliminate or reduce the error to Pareto optimality that arises in Pareto surface navigation when the Pareto surface is approximated by a small number of plans. Methods: The authors propose to project the navigated plan onto the Pareto surface as a postprocessing step to the navigation. The projection attempts to find a Pareto optimal plan that is at least as good as or better than the initial navigated plan with respect to all objective functions. An augmented form of projection is also suggested where dose–volume histogram constraints are used to prevent that the projection causes a violation of some clinical goal. The projections were evaluated with respect to planning for intensity modulated radiation therapy delivered by step-and-shoot and sliding window and spot-scanned intensity modulated proton therapy. Retrospective plans were generated for a prostate and a head and neck case. Results: The projections led to improved dose conformity and better sparing of organs at risk (OARs) for all three delivery techniques and both patient cases. The mean dose to OARs decreased by 3.1 Gy on average for the unconstrained form of the projection and by 2.0 Gy on average when dose–volume histogram constraints were used. No consistent improvements in target homogeneity were observed. Conclusions: There are situations when Pareto navigation leaves room for improvement in OAR sparing and dose conformity, for example, if the approximation of the Pareto surface is coarse or the problem formulation has too permissive constraints. A projection onto the Pareto surface can identify an inaccurate Pareto surface representation and, if necessary, improve the quality of the navigated plan.
Application of the Pareto Principle in Rapid Application Development Model
Vishal Pandey; AvinashBairwa; Sweta Bhattacharya
2013-01-01
the Pareto principle or most popularly termed as the 80/20 rule is one of the well-known theories in the field of economics. This rule of thumb was named after the great economist Vilferdo Pareto. The Pareto principle was proposed by a renowned management consultant Joseph M Juran. The rule states that 80% of the required work can be completed in 20% of the time allotted. The idea is to apply this rule of thumb in the Rapid Application Development (RAD) Process model of software engineering. ...
Ichinose, Genki
2015-09-01
Cooperation is a behavior that benefits others while incurring costs to the actor. Thus, natural selection favors defection (non-cooperation), which unilaterally takes the benefits without paying any costs, rather than cooperation. Despite this logical consequence, reality is the opposite: Cooperation is ubiquitous at any level from genomes to human societies. This contradiction is known as the puzzle of the evolution of cooperation. For a long time, evolutionary game theorists have used the prisoner's dilemma game (PD) and the chicken game (CH) as the standard models to solve this puzzle. For these researchers, it is recognized that a specific mechanism is needed for the evolution of cooperation [1]. Five mechanisms are proposed: kin selection, direct reciprocity, indirect reciprocity, network reciprocity, and group selection. By using the donor and recipient game (D&R), which is one of the particular forms of PD, Nowak theoretically showed that once benefit (b), cost (c), and the other one or two parameters for each mechanism are given, we (evolutionary game theorists) can immediately know whether cooperation evolves [1]. The point here is that he included those unique parameters for each mechanism into PD and then reformulated the payoff matrix. Therefore, we can use this extended PD as the first scaling parameters.
A Pareto Optimal Auction Mechanism for Carbon Emission Rights
Mingxi Wang
2014-01-01
Full Text Available The carbon emission rights do not fit well into the framework of existing multi-item auction mechanisms because of their own unique features. This paper proposes a new auction mechanism which converges to a unique Pareto optimal equilibrium in a finite number of periods. In the proposed auction mechanism, the assignment outcome is Pareto efficient and the carbon emission rights’ resources are efficiently used. For commercial application and theoretical completeness, both discrete and continuous markets—represented by discrete and continuous bid prices, respectively—are examined, and the results show the existence of a Pareto optimal equilibrium under the constraint of individual rationality. With no ties, the Pareto optimal equilibrium can be further proven to be unique.
Joseph Femia (ed.), Vilfredo Pareto (London: Ashgate, 2009)
Giorgio Baruchello
2012-01-01
All contemporary textbooks in the social sciences hail Vilfredo Pareto (1848—1923) as one of the founding fathers of modern sociology, alongside celebrated classics such as Auguste Comte, Max Weber and Emile Durkheim...
Automated Design Framework for Synthetic Biology Exploiting Pareto Optimality.
Otero-Muras, Irene; Banga, Julio R
2017-07-21
In this work we consider Pareto optimality for automated design in synthetic biology. We present a generalized framework based on a mixed-integer dynamic optimization formulation that, given design specifications, allows the computation of Pareto optimal sets of designs, that is, the set of best trade-offs for the metrics of interest. We show how this framework can be used for (i) forward design, that is, finding the Pareto optimal set of synthetic designs for implementation, and (ii) reverse design, that is, analyzing and inferring motifs and/or design principles of gene regulatory networks from the Pareto set of optimal circuits. Finally, we illustrate the capabilities and performance of this framework considering four case studies. In the first problem we consider the forward design of an oscillator. In the remaining problems, we illustrate how to apply the reverse design approach to find motifs for stripe formation, rapid adaption, and fold-change detection, respectively.
Kullback-Leibler divergence and the Pareto-Exponential approximation.
Weinberg, G V
2016-01-01
Recent radar research interests in the Pareto distribution as a model for X-band maritime surveillance radar clutter returns have resulted in analysis of the asymptotic behaviour of this clutter model. In particular, it is of interest to understand when the Pareto distribution is well approximated by an Exponential distribution. The justification for this is that under the latter clutter model assumption, simpler radar detection schemes can be applied. An information theory approach is introduced to investigate the Pareto-Exponential approximation. By analysing the Kullback-Leibler divergence between the two distributions it is possible to not only assess when the approximation is valid, but to determine, for a given Pareto model, the optimal Exponential approximation.
On the Irreconcilability of Pareto and Gibrat Laws
Bottazzi, Giulio
2007-01-01
If business firms face a multiplicative growth process in which their growth rates are independent from their sizes, then these sizes cannot be distributed according to a stationary Pareto distribution. At the same time , the Laplace distribution of growth rates cannot be easily reconciled with a Pareto distribution of firm sizes. Recent contributions, using formal arguments, seems to contrast these statements. We prove that the proposed formal results are wrong.
Evaluating The Effectiveness Of Production Process Using Pareto Analysis
Polák Pavel
2015-03-01
Full Text Available The aim of this paper is to present the possibilities of using the Pareto method in evaluating the effectiveness of machine production processes. The paper deals with the production process of material cutting using progressive technology and subsequent evaluation of its effectiveness and quality. In the production process, we have used the method of material cutting by abrasive water jet. The Pareto analysis was used for eliminating the shortcomings in the quality of the final part.
Kinetics of wealth and the Pareto law.
Boghosian, Bruce M
2014-04-01
An important class of economic models involve agents whose wealth changes due to transactions with other agents. Several authors have pointed out an analogy with kinetic theory, which describes molecules whose momentum and energy change due to interactions with other molecules. We pursue this analogy and derive a Boltzmann equation for the time evolution of the wealth distribution of a population of agents for the so-called Yard-Sale Model of wealth exchange. We examine the solutions to this equation by a combination of analytical and numerical methods and investigate its long-time limit. We study an important limit of this equation for small transaction sizes and derive a partial integrodifferential equation governing the evolution of the wealth distribution in a closed economy. We then describe how this model can be extended to include features such as inflation, production, and taxation. In particular, we show that the model with taxation exhibits the basic features of the Pareto law, namely, a lower cutoff to the wealth density at small values of wealth, and approximate power-law behavior at large values of wealth.
Multi objective SNP selection using pareto optimality.
Gumus, Ergun; Gormez, Zeliha; Kursun, Olcay
2013-04-01
Biomarker discovery is a challenging task of bioinformatics especially when targeting high dimensional problems such as SNP (single nucleotide polymorphism) datasets. Various types of feature selection methods can be applied to accomplish this task. Typically, using features versus class labels of samples in the training dataset, these methods aim at selecting feature subsets with maximal classification accuracies. Although finding such class-discriminative features is crucial, selection of relevant SNPs for maximizing other properties that exist in the nature of population genetics such as the correlation between genetic diversity and geographical distance of ethnic groups can also be equally important. In this work, a methodology using a multi objective optimization technique called Pareto Optimal is utilized for selecting SNP subsets offering both high classification accuracy and correlation between genomic and geographical distances. In this method, discriminatory power of an SNP is determined using mutual information and its contribution to the genomic-geographical correlation is estimated using its loadings on principal components. Combining these objectives, the proposed method identifies SNP subsets that can better discriminate ethnic groups than those obtained with sole mutual information and yield higher correlation than those obtained with sole principal components on the Human Genome Diversity Project (HGDP) SNP dataset.
Pareto-path multitask multiple kernel learning.
Li, Cong; Georgiopoulos, Michael; Anagnostopoulos, Georgios C
2015-01-01
A traditional and intuitively appealing Multitask Multiple Kernel Learning (MT-MKL) method is to optimize the sum (thus, the average) of objective functions with (partially) shared kernel function, which allows information sharing among the tasks. We point out that the obtained solution corresponds to a single point on the Pareto Front (PF) of a multiobjective optimization problem, which considers the concurrent optimization of all task objectives involved in the Multitask Learning (MTL) problem. Motivated by this last observation and arguing that the former approach is heuristic, we propose a novel support vector machine MT-MKL framework that considers an implicitly defined set of conic combinations of task objectives. We show that solving our framework produces solutions along a path on the aforementioned PF and that it subsumes the optimization of the average of objective functions as a special case. Using the algorithms we derived, we demonstrate through a series of experimental results that the framework is capable of achieving a better classification performance, when compared with other similar MTL approaches.
Phase transitions in Pareto optimal complex networks.
Seoane, Luís F; Solé, Ricard
2015-09-01
The organization of interactions in complex systems can be described by networks connecting different units. These graphs are useful representations of the local and global complexity of the underlying systems. The origin of their topological structure can be diverse, resulting from different mechanisms including multiplicative processes and optimization. In spatial networks or in graphs where cost constraints are at work, as it occurs in a plethora of situations from power grids to the wiring of neurons in the brain, optimization plays an important part in shaping their organization. In this paper we study network designs resulting from a Pareto optimization process, where different simultaneous constraints are the targets of selection. We analyze three variations on a problem, finding phase transitions of different kinds. Distinct phases are associated with different arrangements of the connections, but the need of drastic topological changes does not determine the presence or the nature of the phase transitions encountered. Instead, the functions under optimization do play a determinant role. This reinforces the view that phase transitions do not arise from intrinsic properties of a system alone, but from the interplay of that system with its external constraints.
Application of the Pareto Principle in Rapid Application Development Model
Vishal Pandey
2013-06-01
Full Text Available the Pareto principle or most popularly termed as the 80/20 rule is one of the well-known theories in the field of economics. This rule of thumb was named after the great economist Vilferdo Pareto. The Pareto principle was proposed by a renowned management consultant Joseph M Juran. The rule states that 80% of the required work can be completed in 20% of the time allotted. The idea is to apply this rule of thumb in the Rapid Application Development (RAD Process model of software engineering. The Rapid application development model integrates end-user in the development using iterative prototyping emphasizing on delivering a series of fully functional prototype to designated user experts. During the application of Pareto Principle the other concepts like the Pareto indifference curve and Pareto efficiency also come into the picture. This enables the development team to invest major amount of time focusing on the major functionalities of the project as per the requirement prioritizationof the customer. The paper involves an extensive study on different unsatisfactory projects in terms of time and financial resources and the reasons of failures are analyzed. Based on the possible reasons offailure, a customized RAD model is proposed integrating the 80/20 rule and advanced software development strategies to develop and deploy excellent quality software product in minimum time duration. The proposed methodology is such that its application will directly affect the quality of the end product for the better.
Taghanaki, Saeid Asgari; Kawahara, Jeremy; Miles, Brandon; Hamarneh, Ghassan
2017-07-01
Feature reduction is an essential stage in computer aided breast cancer diagnosis systems. Multilayer neural networks can be trained to extract relevant features by encoding high-dimensional data into low-dimensional codes. Optimizing traditional auto-encoders works well only if the initial weights are close to a proper solution. They are also trained to only reduce the mean squared reconstruction error (MRE) between the encoder inputs and the decoder outputs, but do not address the classification error. The goal of the current work is to test the hypothesis that extending traditional auto-encoders (which only minimize reconstruction error) to multi-objective optimization for finding Pareto-optimal solutions provides more discriminative features that will improve classification performance when compared to single-objective and other multi-objective approaches (i.e. scalarized and sequential). In this paper, we introduce a novel multi-objective optimization of deep auto-encoder networks, in which the auto-encoder optimizes two objectives: MRE and mean classification error (MCE) for Pareto-optimal solutions, rather than just MRE. These two objectives are optimized simultaneously by a non-dominated sorting genetic algorithm. We tested our method on 949 X-ray mammograms categorized into 12 classes. The results show that the features identified by the proposed algorithm allow a classification accuracy of up to 98.45%, demonstrating favourable accuracy over the results of state-of-the-art methods reported in the literature. We conclude that adding the classification objective to the traditional auto-encoder objective and optimizing for finding Pareto-optimal solutions, using evolutionary multi-objective optimization, results in producing more discriminative features. Copyright © 2017 Elsevier B.V. All rights reserved.
Y. Tang
2006-01-01
Full Text Available This study provides a comprehensive assessment of state-of-the-art evolutionary multiobjective optimization (EMO tools' relative effectiveness in calibrating hydrologic models. The relative computational efficiency, accuracy, and ease-of-use of the following EMO algorithms are tested: Epsilon Dominance Nondominated Sorted Genetic Algorithm-II (ε-NSGAII, the Multiobjective Shuffled Complex Evolution Metropolis algorithm (MOSCEM-UA, and the Strength Pareto Evolutionary Algorithm 2 (SPEA2. This study uses three test cases to compare the algorithms' performances: (1 a standardized test function suite from the computer science literature, (2 a benchmark hydrologic calibration test case for the Leaf River near Collins, Mississippi, and (3 a computationally intensive integrated surface-subsurface model application in the Shale Hills watershed in Pennsylvania. One challenge and contribution of this work is the development of a methodology for comprehensively comparing EMO algorithms that have different search operators and randomization techniques. Overall, SPEA2 attained competitive to superior results for most of the problems tested in this study. The primary strengths of the SPEA2 algorithm lie in its search reliability and its diversity preservation operator. The biggest challenge in maximizing the performance of SPEA2 lies in specifying an effective archive size without a priori knowledge of the Pareto set. In practice, this would require significant trial-and-error analysis, which is problematic for more complex, computationally intensive calibration applications. ε-NSGAII appears to be superior to MOSCEM-UA and competitive with SPEA2 for hydrologic model calibration. ε-NSGAII's primary strength lies in its ease-of-use due to its dynamic population sizing and archiving which lead to rapid convergence to very high quality solutions with minimal user input. MOSCEM-UA is best suited for hydrologic model calibration applications that have small
Pareto Optimal Insurance Policies in the Presence of Administrative Costs
Aase, Knut K.
2010-01-01
In his classical article in The American Economic Review, Arthur Raviv (1979) examines Pareto optimal insurance contracts when there are ex-post insurance costs c induced by the indemnity I for loss x. Raviv’s main result is that a necessary and sufficient condition for the Pareto optimal deductible to be equal to zero is c0(I) = 0 for all I > 0(or I=0). We claim that another type of cost function is called for in household insurance, caused by frequent but relatively small ...
Vilfredo Pareto e la fine del Sociale
Francesco Antonelli
2017-08-01
posto di quella di Stato o di pubblico ambisce a ricostruire un sociale che si auto-governa rompendo con il capitalismo globale: Toni Negri (2003 e Paolo Virno (2014 e, più in generale i post-operaisti ma anche una parte dei foucaultiani, si riconoscono in questa visione radical. Una delle radici culturali di questa vittoria teorico-pratica dell’individuo sulla Società è rintracciabile negli studi di Vilfredo Pareto. Prendere in considerazione le sue posizioni risulta importante anche per comprenderne le ambiguità. Nel primo paragrafo ci concentreremo sulle sue posizioni giovanili per poi passare ad analizzare quanto da egli sostenuto nel Trattato di sociologia generale (1916. Infine, nelle conclusioni cercheremo di sviluppare alcune considerazioni riferite ai discorsi contemporanei centrati sul soggetto.
A Comparison of Estimation Techniques for the Three Parameter Pareto Distribution
1985-12-01
1897 Vilfredo Pareto (1848-1923), an Italian-born Swiss professor of economics, formulated an empirical law which bears his name (16:233). Pareto’s Law...DTIC00• _ZLECTE! CD S A COMPARISON OF ESTIMATION TECHNIQUES FOR THICTHE REE PARAMETER PARETO DISTRIBUTION THESIS "Dennis J. Charek Major, USAF AFIT...TECHNIQUES FOR THE THREE PARAMETER PARETO DISTRIBUTION THESIS Dennis J. Charek Major, USAF AFIT/GSO/MA/8SD-3 Approved for public release; distribution
An Investigation of the Pareto Distribution as a Model for High Grazing Angle Clutter
2011-03-01
The Pareto Distribution [1-3] is named after the Italian economist Vilfredo Pareto (15 July 1848 – 19 August 1923) [4, 5], and is a power law...1980). 3. Evans, M., Hastings, N. and Peacock, B., Statistical Distributions, 3rd Edition, (Wiley, New York, 2000). 4. Bruno, G. “ Pareto , Vilfredo ” The...New Palgrave: A Dictionary of Economics, 5, 799804, 1987. 5. Cirillo, R. The Economics of Vilfredo Pareto , (Frank Cass Publishers, 1978) 6. Aban, I. B
Pareto-MEC and Its Convergence Analysis%Pareto-MEC算法及其收敛性分析
周秀玲; 孙承意
2007-01-01
介绍了一种新的多目标进化算法--Pareto-MEC.将基本MEC和Pareto思想结合起来处理多目标问题.提出了局部Pareto最优解集与局部Pareto最优态集概念,并利用概率论的基本理论证明了趋同过程产生的序列强收敛于局部Pareto最优态集.数值试验验证了Pareto-MEC算法的有效性.
Zipf-Mandelbrot-Pareto model for co-authorship popularity
Ausloos, Marcel
2014-01-01
Each co-author (CA) of any scientist can be given a rank ($r$) of importance according to the number ($J$) of joint publications which the authors have together. In this paper, the Zipf-Mandelbrot-Pareto law, i.e. $ J \\propto 1/(\
Meta-Modeling by Symbolic Regression and Pareto Simulated Annealing
Stinstra, E.; Rennen, G.; Teeuwen, G.J.A.
2006-01-01
The subject of this paper is a new approach to Symbolic Regression.Other publications on Symbolic Regression use Genetic Programming.This paper describes an alternative method based on Pareto Simulated Annealing.Our method is based on linear regression for the estimation of constants.Interval arithm
Considerazioni sul metodo logico-sperimentale di Vilfredo Pareto
Uliano Conti
2017-08-01
Full Text Available Il contributo considera il tema del metodo logico-sperimentale in Vilfredo Pareto, con riferimento particolare al Trattato di Sociologia Generale (1916. Gli studi che hanno considerato questo argomento hanno messo in evidenza la complessità della posizione metodologica di Pareto (1896-1897; 1906; 1916 evidenziando le caratteristiche della sua evoluzione (Belohradsky 1974; Busino 1968; Garzia 2006; Palumbo 1984; Ammassari 1995. Nelle letture e nelle interpretazioni del metodo logico-sperimentale si problematizza il tema generale del metodo nella scienza e lo si considera in rapporto al percorso scientifico di Pareto e alla temperie socioculturale europea della sua epoca. Nei primi decenni del Novecento è infatti in Europa forte l’impatto del Methodenstreit che era nato in ambito economico per poi estendersi alla filosofia e alle scienze storico-sociali (Dilthey 1883; Windelband 1894, 1912; Rickert 1899. Il Methodenstreit ruotava intorno ai temi della portata cognitiva delle leggi scientifiche e dello statuto disciplinare delle Geistwissenschften (Dilthey 1883. Le letture analitiche su Pareto hanno colto la complessità del suo pensiero sul metodo guardando sia agli aspetti che possono essere associati a una postura intellettuale oggettivista (Palumbo 1984; Marletti 2003, detta anche razionalista (Vaccarini 2013, sia sottolineando gli aspetti del metodo logico-sperimentale che possono essere accostati ad una tensione intellettuale maggiormente attenta al ruolo delle rappresentazioni sociali nella scienza (Belohradsky 1974; Ammassari 1995; Federici 1999. La posizione intellettuale di Pareto evolve nel corso del tempo delineando così un orizzonte di studi sociologici che raccolgono la sua lezione facendo emergere aspetti complessi che riguardano in modo rilevante il pensiero paretiano sul metodo, suggerendo riflessioni sul carattere della ricerca sociale contemporanea.
Some properties on Pareto-eigenvalues of higher-order tensors%高阶张量Pareto-特征值的若干性质
徐凤; 凌晨
2015-01-01
考虑高阶张量特征值互补问题,由于求解张量的最大Pareto-特征值是一个NP难问题,关注于Pareto-特征值的估计,并给出若干关于Z-张量和M-张量的Pareto-特征值的性质.
Sakai, Shoko; Kawakita, Atsushi; Ooi, Kazuyuki; Inoue, Tamiji
2013-03-01
Diversification of floral traits in angiosperms is often attributed to have been driven by adaptations to pollinators. Nevertheless, phylogenetic studies on the relationships among evolutionary changes in floral traits and pollination systems are still limited. We examined the relationships between floral trait changes and pollinator shifts in Bornean gingers (Zingiberaceae). These plants have strongly zygomorphic flowers pollinated by spiderhunter birds, bees of the genus Amegilla, and halictid bees. • We identified pollination systems through field observations and recorded petal color, quantity of floral rewards, and seven measures of flower morphology in 28 ginger species. Phylogenetic trees were constructed from nucleotide sequences of the matK and ITS regions. We examined the correlations between the evolution of pollination systems and floral traits using phylogenetically independent contrasts. • Significant association was found between pink color and spiderhunter pollination, orange and Amegilla pollination, and yellow and white and halictid pollination. Sugar production was higher in spiderhunter-pollinated species and lower in halictid-pollinated. Meanwhile, there was a significant association only for a subset of the floral morphological characters measured. Floral tube length, which is often thought to evolve to match the lengths of pollinator probing apparatuses, did not show any correlation. • There is considerable variation in the strength of association among pollination systems and floral traits. Lack of significant correlation in some traits could partly be explained by floral functions other than pollination, such as adaptations to prevent herbivore damage to the ovules. Further studies on these factors may improve understanding of plant-pollinator interactions.
Income inequality in Romania: The exponential-Pareto distribution
Oancea, Bogdan; Andrei, Tudorel; Pirjol, Dan
2017-03-01
We present a study of the distribution of the gross personal income and income inequality in Romania, using individual tax income data, and both non-parametric and parametric methods. Comparing with official results based on household budget surveys (the Family Budgets Survey and the EU-SILC data), we find that the latter underestimate the income share of the high income region, and the overall income inequality. A parametric study shows that the income distribution is well described by an exponential distribution in the low and middle incomes region, and by a Pareto distribution in the high income region with Pareto coefficient α = 2.53. We note an anomaly in the distribution in the low incomes region (∼9,250 RON), and present a model which explains it in terms of partial income reporting.
Decomposition and Simplification of Multivariate Data using Pareto Sets.
Huettenberger, Lars; Heine, Christian; Garth, Christoph
2014-12-01
Topological and structural analysis of multivariate data is aimed at improving the understanding and usage of such data through identification of intrinsic features and structural relationships among multiple variables. We present two novel methods for simplifying so-called Pareto sets that describe such structural relationships. Such simplification is a precondition for meaningful visualization of structurally rich or noisy data. As a framework for simplification operations, we introduce a decomposition of the data domain into regions of equivalent structural behavior and the reachability graph that describes global connectivity of Pareto extrema. Simplification is then performed as a sequence of edge collapses in this graph; to determine a suitable sequence of such operations, we describe and utilize a comparison measure that reflects the changes to the data that each operation represents. We demonstrate and evaluate our methods on synthetic and real-world examples.
Pareto-depth for multiple-query image retrieval.
Hsiao, Ko-Jen; Calder, Jeff; Hero, Alfred O
2015-02-01
Most content-based image retrieval systems consider either one single query, or multiple queries that include the same object or represent the same semantic information. In this paper, we consider the content-based image retrieval problem for multiple query images corresponding to different image semantics. We propose a novel multiple-query information retrieval algorithm that combines the Pareto front method with efficient manifold ranking. We show that our proposed algorithm outperforms state of the art multiple-query retrieval algorithms on real-world image databases. We attribute this performance improvement to concavity properties of the Pareto fronts, and prove a theoretical result that characterizes the asymptotic concavity of the fronts.
PARETO-IMPROVING WATER MANAGEMENT OVER SPACE AND TIME
Pitafi, Basharat A.K.; Roumasset, James A.
2004-01-01
Proposals for marginal cost water pricing have often been found to be politically infeasible because current users will have to pay a higher price even though future users will be better off. We show how efficiency pricing can be rendered Pareto-improving, and thus politically feasible, by compensating the users suffering a loss due to higher prices. We also provide a method for determining efficient spatial and inter-temporal water management for a system with consumption at significantly di...
[Analysis of efficiency of human sleep by Pareto principle].
Verbitskiĭ, E V; Grachev, G A
2013-01-01
The results of system assessment efficiency of human night sleep which was registered on a large number of people without health problems, and data on the modeling of sleep structure by Pareto principle were showed. In contrast to the assessment of sleep efficiency, as the ratio of the duration of electrophysiological sleep to the total time of sleep, the introduced method increase adequacy assessment by taking into gender factor in the somnology and personalized medicine.
An introduction to synchronous self-learning Pareto strategy
Mozaffari, Ahmad; Fathi, Alireza
2013-01-01
In last decades optimization and control of complex systems that possessed various conflicted objectives simultaneously attracted an incremental interest of scientists. This is because of the vast applications of these systems in various fields of real life engineering phenomena that are generally multi modal, non convex and multi criterion. Hence, many researchers utilized versatile intelligent models such as Pareto based techniques, game theory (cooperative and non cooperative games), neuro...
Pareto optimal design of sectored toroidal superconducting magnet for SMES
Bhunia, Uttam; Saha, Subimal; Chakrabarti, Alok
2014-10-01
A novel multi-objective optimization design approach for sectored toroidal superconducting magnetic energy storage coil has been developed considering the practical engineering constraints. The objectives include the minimization of necessary superconductor length and torus overall size or volume, which determines a significant part of cost towards realization of SMES. The best trade-off between the necessary conductor length for winding and magnet overall size is achieved in the Pareto-optimal solutions, the compact magnet size leads to increase in required superconducting cable length or vice versa The final choice among Pareto optimal configurations can be done in relation to other issues such as AC loss during transient operation, stray magnetic field at outside the coil assembly, and available discharge period, which is not considered in the optimization process. The proposed design approach is adapted for a 4.5 MJ/1 MW SMES system using low temperature niobium-titanium based Rutherford type cable. Furthermore, the validity of the representative Pareto solutions is confirmed by finite-element analysis (FEA) with a reasonably acceptable accuracy.
Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.
Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O
2016-06-01
We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.
Wear/comfort Pareto optimisation of bogie suspension
Milad Mousavi Bideleh, Seyed; Berbyuk, Viktor; Persson, Rickard
2016-08-01
Pareto optimisation of bogie suspension components is considered for a 50 degrees of freedom railway vehicle model to reduce wheel/rail contact wear and improve passenger ride comfort. Several operational scenarios including tracks with different curve radii ranging from very small radii up to straight tracks are considered for the analysis. In each case, the maximum admissible speed is applied to the vehicle. Design parameters are categorised into two levels and the wear/comfort Pareto optimisation is accordingly accomplished in a multistep manner to improve the computational efficiency. The genetic algorithm (GA) is employed to perform the multi-objective optimisation. Two suspension system configurations are considered, a symmetric and an asymmetric in which the primary or secondary suspension elements on the right- and left-hand sides of the vehicle are not the same. It is shown that the vehicle performance on curves can be significantly improved using the asymmetric suspension configuration. The Pareto-optimised values of the design parameters achieved here guarantee wear reduction and comfort improvement for railway vehicles and can also be utilised in developing the reference vehicle models for design of bogie active suspension systems.
Computing gap free Pareto front approximations with stochastic search algorithms.
Schütze, Oliver; Laumanns, Marco; Tantar, Emilia; Coello, Carlos A Coello; Talbi, El-Ghazali
2010-01-01
Recently, a convergence proof of stochastic search algorithms toward finite size Pareto set approximations of continuous multi-objective optimization problems has been given. The focus was on obtaining a finite approximation that captures the entire solution set in some suitable sense, which was defined by the concept of epsilon-dominance. Though bounds on the quality of the limit approximation-which are entirely determined by the archiving strategy and the value of epsilon-have been obtained, the strategies do not guarantee to obtain a gap free approximation of the Pareto front. That is, such approximations A can reveal gaps in the sense that points f in the Pareto front can exist such that the distance of f to any image point F(a), a epsilon A, is "large." Since such gap free approximations are desirable in certain applications, and the related archiving strategies can be advantageous when memetic strategies are included in the search process, we are aiming in this work for such methods. We present two novel strategies that accomplish this task in the probabilistic sense and under mild assumptions on the stochastic search algorithm. In addition to the convergence proofs, we give some numerical results to visualize the behavior of the different archiving strategies. Finally, we demonstrate the potential for a possible hybridization of a given stochastic search algorithm with a particular local search strategy-multi-objective continuation methods-by showing that the concept of epsilon-dominance can be integrated into this approach in a suitable way.
Min-Yin Liu
2017-05-01
Full Text Available Sleep spindles are brief bursts of brain activity in the sigma frequency range (11–16 Hz measured by electroencephalography (EEG mostly during non-rapid eye movement (NREM stage 2 sleep. These oscillations are of great biological and clinical interests because they potentially play an important role in identifying and characterizing the processes of various neurological disorders. Conventionally, sleep spindles are identified by expert sleep clinicians via visual inspection of EEG signals. The process is laborious and the results are inconsistent among different experts. To resolve the problem, numerous computerized methods have been developed to automate the process of sleep spindle identification. Still, the performance of these automated sleep spindle detection methods varies inconsistently from study to study. There are two reasons: (1 the lack of common benchmark databases, and (2 the lack of commonly accepted evaluation metrics. In this study, we focus on tackling the second problem by proposing to evaluate the performance of a spindle detector in a multi-objective optimization context and hypothesize that using the resultant Pareto fronts for deriving evaluation metrics will improve automatic sleep spindle detection. We use a popular multi-objective evolutionary algorithm (MOEA, the Strength Pareto Evolutionary Algorithm (SPEA2, to optimize six existing frequency-based sleep spindle detection algorithms. They include three Fourier, one continuous wavelet transform (CWT, and two Hilbert-Huang transform (HHT based algorithms. We also explore three hybrid approaches. Trained and tested on open-access DREAMS and MASS databases, two new hybrid methods of combining Fourier with HHT algorithms show significant performance improvement with F1-scores of 0.726–0.737.
Qinghai He
2013-01-01
Full Text Available In general Banach spaces, we consider a vector optimization problem (SVOP in which the objective is a set-valued mapping whose graph is the union of finitely many polyhedra or the union of finitely many generalized polyhedra. Dropping the compactness assumption, we establish some results on structure of the weak Pareto solution set, Pareto solution set, weak Pareto optimal value set, and Pareto optimal value set of (SVOP and on connectedness of Pareto solution set and Pareto optimal value set of (SVOP. In particular, we improved and generalize, Arrow, Barankin, and Blackwell’s classical results in Euclidean spaces and Zheng and Yang’s results in general Banach spaces.
Multicriteria Evolutionary Weather Routing Algorithm in Practice
Joanna Szlapczynska
2013-03-01
Full Text Available The Multicriteria Evolutionary Weather Routing Algorithm (MEWRA has already been introduced by the author on earlier TransNav 2009 and 2011 conferences with a focus on theoretical application to a hybrid-propulsion or motor-driven ship. This paper addresses the topic of possible practical weather routing applications of MEWRA. In the paper some practical advantages of utilizing Pareto front as a result of multicriteria optimization in case of route finding are described. The paper describes the notion of Pareto-optimality of routes along with a simplified, easy to follow, example. It also discusses a choice of the most suitable ranking method for MEWRA (a comparison between Fuzzy TOPSIS and Zero Unitarization Method is presented. In addition to that the paper briefly outlines a commercial application of MEWRA.
Marco Bee
2012-01-01
This paper deals with the estimation of the lognormal-Pareto and the lognormal-Generalized Pareto mixture distributions. The log-likelihood function is discontinuous, so that Maximum Likelihood Estimation is not asymptotically optimal. For this reason, we develop an alternative method based on Probability Weighted Moments. We show that the standard version of the method can be applied to the first distribution, but not to the latter. Thus, in the lognormal- Generalized Pareto case, we work ou...
Ottosson, Rickard O; Engstrom, Per E; Sjöström, David; Behrens, Claus F; Karlsson, Anna; Knöös, Tommy; Ceberg, Crister
2009-01-01
Pareto optimality is a concept that formalises the trade-off between a given set of mutually contradicting objectives. A solution is said to be Pareto optimal when it is not possible to improve one objective without deteriorating at least one of the other. A set of Pareto optimal solutions constitute the Pareto front. The Pareto concept applies well to the inverse planning process, which involves inherently contradictory objectives, high and uniform target dose on one hand, and sparing of surrounding tissue and nearby organs at risk (OAR) on the other. Due to the specific characteristics of a treatment planning system (TPS), treatment strategy or delivery technique, Pareto fronts for a given case are likely to differ. The aim of this study was to investigate the feasibility of using Pareto fronts as a comparative tool for TPSs, treatment strategies and delivery techniques. In order to sample Pareto fronts, multiple treatment plans with varying target conformity and dose sparing of OAR were created for a number of prostate and head & neck IMRT cases. The DVHs of each plan were evaluated with respect to target coverage and dose to relevant OAR. Pareto fronts were successfully created for all studied cases. The results did indeed follow the definition of the Pareto concept, i.e. dose sparing of the OAR could not be improved without target coverage being impaired or vice versa. Furthermore, various treatment techniques resulted in distinguished and well separated Pareto fronts. Pareto fronts may be used to evaluate a number of parameters within radiotherapy. Examples are TPS optimization algorithms, the variation between accelerators or delivery techniques and the degradation of a plan during the treatment planning process. The issue of designing a model for unbiased comparison of parameters with such large inherent discrepancies, e.g. different TPSs, is problematic and should be carefully considered.
Pareto Efficient Policy for Supervisory Power Management Control
Malikopoulos, Andreas [ORNL
2015-01-01
n this paper we address the problem of online optimization of the supervisory power management control in parallel hybrid electric vehicles (HEVs). We model HEV opera- tion as a controlled Markov chain using the long-run expected average cost per unit time criterion, and we show that the control policy yielding the Pareto optimal solution minimizes the average cost criterion online. The effectiveness of the proposed solution is validated through simulation and compared to the solution derived with dynamic programming using the average cost criterion.
Pareto Optimal Solution for Supply Contracts with Multiple Suppliers
CHEN Kebing; GAO Chengxiu
2006-01-01
This paper analyzes an electronic procurement (e-procurement) process between a manufacturer and N-supplier in the e-market. We proof that using the general contract based on auction theory, i. e. the wholesale price contract, would not achieve the coordination of channel composed of the manufacturer and the winning supplier. The paper designs a contract mechanism, i.e. the side payment price-restricted contract based on auction theory, which not only ensures Pareto optimal solutions for both, but also coordinates the supply chain. A numerical experiment is provided to compare the performance of different auction mechanisms and to reinforce key managerial insights generated through analysis.
MULTI OBJECTIVE ECONOMIC DISPATCH USING PARETO FRONTIER DIFFERENTIAL EVOLUTION
JAGADEESH GUNDA
2011-10-01
Full Text Available Multi Objective Economic dispatch (MOED problem has gained recent attention due to the deregulation of power industry and environmental regulations. So generating utilities should optimize their emission inaddition to the operating cost. In this paper a Pareto frontier Differential Evolution (PDE technique is developed to solve MOED problem, which provides a set of feasible solutions to the problem. To evaluate the performance and applicability of the proposed method, it is implemented on the standard IEEE-30 bus system having six generating units including valve point effects. The results obtained demonstrate the effectiveness of the proposed method for solving the Multi Objective economic dispatch problem considering security constraints.
Pareto distance for multi-layer network analysis
Magnani, Matteo; Rossi, Luca
2013-01-01
services, e.g., Facebook, Twitter, LinkedIn and Foursquare. As a result, the analysis of on-line social networks requires a wider scope and, more technically speaking, models for the representation of this fragmented scenario. The recent introduction of more realistic layered models has however determined...... on the nature of the connections required by the Pareto distance may in theory result in a large number of potential shortest paths between pairs of nodes. However, an experimental computation of distances on multi-layer networks of increasing size shows an interesting and non-trivial stable behavior....
Pareto joint inversion of 2D magnetotelluric and gravity data
Miernik, Katarzyna; Bogacz, Adrian; Kozubal, Adam; Danek, Tomasz; Wojdyła, Marek
2015-04-01
In this contribution, the first results of the "Innovative technology of petrophysical parameters estimation of geological media using joint inversion algorithms" project were described. At this stage of the development, Pareto joint inversion scheme for 2D MT and gravity data was used. Additionally, seismic data were provided to set some constrains for the inversion. Sharp Boundary Interface(SBI) approach and description model with set of polygons were used to limit the dimensionality of the solution space. The main engine was based on modified Particle Swarm Optimization(PSO). This algorithm was properly adapted to handle two or more target function at once. Additional algorithm was used to eliminate non- realistic solution proposals. Because PSO is a method of stochastic global optimization, it requires a lot of proposals to be evaluated to find a single Pareto solution and then compose a Pareto front. To optimize this stage parallel computing was used for both inversion engine and 2D MT forward solver. There are many advantages of proposed solution of joint inversion problems. First of all, Pareto scheme eliminates cumbersome rescaling of the target functions, that can highly affect the final solution. Secondly, the whole set of solution is created in one optimization run, providing a choice of the final solution. This choice can be based off qualitative data, that are usually very hard to be incorporated into the regular inversion schema. SBI parameterisation not only limits the problem of dimensionality, but also makes constraining of the solution easier. At this stage of work, decision to test the approach using MT and gravity data was made, because this combination is often used in practice. It is important to mention, that the general solution is not limited to this two methods and it is flexible enough to be used with more than two sources of data. Presented results were obtained for synthetic models, imitating real geological conditions, where
唐智礼
2006-01-01
将确定性优化算法和Pareto阵面概念结合起来处理了多目标优化设计问题;给出了结合算法及数值过程的细节,并将其应用到了气动优化设计中;描述了如何用确定性优化算法快速抓获多目标优化问题的Pareto阵面以及能够抓获哪些类型的Pareto阵面.数值实验结果表明,确定性优化算法可以准确高效地抓获任意凸的和某些凹的Pareto阵面,故对于此类多目标气动优化问题,可用确定性算法代替进化算法.%Deterministic optimization methods are combined with the Pareto front concept to solve multi-criterion design problems. The algorithm and the numerical implementation are applied to aerodynamic designs. Evolutionary algorithms (EAs) and the Pareto front concept are used to solve practical design problems in industry for its robustness in capturing convex, concave, discrete or discontinuous Pareto fronts of multi-objective optimization problems. However, the process is time-consuming. Therefore, deterministic optimization methods are introduced to capture the Pareto front, and the types of the captured Pareto front are explained. Numerical experiments show that the deterministic optimization method is a good alternative to EAs for capturing any convex and some concave Pareto fronts in multi-criterion aerodynamic optimization problems due to its efficiency.
Using Pareto optimality to explore the topology and dynamics of the human connectome.
Avena-Koenigsberger, Andrea; Goñi, Joaquín; Betzel, Richard F; van den Heuvel, Martijn P; Griffa, Alessandra; Hagmann, Patric; Thiran, Jean-Philippe; Sporns, Olaf
2014-10-05
Graph theory has provided a key mathematical framework to analyse the architecture of human brain networks. This architecture embodies an inherently complex relationship between connection topology, the spatial arrangement of network elements, and the resulting network cost and functional performance. An exploration of these interacting factors and driving forces may reveal salient network features that are critically important for shaping and constraining the brain's topological organization and its evolvability. Several studies have pointed to an economic balance between network cost and network efficiency with networks organized in an 'economical' small-world favouring high communication efficiency at a low wiring cost. In this study, we define and explore a network morphospace in order to characterize different aspects of communication efficiency in human brain networks. Using a multi-objective evolutionary approach that approximates a Pareto-optimal set within the morphospace, we investigate the capacity of anatomical brain networks to evolve towards topologies that exhibit optimal information processing features while preserving network cost. This approach allows us to investigate network topologies that emerge under specific selection pressures, thus providing some insight into the selectional forces that may have shaped the network architecture of existing human brains.
PARETO OPTIMAL SOLUTIONS FOR MULTI-OBJECTIVE GENERALIZED ASSIGNMENT PROBLEM
S. Prakash
2012-01-01
Full Text Available
ENGLISH ABSTRACT: The Multi-Objective Generalized Assignment Problem (MGAP with two objectives, where one objective is linear and the other one is non-linear, has been considered, with the constraints that a job is assigned to only one worker – though he may be assigned more than one job, depending upon the time available to him. An algorithm is proposed to find the set of Pareto optimal solutions of the problem, determining assignments of jobs to workers with two objectives without setting priorities for them. The two objectives are to minimise the total cost of the assignment and to reduce the time taken to complete all the jobs.
AFRIKAANSE OPSOMMING: ‘n Multi-doelwit veralgemeende toekenningsprobleem (“multi-objective generalised assignment problem – MGAP” met twee doelwitte, waar die een lineêr en die ander nielineêr is nie, word bestudeer, met die randvoorwaarde dat ‘n taak slegs toegedeel word aan een werker – alhoewel meer as een taak aan hom toegedeel kan word sou die tyd beskikbaar wees. ‘n Algoritme word voorgestel om die stel Pareto-optimale oplossings te vind wat die taaktoedelings aan werkers onderhewig aan die twee doelwitte doen sonder dat prioriteite toegeken word. Die twee doelwitte is om die totale koste van die opdrag te minimiseer en om die tyd te verminder om al die take te voltooi.
Determination of Pareto frontier in multi-objective maintenance optimization
Certa, Antonella [Dipartimento di Tecnologia Meccanica, Produzione e Ingegneria Gestionale, Universita di Palermo 90128 Palermo (Italy); Galante, Giacomo, E-mail: galante@dtpm.unipa.i [Dipartimento di Tecnologia Meccanica, Produzione e Ingegneria Gestionale, Universita di Palermo 90128 Palermo (Italy); Lupo, Toni; Passannanti, Gianfranco [Dipartimento di Tecnologia Meccanica, Produzione e Ingegneria Gestionale, Universita di Palermo 90128 Palermo (Italy)
2011-07-15
The objective of a maintenance policy generally is the global maintenance cost minimization that involves not only the direct costs for both the maintenance actions and the spare parts, but also those ones due to the system stop for preventive maintenance and the downtime for failure. For some operating systems, the failure event can be dangerous so that they are asked to operate assuring a very high reliability level between two consecutive fixed stops. The present paper attempts to individuate the set of elements on which performing maintenance actions so that the system can assure the required reliability level until the next fixed stop for maintenance, minimizing both the global maintenance cost and the total maintenance time. In order to solve the previous constrained multi-objective optimization problem, an effective approach is proposed to obtain the best solutions (that is the Pareto optimal frontier) among which the decision maker will choose the more suitable one. As well known, describing the whole Pareto optimal frontier generally is a troublesome task. The paper proposes an algorithm able to rapidly overcome this problem and its effectiveness is shown by an application to a case study regarding a complex series-parallel system.
Estimations of Pareto-eigenvalues for Higher-order Tensors%高阶张量Pareto-特征值的估计
徐凤; 凌晨
2015-01-01
Eigenvalue complementarity problems of tensors have many practical applications, and have closely connection with high order homogeneous polynomial optimization which is NP-hard.In this paper,some estimation proprieties of Pareto-eigenvalues of high order tensors are studied.We also prove that all the Pareto-eigenvalues of symmetric strong M-tensor and monotone tensor are positive.%张量特征值互补问题有许多实际应用,它与一类高次齐次多项式优化关系密切,而后者是NP-难问题. 给出了高阶张量Pareto-特征值的若干估计性质,并证明了对称的强-M-张量及单调张量的Pareto-特征值均为正.
An asymptotically unbiased minimum density power divergence estimator for the Pareto-tail index
Dierckx, Goedele; Goegebeur, Yuri; Guillou, Armelle
2013-01-01
We introduce a robust and asymptotically unbiased estimator for the tail index of Pareto-type distributions. The estimator is obtained by fitting the extended Pareto distribution to the relative excesses over a high threshold with the minimum density power divergence criterion. Consistency...
An asymptotically unbiased minimum density power divergence estimator for the Pareto-tail index
Dierckx, G.; Goegebeur, Y.; Guillou, A.
2013-01-01
We introduce a robust and asymptotically unbiased estimator for the tail index of Pareto-type distributions. The estimator is obtained by fitting the extended Pareto distribution to the relative excesses over a high threshold with the minimum density power divergence criterion. Consistency and as...... by a small simulation experiment involving both uncontaminated and contaminated samples. (C) 2013 Elsevier Inc. All rights reserved....
Anna Michalak
2014-01-01
Full Text Available In this paper we examine the concept of Pareto optimality in a simplified Gale economic model without assuming continuity of the utility functions. We apply some existing results on higher-order optimality conditions to get necessary and sufficient conditions for a locally Pareto optimal allocation.
Strong Convergence Bound of the Pareto Index Estimator under Right Censoring
Peng Zuoxiang
2010-01-01
Full Text Available Let be a sequence of positive independent and identically distributed random variables with common Pareto-type distribution function as , where represents a slowly varying function at infinity. In this note we study the strong convergence bound of a kind of right censored Pareto index estimator under second-order regularly varying conditions.
A Hierachical Evolutionary Algorithm for Multiobjective Optimization in IMRT
Holdsworth, Clay; Liao, Jay; Phillips, Mark H
2012-01-01
Purpose: Current inverse planning methods for IMRT are limited because they are not designed to explore the trade-offs between the competing objectives between the tumor and normal tissues. Our goal was to develop an efficient multiobjective optimization algorithm that was flexible enough to handle any form of objective function and that resulted in a set of Pareto optimal plans. Methods: We developed a hierarchical evolutionary multiobjective algorithm designed to quickly generate a diverse Pareto optimal set of IMRT plans that meet all clinical constraints and reflect the trade-offs in the plans. The top level of the hierarchical algorithm is a multiobjective evolutionary algorithm (MOEA). The genes of the individuals generated in the MOEA are the parameters that define the penalty function minimized during an accelerated deterministic IMRT optimization that represents the bottom level of the hierarchy. The MOEA incorporates clinical criteria to restrict the search space through protocol objectives and then...
Mariano Frutos-Alazard
2012-01-01
Full Text Available La planificación, en el ámbito productivo, se encarga de diseñar, coordinar, administrar y controlar todas las operaciones que se hallan presentes en la explotación de los sistemas productivos. En este marco de trabajo, aparecen numerosos Problemas de Optimización Multi-objetivo (MOPs. Éstos constan de varias funciones que suelen ser complejas y evaluarlas puede ser muy costoso. La optimización multi-objetivo es la disciplina que trata de encontrar las soluciones, denominadas Pareto óptimas, a este tipo de problemas. La compleja resolución de los MOPs es debida a las dimensiones propias del problema, al carácter combinatorio de los algoritmos y a la naturaleza de los objetivos, los cuales están vinculados a la eficiencia del sistema. En las últimas décadas muchos MOPs vinculados a la producción han sido tratados con éxito con técnicas de resolución basadas en Algoritmos Genéticos. En este trabajo se evalúa a NSGAII (Non-dominated Sorting Genetic Algorithm II, SPEAII (Strength Pareto Evolutionary Algorithm II y a sus antecesores, NSGA y SPEA, en el proceso de planificación de la producción no estandarizada. Luego de la experiencia realizada, el algoritmo NSGAII mostró mayor eficiencia.Planning in production environments takes care of designing, coordinating, managing and controlling all the operations existing in the use of productive systems. There are, in the framework analyzed within this work, several relevant Multi-Objective Optimization Problems (MOPs. They consist of several functions which tend to be complex and expensive to evaluate. Multi-objective optimization is the discipline developed to provide solutions, called Pareto optimal, for the simultaneous optimization of those functions. The costs of solving MOPs is due to the dimension of the problems, the combinatorial nature of the algorithms and the kind of objectives represented, linked to the efficiency of the system.. In the last decades several production
Energy-Efficient Scheduling Problem Using an Effective Hybrid Multi-Objective Evolutionary Algorithm
Lvjiang Yin
2016-12-01
Full Text Available Nowadays, manufacturing enterprises face the challenge of just-in-time (JIT production and energy saving. Therefore, study of JIT production and energy consumption is necessary and important in manufacturing sectors. Moreover, energy saving can be attained by the operational method and turn off/on idle machine method, which also increases the complexity of problem solving. Thus, most researchers still focus on small scale problems with one objective: a single machine environment. However, the scheduling problem is a multi-objective optimization problem in real applications. In this paper, a single machine scheduling model with controllable processing and sequence dependence setup times is developed for minimizing the total earliness/tardiness (E/T, cost, and energy consumption simultaneously. An effective multi-objective evolutionary algorithm called local multi-objective evolutionary algorithm (LMOEA is presented to tackle this multi-objective scheduling problem. To accommodate the characteristic of the problem, a new solution representation is proposed, which can convert discrete combinational problems into continuous problems. Additionally, a multiple local search strategy with self-adaptive mechanism is introduced into the proposed algorithm to enhance the exploitation ability. The performance of the proposed algorithm is evaluated by instances with comparison to other multi-objective meta-heuristics such as Nondominated Sorting Genetic Algorithm II (NSGA-II, Strength Pareto Evolutionary Algorithm 2 (SPEA2, Multiobjective Particle Swarm Optimization (OMOPSO, and Multiobjective Evolutionary Algorithm Based on Decomposition (MOEA/D. Experimental results demonstrate that the proposed LMOEA algorithm outperforms its counterparts for this kind of scheduling problems.
Pareto analysis of critical factors affecting technical institution evaluation
Victor Gambhir
2012-08-01
Full Text Available With the change of education policy in 1991, more and more technical institutions are being set up in India. Some of these institutions provide quality education, but others are merely concentrating on quantity. These stakeholders are in a state of confusion about decision to select the best institute for their higher educational studies. Although various agencies including print media provide ranking of these institutions every year, but their results are controversial and biased. In this paper, the authors have made an endeavor to find the critical factors for technical institution evaluation from literature survey. A Pareto analysis has also been performed to find the intensity of these critical factors in evaluation. This will not only help the stake holders in taking right decisions but will also help the management of institutions in benchmarking for identifying the most important critical areas to improve the existing system. This will in turn help Indian economy.
Pareto optimization of an industrial ecosystem: sustainability maximization
J. G. M.-S. Monteiro
2010-09-01
Full Text Available This work investigates a procedure to design an Industrial Ecosystem for sequestrating CO2 and consuming glycerol in a Chemical Complex with 15 integrated processes. The Complex is responsible for the production of methanol, ethylene oxide, ammonia, urea, dimethyl carbonate, ethylene glycol, glycerol carbonate, β-carotene, 1,2-propanediol and olefins, and is simulated using UNISIM Design (Honeywell. The process environmental impact (EI is calculated using the Waste Reduction Algorithm, while Profit (P is estimated using classic cost correlations. MATLAB (The Mathworks Inc is connected to UNISIM to enable optimization. The objective is granting maximum process sustainability, which involves finding a compromise between high profitability and low environmental impact. Sustainability maximization is therefore understood as a multi-criteria optimization problem, addressed by means of the Pareto optimization methodology for trading off P vs. EI.
Foundations of the Pareto Iterated Local Search Metaheuristic
Geiger, Martin Josef
2008-01-01
The paper describes the proposition and application of a local search metaheuristic for multi-objective optimization problems. It is based on two main principles of heuristic search, intensification through variable neighborhoods, and diversification through perturbations and successive iterations in favorable regions of the search space. The concept is successfully tested on permutation flow shop scheduling problems under multiple objectives. While the obtained results are encouraging in terms of their quality, another positive attribute of the approach is its' simplicity as it does require the setting of only very few parameters. The implementation of the Pareto Iterated Local Search metaheuristic is based on the MOOPPS computer system of local search heuristics for multi-objective scheduling which has been awarded the European Academic Software Award 2002 in Ronneby, Sweden (http://www.easa-award.net/, http://www.bth.se/llab/easa_2002.nsf)
Dictatorship, liberalism and the Pareto rule: Possible and impossible
Boričić Branislav
2009-01-01
Full Text Available The current economic crisis has shaken belief in the capacity of neoliberal 'free market' policies. Numerous supports of state intervention have arisen, and the interest for social choice theory has revived. In this paper we consider three standard properties for aggregating individual into social preferences: dictatorship, liberalism and the Pareto rule, and their formal negations. The context of the pure first-order classical logic makes it possible to show how some combinations of the above mentioned conditions, under the hypothesis of unrestricted domain, form simple and reasonable examples of possible or impossible social choice systems. Due to their simplicity, these examples, including the famous 'liberal paradox', could have a particular didactic value.
Derivative-free generation and interpolation of convex Pareto optimal IMRT plans.
Hoffmann, Aswin L; Siem, Alex Y D; den Hertog, Dick; Kaanders, Johannes H A M; Huizenga, Henk
2006-12-21
In inverse treatment planning for intensity-modulated radiation therapy (IMRT), beamlet intensity levels in fluence maps of high-energy photon beams are optimized. Treatment plan evaluation criteria are used as objective functions to steer the optimization process. Fluence map optimization can be considered a multi-objective optimization problem, for which a set of Pareto optimal solutions exists: the Pareto efficient frontier (PEF). In this paper, a constrained optimization method is pursued to iteratively estimate the PEF up to some predefined error. We use the property that the PEF is convex for a convex optimization problem to construct piecewise-linear upper and lower bounds to approximate the PEF from a small initial set of Pareto optimal plans. A derivative-free Sandwich algorithm is presented in which these bounds are used with three strategies to determine the location of the next Pareto optimal solution such that the uncertainty in the estimated PEF is maximally reduced. We show that an intelligent initial solution for a new Pareto optimal plan can be obtained by interpolation of fluence maps from neighbouring Pareto optimal plans. The method has been applied to a simplified clinical test case using two convex objective functions to map the trade-off between tumour dose heterogeneity and critical organ sparing. All three strategies produce representative estimates of the PEF. The new algorithm is particularly suitable for dynamic generation of Pareto optimal plans in interactive treatment planning.
Hu, Xiao-Bing; Wang, Ming; Di Paolo, Ezequiel
2013-06-01
Searching the Pareto front for multiobjective optimization problems usually involves the use of a population-based search algorithm or of a deterministic method with a set of different single aggregate objective functions. The results are, in fact, only approximations of the real Pareto front. In this paper, we propose a new deterministic approach capable of fully determining the real Pareto front for those discrete problems for which it is possible to construct optimization algorithms to find the k best solutions to each of the single-objective problems. To this end, two theoretical conditions are given to guarantee the finding of the actual Pareto front rather than its approximation. Then, a general methodology for designing a deterministic search procedure is proposed. A case study is conducted, where by following the general methodology, a ripple-spreading algorithm is designed to calculate the complete exact Pareto front for multiobjective route optimization. When compared with traditional Pareto front search methods, the obvious advantage of the proposed approach is its unique capability of finding the complete Pareto front. This is illustrated by the simulation results in terms of both solution quality and computational efficiency.
Giller, C A
2011-12-01
The use of conformity indices to optimize Gamma Knife planning is common, but does not address important tradeoffs between dose to tumor and normal tissue. Pareto analysis has been used for this purpose in other applications, but not for Gamma Knife (GK) planning. The goal of this work is to use computer models to show that Pareto analysis may be feasible for GK planning to identify dosimetric tradeoffs. We define a GK plan A to be Pareto dominant to B if the prescription isodose volume of A covers more tumor but not more normal tissue than B, or if A covers less normal tissue but not less tumor than B. A plan is Pareto optimal if it is not dominated by any other plan. Two different Pareto optimal plans represent different tradeoffs between dose to tumor and normal tissue, because neither plan dominates the other. 'GK simulator' software calculated dose distributions for GK plans, and was called repetitively by a genetic algorithm to calculate Pareto dominant plans. Three irregular tumor shapes were tested in 17 trials using various combinations of shots. The mean number of Pareto dominant plans/trial was 59 ± 17 (sd). Different planning strategies were identified by large differences in shot positions, and 70 of the 153 coordinate plots (46%) showed differences of 5mm or more. The Pareto dominant plans dominated other nearby plans. Pareto dominant plans represent different dosimetric tradeoffs and can be systematically calculated using genetic algorithms. Automatic identification of non-intuitive planning strategies may be feasible with these methods.
Sánchez, M S; Sarabia, L A; Ortiz, M C
2012-11-19
Experimental designs for a given task should be selected on the base of the problem being solved and of some criteria that measure their quality. There are several such criteria because there are several aspects to be taken into account when making a choice. The most used criteria are probably the so-called alphabetical optimality criteria (for example, the A-, E-, and D-criteria related to the joint estimation of the coefficients, or the I- and G-criteria related to the prediction variance). Selecting a proper design to solve a problem implies finding a balance among these several criteria that measure the performance of the design in different aspects. Technically this is a problem of multi-criteria optimization, which can be tackled from different views. The approach presented here addresses the problem in its real vector nature, so that ad hoc experimental designs are generated with an algorithm based on evolutionary algorithms to find the Pareto-optimal front. There is not theoretical limit to the number of criteria that can be studied and, contrary to other approaches, no just one experimental design is computed but a set of experimental designs all of them with the property of being Pareto-optimal in the criteria needed by the user. Besides, the use of an evolutionary algorithm makes it possible to search in both continuous and discrete domains and avoid the need of having a set of candidate points, usual in exchange algorithms.
Mahmoodabadi, M J; Taherkhorsandi, M; Bagheri, A
2014-01-01
An optimal robust state feedback tracking controller is introduced to control a biped robot. In the literature, the parameters of the controller are usually determined by a tedious trial and error process. To eliminate this process and design the parameters of the proposed controller, the multiobjective evolutionary algorithms, that is, the proposed method, modified NSGAII, Sigma method, and MATLAB's Toolbox MOGA, are employed in this study. Among the used evolutionary optimization algorithms to design the controller for biped robots, the proposed method operates better in the aspect of designing the controller since it provides ample opportunities for designers to choose the most appropriate point based upon the design criteria. Three points are chosen from the nondominated solutions of the obtained Pareto front based on two conflicting objective functions, that is, the normalized summation of angle errors and normalized summation of control effort. Obtained results elucidate the efficiency of the proposed controller in order to control a biped robot.
Nash, Ulrik William
2014-01-01
The concept of evolutionary expectations descends from cue learning psychology, synthesizing ideas on rational expectations with ideas on bounded rationality, to provide support for these ideas simultaneously. Evolutionary expectations are rational, but within cognitive bounds. Moreover...... cognitive bounds will perceive business opportunities identically. In addition, because cues provide information about latent causal structures of the environment, changes in causality must be accompanied by changes in cognitive representations if adaptation is to be maintained. The concept of evolutionary...
Ottosson, Rickard O; Engstrom, Per E; Sjöström, David
2008-01-01
of a treatment planning system (TPS), treatment strategy or delivery technique, Pareto fronts for a given case are likely to differ. The aim of this study was to investigate the feasibility of using Pareto fronts as a comparative tool for TPSs, treatment strategies and delivery techniques. In order to sample...... Pareto fronts, multiple treatment plans with varying target conformity and dose sparing of OAR were created for a number of prostate and head & neck IMRT cases. The DVHs of each plan were evaluated with respect to target coverage and dose to relevant OAR. Pareto fronts were successfully created for all...... may be used to evaluate a number of parameters within radiotherapy. Examples are TPS optimization algorithms, the variation between accelerators or delivery techniques and the degradation of a plan during the treatment planning process. The issue of designing a model for unbiased comparison...
Accident investigation of construction sites in Qom city using Pareto chart (2009-2012
M. H. Beheshti
2015-07-01
.Conclusions: Employing Pareto charts as a method for analyzing and identification of accident causes can have an effective role in the management of work-related accidents, proper allocation of funds and time.
Selection of influential spreaders in complex networks using Pareto Shell decomposition
Yeruva, Sujatha; Devi, T.; Reddy, Y. Samtha
2016-06-01
The selection of prominent nodes in order to maximize the ability of spreading is very crucial in complex networks. The well known K-Shell method, which comprises nodes located at the core of a network, is better than the degree centrality and betweenness centrality, in capturing the spreading ability for a single origin spreader. As per the multiple origin spreaders, the K-Shell method fails to yield similar results when compared to the degree centrality. Current research proposes a Pareto-Shell Decomposition. It employs Pareto front function. It's Pareto optimal set comprises non-dominated spreads, with the ratio of high out-degree to in-degree and high in-degree. Pareto-Shell decomposition outperforms the K-Shell and the degree centrality for multiple origin spreaders, with the simulation of epidemic spreading process.
2008-01-01
In general normed spaces,we consider a multiobjective piecewise linear optimization problem with the ordering cone being convex and having a nonempty interior.We establish that the weak Pareto optimal solution set of such a problem is the union of finitely many polyhedra and that this set is also arcwise connected under the cone convexity assumption of the objective function.Moreover,we provide necessary and suffcient conditions about the existence of weak(sharp) Pareto solutions.
Levitis, Daniel
2015-01-01
of biological and cultural evolution. Demographic variation within and among human populations is influenced by our biology, and therefore by natural selection and our evolutionary background. Demographic methods are necessary for studying populations of other species, and for quantifying evolutionary fitness...
The geometry of the Pareto front in biological phenotype space.
Sheftel, Hila; Shoval, Oren; Mayo, Avi; Alon, Uri
2013-06-01
When organisms perform a single task, selection leads to phenotypes that maximize performance at that task. When organisms need to perform multiple tasks, a trade-off arises because no phenotype can optimize all tasks. Recent work addressed this question, and assumed that the performance at each task decays with distance in trait space from the best phenotype at that task. Under this assumption, the best-fitness solutions (termed the Pareto front) lie on simple low-dimensional shapes in trait space: line segments, triangles and other polygons. The vertices of these polygons are specialists at a single task. Here, we generalize this finding, by considering performance functions of general form, not necessarily functions that decay monotonically with distance from their peak. We find that, except for performance functions with highly eccentric contours, simple shapes in phenotype space are still found, but with mildly curving edges instead of straight ones. In a wide range of systems, complex data on multiple quantitative traits, which might be expected to fill a high-dimensional phenotype space, is predicted instead to collapse onto low-dimensional shapes; phenotypes near the vertices of these shapes are predicted to be specialists, and can thus suggest which tasks may be at play.
Using Pareto points for model identification in predictive toxicology.
Palczewska, Anna; Neagu, Daniel; Ridley, Mick
2013-03-22
: Predictive toxicology is concerned with the development of models that are able to predict the toxicity of chemicals. A reliable prediction of toxic effects of chemicals in living systems is highly desirable in cosmetics, drug design or food protection to speed up the process of chemical compound discovery while reducing the need for lab tests. There is an extensive literature associated with the best practice of model generation and data integration but management and automated identification of relevant models from available collections of models is still an open problem. Currently, the decision on which model should be used for a new chemical compound is left to users. This paper intends to initiate the discussion on automated model identification. We present an algorithm, based on Pareto optimality, which mines model collections and identifies a model that offers a reliable prediction for a new chemical compound. The performance of this new approach is verified for two endpoints: IGC50 and LogP. The results show a great potential for automated model identification methods in predictive toxicology.
A Pareto-optimal refinement method for protein design scaffolds.
Nivón, Lucas Gregorio; Moretti, Rocco; Baker, David
2013-01-01
Computational design of protein function involves a search for amino acids with the lowest energy subject to a set of constraints specifying function. In many cases a set of natural protein backbone structures, or "scaffolds", are searched to find regions where functional sites (an enzyme active site, ligand binding pocket, protein-protein interaction region, etc.) can be placed, and the identities of the surrounding amino acids are optimized to satisfy functional constraints. Input native protein structures almost invariably have regions that score very poorly with the design force field, and any design based on these unmodified structures may result in mutations away from the native sequence solely as a result of the energetic strain. Because the input structure is already a stable protein, it is desirable to keep the total number of mutations to a minimum and to avoid mutations resulting from poorly-scoring input structures. Here we describe a protocol using cycles of minimization with combined backbone/sidechain restraints that is Pareto-optimal with respect to RMSD to the native structure and energetic strain reduction. The protocol should be broadly useful in the preparation of scaffold libraries for functional site design.
Influence of Pareto optimality on the maximum entropy methods
Peddavarapu, Sreehari; Sunil, Gujjalapudi Venkata Sai; Raghuraman, S.
2017-07-01
Galerkin meshfree schemes are emerging as a viable substitute to finite element method to solve partial differential equations for the large deformations as well as crack propagation problems. However, the introduction of Shanon-Jayne's entropy principle in to the scattered data approximation has deviated from the trend of defining the approximation functions, resulting in maximum entropy approximants. Further in addition to this, an objective functional which controls the degree of locality resulted in Local maximum entropy approximants. These are based on information-theoretical Pareto optimality between entropy and degree of locality that are defining the basis functions to the scattered nodes. The degree of locality in turn relies on the choice of locality parameter and prior (weight) function. The proper choices of both plays vital role in attain the desired accuracy. Present work is focused on the choice of locality parameter which defines the degree of locality and priors: Gaussian, Cubic spline and quartic spline functions on the behavior of local maximum entropy approximants.
Diversity comparison of Pareto front approximations in many-objective optimization.
Li, Miqing; Yang, Shengxiang; Liu, Xiaohui
2014-12-01
Diversity assessment of Pareto front approximations is an important issue in the stochastic multiobjective optimization community. Most of the diversity indicators in the literature were designed to work for any number of objectives of Pareto front approximations in principle, but in practice many of these indicators are infeasible or not workable when the number of objectives is large. In this paper, we propose a diversity comparison indicator (DCI) to assess the diversity of Pareto front approximations in many-objective optimization. DCI evaluates relative quality of different Pareto front approximations rather than provides an absolute measure of distribution for a single approximation. In DCI, all the concerned approximations are put into a grid environment so that there are some hyperboxes containing one or more solutions. The proposed indicator only considers the contribution of different approximations to nonempty hyperboxes. Therefore, the computational cost does not increase exponentially with the number of objectives. In fact, the implementation of DCI is of quadratic time complexity, which is fully independent of the number of divisions used in grid. Systematic experiments are conducted using three groups of artificial Pareto front approximations and seven groups of real Pareto front approximations with different numbers of objectives to verify the effectiveness of DCI. Moreover, a comparison with two diversity indicators used widely in many-objective optimization is made analytically and empirically. Finally, a parametric investigation reveals interesting insights of the division number in grid and also offers some suggested settings to the users with different preferences.
Multiobjective Optimization of Linear Cooperative Spectrum Sensing: Pareto Solutions and Refinement.
Yuan, Wei; You, Xinge; Xu, Jing; Leung, Henry; Zhang, Tianhang; Chen, Chun Lung Philip
2016-01-01
In linear cooperative spectrum sensing, the weights of secondary users and detection threshold should be optimally chosen to minimize missed detection probability and to maximize secondary network throughput. Since these two objectives are not completely compatible, we study this problem from the viewpoint of multiple-objective optimization. We aim to obtain a set of evenly distributed Pareto solutions. To this end, here, we introduce the normal constraint (NC) method to transform the problem into a set of single-objective optimization (SOO) problems. Each SOO problem usually results in a Pareto solution. However, NC does not provide any solution method to these SOO problems, nor any indication on the optimal number of Pareto solutions. Furthermore, NC has no preference over all Pareto solutions, while a designer may be only interested in some of them. In this paper, we employ a stochastic global optimization algorithm to solve the SOO problems, and then propose a simple method to determine the optimal number of Pareto solutions under a computational complexity constraint. In addition, we extend NC to refine the Pareto solutions and select the ones of interest. Finally, we verify the effectiveness and efficiency of the proposed methods through computer simulations.
On the Pareto Boundary for the Two-User Single-Beam MIMO Interference Channel
Cao, Pan; Shi, Shuying
2012-01-01
We consider a two-user multiple-input multiple-output (MIMO) interference channel (IC), where a single data stream is transmitted and each receiver applies the minimum mean square error (MMSE) filter. In this paper, we study an open topic on the Pareto boundary of the rate region. The Pareto boundary is divided by two turning points into the weak Pareto boundary (including the horizontal part and vertical part) and the strict Pareto boundary (including the upper-right part and turning points). The weak Pareto boundary and turning points can be computed exactly. For the strict Pareto boundary, we propose a computationally efficient method called iterative alternating algorithm (IAA) for maximizing the rate of one user while the rate of the other user is fixed. To deal with the difficult coupling of the two transmit beamformers in this optimization problem, we convert it into two single-beamformer optimization problems. Then, by certain equivalent transformations, each problem becomes a quadratically constraine...
A New Evolutionary Algorithm for Solving Multi-Objective Optimization Problems
Chen Wen-ping; Kang Li-shan
2003-01-01
Multi-objective optimization is a new focus of evolutionary computation research. This paper puts forward a new algorithm, which can not only converge quickly, but also keep diversity among population efficiently, in order to find the Pareto-optimal set. This new algorithm replaces the worst individual with a newly-created one by "multi parent crossover", so that the population could converge near the true Pareto-optimal solutions in the end. At the same time, this new algorithm adopts niching and fitness-sharing techniques to keep the population in a good distribution. Numerical experiments show that the algorithm is rather effective in solving some Benchmarks. No matter whether the Pareto front of problems is convex or non-convex, continuous or discontinuous, and the problems are with constraints or not, the program turns out to do well.
Interleaving Guidance in Evolutionary Multi-Objective Optimization
Lam Thu Bui; Kalyanmoy Deb; Hussein A. Abbass; Daryl Essam
2008-01-01
In this paper, we propose a framework that uses localization for multi-objective optimization to simultaneously guide an evolutionary algorithm in both the decision and objective spaces. The localization is built using a limited number of adaptive spheres (local models) in the decision space. These spheres are usually guided, using some direction information, in the decision space towards the areas with non-dominated solutions. We use a second mechanism to adjust the spheres to specialize on different parts of the Pareto front by using a guided dominance technique in the objective space. Through this interleaved guidance in both spaces, the spheres will be guided towards different parts of the Pareto front while also exploring the decision space efficiently. The experimental results showed good performance for the local models using this dual guidance, in comparison with their original version.
MULTIOBJECT OPTIMIZATION OF A CENTRIFUGAL IMPELLER USING EVOLUTIONARY ALGORITHMS
Li Jun; Liu Lijun; Feng Zhenping
2004-01-01
Application of the multiobjective evolutionary algorithms to the aerodynamic optimization design of a centrifugal impeller is presented. The aerodynamic performance of a centrifugal impeller is evaluated by using the three-dimensional Navier-Stokes solutions. The typical centrifugal impeller is redesigned for maximization of the pressure rise and blade load and minimization of the rotational total pressure loss at the given flow conditions. The B閦ier curves are used to parameterize the three-dimensional impeller blade shape. The present method obtains many reasonable Pareto optimal designs that outperform the original centrifugal impeller. Detailed observation of the certain Pareto optimal design demonstrates the feasibility of the present multiobjective optimization method tool for turbomachinery design.
Birds shed RNA-viruses according to the pareto principle.
Mark D Jankowski
Full Text Available A major challenge in disease ecology is to understand the role of individual variation of infection load on disease transmission dynamics and how this influences the evolution of resistance or tolerance mechanisms. Such information will improve our capacity to understand, predict, and mitigate pathogen-associated disease in all organisms. In many host-pathogen systems, particularly macroparasites and sexually transmitted diseases, it has been found that approximately 20% of the population is responsible for approximately 80% of the transmission events. Although host contact rates can account for some of this pattern, pathogen transmission dynamics also depend upon host infectiousness, an area that has received relatively little attention. Therefore, we conducted a meta-analysis of pathogen shedding rates of 24 host (avian - pathogen (RNA-virus studies, including 17 bird species and five important zoonotic viruses. We determined that viral count data followed the Weibull distribution, the mean Gini coefficient (an index of inequality was 0.687 (0.036 SEM, and that 22.0% (0.90 SEM of the birds shed 80% of the virus across all studies, suggesting an adherence of viral shedding counts to the Pareto Principle. The relative position of a bird in a distribution of viral counts was affected by factors extrinsic to the host, such as exposure to corticosterone and to a lesser extent reduced food availability, but not to intrinsic host factors including age, sex, and migratory status. These data provide a quantitative view of heterogeneous virus shedding in birds that may be used to better parameterize epidemiological models and understand transmission dynamics.
Selecting series size where the generalized Pareto distribution best fits
Ben-Zvi, Arie
2016-10-01
Rates of arrival and magnitudes of hydrologic variables are frequently described by the Poisson and the generalized Pareto (GP) distributions. Variations of their goodness-of-fit to nested series are studied here. The variable employed is depth of rainfall events at five stations of the Israel Meteorological Service. Series sizes range from about 50 (number of years on records) to about 1000 (total number of recorded events). The goodness-of-fit is assessed by the Anderson-Darling test. Three versions of this test are applied here. These are the regular two-sided test (of which the statistic is designated here by A2), the upper one-sided test (UA2) and the adaptation to the Poisson distribution (PA2). Very good fits, with rejection significance levels higher than 0.5 for A2 and higher than 0.25 for PA2, are found for many series of different sizes. Values of the shape parameter of the GP distribution and of the predicted rainfall depths widely vary with series size. Small coefficients of variation are found, at each station, for the 100-year rainfall depths, predicted through the series with very good fit of the GP distribution. Therefore, predictions through series of very good fit appear more consistent than through other selections of series size. Variations of UA2, with series size, are found narrower than those of A2. Therefore, it is advisable to predict through the series of low UA2. Very good fits of the Poisson distribution to arrival rates are found for series with low UA2. But, a reversed relation is not found here. Thus, the model of Poissonian arrival rates and GP distribution of magnitudes suits here series with low UA2. It is recommended to predict through the series, to which the lowest UA2 is obtained.
Birds shed RNA-viruses according to the pareto principle.
Jankowski, Mark D; Williams, Christopher J; Fair, Jeanne M; Owen, Jennifer C
2013-01-01
A major challenge in disease ecology is to understand the role of individual variation of infection load on disease transmission dynamics and how this influences the evolution of resistance or tolerance mechanisms. Such information will improve our capacity to understand, predict, and mitigate pathogen-associated disease in all organisms. In many host-pathogen systems, particularly macroparasites and sexually transmitted diseases, it has been found that approximately 20% of the population is responsible for approximately 80% of the transmission events. Although host contact rates can account for some of this pattern, pathogen transmission dynamics also depend upon host infectiousness, an area that has received relatively little attention. Therefore, we conducted a meta-analysis of pathogen shedding rates of 24 host (avian) - pathogen (RNA-virus) studies, including 17 bird species and five important zoonotic viruses. We determined that viral count data followed the Weibull distribution, the mean Gini coefficient (an index of inequality) was 0.687 (0.036 SEM), and that 22.0% (0.90 SEM) of the birds shed 80% of the virus across all studies, suggesting an adherence of viral shedding counts to the Pareto Principle. The relative position of a bird in a distribution of viral counts was affected by factors extrinsic to the host, such as exposure to corticosterone and to a lesser extent reduced food availability, but not to intrinsic host factors including age, sex, and migratory status. These data provide a quantitative view of heterogeneous virus shedding in birds that may be used to better parameterize epidemiological models and understand transmission dynamics.
Song, Q Chelsea; Wee, Serena; Newman, Daniel A
2017-07-27
To reduce adverse impact potential and improve diversity outcomes from personnel selection, one promising technique is De Corte, Lievens, and Sackett's (2007) Pareto-optimal weighting strategy. De Corte et al.'s strategy has been demonstrated on (a) a composite of cognitive and noncognitive (e.g., personality) tests (De Corte, Lievens, & Sackett, 2008) and (b) a composite of specific cognitive ability subtests (Wee, Newman, & Joseph, 2014). Both studies illustrated how Pareto-weighting (in contrast to unit weighting) could lead to substantial improvement in diversity outcomes (i.e., diversity improvement), sometimes more than doubling the number of job offers for minority applicants. The current work addresses a key limitation of the technique-the possibility of shrinkage, especially diversity shrinkage, in the Pareto-optimal solutions. Using Monte Carlo simulations, sample size and predictor combinations were varied and cross-validated Pareto-optimal solutions were obtained. Although diversity shrinkage was sizable for a composite of cognitive and noncognitive predictors when sample size was at or below 500, diversity shrinkage was typically negligible for a composite of specific cognitive subtest predictors when sample size was at least 100. Diversity shrinkage was larger when the Pareto-optimal solution suggested substantial diversity improvement. When sample size was at least 100, cross-validated Pareto-optimal weights typically outperformed unit weights-suggesting that diversity improvement is often possible, despite diversity shrinkage. Implications for Pareto-optimal weighting, adverse impact, sample size of validation studies, and optimizing the diversity-job performance tradeoff are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Craft, David
2010-10-01
A discrete set of points and their convex combinations can serve as a sparse representation of the Pareto surface in multiple objective convex optimization. We develop a method to evaluate the quality of such a representation, and show by example that in multiple objective radiotherapy planning, the number of Pareto optimal solutions needed to represent Pareto surfaces of up to five dimensions grows at most linearly with the number of objectives. The method described is also applicable to the representation of convex sets.
Distributed approximation of Pareto surfaces in multicriteria radiation therapy treatment planning.
Bokrantz, Rasmus
2013-06-07
We consider multicriteria radiation therapy treatment planning by navigation over the Pareto surface, implemented by interpolation between discrete treatment plans. Current state of the art for calculation of a discrete representation of the Pareto surface is to sandwich this set between inner and outer approximations that are updated one point at a time. In this paper, we generalize this sequential method to an algorithm that permits parallelization. The principle of the generalization is to apply the sequential method to an approximation of an inexpensive model of the Pareto surface. The information gathered from the model is sub-sequently used for the calculation of points from the exact Pareto surface, which are processed in parallel. The model is constructed according to the current inner and outer approximations, and given a shape that is difficult to approximate, in order to avoid that parts of the Pareto surface are incorrectly disregarded. Approximations of comparable quality to those generated by the sequential method are demonstrated when the degree of parallelization is up to twice the number of dimensions of the objective space. For practical applications, the number of dimensions is typically at least five, so that a speed-up of one order of magnitude is obtained.
Ziaul Huque
2012-01-01
Full Text Available A Computational Fluid Dynamics (CFD and response surface-based multiobjective design optimization were performed for six different 2D airfoil profiles, and the Pareto optimal front of each airfoil is presented. FLUENT, which is a commercial CFD simulation code, was used to determine the relevant aerodynamic loads. The Lift Coefficient (CL and Drag Coefficient (CD data at a range of 0° to 12° angles of attack (α and at three different Reynolds numbers (Re=68,459, 479, 210, and 958, 422 for all the six airfoils were obtained. Realizable k-ε turbulence model with a second-order upwind solution method was used in the simulations. The standard least square method was used to generate response surface by the statistical code JMP. Elitist Non-dominated Sorting Genetic Algorithm (NSGA-II was used to determine the Pareto optimal set based on the response surfaces. Each Pareto optimal solution represents a different compromise between design objectives. This gives the designer a choice to select a design compromise that best suits the requirements from a set of optimal solutions. The Pareto solution set is presented in the form of a Pareto optimal front.
Gregory Gorelik
2014-10-01
Full Text Available In this article, we advance the concept of “evolutionary awareness,” a metacognitive framework that examines human thought and emotion from a naturalistic, evolutionary perspective. We begin by discussing the evolution and current functioning of the moral foundations on which our framework rests. Next, we discuss the possible applications of such an evolutionarily-informed ethical framework to several domains of human behavior, namely: sexual maturation, mate attraction, intrasexual competition, culture, and the separation between various academic disciplines. Finally, we discuss ways in which an evolutionary awareness can inform our cross-generational activities—which we refer to as “intergenerational extended phenotypes”—by helping us to construct a better future for ourselves, for other sentient beings, and for our environment.
José Alexandre F. Diniz-Filho
2013-10-01
Full Text Available Macroecology focuses on ecological questions at broad spatial and temporal scales, providing a statistical description of patterns in species abundance, distribution and diversity. More recently, historical components of these patterns have begun to be investigated more deeply. We tentatively refer to the practice of explicitly taking species history into account, both analytically and conceptually, as ‘evolutionary macroecology’. We discuss how the evolutionary dimension can be incorporated into macroecology through two orthogonal and complementary data types: fossils and phylogenies. Research traditions dealing with these data have developed more‐or‐less independently over the last 20–30 years, but merging them will help elucidate the historical components of diversity gradients and the evolutionary dynamics of species’ traits. Here we highlight conceptual and methodological advances in merging these two research traditions and review the viewpoints and toolboxes that can, in combination, help address patterns and unveil processes at temporal and spatial macro‐scales.
Gorelik, Gregory; Shackelford, Todd K
2014-08-27
In this article, we advance the concept of "evolutionary awareness," a metacognitive framework that examines human thought and emotion from a naturalistic, evolutionary perspective. We begin by discussing the evolution and current functioning of the moral foundations on which our framework rests. Next, we discuss the possible applications of such an evolutionarily-informed ethical framework to several domains of human behavior, namely: sexual maturation, mate attraction, intrasexual competition, culture, and the separation between various academic disciplines. Finally, we discuss ways in which an evolutionary awareness can inform our cross-generational activities-which we refer to as "intergenerational extended phenotypes"-by helping us to construct a better future for ourselves, for other sentient beings, and for our environment.
Jarosław Rudy
2015-01-01
Full Text Available In this paper the job shop scheduling problem (JSP with minimizing two criteria simultaneously is considered. JSP is frequently used model in real world applications of combinatorial optimization. Multi-objective job shop problems (MOJSP were rarely studied. We implement and compare two multi-agent nature-based methods, namely ant colony optimization (ACO and genetic algorithm (GA for MOJSP. Both of those methods employ certain technique, taken from the multi-criteria decision analysis in order to establish ranking of solutions. ACO and GA differ in a method of keeping information about previously found solutions and their quality, which affects the course of the search. In result, new features of Pareto approximations provided by said algorithms are observed: aside from the slight superiority of the ACO method the Pareto frontier approximations provided by both methods are disjoint sets. Thus, both methods can be used to search mutually exclusive areas of the Pareto frontier.
Cleaner production for continuous digester processes based on hybrid Pareto genetic algorithm
无
2003-01-01
Pulping production process produce large amount of wastewater and pollutant emitted, which has become one of the main pollution sources in pulp and paper industry. To solve this problem, it is necessary to implement cleaner production by using modeling and optimization technology. This paper studies the model and multi-objective genetic algorithms for continuous digester process. A model is established, in which environmental pollution and saving energy factors are considered. A hybrid genetic algorithm based on Pareto stratum-niche count is designed for finding near-Pareto or Pareto optimal solutions in the problem. A new genetic evaluation and selection mechanism is proposed. Using the real data from a pulp mill shows the results of computer simulation. Through comparing with the practical curve of digester,this method can reduce the pollutant effectively and increase the profit while keeping the pulp quality constant.
Cleaner production for continuous digester processes based on hybrid Pareto genetic algorithm.
Jin, Fu-Jiang; Wang, Hui; Li, Ping
2003-01-01
Pulping production process produces a large amount of wastewater and pollutant emitted, which has become one of the main pollution sources in pulp and paper industry. To solve this problem, it is necessary to implement cleaner production by using modeling and optimization technology. This paper studies the modeling and multi-objective genetic algorithms for continuous digester process. First, model is established, in which environmental pollution and saving energy factors are considered. Then hybrid genetic algorithm based on Pareto stratum-nichecount is designed for finding near-Pareto or Pareto optimal solutions in the problem and a new genetic evaluation and selection mechanism is proposed. Finally using the real data from a pulp mill shows the results of computer simulation. Through comparing with the practical curve of digester, this method can reduce the pollutant effectively and increase the profit while keeping the pulp quality unchanged.
Prediction in Partial Duration Series With Generalized Pareto-Distributed Exceedances
Rosbjerg, Dan; Madsen, Henrik; Rasmussen, Peter Funder
1992-01-01
As a generalization of the common assumption of exponential distribution of the exceedances in Partial duration series the generalized Pareto distribution has been adopted. Estimators for the parameters are presented using estimation by both method of moments and probability-weighted moments......-weighted moments. Maintaining the generalized Pareto distribution as the parent exceedance distribution the T-year event is estimated assuming the exceedances to be exponentially distributed. For moderately long-tailed exceedance distributions and small to moderate sample sizes it is found, by comparing mean...... square errors of the T-year event estimators, that the exponential distribution is preferable to the correct generalized Pareto distribution despite the introduced model error and despite a possible rejection of the exponential hypothesis by a test of significance. For moderately short-tailed exceedance...
A note on the estimation of the Pareto efficient set for multiobjective matrix permutation problems.
Brusco, Michael J; Steinley, Douglas
2012-02-01
There are a number of important problems in quantitative psychology that require the identification of a permutation of the n rows and columns of an n × n proximity matrix. These problems encompass applications such as unidimensional scaling, paired-comparison ranking, and anti-Robinson forms. The importance of simultaneously incorporating multiple objective criteria in matrix permutation applications is well recognized in the literature; however, to date, there has been a reliance on weighted-sum approaches that transform the multiobjective problem into a single-objective optimization problem. Although exact solutions to these single-objective problems produce supported Pareto efficient solutions to the multiobjective problem, many interesting unsupported Pareto efficient solutions may be missed. We illustrate the limitation of the weighted-sum approach with an example from the psychological literature and devise an effective heuristic algorithm for estimating both the supported and unsupported solutions of the Pareto efficient set.
Nash, Ulrik William
2014-01-01
The concept of evolutionary expectations descends from cue learning psychology, synthesizing ideas on rational expectations with ideas on bounded rationality, to provide support for these ideas simultaneously. Evolutionary expectations are rational, but within cognitive bounds. Moreover......, they are correlated among people who share environments because these individuals satisfice within their cognitive bounds by using cues in order of validity, as opposed to using cues arbitrarily. Any difference in expectations thereby arise from differences in cognitive ability, because two individuals with identical...... expectations emphasizes not only that causal structure changes are common in social systems but also that causal structures in social systems, and expectations about them, develop together....
多目标对策的弱Pareto-Nash平衡点集的稳定性研究%The Stability of the Set of Weakly Pareto-Nash Equilibrium Points
余孝军
2008-01-01
通过定义多目标对策的加权Nash平衡点集,得出它和对应对策的弱Pareto-Nash平衡点集之间的关系.证明了在一定条件下的多目标对策的弱Pareto-Nash平衡点集的稳定性.
Yan Sun
2015-09-01
Full Text Available Purpose: The purpose of study is to solve the multi-modal transportation routing planning problem that aims to select an optimal route to move a consignment of goods from its origin to its destination through the multi-modal transportation network. And the optimization is from two viewpoints including cost and time. Design/methodology/approach: In this study, a bi-objective mixed integer linear programming model is proposed to optimize the multi-modal transportation routing planning problem. Minimizing the total transportation cost and the total transportation time are set as the optimization objectives of the model. In order to balance the benefit between the two objectives, Pareto optimality is utilized to solve the model by gaining its Pareto frontier. The Pareto frontier of the model can provide the multi-modal transportation operator (MTO and customers with better decision support and it is gained by the normalized normal constraint method. Then, an experimental case study is designed to verify the feasibility of the model and Pareto optimality by using the mathematical programming software Lingo. Finally, the sensitivity analysis of the demand and supply in the multi-modal transportation organization is performed based on the designed case. Findings: The calculation results indicate that the proposed model and Pareto optimality have good performance in dealing with the bi-objective optimization. The sensitivity analysis also shows the influence of the variation of the demand and supply on the multi-modal transportation organization clearly. Therefore, this method can be further promoted to the practice. Originality/value: A bi-objective mixed integer linear programming model is proposed to optimize the multi-modal transportation routing planning problem. The Pareto frontier based sensitivity analysis of the demand and supply in the multi-modal transportation organization is performed based on the designed case.
Comparative analysis of Pareto surfaces in multi-criteria IMRT planning
Teichert, K; Suess, P; Serna, J I; Monz, M; Kuefer, K H [Department of Optimization, Fraunhofer Institute for Industrial Mathematics (ITWM), Fraunhofer Platz 1, 67663 Kaiserslautern (Germany); Thieke, C, E-mail: katrin.teichert@itwm.fhg.de [Clinical Cooperation Unit Radiation Oncology, German Cancer Research Center, Im Neuenheimer Feld 280, 69120 Heidelberg (Germany)
2011-06-21
In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g. photons versus protons) than with the classical method of comparing single treatment plans.
Anima: un rompicapo per le scienze sociali? Il contributo di Pareto
Roberta Iannone
2017-08-01
Full Text Available È noto che per Pareto la scienza non è una pura e semplice riproduzione dei fenomeni che osserviamo esteriormente (Pareto non è un mero “positivista” e che non deve occuparsi tanto delle azioni logico-sperimentali, ma soprattutto di quelle non logiche, essendo esse la maggioranza, quelle meno conosciute, quelle più dissimulate e sfigurate (perché forte, come sappiamo, è la tendenza degli esseri umani ad ingannarsi, vittime come sono dell'istinto umano alle razionalizzazioni. È vero, dunque, che la scienza (con il suo metodo, il suo oggetto e la sua tensione ideale deve essere logico-sperimentale, ma questo non significa chiudere gli occhi di fronte a ciò che logico e sperimentale non è. Anzi. Tutto lo sforzo del Trattato di sociologia generale (V. Pareto, 1916, 1964, come noto, consiste semmai, nel capire e spiegare, in maniera logica e scientifica tutta quella infrastruttura di sentimenti, istinti, impulsi e creatività (D. Padua, 2009 che è alla base delle azioni non logiche. È questo il problema fondamentale affrontato nell’opera, lo snodo principale dei suoi studi, ma è anche, a mio avviso, ciò che più può aiutare a spiegare scientificamente il concetto di anima (e ciò al di là di quanto Pareto abbia o meno utilizzato espressamente questa categoria concettuale nelle sue opere. In che senso, dunque, la sociologia di Pareto può contribuire a spiegare scientificamente il concetto di anima? E qual è il vantaggio di questa operazione? A che serve dire che la sociologia di Pareto aiuti nella spiegazione del concetto di anima?
Comparative analysis of Pareto surfaces in multi-criteria IMRT planning.
Teichert, K; Süss, P; Serna, J I; Monz, M; Küfer, K H; Thieke, C
2011-06-21
In the multi-criteria optimization approach to IMRT planning, a given dose distribution is evaluated by a number of convex objective functions that measure tumor coverage and sparing of the different organs at risk. Within this context optimizing the intensity profiles for any fixed set of beams yields a convex Pareto set in the objective space. However, if the number of beam directions and irradiation angles are included as free parameters in the formulation of the optimization problem, the resulting Pareto set becomes more intricate. In this work, a method is presented that allows for the comparison of two convex Pareto sets emerging from two distinct beam configuration choices. For the two competing beam settings, the non-dominated and the dominated points of the corresponding Pareto sets are identified and the distance between the two sets in the objective space is calculated and subsequently plotted. The obtained information enables the planner to decide if, for a given compromise, the current beam setup is optimal. He may then re-adjust his choice accordingly during navigation. The method is applied to an artificial case and two clinical head neck cases. In all cases no configuration is dominating its competitor over the whole Pareto set. For example, in one of the head neck cases a seven-beam configuration turns out to be superior to a nine-beam configuration if the highest priority is the sparing of the spinal cord. The presented method of comparing Pareto sets is not restricted to comparing different beam angle configurations, but will allow for more comprehensive comparisons of competing treatment techniques (e.g., photons versus protons) than with the classical method of comparing single treatment plans.
Identifying best-fitting inputs in health-economic model calibration: a Pareto frontier approach.
Enns, Eva A; Cipriano, Lauren E; Simons, Cyrena T; Kong, Chung Yin
2015-02-01
To identify best-fitting input sets using model calibration, individual calibration target fits are often combined into a single goodness-of-fit (GOF) measure using a set of weights. Decisions in the calibration process, such as which weights to use, influence which sets of model inputs are identified as best-fitting, potentially leading to different health economic conclusions. We present an alternative approach to identifying best-fitting input sets based on the concept of Pareto-optimality. A set of model inputs is on the Pareto frontier if no other input set simultaneously fits all calibration targets as well or better. We demonstrate the Pareto frontier approach in the calibration of 2 models: a simple, illustrative Markov model and a previously published cost-effectiveness model of transcatheter aortic valve replacement (TAVR). For each model, we compare the input sets on the Pareto frontier to an equal number of best-fitting input sets according to 2 possible weighted-sum GOF scoring systems, and we compare the health economic conclusions arising from these different definitions of best-fitting. For the simple model, outcomes evaluated over the best-fitting input sets according to the 2 weighted-sum GOF schemes were virtually nonoverlapping on the cost-effectiveness plane and resulted in very different incremental cost-effectiveness ratios ($79,300 [95% CI 72,500-87,600] v. $139,700 [95% CI 79,900-182,800] per quality-adjusted life-year [QALY] gained). Input sets on the Pareto frontier spanned both regions ($79,000 [95% CI 64,900-156,200] per QALY gained). The TAVR model yielded similar results. Choices in generating a summary GOF score may result in different health economic conclusions. The Pareto frontier approach eliminates the need to make these choices by using an intuitive and transparent notion of optimality as the basis for identifying best-fitting input sets. © The Author(s) 2014.
Kangji Li
2017-02-01
Full Text Available This paper is concerned with the development of a high-resolution and control-friendly optimization framework in enclosed environments that helps improve thermal comfort, indoor air quality (IAQ, and energy costs of heating, ventilation and air conditioning (HVAC system simultaneously. A computational fluid dynamics (CFD-based optimization method which couples algorithms implemented in Matlab with CFD simulation is proposed. The key part of this method is a data interactive mechanism which efficiently passes parameters between CFD simulations and optimization functions. A two-person office room is modeled for the numerical optimization. The multi-objective evolutionary algorithm—non-dominated-and-crowding Sorting Genetic Algorithm II (NSGA-II—is realized to explore the environment/energy Pareto front of the enclosed space. Performance analysis will demonstrate the effectiveness of the presented optimization method.
Agterberg, Frits, E-mail: agterber@nrcan.gc.ca [Geological Survey of Canada (Canada)
2017-07-01
Pareto-lognormal modeling of worldwide metal deposit size–frequency distributions was proposed in an earlier paper (Agterberg in Nat Resour 26:3–20, 2017). In the current paper, the approach is applied to four metals (Cu, Zn, Au and Ag) and a number of model improvements are described and illustrated in detail for copper and gold. The new approach has become possible because of the very large inventory of worldwide metal deposit data recently published by Patiño Douce (Nat Resour 25:97–124, 2016c). Worldwide metal deposits for Cu, Zn and Ag follow basic lognormal size–frequency distributions that form straight lines on lognormal Q–Q plots. Au deposits show a departure from the straight-line model in the vicinity of their median size. Both largest and smallest deposits for the four metals taken as examples exhibit hyperbolic size–frequency relations and their Pareto coefficients are determined by fitting straight lines on log rank–log size plots. As originally pointed out by Patiño Douce (Nat Resour Res 25:365–387, 2016d), the upper Pareto tail cannot be distinguished clearly from the tail of what would be a secondary lognormal distribution. The method previously used in Agterberg (2017) for fitting the bridge function separating the largest deposit size–frequency Pareto tail from the basic lognormal is significantly improved in this paper. A new method is presented for estimating the approximate deposit size value at which the upper tail Pareto comes into effect. Although a theoretical explanation of the proposed Pareto-lognormal distribution model is not a required condition for its applicability, it is shown that existing double Pareto-lognormal models based on Brownian motion generalizations of the multiplicative central limit theorem are not applicable to worldwide metal deposits. Neither are various upper tail frequency amplification models in their present form. Although a physicochemical explanation remains possible, it is argued that
Andersen, Kurt Munk; Sandqvist, Allan
1997-01-01
We investigate the domain of definition and the domain of values for the successor function of a cooperative differential system x'=f(t,x), where the coordinate functions are concave in x for any fixed value of t. Moreover, we give a characterization of a weakly Pareto optimal solution.......We investigate the domain of definition and the domain of values for the successor function of a cooperative differential system x'=f(t,x), where the coordinate functions are concave in x for any fixed value of t. Moreover, we give a characterization of a weakly Pareto optimal solution....
A new Pareto-type distribution with applications in reliability and income data
Bourguignon, Marcelo; Saulo, Helton; Fernandez, Rodrigo Nobre
2016-09-01
A new Pareto-type distribution is introduced and studied. This new model is a generalization of the well-known Pareto distribution. We derive some of its probabilistic and inferential properties. We deduce the mathematical form of the Lorenz curve and the Gini index associated with the new model. The maximum likelihood estimators are derived and their performance are evaluated through a Monte Carlo simulation study. Finally, we illustrate the flexibility of the new distribution by means of three applications to real data sets.
A New Mechanism for Maintaining Diversity of Pareto Archive in Multiobjective Optimization
Hájek, Jaroslav; Šístek, Jakub; 10.1016/j.advengsoft.2010.03.003
2010-01-01
The article introduces a new mechanism for selecting individuals to a Pareto archive. It was combined with a micro-genetic algorithm and tested on several problems. The ability of this approach to produce individuals uniformly distributed along the Pareto set without negative impact on convergence is demonstrated on presented results. The new concept was confronted with NSGA-II, SPEA2, and IBEA algorithms from the PISA package. Another studied effect is the size of population versus number of generations for small populations.
Swynghedauw, B
2004-04-01
Nothing in biology makes sense except in the light of evolution. Evolutionary, or darwinian, medicine takes the view that contemporary diseases result from incompatibility between the conditions under which the evolutionary pressure had modified our genetic endowment and the lifestyle and dietary habits in which we are currently living, including the enhanced lifespan, the changes in dietary habits and the lack of physical activity. An evolutionary trait express a genetic polymorphism which finally improve fitness, it needs million years to become functional. A limited genetic diversity is a necessary prerequisite for evolutionary medicine. Nevertheless, search for a genetic endowment would become nearly impossible if the human races were genetically different. From a genetic point of view, homo sapiens, is homogeneous, and the so-called human races have only a socio-economic definition. Historically, Heart Failure, HF, had an infectious origin and resulted from mechanical overload which triggered mechanoconversion by using phylogenically ancient pleiotropic pathways. Adaptation was mainly caused by negative inotropism. Recently, HF was caused by a complex remodelling caused by the trophic effects of mechanics, ischemia, senescence, diabetes and, neurohormones. The generally admitted hypothesis is that cancers were largely caused by a combination of modern reproductive and dietary lifestyles mismatched with genotypic traits, plus the longer time available for a confrontation. Such a concept is illustrated for skin and breast cancers, and also for the link between cancer risk and dietary habits.
K. Gawdzińska
2011-04-01
Full Text Available This author discusses the use of selected quality management tools, i.e. the Pareto chart and Ishikawa fishbone diagram, for the descriptionof composite casting defects. The Pareto chart allows to determine defect priority related with metallic composite castings, while theIshikawa diagram indicates the causes of defect formation and enables calculating defect weights.
K. Gawdzińska
2011-01-01
This author discusses the use of selected quality management tools, i.e. the Pareto chart and Ishikawa fishbone diagram, for the descriptionof composite casting defects. The Pareto chart allows to determine defect priority related with metallic composite castings, while theIshikawa diagram indicates the causes of defect formation and enables calculating defect weights.
Giesy, D. P.
1978-01-01
A technique is presented for the calculation of Pareto-optimal solutions to a multiple-objective constrained optimization problem by solving a series of single-objective problems. Threshold-of-acceptability constraints are placed on the objective functions at each stage to both limit the area of search and to mathematically guarantee convergence to a Pareto optimum.
Bouter, Anton; Alderliesten, Tanja; Bosman, Peter A. N.
2017-02-01
Taking a multi-objective optimization approach to deformable image registration has recently gained attention, because such an approach removes the requirement of manually tuning the weights of all the involved objectives. Especially for problems that require large complex deformations, this is a non-trivial task. From the resulting Pareto set of solutions one can then much more insightfully select a registration outcome that is most suitable for the problem at hand. To serve as an internal optimization engine, currently used multi-objective algorithms are competent, but rather inefficient. In this paper we largely improve upon this by introducing a multi-objective real-valued adaptation of the recently introduced Gene-pool Optimal Mixing Evolutionary Algorithm (GOMEA) for discrete optimization. In this work, GOMEA is tailored specifically to the problem of deformable image registration to obtain substantially improved efficiency. This improvement is achieved by exploiting a key strength of GOMEA: iteratively improving small parts of solutions, allowing to faster exploit the impact of such updates on the objectives at hand through partial evaluations. We performed experiments on three registration problems. In particular, an artificial problem containing a disappearing structure, a pair of pre- and post-operative breast CT scans, and a pair of breast MRI scans acquired in prone and supine position were considered. Results show that compared to the previously used evolutionary algorithm, GOMEA obtains a speed-up of up to a factor of 1600 on the tested registration problems while achieving registration outcomes of similar quality.
δ-Similar Elimination to Enhance Search Performance of Multiobjective Evolutionary Algorithms
Aguirre, Hernán; Sato, Masahiko; Tanaka, Kiyoshi
In this paper, we propose δ-similar elimination to improve the search performance of multiobjective evolutionary algorithms in combinatorial optimization problems. This method eliminates similar individuals in objective space to fairly distribute selection among the different regions of the instantaneous Pareto front. We investigate four eliminating methods analyzing their effects using NSGA-II. In addition, we compare the search performance of NSGA-II enhanced by our method and NSGA-II enhanced by controlled elitism.
Towards Automatic Controller Design using Multi-Objective Evolutionary Algorithms
Pedersen, Gerulf
of evolutionary computation, a choice was made to use multi-objective algorithms for the purpose of aiding in automatic controller design. More specifically, the choice was made to use the Non-dominated Sorting Genetic Algorithm II (NSGAII), which is one of the most potent algorithms currently in use......, as the foundation for achieving the desired goal. While working with the algorithm, some issues arose which limited the use of the algorithm for unknown problems. These issues included the relative scale of the used fitness functions and the distribution of solutions on the optimal Pareto front. Some work has...
A Pareto Improving Strategy for the Time-Dependent Morning Commute Problem
Garcia, Reinaldo Crispiniano
1999-01-01
This dissertation describes a strategy which makes all commuters better off (i.e. a Pareto effecient strategy) for the time-dependent morning commute problem, even if the collected revenues are not returned to the population of commuters. The proposed strategy will apply road pricing as a tool for congestion management, a practice usually called congestion pricing.
On Usage of Pareto curves to Select Wind Turbine Controller Tunings to the Wind Turbulence Level
Odgaard, Peter Fogh
2015-01-01
to update an model predictive wind turbine controller tuning as the wind turbulence increases, as increased turbulence levels results in higher loads for the same controller tuning. In this paper the Pareto curves are computed using an industrial high fidelity aero-elastic model. Simulations show...
Model-based problem solving through symbolic regression via pareto genetic programming
Vladislavleva, E.
2008-01-01
Pareto genetic programming methodology is extended by additional generic model selection and generation strategies that (1) drive the modeling engine to creation of models of reduced non-linearity and increased generalization capabilities, and (2) improve the effectiveness of the search for robust m
Approximating the Pareto Set of Multiobjective Linear Programs via Robust Optimization
Gorissen, B.L.; den Hertog, D.
2012-01-01
Abstract: The Pareto set of a multiobjective optimization problem consists of the solutions for which one or more objectives can not be improved without deteriorating one or more other objectives. We consider problems with linear objectives and linear constraints and use Adjustable Robust Optimizati
Gökhan Gökdere
2014-05-01
Full Text Available In this paper, closed form expressions for the moments of the truncated Pareto order statistics are obtained by using conditional distribution. We also derive some results for the moments which will be useful for moment computations based on ordered data.
Design of high performance multilayer microwave absorbers using fast Pareto genetic algorithm
无
2009-01-01
The application of the Non-dominated Sorting Genetic Algorithm Ⅱ (NSGA-Ⅱ) in designing microwave absorbers is described in this paper. To obtain high performance coatings,we put forward three cost functions,which represent three objectives of strong-absorption,broad-bandwidth and thin structure,and study the tradeoffs between each other. Numerical calculations on available materials in 2―18 GHz are implemented to construct the Pareto front and Pareto-optimal surface for two and three objectives respectively. Results indicate that the NSGA-Ⅱ can work more efficiently and effectively than traditional Pareto genetic algorithms. Additionally,we present several particular designs from the above Pareto front (surface) for potential applications in different frequency bands. For example,a four-layer absorber with thickness of 2.8071 mm is obtained to provide average reflection coefficient of -11.95 dB and average reflection bandwidth of 0.5780 in 2―18 GHz,considering arbitrary incident angles (0°―89°) and both TE and TM polarizations.
Kyroudi, Archonteia; Petersson, Kristoffer; Ghandour, Sarah; Pachoud, Marc; Matzinger, Oscar; Ozsahin, Mahmut; Bourhis, Jean; Bochud, François; Moeckli, Raphaël
2016-08-01
Multi-criteria optimization provides decision makers with a range of clinical choices through Pareto plans that can be explored during real time navigation and then converted into deliverable plans. Our study shows that dosimetric differences can arise between the two steps, which could compromise the clinical choices made during navigation.
Barmby, Tim; Smith, Nina
1996-01-01
This paper analyses the labour supply behaviour of households in Denmark and Britain. It employs models in which the preferences of individuals within the household are explicitly represented. The households are then assumed to decide on their labour supply in a Pareto-Optimal fashion. Describing...
A hybrid pareto mixture for conditional asymmetric fat-tailed distributions.
Carreau, Julie; Bengio, Yoshua
2009-07-01
In many cases, we observe some variables X that contain predictive information over a scalar variable of interest Y , with (X,Y) pairs observed in a training set. We can take advantage of this information to estimate the conditional density p(Y|X = x). In this paper, we propose a conditional mixture model with hybrid Pareto components to estimate p(Y|X = x). The hybrid Pareto is a Gaussian whose upper tail has been replaced by a generalized Pareto tail. A third parameter, in addition to the location and spread parameters of the Gaussian, controls the heaviness of the upper tail. Using the hybrid Pareto in a mixture model results in a nonparametric estimator that can adapt to multimodality, asymmetry, and heavy tails. A conditional density estimator is built by modeling the parameters of the mixture estimator as functions of X. We use a neural network to implement these functions. Such conditional density estimators have important applications in many domains such as finance and insurance. We show experimentally that this novel approach better models the conditional density in terms of likelihood, compared to competing algorithms: conditional mixture models with other types of components and a classical kernel-based nonparametric model.
Pospelov, A. I.
2016-08-01
Adaptive methods for the polyhedral approximation of the convex Edgeworth-Pareto hull in multiobjective monotone integer optimization problems are proposed and studied. For these methods, theoretical convergence rate estimates with respect to the number of vertices are obtained. The estimates coincide in order with those for filling and augmentation H-methods intended for the approximation of nonsmooth convex compact bodies.
The spatial meaning of Pareto's scaling exponent of city-size distribution
Chen, Yanguang
2013-01-01
The scaling exponent of a hierarchy of cities used to be regarded as a fractal parameter. The Pareto exponent was treated as the fractal dimension of size distribution of cities, while the Zipf exponent was treated as the reciprocal of the fractal dimension. However, this viewpoint is not exact. In this paper, I will present a new interpretation of the scaling exponent of rank-size distributions. The ideas from fractal measure relation and the principle of dimension consistency are employed to explore the essence of Pareto's and Zipf's scaling exponents. The Pareto exponent proved to be a ratio of the fractal dimension of a network of cities to the average dimension of city population. Accordingly, the Zipf exponent is the reciprocal of this dimension ratio. On a digital map, the Pareto exponent can be defined by the scaling relation between a map scale and the corresponding number of cities based on this scale. The cities of the United States of America in 1900, 1940, 1960, and 1980 and Indian cities in 1981...
Design of high performance multilayer microwave absorbers using fast Pareto genetic algorithm
JIANG LiYong; LI XiangYin; ZHANG Jie
2009-01-01
The application of the Non-dominated Sorting Genetic Algorithm Ⅱ (NSGA-Ⅱ) in designing microwave absorbers is described in this paper. To obtain high performance coatings, we put forward three cost functions, which represent three objectives of strong-absorption, broad-bandwidth and thin structure, and study the tradeoffs between each other. Numerical calculations on available materials in 2-18 GHz are implemented to construct the Pareto front and Pareto-optimal surface for two and three objectives respectively. Results indicate that the NSGA-Ⅱ can work more efficiently and effectively than traditional Pareto genetic algorithms. Additionally, we present several particular designs from the above Pareto front (surface) for potential applications in different frequency bands. For example, a four-layer ab-sorber with thickness of 2.8071 mm is obtained to provide average reflection coefficient of -11.95 dB and average reflection bandwidth of 0.5780 in 2-18 GHz, considering arbitrary incident angles (0°-89°) and both TE and TM polarizations.
Searching for the Pareto frontier in multi-objective protein design.
Nanda, Vikas; Belure, Sandeep V; Shir, Ofer M
2017-08-10
The goal of protein engineering and design is to identify sequences that adopt three-dimensional structures of desired function. Often, this is treated as a single-objective optimization problem, identifying the sequence-structure solution with the lowest computed free energy of folding. However, many design problems are multi-state, multi-specificity, or otherwise require concurrent optimization of multiple objectives. There may be tradeoffs among objectives, where improving one feature requires compromising another. The challenge lies in determining solutions that are part of the Pareto optimal set-designs where no further improvement can be achieved in any of the objectives without degrading one of the others. Pareto optimality problems are found in all areas of study, from economics to engineering to biology, and computational methods have been developed specifically to identify the Pareto frontier. We review progress in multi-objective protein design, the development of Pareto optimization methods, and present a specific case study using multi-objective optimization methods to model the tradeoff between three parameters, stability, specificity, and complexity, of a set of interacting synthetic collagen peptides.
Angela Maria Zocchi
2017-08-01
Full Text Available Parlare di Pareto, oggi, significa occuparsi di un classico della sociologia che è stato oggetto di una ricezione/recezione discontinua, non solo in Europa ma anche, forse soprattutto, in America. Se negli anni Venti James Harvey Robinson scriveva che la teoria dei residui e delle derivazioni di Pareto si poteva annoverare tra le grandi scoperte scientifiche, è anche vero che la figura di Pareto ha suscitato riserve e perplessità, e che i suoi meriti «furono riconosciuti soltanto dieci anni dopo la sua morte, e per di più in America» (Coser [1977] 1983, p. 582, in particolare ad Harvard (Femia, Marshall 2012. Si pensi ad esempio a Parsons, che negli anni Trenta, ne La struttura dell’azione sociale, richiama ampiamente Pareto (Parsons [1937] 1968, o anche a Merton, il quale, sebbene non sia stato mai molto attratto dalle analisi paretiane (cfr. Coser 1975, p. 96; Coser [1977] 1983, p. 585, nell’intervista rilasciata ad Anna Di Lellio ha ricordato il grande interesse con il quale seguì i seminari di Henderson su Pareto (cfr. Di Lellio 1985, p. 17. E in effetti, negli anni Trenta gli Stati Uniti sono stati estremamente recettivi nei confronti dell’opera di Pareto. Successivamente, però, questo interesse si è fortemente ridimensionato, per poi riaccendersi negli anni Cinquanta e Sessanta, non solo in America ma anche in Europa, come testimoniato, fra l’altro, dall’attenzione riservata a Pareto da Raymond Aron ([1967] 1989. Una ricezione/recezione discontinua, quindi, che ha caratterizzato anche i decenni successivi e il nuovo millennio (cfr. Federici 1991, 1999, 2016, con studi che hanno cercato di rispondere anche a un inquietante interrogativo (cfr. Cirillo 1983; Femia e Marshall 2012: se Pareto fosse vissuto più a lungo, si sarebbe opposto al fascismo? Ciò premesso, partendo dalla distinzione tra ricezione e recezione, il paper intende strutturare una riflessione sulla recezione di Pareto negli Stati Uniti. Non mi soffermer
蒲勇健; 杨哲
2012-01-01
研究了具有任意多个局中人的非合作多目标博弈(多目标大博弈).基于一般非合作博弈中的Berge均衡概念,定义多目标大博弈中的弱Pareto-Berge均衡.进一步推广了截口定理,得到新的截口定理,并且利用这个新的截口定理证明多目标大博弈中弱Pareto-Berge均衡的存在性.多目标大博弈中弱Pareto-Nash均衡的存在性结论可作为弱Pareto-Berge均衡存在性的特例给出.%This paper considers noncooperative multi-objective games with multi-players (multi-objective large game).According to Berge equilibrium in normal games,we introduce the notion of weakly Pareto-Berge equilibrium in multi-objective large games.By generalizing section theorem,we show the existence of weakly Pareto-Berge equilibrium in multi-objective large games.As a special case,we obtain the existence of weakly Pareto-Nash equilibrium points in multi-objective large games.
Trade-off bounds for the Pareto surface approximation in multi-criteria IMRT planning
Serna, J I; Monz, M; Kuefer, K H [Department of Optimization, Fraunhofer Institute for Industrial Mathematics (ITWM), Fraunhofer Platz 1, 67663 Kaiserslautern (Germany); Thieke, C [Clinical Cooperation Unit Radiation Oncology, German Cancer Research Center, and Department of Radiation Oncology, University Clinic Heidelberg, Im Neuenheimer Feld 280, 69120 Heidelberg (Germany)], E-mail: serna@itwm.fhg.de
2009-10-21
One approach to multi-criteria IMRT planning is to automatically calculate a data set of Pareto-optimal plans for a given planning problem in a first phase, and then interactively explore the solution space and decide on the clinically best treatment plan in a second phase. The challenge of computing the plan data set is to ensure that all clinically meaningful plans are covered and that as many clinically irrelevant plans as possible are excluded to keep computation times within reasonable limits. In this work, we focus on the approximation of the clinically relevant part of the Pareto surface, the process that constitutes the first phase. It is possible that two plans on the Pareto surface have a small, clinically insignificant difference in one criterion and a significant difference in another criterion. For such cases, only the plan that is clinically clearly superior should be included into the data set. To achieve this during the Pareto surface approximation, we propose to introduce bounds that restrict the relative quality between plans, the so-called trade-off bounds. We show how to integrate these trade-off bounds into the approximation scheme and study their effects. The proposed scheme is applied to two artificial cases and one clinical case of a paraspinal tumor. For all cases, the quality of the Pareto surface approximation is measured with respect to the number of computed plans, and the range of values occurring in the approximation for different criteria is compared. Through enforcing trade-off bounds, the scheme disregards clinically irrelevant plans during the approximation. Thereby, the number of plans necessary to achieve a good approximation quality can be significantly reduced. Thus, trade-off bounds are an effective tool to focus the planning and to reduce computation time.
Ranking of microRNA target prediction scores by Pareto front analysis.
Sahoo, Sudhakar; Albrecht, Andreas A
2010-12-01
Over the past ten years, a variety of microRNA target prediction methods has been developed, and many of the methods are constantly improved and adapted to recent insights into miRNA-mRNA interactions. In a typical scenario, different methods return different rankings of putative targets, even if the ranking is reduced to selected mRNAs that are related to a specific disease or cell type. For the experimental validation it is then difficult to decide in which order to process the predicted miRNA-mRNA bindings, since each validation is a laborious task and therefore only a limited number of mRNAs can be analysed. We propose a new ranking scheme that combines ranked predictions from several methods and - unlike standard thresholding methods - utilises the concept of Pareto fronts as defined in multi-objective optimisation. In the present study, we attempt a proof of concept by applying the new ranking scheme to hsa-miR-21, hsa-miR-125b, and hsa-miR-373 and prediction scores supplied by PITA and RNAhybrid. The scores are interpreted as a two-objective optimisation problem, and the elements of the Pareto front are ranked by the STarMir score with a subsequent re-calculation of the Pareto front after removal of the top-ranked mRNA from the basic set of prediction scores. The method is evaluated on validated targets of the three miRNA, and the ranking is compared to scores from DIANA-microT and TargetScan. We observed that the new ranking method performs well and consistent, and the first validated targets are elements of Pareto fronts at a relatively early stage of the recurrent procedure, which encourages further research towards a higher-dimensional analysis of Pareto fronts.
Pareto navigation: algorithmic foundation of interactive multi-criteria IMRT planning.
Monz, M; Küfer, K H; Bortfeld, T R; Thieke, C
2008-02-21
Inherently, IMRT treatment planning involves compromising between different planning goals. Multi-criteria IMRT planning directly addresses this compromising and thus makes it more systematic. Usually, several plans are computed from which the planner selects the most promising following a certain procedure. Applying Pareto navigation for this selection step simultaneously increases the variety of planning options and eases the identification of the most promising plan. Pareto navigation is an interactive multi-criteria optimization method that consists of the two navigation mechanisms 'selection' and 'restriction'. The former allows the formulation of wishes whereas the latter allows the exclusion of unwanted plans. They are realized as optimization problems on the so-called plan bundle -- a set constructed from pre-computed plans. They can be approximately reformulated so that their solution time is a small fraction of a second. Thus, the user can be provided with immediate feedback regarding his or her decisions. Pareto navigation was implemented in the MIRA navigator software and allows real-time manipulation of the current plan and the set of considered plans. The changes are triggered by simple mouse operations on the so-called navigation star and lead to real-time updates of the navigation star and the dose visualizations. Since any Pareto-optimal plan in the plan bundle can be found with just a few navigation operations the MIRA navigator allows a fast and directed plan determination. Besides, the concept allows for a refinement of the plan bundle, thus offering a middle course between single plan computation and multi-criteria optimization. Pareto navigation offers so far unmatched real-time interactions, ease of use and plan variety, setting it apart from the multi-criteria IMRT planning methods proposed so far.
Comparison of Two Methods Used to Model Shape Parameters of Pareto Distributions
Liu, C.; Charpentier, R.R.; Su, J.
2011-01-01
Two methods are compared for estimating the shape parameters of Pareto field-size (or pool-size) distributions for petroleum resource assessment. Both methods assume mature exploration in which most of the larger fields have been discovered. Both methods use the sizes of larger discovered fields to estimate the numbers and sizes of smaller fields: (1) the tail-truncated method uses a plot of field size versus size rank, and (2) the log-geometric method uses data binned in field-size classes and the ratios of adjacent bin counts. Simulation experiments were conducted using discovered oil and gas pool-size distributions from four petroleum systems in Alberta, Canada and using Pareto distributions generated by Monte Carlo simulation. The estimates of the shape parameters of the Pareto distributions, calculated by both the tail-truncated and log-geometric methods, generally stabilize where discovered pool numbers are greater than 100. However, with fewer than 100 discoveries, these estimates can vary greatly with each new discovery. The estimated shape parameters of the tail-truncated method are more stable and larger than those of the log-geometric method where the number of discovered pools is more than 100. Both methods, however, tend to underestimate the shape parameter. Monte Carlo simulation was also used to create sequences of discovered pool sizes by sampling from a Pareto distribution with a discovery process model using a defined exploration efficiency (in order to show how biased the sampling was in favor of larger fields being discovered first). A higher (more biased) exploration efficiency gives better estimates of the Pareto shape parameters. ?? 2011 International Association for Mathematical Geosciences.
Trade-off bounds for the Pareto surface approximation in multi-criteria IMRT planning.
Serna, J I; Monz, M; Küfer, K H; Thieke, C
2009-10-21
One approach to multi-criteria IMRT planning is to automatically calculate a data set of Pareto-optimal plans for a given planning problem in a first phase, and then interactively explore the solution space and decide on the clinically best treatment plan in a second phase. The challenge of computing the plan data set is to ensure that all clinically meaningful plans are covered and that as many clinically irrelevant plans as possible are excluded to keep computation times within reasonable limits. In this work, we focus on the approximation of the clinically relevant part of the Pareto surface, the process that constitutes the first phase. It is possible that two plans on the Pareto surface have a small, clinically insignificant difference in one criterion and a significant difference in another criterion. For such cases, only the plan that is clinically clearly superior should be included into the data set. To achieve this during the Pareto surface approximation, we propose to introduce bounds that restrict the relative quality between plans, the so-called trade-off bounds. We show how to integrate these trade-off bounds into the approximation scheme and study their effects. The proposed scheme is applied to two artificial cases and one clinical case of a paraspinal tumor. For all cases, the quality of the Pareto surface approximation is measured with respect to the number of computed plans, and the range of values occurring in the approximation for different criteria is compared. Through enforcing trade-off bounds, the scheme disregards clinically irrelevant plans during the approximation. Thereby, the number of plans necessary to achieve a good approximation quality can be significantly reduced. Thus, trade-off bounds are an effective tool to focus the planning and to reduce computation time.
Scheufen, Martin; Natemeyer, Hendrik; Surmann, Yvonne; Schnettler, Armin [RWTH Aachen Univ. (Germany). Inst. fuer Hochspannungstechnik
2012-07-01
This paper presents a methodology that generates transmission grid expansion options which are assessed under competing interests of technical, economic and ecological criteria. The results presented are based on a multi-objective evolutionary optimization approach (genetic algorithm, NSGA-II) in which each individual of the population is representing a grid expansion pattern. As network expansion option power flow controlling elements, such as FACTS and PSTs are considered in addition to line-additions. The results are demonstrated on a modified standard grid-model. The comparative assessment of the specific expansion options uses a modified optimal power flow method taking into account characteristically spatio-temporal correlated supply and load pattern. (orig.)
Hunt, Tam
2015-01-01
Evolution as an idea has a lengthy history, even though the idea of evolution is generally associated with Darwin today. Rebecca Stott provides an engaging and thoughtful overview of this history of evolutionary thinking in her 2013 book, Darwin's Ghosts: The Secret History of Evolution. Since Darwin, the debate over evolution—both how it takes place and, in a long war of words with religiously-oriented thinkers, whether it takes place—has been sustained and heated. A growing share of this de...
Alvaro Bianchi
2012-04-01
Full Text Available Antonio Gramsci e Vilfredo Pareto, a despeito de pertencerem a diferentes campos de interpretação social, fazem parte de uma tradição maquiavelista dos estudos políticos, responsável por notáveis continuidades temáticas e afinidades nas formulações gerais de conceitos políticos. Esta convergência dá-se principalmente em torno de dois temas que serão analisados neste artigo: a metodologia da ciência política e a distinção entre governantes e governados. Pareto reivindica uma ciência livre de ideais fictícios, assente na observação empírica e histórica. Gramsci, por outro lado, entendia que uma ciência da política só poderia ser concebida a partir da perceção de que toda a teoria social estaria inserida no campo das relações de forças sociais implícitas na dialética entre estrutura e superestrutura.Although they worked in different fields of social interpretation, Antonio Gramsci and Vilfredo Pareto are both part of a Machiavellian tradition of political studies that carries with itself considerable thematic continuities and affinities in the overall formulation of political concepts. This is especially visible with regard to the two main topics examined in this article: the methodology of political science and the distinction between the governed and those who govern. Pareto proposes a science that is free from fictional ideals, founded on empirical, historical observation. Gramsci, on the other hand, thought that a political science could not but be founded on the understanding that any social theory must necessarily be part of the field of relations constituted by the social forces that are implicit in the dialectic between structure and superstructure.
凌莉芸; 凌晨
2016-01-01
For a class of eigenvalue complementarity problem with strictly semi-positive tensors,we study the symbolic features of Pareto-eigenvalue.On this based,we obtain the upper and lower bounds of Pareto-eigenvalue for eigenvalue complementarity problem with strictly semi-positive tensors by using the constant definition and operator definition of strictly semi-positive tensors.%针对一类严格半正张量特征值互补问题，研究了其 Pareto-特征值的符号特征。在此基础上，利用严格半正张量的常量定义和算子定义，得到了严格半正张量特征值互补问题的 Pareto-特征值的上下界估计。
杨艳秋; 宋立新
2011-01-01
讨论了均匀分布U(0,θ)共轭于Pareto分布模型下的二行动线性决策问题的抽样信息期望值(EVSI)的计算公式.%This study discusses the expected value of sampling information(EVSI) of the linear decision-making problem on two actions about the model of Pareto distribution conjugate in uniformly distribution (Pareto-U) model.
Fuzzy Preference Incorporated Evolutionary Algorithm for Multiobjective Optimization
Surafel Luleseged Tilahun
2011-01-01
Full Text Available Multiobjective evolutionary method is a way to overcome the limitation of the classical methods, by finding multiple solutions within a single run of the solution procedure. The aim of having a solution method for multiobjective optimization problem is to help the decision maker in getting the best solution. Usually the decision maker is not interested in a diverse set of Pareto optimal points. So, it is necessary to incorporate the decision maker’s preference so that the algorithm gives out alternative solutions around the decision maker’s preference. The problem in incorporating the decision maker’s preference is that the decision maker may not have a solid guide line in comparing tradeoffs of objectives. However, it is easy for the decision maker to compare in a fuzzy way. This paper discusses on incorporating a fuzzy tradeoffs in the evolutionary algorithm to zoom out the region where the decision maker’s preference lies. By using test functions it has shown that it is possible to give points in the region on the Pareto front where the decision maker’s interest lies.
Interactive Evolutionary Multi-Objective Optimization Algorithm Using Cone Dominance
Dalaijargal Purevsuren; Saif ur Rehman; Gang Cui; Jianmin Bao; Nwe Nwe Htay Win
2015-01-01
As the number of objectives increases, the performance of the Pareto dominance⁃based Evolutionary Multi⁃objective Optimization ( EMO) algorithms such as NSGA⁃II, SPEA2 severely deteriorates due to the drastic increase in the Pareto⁃incomparable solutions. We propose a sorting method which classifies these incomparable solutions into several ordered classes by using the decision maker's ( DM) preference information. This is accomplished by designing an interactive evolutionary algorithm and constructing convex cones. This method allows the DMs to drive the search process toward a preferred region of the Pareto optimal front. The performance of the proposed algorithm is assessed for two, three, and four⁃objective knapsack problems. The results demonstrate the algorithm's ability to converge to the most preferred point. The evaluation and comparison of the results indicate that the proposed approach gives better solutions than that of NSGA⁃II. In addition, the approach is more efficient compared to NSGA⁃II in terms of the number of generations required to reach the preferred point.
杨哲; 蒲勇健; 郭心毅
2013-01-01
在已知不确定参数变化范围的假设下,研究了多目标博弈中弱Pareto-NS均衡点的存在性问题.首先结合非合作博弈中NS-均衡的定义,给出不确定性下多目标博弈中弱Pareto-NS均衡的定义.进一步借助Fan-Glicksberg不动点定理,证明弱Pareto-NS均衡点的存在性.最后给出算例,验证其可行性.%In this paper, under the assumption that the domain of the undetermined parameters is known, we study the existence theorem for weakly Pareto-NS equilibrium points in multi-objective games. Combined the concept of NS-equilibrium in noncooperative games, we firstly introduce the concept of weakly Pareto-NS equilibrium points, further prove the existence of weakly Pareto-NS equilibrium points in multi-objective games under uncertainty by the mean of Fan-Glicksberg fixed point theorem. Finally, a numeric example is given.
A multiobjective optimization algorithm is applied to a groundwater quality management problem involving remediation by pump-and-treat (PAT). The multiobjective optimization framework uses the niched Pareto genetic algorithm (NPGA) and is applied to simultaneously minimize the...
Reed, P. M.; Kollat, J. B.
2005-12-01
This study demonstrates the effectiveness of a modified version of Deb's Non-Dominated Sorted Genetic Algorithm II (NSGAII), which the authors have named the Epsilon-Dominance Non-Dominated Sorted Genetic Algorithm II (Epsilon-NSGAII), at solving a four objective long-term groundwater monitoring (LTM) design test case. The Epsilon-NSGAII incorporates prior theoretical competent evolutionary algorithm (EA) design concepts and epsilon-dominance archiving to improve the original NSGAII's efficiency, reliability, and ease-of-use. This algorithm eliminates much of the traditional trial-and-error parameterization associated with evolutionary multi-objective optimization (EMO) through epsilon-dominance archiving, dynamic population sizing, and automatic termination. The effectiveness and reliability of the new algorithm is compared to the original NSGAII as well as two other benchmark multi-objective evolutionary algorithms (MOEAs), the Epsilon-Dominance Multi-Objective Evolutionary Algorithm (Epsilon-MOEA) and the Strength Pareto Evolutionary Algorithm 2 (SPEA2). These MOEAs have been selected because they have been demonstrated to be highly effective at solving numerous multi-objective problems. The results presented in this study indicate superior performance of the Epsilon-NSGAII in terms of the hypervolume indicator, unary Epsilon-indicator, and first-order empirical attainment function metrics. In addition, the runtime metric results indicate that the diversity and convergence dynamics of the Epsilon-NSGAII are competitive to superior relative to the SPEA2, with both algorithms greatly outperforming the NSGAII and Epsilon-MOEA in terms of these metrics. The improvements in performance of the Epsilon-NSGAII over its parent algorithm the NSGAII demonstrate that the application of Epsilon-dominance archiving, dynamic population sizing with archive injection, and automatic termination greatly improve algorithm efficiency and reliability. In addition, the usability of
Evaluation of the Effectiveness of Engineering Production Processes using Pareto Analysis
Darina Matisková
2015-02-01
Full Text Available The aim of this paper is to illustrate possibilities of using Pareto method in evaluating the effectiveness of engineering production processes. The essence of this issue is dividing materials by using progressive technologies on the specific component and the evaluation of its effectiveness and quality. For the production of component was used method of dividing by the plasma, laser and water jet. To eliminate the irregularities in the quality of the resulting component was used Pareto method. The aim was to determine from the available technical knowledge the most efficient method using established evaluation model of efficiency. The result is the finding that the most effective device for dividing the chosen component is the plasma device.
Akbar A. Tabriz
2011-07-01
Full Text Available Concurrent engineering (CE is one of the widest known techniques for simultaneous planning of product and process design. In concurrent engineering, design processes are often complicated with multiple conflicting criteria and discrete sets of feasible alternatives. Thus multi-criteria decision making (MCDM techniques are integrated into CE to perform concurrent design. This paper proposes a design framework governed by MCDM technique, which are in conflict in the sense of competing for common resources to achieve variously different performance objectives such as financial, functional, environmental, etc. The Pareto MCDM model is applied to polyethylene pipe concurrent design governed by four criteria to determine the best alternative design to Pareto-compromise design.
Prediction in Partial Duration Series With Generalized Pareto-Distributed Exceedances
Rosbjerg, Dan; Madsen, Henrik; Rasmussen, Peter Funder
1992-01-01
As a generalization of the common assumption of exponential distribution of the exceedances in Partial duration series the generalized Pareto distribution has been adopted. Estimators for the parameters are presented using estimation by both method of moments and probability-weighted moments....... The corresponding estimators for the T-year event are given and approximate expressions for bias and variance of the estimators are derived in both cases. Using the mean square error of the T-year event estimator as a performance index it is shown that the method of moments is preferable to the probability...... square errors of the T-year event estimators, that the exponential distribution is preferable to the correct generalized Pareto distribution despite the introduced model error and despite a possible rejection of the exponential hypothesis by a test of significance. For moderately short-tailed exceedance...
Pareto evolution of gene networks: an algorithm to optimize multiple fitness objectives.
Warmflash, Aryeh; Francois, Paul; Siggia, Eric D
2012-10-01
The computational evolution of gene networks functions like a forward genetic screen to generate, without preconceptions, all networks that can be assembled from a defined list of parts to implement a given function. Frequently networks are subject to multiple design criteria that cannot all be optimized simultaneously. To explore how these tradeoffs interact with evolution, we implement Pareto optimization in the context of gene network evolution. In response to a temporal pulse of a signal, we evolve networks whose output turns on slowly after the pulse begins, and shuts down rapidly when the pulse terminates. The best performing networks under our conditions do not fall into categories such as feed forward and negative feedback that also encode the input-output relation we used for selection. Pareto evolution can more efficiently search the space of networks than optimization based on a single ad hoc combination of the design criteria.
Houghton, J.C.
1988-01-01
The truncated shifted Pareto (TSP) distribution, a variant of the two-parameter Pareto distribution, in which one parameter is added to shift the distribution right and left and the right-hand side is truncated, is used to model size distributions of oil and gas fields for resource assessment. Assumptions about limits to the left-hand and right-hand side reduce the number of parameters to two. The TSP distribution has advantages over the more customary lognormal distribution because it has a simple analytic expression, allowing exact computation of several statistics of interest, has a "J-shape," and has more flexibility in the thickness of the right-hand tail. Oil field sizes from the Minnelusa play in the Powder River Basin, Wyoming and Montana, are used as a case study. Probability plotting procedures allow easy visualization of the fit and help the assessment. ?? 1988 International Association for Mathematical Geology.
Efficiency of Pareto joint inversion of 2D geophysical data using global optimization methods
Miernik, Katarzyna; Bogacz, Adrian; Kozubal, Adam; Danek, Tomasz; Wojdyła, Marek
2016-04-01
Pareto joint inversion of two or more sets of data is a promising new tool of modern geophysical exploration. In the first stage of our investigation we created software enabling execution of forward solvers of two geophysical methods (2D magnetotelluric and gravity) as well as inversion with possibility of constraining solution with seismic data. In the algorithm solving MT forward solver Helmholtz's equations, finite element method and Dirichlet's boundary conditions were applied. Gravity forward solver was based on Talwani's algorithm. To limit dimensionality of solution space we decided to describe model as sets of polygons, using Sharp Boundary Interface (SBI) approach. The main inversion engine was created using Particle Swarm Optimization (PSO) algorithm adapted to handle two or more target functions and to prevent acceptance of solutions which are non - realistic or incompatible with Pareto scheme. Each inversion run generates single Pareto solution, which can be added to Pareto Front. The PSO inversion engine was parallelized using OpenMP standard, what enabled execution code for practically unlimited amount of threads at once. Thereby computing time of inversion process was significantly decreased. Furthermore, computing efficiency increases with number of PSO iterations. In this contribution we analyze the efficiency of created software solution taking under consideration details of chosen global optimization engine used as a main joint minimization engine. Additionally we study the scale of possible decrease of computational time caused by different methods of parallelization applied for both forward solvers and inversion algorithm. All tests were done for 2D magnetotelluric and gravity data based on real geological media. Obtained results show that even for relatively simple mid end computational infrastructure proposed solution of inversion problem can be applied in practice and used for real life problems of geophysical inversion and interpretation.
Time consistent Pareto solutions in common access resource gameswith asymmetric players
De-Paz, Albert; Marín Solano, Jesús; Navas, Jorge
2011-01-01
In the analysis of equilibrium policies in a di erential game, if agents have different time preference rates, the cooperative (Pareto optimum) solution obtained by applying the Pontryagin's Maximum Principle becomes time inconsistent. In this work we derive a set of dynamic programming equations (in discrete and continuous time) whose solutions are time consistent equilibrium rules for N-player cooperative di erential games in which agents di er in their instantaneous utility functions and a...
Application of isa and pareto diagram as management of the plots Lagoa Carapebus Serra / ES
Neumann, Bruna; Calmon,Ana Paula Santos; Aguiar, Marluce Martins
2013-01-01
Application of the Indicator of Environmental Health (ISA), with further elaboration of Pareto Diagram allowed to verify the sanitary and environmental conditions of the plots Lagoa Carapebus along with the use of primary data (field information) and application forms to the local community. These management tools include qualitative and quantitative aspects of public services. The final ISA presented a situation of average health, because high scores of some components of the ISA provided th...
Pareto-Efficient Target by Obtaining the Facets of the Efficient Frontier in DEA
Washio, Satoshi; Yamada, Syuuji; Tanaka, Tamaki; TANINO, Tetsuzo
2011-01-01
In this paper, we propose an algorithm to calculate an improvement target for each inefficient DMU in the CCR model by calculating all equations forming the facets of the efficient frontier. By introducing a parameter into the algorithm, we calculate a minimal distance point or a Pareto-efficient point on the efficient frontier as an improvement target. An improvement targets are obtained by solving quadratic mathematical problems.
The Forbes 400, the Pareto power-law and efficient markets
Klass, O. S.; Biham, O.; Levy, M.; Malcai, O.; Solomon, S.
2007-01-01
Statistical regularities at the top end of the wealth distribution in the United States are examined using the Forbes 400 lists of richest Americans, published between 1988 and 2003. It is found that the wealths are distributed according to a power-law (Pareto) distribution. This result is explained using a simple stochastic model of multiple investors that incorporates the efficient market hypothesis as well as the multiplicative nature of financial market fluctuations.
Optimal Reinsurance Design for Pareto Optimum: From the Perspective of Multiple Reinsurers
Xing Rong
2016-01-01
Full Text Available This paper investigates optimal reinsurance strategies for an insurer which cedes the insured risk to multiple reinsurers. Assume that the insurer and every reinsurer apply the coherent risk measures. Then, we find out the necessary and sufficient conditions for the reinsurance market to achieve Pareto optimum; that is, every ceded-loss function and the retention function are in the form of “multiple layers reinsurance.”
Improving predicted protein loop structure ranking using a Pareto-optimality consensus method
Jakobsson Eric
2010-07-01
Full Text Available Abstract Background Accurate protein loop structure models are important to understand functions of many proteins. Identifying the native or near-native models by distinguishing them from the misfolded ones is a critical step in protein loop structure prediction. Results We have developed a Pareto Optimal Consensus (POC method, which is a consensus model ranking approach to integrate multiple knowledge- or physics-based scoring functions. The procedure of identifying the models of best quality in a model set includes: 1 identifying the models at the Pareto optimal front with respect to a set of scoring functions, and 2 ranking them based on the fuzzy dominance relationship to the rest of the models. We apply the POC method to a large number of decoy sets for loops of 4- to 12-residue in length using a functional space composed of several carefully-selected scoring functions: Rosetta, DOPE, DDFIRE, OPLS-AA, and a triplet backbone dihedral potential developed in our lab. Our computational results show that the sets of Pareto-optimal decoys, which are typically composed of ~20% or less of the overall decoys in a set, have a good coverage of the best or near-best decoys in more than 99% of the loop targets. Compared to the individual scoring function yielding best selection accuracy in the decoy sets, the POC method yields 23%, 37%, and 64% less false positives in distinguishing the native conformation, indentifying a near-native model (RMSD Conclusions By integrating multiple knowledge- and physics-based scoring functions based on Pareto optimality and fuzzy dominance, the POC method is effective in distinguishing the best loop models from the other ones within a loop model set.
He, Lu; Friedman, Alan M; Bailey-Kellogg, Chris
2012-03-01
In developing improved protein variants by site-directed mutagenesis or recombination, there are often competing objectives that must be considered in designing an experiment (selecting mutations or breakpoints): stability versus novelty, affinity versus specificity, activity versus immunogenicity, and so forth. Pareto optimal experimental designs make the best trade-offs between competing objectives. Such designs are not "dominated"; that is, no other design is better than a Pareto optimal design for one objective without being worse for another objective. Our goal is to produce all the Pareto optimal designs (the Pareto frontier), to characterize the trade-offs and suggest designs most worth considering, but to avoid explicitly considering the large number of dominated designs. To do so, we develop a divide-and-conquer algorithm, Protein Engineering Pareto FRontier (PEPFR), that hierarchically subdivides the objective space, using appropriate dynamic programming or integer programming methods to optimize designs in different regions. This divide-and-conquer approach is efficient in that the number of divisions (and thus calls to the optimizer) is directly proportional to the number of Pareto optimal designs. We demonstrate PEPFR with three protein engineering case studies: site-directed recombination for stability and diversity via dynamic programming, site-directed mutagenesis of interacting proteins for affinity and specificity via integer programming, and site-directed mutagenesis of a therapeutic protein for activity and immunogenicity via integer programming. We show that PEPFR is able to effectively produce all the Pareto optimal designs, discovering many more designs than previous methods. The characterization of the Pareto frontier provides additional insights into the local stability of design choices as well as global trends leading to trade-offs between competing criteria. Copyright © 2011 Wiley Periodicals, Inc.
Luo, Qiankun; Wu, Jianfeng; Yang, Yun; Qian, Jiazhong; Wu, Jichun
2016-03-01
Optimal design of long term groundwater monitoring (LTGM) network often involves conflicting objectives and substantial uncertainty arising from insufficient hydraulic conductivity (K) data. This study develops a new multi-objective simulation-optimization model involving four objectives: minimizations of (i) the total sampling costs for monitoring contaminant plume, (ii) mass estimation error, (iii) the first moment estimation error, and (iv) the second moment estimation error of the contaminant plume, for LTGM network design problems. Then a new probabilistic Pareto genetic algorithm (PPGA) coupled with the commonly used flow and transport codes, MODFLOW and MT3DMS, is developed to search for the Pareto-optimal solutions to the multi-objective LTGM problems under uncertainty of the K-fields. The PPGA integrates the niched Pareto genetic algorithm with probabilistic Pareto sorting scheme to deal with the uncertainty of objectives caused by the uncertain K-field. Also, the elitist selection strategy, the operation library and the Pareto solution set filter are conducted to improve the diversity and reliability of Pareto-optimal solutions by the PPGA. Furthermore, the sampling strategy of noisy genetic algorithm is adopted to cope with the uncertainty of the K-fields and improve the computational efficiency of the PPGA. In particular, Monte Carlo (MC) analysis is employed to evaluate the effectiveness of the proposed methodology in finding Pareto-optimal sampling network designs of LTGM systems through a two-dimensional hypothetical example and a three-dimensional field application in Indiana (USA). Comprehensive analysis demonstrates that the proposed PPGA can find Pareto optimal solutions with low variability and high reliability and is a promising tool for optimizing multi-objective LTGM network designs under uncertainty.
周杰; 卓芳; 黄磊; 罗艳
2015-01-01
To obtain the optimal process parameters of stamping forming, finite element analysis and optimization technique were integrated via transforming multi-objective issue into a single-objective issue. A Pareto-based genetic algorithm was applied to optimizing the head stamping forming process. In the proposed optimal model, fracture, wrinkle and thickness varying are a function of several factors, such as fillet radius, draw-bead position, blank size and blank-holding force. Hence, it is necessary to investigate the relationship between the objective functions and the variables in order to make objective functions varying minimized simultaneously. Firstly, the central composite experimental (CCD) with four factors and five levels was applied, and the experimental data based on the central composite experimental were acquired. Then, the response surface model (RSM) was set up and the results of the analysis of variance (ANOVA) show that it is reliable to predict the fracture, wrinkle and thickness varying functions by the response surface model. Finally, a Pareto-based genetic algorithm was used to find out a set of Pareto front, which makes fracture, wrinkle and thickness varying minimized integrally. A head stamping case indicates that the present method has higher precision and practicability compared with the“trial and error”procedure.
Ajibade Oluwaseyi Ayodele
2016-01-01
Full Text Available In this attempt, which is a second part of discussions on tapped density optimisation for four agricultural wastes (particles of coconut, periwinkle, palm kernel and egg shells, performance analysis for comparative basis is made. This paper pioneers a study direction in which optimisation of process variables are pursued using Taguchi method integrated with the Pareto 80-20 rule. Negative percentage improvements resulted when the optimal tapped density was compared with the average tapped density. However, the performance analysis between optimal tapped density and the peak tapped density values yielded positive percentage improvements for the four filler particles. The performance analysis results validate the effectiveness of using the Taguchi method in improving the tapped density properties of the filler particles. The application of the Pareto 80-20 rule to the table of parameters and levels produced revised tables of parameters and levels which helped to identify the factor-levels position of each parameter that is economical to optimality. The Pareto 80-20 rule also produced revised S/N response tables which were used to know the relevant S/N ratios that are relevant to optimality.
Pareto-optimal multi-objective design of airplane control systems
Schy, A. A.; Johnson, K. G.; Giesy, D. P.
1980-01-01
A constrained minimization algorithm for the computer aided design of airplane control systems to meet many requirements over a set of flight conditions is generalized using the concept of Pareto-optimization. The new algorithm yields solutions on the boundary of the achievable domain in objective space in a single run, whereas the older method required a sequence of runs to approximate such a limiting solution. However, Pareto-optimality does not guarantee a satisfactory design, since such solutions may emphasize some objectives at the expense of others. The designer must still interact with the program to obtain a well-balanced set of objectives. Using the example of a fighter lateral stability augmentation system (SAS) design over five flight conditions, several effective techniques are developed for obtaining well-balanced Pareto-optimal solutions. For comparison, one of these techniques is also used in a recently developed algorithm of Kreisselmeier and Steinhauser, which replaces the hard constraints with soft constraints, using a special penalty function. It is shown that comparable results can be obtained.
Evolutionary Information Theory
Mark Burgin
2013-01-01
Evolutionary information theory is a constructive approach that studies information in the context of evolutionary processes, which are ubiquitous in nature and society. In this paper, we develop foundations of evolutionary information theory, building several measures of evolutionary information and obtaining their properties. These measures are based on mathematical models of evolutionary computations, machines and automata. To measure evolutionary information in an invariant form, we const...
Machwe, A. T.; Parmee, I. C.
2007-07-01
This article describes research relating to a user-centered evolutionary design system that evaluates both engineering and aesthetic aspects of design solutions during early-stage conceptual design. The experimental system comprises several components relating to user interaction, problem representation, evolutionary search and exploration and online learning. The main focus of the article is the evolutionary aspect of the system when using a single quantitative objective function plus subjective judgment of the user. Additionally, the manner in which the user-interaction aspect affects system output is assessed by comparing Pareto frontiers generated with and without user interaction via a multi-objective evolutionary algorithm (MOEA). A solution clustering component is also introduced and it is shown how this can improve the level of support to the designer when dealing with a complex design problem involving multiple objectives. Supporting results are from the application of the system to the design of urban furniture which, in this case, largely relates to seating design.
Evolutionary developmental psychology
King, Ashley C; Bjorklund, David F
2010-01-01
The field of evolutionary developmental psychology can potentially broaden the horizons of mainstream evolutionary psychology by combining the principles of Darwinian evolution by natural selection...
Pareto, Leopardi e il principio della supremazia: note sulla società contro-bilanciata
Andrea Lombardinilo
2017-08-01
Full Text Available L’obiettivo del saggio è approfondire il significato socioculturale della metafora della società “contro-bilanciata”, sviluppata da Leopardi nel poema satirico Paralipomeni della Batracomiomachia, e ripresa da Pareto nel Trattato di sociologia generale (capitolo IX, paragrafo1508, in riferimento alle guerre balcaniche e alla crisi europea che prelude alla prima guerra mondiale. La lunga nota a piè di pagina che conclude il paragrafo 1508 attesta da un lato la conoscenza letteraria di Pareto (le citazioni leopardiane si intrecciano con quelle dantesche, e dall’altro la volontà di spiegare i fatti politici e sociali attraverso la lezione degli scrittori, antichi e moderni. In primo piano vi è il proposito di Pareto di esplorare l’evoluzione dei fatti sociali, in ottemperanza alla dialettica permanente tra residui e derivazioni, estremamente rilevanti in un secolo caratterizzato da conflitti mondiali e contrasti insanabili tra le grandi potenze internazionali. Lo studio dei fatti sociali e culturali mediante le evidenze letterarie del passato è uno dei tratti caratterizzanti del lavoro di Pareto, ispirato all’istanza euristica di tesaurizzare la lezione filosofica dei grandi scrittori e pensatori. Leopardi non fa eccezione, come il sociologo puntualizza nel capitolo IX del suo Trattato, che è infatti dedicato allo studio delle derivazioni sociali e, nello specifico, al ruolo della forza nella costruzione e nella salvaguardia dei rapporti sociali. Dal canto suo, Leopardi si interessa a questi aspetti nel redigere il suo poema satirico, che propone sullo sfondo l’eterno conflitto tra i topi e i granchi, simboleggianti rispettivamente i liberali italiani e le forze reazionarie austriache. Il cammino verso l’innovazione sociale è sostenuto da un profondo desiderio di libertà, così distante dall’idea imperante di forza, supremazia, abuso di potere. Si tratta di una preziosa lezione metaforica per i moderni, capitalizzata da
A possibilistic approach to rotorcraft design through a multi-objective evolutionary algorithm
Chae, Han Gil
Most of the engineering design processes in use today in the field may be considered as a series of successive decision making steps. The decision maker uses information at hand, determines the direction of the procedure, and generates information for the next step and/or other decision makers. However, the information is often incomplete, especially in the early stages of the design process of a complex system. As the complexity of the system increases, uncertainties eventually become unmanageable using traditional tools. In such a case, the tools and analysis values need to be "softened" to account for the designer's intuition. One of the methods that deals with issues of intuition and incompleteness is possibility theory. Through the use of possibility theory coupled with fuzzy inference, the uncertainties estimated by the intuition of the designer are quantified for design problems. By involving quantified uncertainties in the tools, the solutions can represent a possible set, instead of a crisp spot, for predefined levels of certainty. From a different point of view, it is a well known fact that engineering design is a multi-objective problem or a set of such problems. The decision maker aims to find satisfactory solutions, sometimes compromising the objectives that conflict with each other. Once the candidates of possible solutions are generated, a satisfactory solution can be found by various decision-making techniques. A number of multi-objective evolutionary algorithms (MOEAs) have been developed, and can be found in the literature, which are capable of generating alternative solutions and evaluating multiple sets of solutions in one single execution of an algorithm. One of the MOEA techniques that has been proven to be very successful for this class of problems is the strength Pareto evolutionary algorithm (SPEA) which falls under the dominance-based category of methods. The Pareto dominance that is used in SPEA, however, is not enough to account for the
A multiobjective evolutionary algorithm to find community structures based on affinity propagation
Shang, Ronghua; Luo, Shuang; Zhang, Weitong; Stolkin, Rustam; Jiao, Licheng
2016-07-01
Community detection plays an important role in reflecting and understanding the topological structure of complex networks, and can be used to help mine the potential information in networks. This paper presents a Multiobjective Evolutionary Algorithm based on Affinity Propagation (APMOEA) which improves the accuracy of community detection. Firstly, APMOEA takes the method of affinity propagation (AP) to initially divide the network. To accelerate its convergence, the multiobjective evolutionary algorithm selects nondominated solutions from the preliminary partitioning results as its initial population. Secondly, the multiobjective evolutionary algorithm finds solutions approximating the true Pareto optimal front through constantly selecting nondominated solutions from the population after crossover and mutation in iterations, which overcomes the tendency of data clustering methods to fall into local optima. Finally, APMOEA uses an elitist strategy, called "external archive", to prevent degeneration during the process of searching using the multiobjective evolutionary algorithm. According to this strategy, the preliminary partitioning results obtained by AP will be archived and participate in the final selection of Pareto-optimal solutions. Experiments on benchmark test data, including both computer-generated networks and eight real-world networks, show that the proposed algorithm achieves more accurate results and has faster convergence speed compared with seven other state-of-art algorithms.
1985-12-01
tion in statistical analysis. It is named after Vilfredo Pareto (1848-1923), a Swiss professor of economics who con- ducted the first extensive...THE PARETO DISTRIBUTION WITH UNKNOWN LOCATION AND SCALE PARAMETERS THESIS James E. Porter III Captain, USAF AFIT/GSO/MA/85D-6 Approved for public... PARETO DISTRIBUTION WITH UNKNOWN LOCATION AND SCALE PARAMETERS -.- THES IS Presented to the Faculty of the School of Engineering *- of the Air Force
重用抗体优良片断的免疫进化算法%Immune Evolutionary Algorithm Reusing Excellent Genes of Antibody
杨观赐; 马鑫; 李少波; 钟勇; 于丽娅
2012-01-01
By expounding the ideological origin of improving the clonal selection algorithm through the analysis of the specific phenomenon,the method to extract excellent gene schema to fill a memory pool from antibody set,to package excellent gene segment,and to replace low affinity antibody with high affinity antibody with probability from mutation antibody population during updating memory antibody population was designed based on clonal selection principle and algorithm,and then an improved clonal selection algorithm reusing excellent gene segment was put forward.Refering to the framework of strength Pareto evolutionary algorithm,the immune evolutionary algorithm reusing excellent genes of antibody（RG-IEM） was proposed,which implements the genetic operation such as selection,crossover and recombinant by applying the improved clonal selection algorithm.Taking a series of multi-objective 0/1 knapsack problems to check RG-IEA＇s performance,the results show that RG-IEA is capable of maintaining the diversity of population and obtaining solutions approximating to Pareto front.%基于克隆选择原理与算法,通过分析具体现象阐述了改进克隆选择算法的思想来源,设计了挖掘抗体中优秀决定基因并生成记忆集、封装优秀决定基片段、用变异抗体群中亲和度高的抗体按概率替换记忆抗体群中低亲和度抗体的方法,获得了重用抗体优良片断的克隆选择算法.借鉴强度Pareto进化算法的进化框架,提出了重用抗体优良片断的免疫进化算法.该算法通过克隆选择替代选择、交叉、重组等遗传操作.在一组0/1背包问题上的测试结果表明,所提出的算法可以有效保持种群多样性,获得较高质量的Pareto非劣解集.
La narrazione dell’azione sociale: spunti dal Trattato di Vilfredo Pareto
Ilaria Riccioni
2017-08-01
Full Text Available La rilettura dei classici porta con sé sempre una duplice operazione: da una parte un ritorno a riflessioni, ritmi, storicità che spesso sembrano già superate; dall’altra la riscoperta delle origini di fenomeni contemporanei da punti di vista che ne delineano le interconnessioni profonde, non più visibili allo stato di avanzamento in cui le osserviamo oggi. Tale maggiore chiarezza è forse dovuta al fatto che ogni fenomeno nella sua fase aurorale è più chiaramente identificabile rispetto alle sue fasi successive, dove le caratteristiche primarie tendono a stemperarsi nelle cifre dominanti della contemporaneità, perdendosi nelle pratiche quotidiane che ne celano la provenienza. Se la sociologia è un processo di conoscenza della realtà dei fenomeni, il punto centrale della scienza sociale va distinto tra quelle scienze che schematizzano il reale in equazioni formali funzionali e funzionanti, il sistema economico, normativo, e le scienze sociali che si occupano della realtà e della sua complessità, che in quanto scienze si devono occupare non tanto di ciò che la realtà deve essere, bensì di ciò che la realtà è, di come si pone e di come manifesta i movimenti desideranti e profondi del vivere collettivo oltre il sistema che ne gestisce il funzionamento. Il punto che Pareto sembra scorgere, con estrema lucidità, è la necessità di ribaltare l’importanza della logica economica nell’organizzazione sociale da scienza che detta la realtà a scienza che propone uno schema di gestione di essa: da essa si cerca di dettare la realtà, ma l’economia, dal greco moderno Oikòs, Oikòsgeneia (casa e generazione, il termine utilizzato per definire l’unità famigliare non è di fatto “la realtà”, sembra dirci Pareto in più digressioni, bensì l’arte e la scienza della gestione di unità familiari e produttive. La realtà rimane in ombra e non può che essere “avvicinata” da una scienza che ne registri, ed eventualmente
Computing the Distribution of Pareto Sums Using Laplace Transformation and Stehfest Inversion
Harris, C. K.; Bourne, S. J.
2017-05-01
In statistical seismology, the properties of distributions of total seismic moment are important for constraining seismological models, such as the strain partitioning model (Bourne et al. J Geophys Res Solid Earth 119(12): 8991-9015, 2014). This work was motivated by the need to develop appropriate seismological models for the Groningen gas field in the northeastern Netherlands, in order to address the issue of production-induced seismicity. The total seismic moment is the sum of the moments of individual seismic events, which in common with many other natural processes, are governed by Pareto or "power law" distributions. The maximum possible moment for an induced seismic event can be constrained by geomechanical considerations, but rather poorly, and for Groningen it cannot be reliably inferred from the frequency distribution of moment magnitude pertaining to the catalogue of observed events. In such cases it is usual to work with the simplest form of the Pareto distribution without an upper bound, and we follow the same approach here. In the case of seismicity, the exponent β appearing in the power-law relation is small enough for the variance of the unbounded Pareto distribution to be infinite, which renders standard statistical methods concerning sums of statistical variables, based on the central limit theorem, inapplicable. Determinations of the properties of sums of moderate to large numbers of Pareto-distributed variables with infinite variance have traditionally been addressed using intensive Monte Carlo simulations. This paper presents a novel method for accurate determination of the properties of such sums that is accurate, fast and easily implemented, and is applicable to Pareto-distributed variables for which the power-law exponent β lies within the interval [0, 1]. It is based on shifting the original variables so that a non-zero density is obtained exclusively for non-negative values of the parameter and is identically zero elsewhere, a property
Evolutionary Information Theory
Mark Burgin
2013-04-01
Full Text Available Evolutionary information theory is a constructive approach that studies information in the context of evolutionary processes, which are ubiquitous in nature and society. In this paper, we develop foundations of evolutionary information theory, building several measures of evolutionary information and obtaining their properties. These measures are based on mathematical models of evolutionary computations, machines and automata. To measure evolutionary information in an invariant form, we construct and study universal evolutionary machines and automata, which form the base for evolutionary information theory. The first class of measures introduced and studied in this paper is evolutionary information size of symbolic objects relative to classes of automata or machines. In particular, it is proved that there is an invariant and optimal evolutionary information size relative to different classes of evolutionary machines. As a rule, different classes of algorithms or automata determine different information size for the same object. The more powerful classes of algorithms or automata decrease the information size of an object in comparison with the information size of an object relative to weaker4 classes of algorithms or machines. The second class of measures for evolutionary information in symbolic objects is studied by introduction of the quantity of evolutionary information about symbolic objects relative to a class of automata or machines. To give an example of applications, we briefly describe a possibility of modeling physical evolution with evolutionary machines to demonstrate applicability of evolutionary information theory to all material processes. At the end of the paper, directions for future research are suggested.
A Pareto-optimal moving average multigene genetic programming model for daily streamflow prediction
Danandeh Mehr, Ali; Kahya, Ercan
2017-06-01
Genetic programming (GP) is able to systematically explore alternative model structures of different accuracy and complexity from observed input and output data. The effectiveness of GP in hydrological system identification has been recognized in recent studies. However, selecting a parsimonious (accurate and simple) model from such alternatives still remains a question. This paper proposes a Pareto-optimal moving average multigene genetic programming (MA-MGGP) approach to develop a parsimonious model for single-station streamflow prediction. The three main components of the approach that take us from observed data to a validated model are: (1) data pre-processing, (2) system identification and (3) system simplification. The data pre-processing ingredient uses a simple moving average filter to diminish the lagged prediction effect of stand-alone data-driven models. The multigene ingredient of the model tends to identify the underlying nonlinear system with expressions simpler than classical monolithic GP and, eventually simplification component exploits Pareto front plot to select a parsimonious model through an interactive complexity-efficiency trade-off. The approach was tested using the daily streamflow records from a station on Senoz Stream, Turkey. Comparing to the efficiency results of stand-alone GP, MGGP, and conventional multi linear regression prediction models as benchmarks, the proposed Pareto-optimal MA-MGGP model put forward a parsimonious solution, which has a noteworthy importance of being applied in practice. In addition, the approach allows the user to enter human insight into the problem to examine evolved models and pick the best performing programs out for further analysis.
Il problema della costruzione di senso nel Trattato di Sociologia Generale di Vilfredo Pareto
Andrea Millefiorini
2017-08-01
Full Text Available Pareto ci spiega come i residui siano al centro del complesso ordine sociale che viene a costituirsi dalla combinazione tra questi, gli interessi, l’eterogeneità della società e le derivazioni. Ai fini della costruzione di senso, vi è un genere di residui, quello definito «bisogno di sviluppi logici», il quale comprende «la maggior parte dei residui che determinano le derivazioni» . Sono poi queste ultime che, venendo diciamo così a “vivere di vita propria”, perimetrano, definiscono, determinano, conferiscono i significati individuali e collettivi sui quali l’interazione quotidiana tra gli uomini fonda la trama principale delle proprie routines, delle proprie pratiche, delle proprie condotte all’interno di ambiti di convivenza, di istituzioni, di comunità nazionali. Vi è stato chi, come Norberto Bobbio, ha tratto da questo indubbio assetto concettuale nella teoria sociologica paretiana, conseguenze e deduzioni che ci restituiscono il pensiero di Pareto come una versione socio-psicologica della teoria marxista della “falsa coscienza”. In sostanza, scrive Bobbio, «alla concezione storicistica delle ideologie propria di Marx, Pareto contrappone una concezione naturalistica dell’uomo come animale ideologico». Tuttavia bisogna intendersi. È certamente vero che le ideologie del Novecento possono essere spiegate seguendo l’approccio paretiano, ma la sua sociologia non si risolve e non si esaurisce in una semplice teoria delle ideologie. Essa è un qualcosa di ben più ampio e ben più complesso, che abbraccia tutto l’arco storico delle civiltà umane, e che quindi si pone come uno dei tentativi più ambiziosi, sino ad oggi concepiti dalle scienze sociali, di spiegare quel complicatissimo processo sociale che va sotto il nome di “costruzione di senso”.
Analysis of extreme drinking in patients with alcohol dependence using Pareto regression.
Das, Sourish; Harel, Ofer; Dey, Dipak K; Covault, Jonathan; Kranzler, Henry R
2010-05-20
We developed a novel Pareto regression model with an unknown shape parameter to analyze extreme drinking in patients with Alcohol Dependence (AD). We used the generalized linear model (GLM) framework and the log-link to include the covariate information through the scale parameter of the generalized Pareto distribution. We proposed a Bayesian method based on Ridge prior and Zellner's g-prior for the regression coefficients. Simulation study indicated that the proposed Bayesian method performs better than the existing likelihood-based inference for the Pareto regression.We examined two issues of importance in the study of AD. First, we tested whether a single nucleotide polymorphism within GABRA2 gene, which encodes a subunit of the GABA(A) receptor, and that has been associated with AD, influences 'extreme' alcohol intake and second, the efficacy of three psychotherapies for alcoholism in treating extreme drinking behavior. We found an association between extreme drinking behavior and GABRA2. We also found that, at baseline, men with a high-risk GABRA2 allele had a significantly higher probability of extreme drinking than men with no high-risk allele. However, men with a high-risk allele responded to the therapy better than those with two copies of the low-risk allele. Women with high-risk alleles also responded to the therapy better than those with two copies of the low-risk allele, while women who received the cognitive behavioral therapy had better outcomes than those receiving either of the other two therapies. Among men, motivational enhancement therapy was the best for the treatment of the extreme drinking behavior.
The Mass-Longevity Triangle: Pareto Optimality and the Geometry of Life-History Trait Space.
Pablo Szekely
2015-10-01
Full Text Available When organisms need to perform multiple tasks they face a fundamental tradeoff: no phenotype can be optimal at all tasks. This situation was recently analyzed using Pareto optimality, showing that tradeoffs between tasks lead to phenotypes distributed on low dimensional polygons in trait space. The vertices of these polygons are archetypes--phenotypes optimal at a single task. This theory was applied to examples from animal morphology and gene expression. Here we ask whether Pareto optimality theory can apply to life history traits, which include longevity, fecundity and mass. To comprehensively explore the geometry of life history trait space, we analyze a dataset of life history traits of 2105 endothermic species. We find that, to a first approximation, life history traits fall on a triangle in log-mass log-longevity space. The vertices of the triangle suggest three archetypal strategies, exemplified by bats, shrews and whales, with specialists near the vertices and generalists in the middle of the triangle. To a second approximation, the data lies in a tetrahedron, whose extra vertex above the mass-longevity triangle suggests a fourth strategy related to carnivory. Each animal species can thus be placed in a coordinate system according to its distance from the archetypes, which may be useful for genome-scale comparative studies of mammalian aging and other biological aspects. We further demonstrate that Pareto optimality can explain a range of previous studies which found animal and plant phenotypes which lie in triangles in trait space. This study demonstrates the applicability of multi-objective optimization principles to understand life history traits and to infer archetypal strategies that suggest why some mammalian species live much longer than others of similar mass.
The Mass-Longevity Triangle: Pareto Optimality and the Geometry of Life-History Trait Space.
Szekely, Pablo; Korem, Yael; Moran, Uri; Mayo, Avi; Alon, Uri
2015-10-01
When organisms need to perform multiple tasks they face a fundamental tradeoff: no phenotype can be optimal at all tasks. This situation was recently analyzed using Pareto optimality, showing that tradeoffs between tasks lead to phenotypes distributed on low dimensional polygons in trait space. The vertices of these polygons are archetypes--phenotypes optimal at a single task. This theory was applied to examples from animal morphology and gene expression. Here we ask whether Pareto optimality theory can apply to life history traits, which include longevity, fecundity and mass. To comprehensively explore the geometry of life history trait space, we analyze a dataset of life history traits of 2105 endothermic species. We find that, to a first approximation, life history traits fall on a triangle in log-mass log-longevity space. The vertices of the triangle suggest three archetypal strategies, exemplified by bats, shrews and whales, with specialists near the vertices and generalists in the middle of the triangle. To a second approximation, the data lies in a tetrahedron, whose extra vertex above the mass-longevity triangle suggests a fourth strategy related to carnivory. Each animal species can thus be placed in a coordinate system according to its distance from the archetypes, which may be useful for genome-scale comparative studies of mammalian aging and other biological aspects. We further demonstrate that Pareto optimality can explain a range of previous studies which found animal and plant phenotypes which lie in triangles in trait space. This study demonstrates the applicability of multi-objective optimization principles to understand life history traits and to infer archetypal strategies that suggest why some mammalian species live much longer than others of similar mass.
Group Acceptance Sampling Plan for Lifetime Data Using Generalized Pareto Distribution
Muhammad Aslam
2010-02-01
Full Text Available In this paper, a group acceptance sampling plan (GASP is introduced for the situations when lifetime of the items follows the generalized Pareto distribution. The design parameters such as minimum group size and acceptance number are determined when the consumer’s risk and the test termination time are specified. The proposed sampling plan is compared with the existing sampling plan. It is concluded that the proposed sampling plan performs better than the existing plan in terms of minimum sample size required to reach the same decision.
A class of generalized beta distributions, Pareto power series and Weibull power series
Lemos de Morais, Alice
2009-01-01
Nesta dissertação trabalhamos com três classes de distribuições de probabilidade, sendo uma já conhecida na literatura, a Classe de Distribuições Generalizadas Beta (Beta-G) e duas outras novas classes introduzidas nesta tese, baseadas na composição das distribuições Pareto e Weibull com a classe de distribuições discretas power series. Fazemos uma revisão geral da classe Beta-G e introduzimos um caso especial, a distribuição beta logística generalizada do tipo IV (BGL(IV)). In...
Inferring biological tasks using Pareto analysis of high-dimensional data.
Hart, Yuval; Sheftel, Hila; Hausser, Jean; Szekely, Pablo; Ben-Moshe, Noa Bossel; Korem, Yael; Tendler, Avichai; Mayo, Avraham E; Alon, Uri
2015-03-01
We present the Pareto task inference method (ParTI; http://www.weizmann.ac.il/mcb/UriAlon/download/ParTI) for inferring biological tasks from high-dimensional biological data. Data are described as a polytope, and features maximally enriched closest to the vertices (or archetypes) allow identification of the tasks the vertices represent. We demonstrate that human breast tumors and mouse tissues are well described by tetrahedrons in gene expression space, with specific tumor types and biological functions enriched at each of the vertices, suggesting four key tasks.
Pareto-Ranking Based Quantum-Behaved Particle Swarm Optimization for Multiobjective Optimization
Na Tian
2015-01-01
Full Text Available A study on pareto-ranking based quantum-behaved particle swarm optimization (QPSO for multiobjective optimization problems is presented in this paper. During the iteration, an external repository is maintained to remember the nondominated solutions, from which the global best position is chosen. The comparison between different elitist selection strategies (preference order, sigma value, and random selection is performed on four benchmark functions and two metrics. The results demonstrate that QPSO with preference order has comparative performance with sigma value according to different number of objectives. Finally, QPSO with sigma value is applied to solve multiobjective flexible job-shop scheduling problems.
Multi-objective evolutionary algorithms for fuzzy classification in survival prediction.
Jiménez, Fernando; Sánchez, Gracia; Juárez, José M
2014-03-01
This paper presents a novel rule-based fuzzy classification methodology for survival/mortality prediction in severe burnt patients. Due to the ethical aspects involved in this medical scenario, physicians tend not to accept a computer-based evaluation unless they understand why and how such a recommendation is given. Therefore, any fuzzy classifier model must be both accurate and interpretable. The proposed methodology is a three-step process: (1) multi-objective constrained optimization of a patient's data set, using Pareto-based elitist multi-objective evolutionary algorithms to maximize accuracy and minimize the complexity (number of rules) of classifiers, subject to interpretability constraints; this step produces a set of alternative (Pareto) classifiers; (2) linguistic labeling, which assigns a linguistic label to each fuzzy set of the classifiers; this step is essential to the interpretability of the classifiers; (3) decision making, whereby a classifier is chosen, if it is satisfactory, according to the preferences of the decision maker. If no classifier is satisfactory for the decision maker, the process starts again in step (1) with a different input parameter set. The performance of three multi-objective evolutionary algorithms, niched pre-selection multi-objective algorithm, elitist Pareto-based multi-objective evolutionary algorithm for diversity reinforcement (ENORA) and the non-dominated sorting genetic algorithm (NSGA-II), was tested using a patient's data set from an intensive care burn unit and a standard machine learning data set from an standard machine learning repository. The results are compared using the hypervolume multi-objective metric. Besides, the results have been compared with other non-evolutionary techniques and validated with a multi-objective cross-validation technique. Our proposal improves the classification rate obtained by other non-evolutionary techniques (decision trees, artificial neural networks, Naive Bayes, and case
Development of antibiotic regimens using graph based evolutionary algorithms.
Corns, Steven M; Ashlock, Daniel A; Bryden, Kenneth M
2013-12-01
This paper examines the use of evolutionary algorithms in the development of antibiotic regimens given to production animals. A model is constructed that combines the lifespan of the animal and the bacteria living in the animal's gastro-intestinal tract from the early finishing stage until the animal reaches market weight. This model is used as the fitness evaluation for a set of graph based evolutionary algorithms to assess the impact of diversity control on the evolving antibiotic regimens. The graph based evolutionary algorithms have two objectives: to find an antibiotic treatment regimen that maintains the weight gain and health benefits of antibiotic use and to reduce the risk of spreading antibiotic resistant bacteria. This study examines different regimens of tylosin phosphate use on bacteria populations divided into Gram positive and Gram negative types, with a focus on Campylobacter spp. Treatment regimens were found that provided decreased antibiotic resistance relative to conventional methods while providing nearly the same benefits as conventional antibiotic regimes. By using a graph to control the information flow in the evolutionary algorithm, a variety of solutions along the Pareto front can be found automatically for this and other multi-objective problems.
Jorge Caldera-Serrano
2015-09-01
Full Text Available Se analiza la reutilización de las colecciones audiovisuales de las cadenas de televisión con el fin de detectar si se cumple el Índice de Pareto, facilitando mecanismos para su control y explotación de la parte de la colección audiovisual menos utilizada. Se detecta que la correlación de Pareto se establece no sólo en el uso sino también en la presencia de elementos temáticos y elementos onomásticos en el archivo y en la difusión de contenidos, por lo que se plantea formas de control en la integración de información en la colección y de recursos en la difusión. Igualmente se describe el Índice de Pareto, los Media Asset Management y el cambio de paradigma al digital, elementos fundamentales para entender los problemas y las soluciones para la eliminación de problemas en la recuperación y en la conformación de la colección. Abstract: Reuse of audiovisual collections television networks in order to detect whether the Pareto index, providing mechanisms for control and exploitation of the least used part of the audiovisual collection holds analyzed. It is found that the correlation of Pareto is established not only in the use but also the presence of thematic elements and onomastic elements in the file and in the distribution of content, so forms of control arises in the integration of information collection and distributing resources. Likewise, the Pareto index, the Media Asset Management and the paradigm shift to digital, essential to understanding the problems and solutions to eliminate problems in recovery and in the establishment of collection elements described. Keywords: Information processing. Television. Electronic media. Information systems evaluation.
Distributed Multicell Beamforming Design Approaching Pareto Boundary with Max-Min Fairness
Huang, Yongming; Bengtsson, Mats; Wong, Kai-Kit; Yang, Luxi; Ottersten, Bjorn
2012-01-01
This paper addresses coordinated downlink beamforming optimization in multicell time-division duplex (TDD) systems where a small number of parameters are exchanged between cells but with no data sharing. With the goal to reach the point on the Pareto boundary with max-min rate fairness, we first develop a two-step centralized optimization algorithm to design the joint beamforming vectors. This algorithm can achieve a further sum-rate improvement over the max-min optimal performance, and is shown to guarantee max-min Pareto optimality for scenarios with two base stations (BSs) each serving a single user. To realize a distributed solution with limited intercell communication, we then propose an iterative algorithm by exploiting an approximate uplink-downlink duality, in which only a small number of positive scalars are shared between cells in each iteration. Simulation results show that the proposed distributed solution achieves a fairness rate performance close to the centralized algorithm while it has a bette...
Ensembles of signal transduction models using Pareto Optimal Ensemble Techniques (POETs).
Song, Sang Ok; Chakrabarti, Anirikh; Varner, Jeffrey D
2010-07-01
Mathematical modeling of complex gene expression programs is an emerging tool for understanding disease mechanisms. However, identification of large models sometimes requires training using qualitative, conflicting or even contradictory data sets. One strategy to address this challenge is to estimate experimentally constrained model ensembles using multiobjective optimization. In this study, we used Pareto Optimal Ensemble Techniques (POETs) to identify a family of proof-of-concept signal transduction models. POETs integrate Simulated Annealing (SA) with Pareto optimality to identify models near the optimal tradeoff surface between competing training objectives. We modeled a prototypical-signaling network using mass-action kinetics within an ordinary differential equation (ODE) framework (64 ODEs in total). The true model was used to generate synthetic immunoblots from which the POET algorithm identified the 117 unknown model parameters. POET generated an ensemble of signaling models, which collectively exhibited population-like behavior. For example, scaled gene expression levels were approximately normally distributed over the ensemble following the addition of extracellular ligand. Also, the ensemble recovered robust and fragile features of the true model, despite significant parameter uncertainty. Taken together, these results suggest that experimentally constrained model ensembles could capture qualitatively important network features without exact parameter information.
Comparing the Impact of Mobile Nodes Arrival Patterns in Manets using Poisson and Pareto Models
John Tengviel
2013-10-01
Full Text Available Mobile Ad hoc Networks (MANETs are dynamic networks populated by mobile stations, or mobile nodes(MNs. Mobility model is a hot topic in many areas, for example, protocol evaluation, networkperformance analysis and so on.How to simulate MNs mobility is the problem we should consider if wewant to build an accurate mobility model. When new nodes can join and other nodes can leave the networkand therefore the topology is dynamic.Specifically, MANETs consist of a collection of nodes randomlyplaced in a line (not necessarily straight. MANETs do appear in many real-world network applicationssuch as a vehicular MANETs built along a highway in a city environment or people in a particularlocation. MNs in MANETs are usually laptops, PDAs or mobile phones.This paper presents comparative results that have been carried out via Matlab software simulation. Thestudy investigates the impact of mobility predictive models on mobile nodes’ parameters such as, thearrival rate and the size of mobile nodes in a given area using Pareto and Poisson distributions. Theresults have indicated that mobile nodes’ arrival rates may have influence on MNs population (as a largernumber in a location. The Pareto distribution is more reflective of the modeling mobility for MANETsthan the Poisson distribution.
Delgado, João; Longhurst, Phil; Hickman, Gordon A W; Gauntlett, Daniel M; Howson, Simon F; Irving, Phil; Hart, Alwyn; Pollard, Simon J T
2010-06-15
An enhanced methodology for the policy-level prioritization of intervention options during carcass disposal is presented. Pareto charts provide a semiquantitative analysis of opportunities for multiple exposures to human health, animal health, and the wider environment during carcass disposal; they identify critical control points for risk management and assist in waste technology assessment. Eighty percent of the total availability of more than 1300 potential exposures to human, animal, or environmental receptors is represented by 16 processes, these being dominated by on-farm collection and carcass processing, reinforcing the criticality of effective controls during early stages of animal culling and waste processing. Exposures during mass burials are dominated by ground- and surface-water exposures with noise and odor nuisance prevalent for mass pyres, consistent with U.K. experience. Pareto charts are discussed in the context of other visualization formats for policy officials and promoted as a communication tool for informing the site-specific risk assessments required during the operational phases of exotic disease outbreaks.
Cao, Pan; Jorswieck, Eduard A.; Shi, Shuying
2013-10-01
We consider a multiple-input multiple-output (MIMO) interference channel (IC), where a single data stream per user is transmitted and each receiver treats interference as noise. The paper focuses on the open problem of computing the outermost boundary (so-called Pareto boundary-PB) of the achievable rate region under linear transceiver design. The Pareto boundary consists of the strict PB and non-strict PB. For the two user case, we compute the non-strict PB and the two ending points of the strict PB exactly. For the strict PB, we formulate the problem to maximize one rate while the other rate is fixed such that a strict PB point is reached. To solve this non-convex optimization problem which results from the hard-coupled two transmit beamformers, we propose an alternating optimization algorithm. Furthermore, we extend the algorithm to the multi-user scenario and show convergence. Numerical simulations illustrate that the proposed algorithm computes a sequence of well-distributed operating points that serve as a reasonable and complete inner bound of the strict PB compared with existing methods.
Klammer, Martin; Dybowski, J Nikolaj; Hoffmann, Daniel; Schaab, Christoph
2015-01-01
Multivariate biomarkers that can predict the effectiveness of targeted therapy in individual patients are highly desired. Previous biomarker discovery studies have largely focused on the identification of single biomarker signatures, aimed at maximizing prediction accuracy. Here, we present a different approach that identifies multiple biomarkers by simultaneously optimizing their predictive power, number of features, and proximity to the drug target in a protein-protein interaction network. To this end, we incorporated NSGA-II, a fast and elitist multi-objective optimization algorithm that is based on the principle of Pareto optimality, into the biomarker discovery workflow. The method was applied to quantitative phosphoproteome data of 19 non-small cell lung cancer (NSCLC) cell lines from a previous biomarker study. The algorithm successfully identified a total of 77 candidate biomarker signatures predicting response to treatment with dasatinib. Through filtering and similarity clustering, this set was trimmed to four final biomarker signatures, which then were validated on an independent set of breast cancer cell lines. All four candidates reached the same good prediction accuracy (83%) as the originally published biomarker. Although the newly discovered signatures were diverse in their composition and in their size, the central protein of the originally published signature - integrin β4 (ITGB4) - was also present in all four Pareto signatures, confirming its pivotal role in predicting dasatinib response in NSCLC cell lines. In summary, the method presented here allows for a robust and simultaneous identification of multiple multivariate biomarkers that are optimized for prediction performance, size, and relevance.
Single Cell Dynamics Causes Pareto-Like Effect in Stimulated T Cell Populations.
Cosette, Jérémie; Moussy, Alice; Onodi, Fanny; Auffret-Cariou, Adrien; Neildez-Nguyen, Thi My Anh; Paldi, Andras; Stockholm, Daniel
2015-12-09
Cell fate choice during the process of differentiation may obey to deterministic or stochastic rules. In order to discriminate between these two strategies we used time-lapse microscopy of individual murine CD4 + T cells that allows investigating the dynamics of proliferation and fate commitment. We observed highly heterogeneous division and death rates between individual clones resulting in a Pareto-like dominance of a few clones at the end of the experiment. Commitment to the Treg fate was monitored using the expression of a GFP reporter gene under the control of the endogenous Foxp3 promoter. All possible combinations of proliferation and differentiation were observed and resulted in exclusively GFP-, GFP+ or mixed phenotype clones of very different population sizes. We simulated the process of proliferation and differentiation using a simple mathematical model of stochastic decision-making based on the experimentally observed parameters. The simulations show that a stochastic scenario is fully compatible with the observed Pareto-like imbalance in the final population.
Mapping the Pareto optimal design space for a functionally deimmunized biotherapeutic candidate.
Salvat, Regina S; Parker, Andrew S; Choi, Yoonjoo; Bailey-Kellogg, Chris; Griswold, Karl E
2015-01-01
The immunogenicity of biotherapeutics can bottleneck development pipelines and poses a barrier to widespread clinical application. As a result, there is a growing need for improved deimmunization technologies. We have recently described algorithms that simultaneously optimize proteins for both reduced T cell epitope content and high-level function. In silico analysis of this dual objective design space reveals that there is no single global optimum with respect to protein deimmunization. Instead, mutagenic epitope deletion yields a spectrum of designs that exhibit tradeoffs between immunogenic potential and molecular function. The leading edge of this design space is the Pareto frontier, i.e. the undominated variants for which no other single design exhibits better performance in both criteria. Here, the Pareto frontier of a therapeutic enzyme has been designed, constructed, and evaluated experimentally. Various measures of protein performance were found to map a functional sequence space that correlated well with computational predictions. These results represent the first systematic and rigorous assessment of the functional penalty that must be paid for pursuing progressively more deimmunized biotherapeutic candidates. Given this capacity to rapidly assess and design for tradeoffs between protein immunogenicity and functionality, these algorithms may prove useful in augmenting, accelerating, and de-risking experimental deimmunization efforts.
J. S. Sadaghiani
2014-04-01
Full Text Available Flexible job shop scheduling problem is a key factor of using efficiently in production systems. This paper attempts to simultaneously optimize three objectives including minimization of the make span, total workload and maximum workload of jobs. Since the multi objective flexible job shop scheduling problem is strongly NP-Hard, an integrated heuristic approach has been used to solve it. The proposed approach was based on a floating search procedure that has used some heuristic algorithms. Within floating search procedure utilize local heuristic algorithms; it makes the considered problem into two sections including assigning and sequencing sub problem. First of all search is done upon assignment space achieving an acceptable solution and then search would continue on sequencing space based on a heuristic algorithm. This paper has used a multi-objective approach for producing Pareto solution. Thus proposed approach was adapted on NSGA II algorithm and evaluated Pareto-archives. The elements and parameters of the proposed algorithms were adjusted upon preliminary experiments. Finally, computational results were used to analyze efficiency of the proposed algorithm and this results showed that the proposed algorithm capable to produce efficient solutions.
The Reduction of Modal Sensor Channels through a Pareto Chart Methodology
Kaci J. Lemler
2015-01-01
Full Text Available Presented herein is a new experimental sensor placement procedure developed to assist in placing sensors in key locations in an efficient method to reduce the number of channels for a full modal analysis. It is a fast, noncontact method that uses a laser vibrometer to gather a candidate set of sensor locations. These locations are then evaluated using a Pareto chart to obtain a reduced set of sensor locations that still captures the motion of the structure. The Pareto chart is employed to identify the points on a structure that have the largest reaction to an input excitation and thus reduce the number of channels while capturing the most significant data. This method enhances the correct and efficient placement of sensors which is crucial in modal testing. Previously this required the development and/or use of a complicated model or set of equations. This new technique is applied in a case study on a small unmanned aerial system. The test procedure is presented and the results are discussed.
Wira-Alam, Andias
2009-01-01
Peptide Optimization is a highly complex problem and it takes very long time of computation. This optimization process uses many software applications in a cluster running GNU/Linux Operating System that perform special tasks. The application to organize the whole optimization process had been already developed, namely SEPP (System for Evolutionary Pareto Optimization of Peptides/Polymers). A single peptide optimization takes a lot of computation time to produce a certain number of individuals. However, it can be accelerated by increasing the degree of parallelism as well as the number of nodes (processors) in the cluster. In this master thesis, I build a model simulating the interplay of the programs so that the usage of each resource (processor) can be determined and also the approximated time needed for the overall optimization process. There are two Evolutionary Algorithms that could be used in the optimization, namely Generation-based and Steady-state Evolutionary Algorithm. The results of each Evolution...
van de Schoot, A J A J; Visser, J; van Kesteren, Z; Janssen, T M; Rasch, C R N; Bel, A
2016-02-21
The Pareto front reflects the optimal trade-offs between conflicting objectives and can be used to quantify the effect of different beam configurations on plan robustness and dose-volume histogram parameters. Therefore, our aim was to develop and implement a method to automatically approach the Pareto front in robust intensity-modulated proton therapy (IMPT) planning. Additionally, clinically relevant Pareto fronts based on different beam configurations will be derived and compared to enable beam configuration selection in cervical cancer proton therapy. A method to iteratively approach the Pareto front by automatically generating robustly optimized IMPT plans was developed. To verify plan quality, IMPT plans were evaluated on robustness by simulating range and position errors and recalculating the dose. For five retrospectively selected cervical cancer patients, this method was applied for IMPT plans with three different beam configurations using two, three and four beams. 3D Pareto fronts were optimized on target coverage (CTV D(99%)) and OAR doses (rectum V30Gy; bladder V40Gy). Per patient, proportions of non-approved IMPT plans were determined and differences between patient-specific Pareto fronts were quantified in terms of CTV D(99%), rectum V(30Gy) and bladder V(40Gy) to perform beam configuration selection. Per patient and beam configuration, Pareto fronts were successfully sampled based on 200 IMPT plans of which on average 29% were non-approved plans. In all patients, IMPT plans based on the 2-beam set-up were completely dominated by plans with the 3-beam and 4-beam configuration. Compared to the 3-beam set-up, the 4-beam set-up increased the median CTV D(99%) on average by 0.2 Gy and decreased the median rectum V(30Gy) and median bladder V(40Gy) on average by 3.6% and 1.3%, respectively. This study demonstrates a method to automatically derive Pareto fronts in robust IMPT planning. For all patients, the defined four-beam configuration was found optimal
Özlem Türkşen
2013-01-01
Full Text Available The solution set of a multi-response experiment is characterized by Pareto solution set. In this paper, the multi-response experiment is dealed in a fuzzy framework. The responses and model parameters are considered as triangular fuzzy numbers which indicate the uncertainty of the data set. Fuzzy least square approach and fuzzy modified NSGA-II (FNSGA-II are used for modeling and optimization, respectively. The obtained fuzzy Pareto solution set is grouped by using fuzzy relational clustering approach. Therefore, it could be easier to choose the alternative solutions to make better decision. A fuzzy response valued real data set is used as an application.
Reddy, P.V.; Engwerda, J.C.
2010-01-01
In this article we derive necessary and sufficient conditions for the existence of Pareto optimal solutions for an N player cooperative infinite horizon differential game. Firstly, we write the problem of finding Pareto candidates as solving N constrained optimal control subproblems. We derive some
Evolutionary path control strategy for solving many-objective optimization problem.
Roy, Proteek Chandan; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin
2015-04-01
The number of objectives in many-objective optimization problems (MaOPs) is typically high and evolutionary algorithms face severe difficulties in solving such problems. In this paper, we propose a new scalable evolutionary algorithm, called evolutionary path control strategy (EPCS), for solving MaOPs. The central component of our algorithm is the use of a reference vector that helps simultaneously minimizing all the objectives of an MaOP. In doing so, EPCS employs a new fitness assignment strategy for survival selection. This strategy consists of two procedures and our algorithm applies them sequentially. It encourages a population of solutions to follow a certain path reaching toward the Pareto optimal front. The essence of our strategy is that it reduces the number of nondominated solutions to increase selection pressure in evolution. Furthermore, unlike previous work, EPCS is able to apply the classical Pareto-dominance relation with the new fitness assignment strategy. Our algorithm has been tested extensively on several scalable test problems, namely five DTLZ problems with 5 to 40 objectives and six WFG problems with 2 to 13 objectives. Furthermore, the algorithm has been tested on six CEC09 problems having 2 or 3 objectives. The experimental results show that EPCS is capable of finding better solutions compared to other existing algorithms for problems with an increasing number of objectives.
Wei Yue
2015-01-01
Full Text Available The major issues for mean-variance-skewness models are the errors in estimations that cause corner solutions and low diversity in the portfolio. In this paper, a multiobjective fuzzy portfolio selection model with transaction cost and liquidity is proposed to maintain the diversity of portfolio. In addition, we have designed a multiobjective evolutionary algorithm based on decomposition of the objective space to maintain the diversity of obtained solutions. The algorithm is used to obtain a set of Pareto-optimal portfolios with good diversity and convergence. To demonstrate the effectiveness of the proposed model and algorithm, the performance of the proposed algorithm is compared with the classic MOEA/D and NSGA-II through some numerical examples based on the data of the Shanghai Stock Exchange Market. Simulation results show that our proposed algorithm is able to obtain better diversity and more evenly distributed Pareto front than the other two algorithms and the proposed model can maintain quite well the diversity of portfolio. The purpose of this paper is to deal with portfolio problems in the weighted possibilistic mean-variance-skewness (MVS and possibilistic mean-variance-skewness-entropy (MVS-E frameworks with transaction cost and liquidity and to provide different Pareto-optimal investment strategies as diversified as possible for investors at a time, rather than one strategy for investors at a time.
Tsukamoto, Noritaka; Nojima, Yusuke; Ishibuchi, Hisao
In this paper, we examine the behavior of evolutionary multiobjective optimization (EMO) algorithms to clarify the difficulties in their scalability to many-objective optimization problems. Whereas EMO algorithms usually work well on two-objective problems, it has also been reported that they do not work well on many-objective problems. First, we examine the behavior of the most well-known and frequently-used Pareto-based EMO algorithm (i. e. , NSGA-II) on many-objective 0/1 knapsack problems. Experimental results show that the search ability of NSGA-II is severely deteriorated by the increase in the number of objectives. This is because the selection pressure toward the Pareto front is severely weakened by the increase in the number of non-dominated solutions. Next we briefly review some approaches to the scalability improvement of EMO algorithms to many-objective problems. Then we examine their effects on the search ability of NSGA-II. Experimental results show that the improvement in the convergence of solutions to the Pareto front often leads to the decrease in their diversity.
李锋
2011-01-01
多目标旅行商问题(MOTSP)是经典旅行商问题的扩展,其优化目标包含了距离、成本、收益及风险等多个相互冲突的指标.本文提出了一种基于偏好的Pareto演化算法p-PEA用于建模并求解此NP-hard问题.该优化算法建立在MOTSP的智能体仿真模型之上,从而解决了数学建模不能真实再现实际MOTSP中众多影响因素的问题.通过仿真的方法,算法能够得到MOTSP可行解的各项评价指标值.在此基础士,通过设计演化算法搜索问题的Pareto优化解集.其中,将决策者的决策偏好信息引入到Pareto优化解集的求解过程中,所得结果将更合理.最后,以一个130个城市的旅行商问题为例验证了算法的有效性.%The multi-objective traveling salesman problems (MOTSP) are a generalization of the well-known traveling salesman problem where multiple conflicting objectives include optimizing distance, cost, profit, risk of the tour, etc. This paper proposes a preference-based Pareto evolutionary algorithm, named p-PEA, to model and solve this NP-hard problem. The algorithm is built on framework of agent-based simulation model, where MOTSP is represented as an agent model. In this way, various factors of practical MOTSP are easier involved in the model, than the mathematics approaches. Based on agent simulation, multi-objective data of feasible solutions are collected from simulation output. Furthermore, an evolutionary algorithm is adopted to search Pareto-optimal solutions. In the p-PEA algorithm, preference information of decision-maker is introduced to search preferred Pareto-optimal solutions. Finally, the proposed p-PEA algorithm is applied to a bi-objective TSP instance with 130 cities to demonstrate its validity and effectiveness.
I. K. Romanova
2015-01-01
Full Text Available The article research concerns the multi-criteria optimization (MCO, which assumes that operation quality criteria of the system are independent and specifies a way to improve values of these criteria. Mutual contradiction of some criteria is a major problem in MCO. One of the most important areas of research is to obtain the so-called Pareto - optimal options.The subject of research is Pareto front, also called the Pareto frontier. The article discusses front classifications by its geometric representation for the case of two-criterion task. It presents a mathematical description of the front characteristics using the gradients and their projections. A review of current domestic and foreign literature has revealed that the aim of works in constructing the Pareto frontier is to conduct research in conditions of uncertainty, in the stochastic statement, with no restrictions. A topology both in two- and in three-dimensional case is under consideration. The targets of modern applications are multi-agent systems and groups of players in differential games. However, all considered works have no task to provide an active management of the front.The objective of this article is to discuss the research problem the Pareto frontier in a new production, namely, with the active co-developers of the systems and (or the decision makers (DM in the management of the Pareto frontier. It notes that such formulation differs from the traditionally accepted approach based on the analysis of already existing solutions.The article discusses three ways to describe a quality of the object management system. The first way is to use the direct quality criteria for the model of a closed system as the vibrational level of the General form. The second one is to study a specific two-loop system of an aircraft control using the angular velocity and normal acceleration loops. The third is the use of the integrated quality criteria. In all three cases, the selected criteria are
王晓红; 宋立新
2013-01-01
研究定时截尾数据情形下 Pareto 分布参数θ的 Bayes 估计和可容许性。给出熵损失函数的定义，取损失函数为熵损失函数，通过计算求出定时截尾情形下的熵损失函数，从而给出了 Pareto 分布参数θ的 Bayes 估计的一般形式；在给出先验分布为 Gamma 分布的条件下，计算出参数θ的后验密度，进而得出了参数θ的 Bayes 估计的精确形式，证明了所得到的参数θ的 Bayes 估计的可容许性。%In order to investigate the Bayesian estimation of Pareto distribution parameter on fixed time censoring data, this paper introduces the definition of entropy loss function by taking the loss function as entropy loss function. Through the calculations of above functions, the general form of Bayesian estimation of Pareto distribution parameter is obtained. Under the conditions of prior distribution as Gamma distribution, calculating the posterior density, the exact form of estimation is then given. In addition, the paper proves that the Bayesian estimation is admissible.
A New Definition and Calculation Model for Evolutionary Multi-Objective Optimization
Zhou Ai-min; Kang Li-shan; Chen Yu-ping; Huang Yu-zhen
2003-01-01
We present a new definition (Evolving Solutions) for Multi objective Optimization Problem (MOP) to answer the basic question (what's multi-objective optimal solution?) and advance an asynchronous evolutionary model (MINT Model) to solve MOPs. The new theory is based on our understanding of the natural evolution and the analysis of the difference between natural evolution and MOP, thus it is not only different from the Converting Optimization but also different from Pareto Optimization.Some tests prove that our new theory may conquer disadvantages of the upper two methods to some extent.
Safety management in NPPs using an evolutionary algorithm technique
Mishra, Alok [Nuclear Power Corporation of India Limited, NUB Ent-2, Anushakti Nagar, Mumbai (India)]. E-mail: alok@kkhq.net; Patwardhan, Anand [Indian Institute of Tehnology Bombay (India); Verma, A.K. [Indian Institute of Tehnology Bombay (India)
2007-07-15
The general goal of safety management in Nuclear Power Plants (NPPs) is to make requirements and activities more risk effective and less costly. The technical specification and maintenance (TS and M) activities in a plant are associated with controlling risk or with satisfying requirements, and are candidates to be evaluated for their resource effectiveness in risk-informed applications. Accordingly, the risk-based analysis of technical specification (RBTS) is being considered in evaluating current TS. The multi-objective optimization of the TS and M requirements of a NPP based on risk and cost, gives the pareto-optimal solutions, from which the utility can pick its decision variables suiting its interest. In this paper, a multi-objective evolutionary algorithm technique has been used to make a trade-off between risk and cost both at the system level and at the plant level for loss of coolant accident (LOCA) and main steam line break (MSLB) as initiating events.
LI Hui; LIAN Jijian
2008-01-01
A multi-objective optimal operation model of water-sedimentation-power in reservoir is established with power-generation, sedimentation and water storage taken into account. Moreover,the inertia weight serf-adjusting mechanism and Pareto-optimal archive are introduced into the par-ticle swarm optimization and an improved multi-objective particle swarm optimization (IMOPSO) is proposed. The IMOPSO is employed to solve the optimal model and obtain the Pareto-optimal front. The multi-objective optimal operation of Wanjiazhai Reservoir during the spring breakup was investigated with three typical flood hydrographs. The results show that the former method is able to obtain the Pareto-optimal front with a uniform distribution property. Different regions (A, B, C) of the Pareto-optimal front correspond to the optimized schemes in terms of the objectives of sedi-ment deposition, sediment deposition and power generation, and power generation, respectively.The level hydrographs and outflow hydrographs show the operation of the reservoir in details. Com-pared with the non-dominated sorting genetic algorithm-Ⅱ (NSGA-Ⅱ), IMOPSO has close global op-timization capability and is suitable for multi-objective optimization problems.
Wismans, L.J.J.; Brands, T.; Berkum, van E.C.; Bliemer, M.C.J.
2014-01-01
Solving the multi-objective network design problem (MONDP) resorts to a Pareto optimal set. This set can provide additional information like trade-offs between objectives for the decision making process, which is not available if the compensation principle would be chosen in advance. However, the Pa
Abdalroof M.S.; Zhao Zhi-wen; Wang De-hui
2014-01-01
In this paper, the estimation of parameters based on a progressively type-I interval censored sample from a Pareto distribution is studied. Different methods of estimation are discussed, which include mid-point approximation estimator, the maximum likelihood estimator and moment estimator. The estimation procedures are discussed in details and compared via Monte Carlo simulations in terms of their biases.
Evolutionary molecular medicine.
Nesse, Randolph M; Ganten, Detlev; Gregory, T Ryan; Omenn, Gilbert S
2012-05-01
Evolution has long provided a foundation for population genetics, but some major advances in evolutionary biology from the twentieth century that provide foundations for evolutionary medicine are only now being applied in molecular medicine. They include the need for both proximate and evolutionary explanations, kin selection, evolutionary models for cooperation, competition between alleles, co-evolution, and new strategies for tracing phylogenies and identifying signals of selection. Recent advances in genomics are transforming evolutionary biology in ways that create even more opportunities for progress at its interfaces with genetics, medicine, and public health. This article reviews 15 evolutionary principles and their applications in molecular medicine in hopes that readers will use them and related principles to speed the development of evolutionary molecular medicine.
Numerical and Evolutionary Optimization Workshop
Trujillo, Leonardo; Legrand, Pierrick; Maldonado, Yazmin
2017-01-01
This volume comprises a selection of works presented at the Numerical and Evolutionary Optimization (NEO) workshop held in September 2015 in Tijuana, Mexico. The development of powerful search and optimization techniques is of great importance in today’s world that requires researchers and practitioners to tackle a growing number of challenging real-world problems. In particular, there are two well-established and widely known fields that are commonly applied in this area: (i) traditional numerical optimization techniques and (ii) comparatively recent bio-inspired heuristics. Both paradigms have their unique strengths and weaknesses, allowing them to solve some challenging problems while still failing in others. The goal of the NEO workshop series is to bring together people from these and related fields to discuss, compare and merge their complimentary perspectives in order to develop fast and reliable hybrid methods that maximize the strengths and minimize the weaknesses of the underlying paradigms. Throu...
Pareto, Mosca e a metodologia de uma nova ciência política
Bianchi,Alvaro
2016-01-01
O artigo investiga os debates metodológicos existentes no processo de institucionalização a ciência política italiana, no final do século XIX e início do XX. Gaetano Mosca e Vilfredo Pareto almejavam um conhecimento científico da política que fosse construído de acordo com parâmetros inspirados nas ciências naturais. Mosca advogou fortemente em favor de um método histórico que permitiria encontrar nas instituições as forças psicológicas que garantiriam a regularidade dos fenômenos políticos. ...
Pareto Joint Inversion of 2D Magnetotelluric and Gravity Data — Towards Practical Applications
Miernik, Katarzyna; Bogacz, Adrian; Kozubal, Adam; Danek, Tomasz; Wojdyła, Marek
2016-10-01
In this paper, a Pareto inversion based global optimization approach, to obtain results of joint inversion of two types of geophysical data sets, is formulated. 2D magnetotelluric and gravity data were used for tests, but presented solution is flexible enough to be used for combination of any kind of two or more target functions, as long as misfits can be calculated and forward problems solved. To minimize dimensionality of the solution, space and introduce straightforward regularization Sharp Boundary Interface (SBI) method was applied. As a main optimization engine, Particle Swarm Optimization (PSO) was used. Synthetic examples based on a real geological model were used to test proposed approach and show its usefulness in practical applications.
AN ECONOMIC RELIABILITY EFFICIENT GROUP ACCEPTANCE SAMPLING PLANS FOR FAMILY PARETO DISTRIBUTIONS
Muhammad Ismail
2013-12-01
Full Text Available The present research article deals with an economic reliability efficient group acceptance sampling plan for time truncated tests which are based on the total number of failures assuming that the life time of a product follows the family for Pareto distribution. This research is proposed when a multiple number of products as a group can be observed simultaneously in a tester. The minimum termination time required for a given group size and acceptance number is determined such that the producer and consumer risks are satisfied for specific standard of quality level, while the number of groups and the number of testers are pre-assumed. Comparison studies are made between the proposed plan and the existing plan on the basis of minimum termination time. Two real examples are also discussed.
Statistical inferences with jointly type-II censored samples from two Pareto distributions
Abu-Zinadah, Hanaa H.
2017-08-01
In the several fields of industries the product comes from more than one production line, which is required to work the comparative life tests. This problem requires sampling of the different production lines, then the joint censoring scheme is appeared. In this article we consider the life time Pareto distribution with jointly type-II censoring scheme. The maximum likelihood estimators (MLE) and the corresponding approximate confidence intervals as well as the bootstrap confidence intervals of the model parameters are obtained. Also Bayesian point and credible intervals of the model parameters are presented. The life time data set is analyzed for illustrative purposes. Monte Carlo results from simulation studies are presented to assess the performance of our proposed method.
S.K. Barik
2015-06-01
Full Text Available In many real-life decision making problems, probabilistic fuzzy goal programming problems are used where some of the input parameters of the problem are considered as random variables with fuzzy aspiration levels. In the present paper, a linearly constrained probabilistic fuzzy goal programming programming problem is presented where the right hand side parameters in some constraints follows Pareto distribution with known mean and variance. Also the aspiration levels are considered as fuzzy. Further, simple, weighted, and preemptive additive approaches are discussed for probabilistic fuzzy goal programming model. These additive approaches are employed to aggregating the membership values and form crisp equivalent deterministic models. The resulting models are then solved by using standard linear mathematical programming techniques. The developed methodology and solution procedures are illustrated with a numerical example.
Linear discrete-time Pareto-Nash-Stackelberg control problem and principles for its solving
Valeriu Ungureanu
2013-04-01
Full Text Available A direct-straightforward method for solving linear discrete-time optimal control problem is applied to solve control problem of a linear discrete-time system as a mixture of multi-criteria Stackelberg and Nash games. For simplicity, the exposure starts with the simplest case of linear discrete-time optimal control problem and, by sequential considering of more general cases, investigation finalizes with the highlighted Pareto-Nash-Stackelberg and set valued control problems. Different principles of solving are compared and their equivalence is proved. Mathematics Subject Classification 2010: 49K21, 49N05, 93C05, 93C55, 90C05, 90C29, 91A10, 91A20, 91A44, 91A50.
Pareto Optimizations for Rescheduling Problems%重新排序问题的Pareto最优解
慕运动; 许小艳
2009-01-01
为了保证先来顾客的需求和工件本身错位量的要求,着重研究这种使先来顾客的总目标函数值与工件错位量之间达到平衡的问题,即重新排序问题的Pareto最优解问题.对于最大延迟和工件错位量的Pareto最优解问题,给出了这些问题的多项式时间算法或计算复杂度.%This paper considers simultaneous or Pareto optimization between the scheduling cost of all the jobs and the degree of this disruption.For some problems between maximum lateness and disruptions,a polynomial time algorithm or complexity of computation is provided.
PARETO ANALYSIS OF TOTAL QUALITY MANAGEMENT FACTORS CRITICAL TO SUCCESS FOR SERVICE INDUSTRIES
Faisal Talib
2010-06-01
Full Text Available Total quality management (TQM is an integrated management approach that aim to continuously improve the performance of products, processes, and services to achieve and surpass customer's expectations. To accomplish this objective, some key factors that contribute to the success of TQM efforts are to be identified. These key factors are often termed as critical success factors (CSFs. The purpose of the present study is to identify and propose a list of "vital few" TQM CSFs for the benefit of researchers and service industries practitioners. A quality tool "Pareto analysis" was used to sort and arrange the CSFs according to the order of criticality. A few vital CSFs were identified and reported. The results of this study will help in successful implementation of TQM program in organizations. The managerial implications, research recommendations, and scope for future research work are presented in the end.
Pareto's Law of Income Distribution: Evidence for Germany, the United Kingdom, and the United States
Clementi, F
2005-01-01
We analyze three sets of income data: the US Panel Study of Income Dynamics (PSID), the British Household Panel Survey (BHPS), and the German Socio-Economic Panel (GSOEP). It is shown that the empirical income distribution is consistent with a two-parameter lognormal function for the low-middle income group (97%-99% of the population), and with a Pareto or power law function for the high income group (1%-3% of the population). This mixture of two qualitatively different analytical distributions seems stable over the years covered by our data sets, although their parameters significantly change in time. It is also found that the probability density of income growth rates almost has the form of an exponential function.
Risk finance for catastrophe losses with Pareto-calibrated Lévy-stable severities.
Powers, Michael R; Powers, Thomas Y; Gao, Siwei
2012-11-01
For catastrophe losses, the conventional risk finance paradigm of enterprise risk management identifies transfer, as opposed to pooling or avoidance, as the preferred solution. However, this analysis does not necessarily account for differences between light- and heavy-tailed characteristics of loss portfolios. Of particular concern are the decreasing benefits of diversification (through pooling) as the tails of severity distributions become heavier. In the present article, we study a loss portfolio characterized by nonstochastic frequency and a class of Lévy-stable severity distributions calibrated to match the parameters of the Pareto II distribution. We then propose a conservative risk finance paradigm that can be used to prepare the firm for worst-case scenarios with regard to both (1) the firm's intrinsic sensitivity to risk and (2) the heaviness of the severity's tail.
Li Wei; Hai-liang Yang
2004-01-01
In this paper we first consider a risk process in which claim inter-arrival times and the time until the first claim have an Erlang (2)distribution.An explicit solution is derived for the probability of ultimate ruin,given an initial reserve of u when the claim size follows a Pareto distribution.Follow Ramsay [8] ,Laplace transforms and exponential integrals are used to derive the solution,which involves a single integral of real valued functions along the positive real line,and the integrand is not of an oscillating kind.Then we show that the ultimate ruin probability can be expressed as the sum of expected values of functions of two different Gamma random variables.Finally,the results are extended to the Erlang(n)case.Numerical examples are given to illustrate the main results.
Modeling Pareto efficient PM10 control policies in Northern Italy to reduce health effects
Pisoni, Enrico; Volta, Marialuisa
High PM10 concentrations can cause human health problems, both related to short-term and long-term exposure to particles. In this work the impact of efficient PM10 control problems in Northern Italy is assessed by means of a two-stage methodology. In the first stage a multi-objective optimization approach is applied. The multi-objective problem defines two control objectives (the emission reduction costs and the air quality index) to be minimized varying the decision variables (precursor emission reductions). The solution of the multi-objective problem is the Pareto efficient PM10 control policies. In the second stage, the ExternE methodology is applied to estimate health impacts and external costs for the efficient emission reduction scenarios computed in the first stage. The methodology has been applied over Lombardia region, one of the most polluted areas in Europe.
Pareto Optimal Solutions for Stochastic Dynamic Programming Problems via Monte Carlo Simulation
R. T. N. Cardoso
2013-01-01
Full Text Available A heuristic algorithm is proposed for a class of stochastic discrete-time continuous-variable dynamic programming problems submitted to non-Gaussian disturbances. Instead of using the expected values of the objective function, the randomness nature of the decision variables is kept along the process, while Pareto fronts weighted by all quantiles of the objective function are determined. Thus, decision makers are able to choose any quantile they wish. This new idea is carried out by using Monte Carlo simulations embedded in an approximate algorithm proposed to deterministic dynamic programming problems. The new method is tested in instances of the classical inventory control problem. The results obtained attest for the efficiency and efficacy of the algorithm in solving these important stochastic optimization problems.
无
2006-01-01
The generalized Pareto distribution model is a kind of hydrocarbon pool size probability statistical method for resource assessment. By introducing the time variable, resource conversion rate and the geological variable, resource density, such model can describe not only different types of basins, but also any exploration samples at different phases of exploration, up to the parent population. It is a dynamic distribution model with profound geological significance and wide applicability. Its basic principle and the process of resource assessment are described in this paper. The petroleum accumulation system is an appropriate assessment unit for such method. The hydrocarbon resource structure of the Huanghua Depression in Bohai Bay Basin was predicted by using this model. The prediction results accord with the knowledge of exploration in the Huanghua Depression, and point out the remaining resources potential and structure of different petroleum accumulation systems, which are of great significance for guiding future exploration in the Huanghua Depression.
Rania, M. Shalaby
2015-10-01
Full Text Available This paper deals with Bayesian and non-Bayesian methods for estimating parameters of the bivariate Pareto (BP distribution based on censored samples are considered with shape parameters λ and known scale parameter β. The maximum likelihood estimators MLE of the unknown parameters are derived. The Bayes estimators are obtained with respect to the squared error loss function and the prior distributions allow for prior dependence among the components of the parameter vector. .Posterior distributions for parameters of interest are derived and their properties are described. If the scale parameter is known, the Bayes estimators of the unknown parameters can be obtained in explicit forms under the assumptions of independent priors. An extensive computer simulation is used to compare the performance of the proposed estimators using MathCAD (14.
Patient feature based dosimetric Pareto front prediction in esophageal cancer radiotherapy
Wang, Jiazhou; Zhao, Kuaike; Peng, Jiayuan; Xie, Jiang; Chen, Junchao; Zhang, Zhen; Hu, Weigang, E-mail: jackhuwg@gmail.com [Department of Radiation Oncology, Fudan University Shanghai Cancer Center, Shanghai 200032, China and Department of Oncology, Shanghai Medical College, Fudan University, Shanghai 200032 (China); Jin, Xiance [The 1st Affiliated Hospital of Wenzhou Medical College, Wenzhou, Zhejiang 325000 (China); Studenski, Matthew [Department of Radiation Oncology, University of Miami-Miller School of Medicine, Miami, Florida 33136 (United States)
2015-02-15
Purpose: To investigate the feasibility of the dosimetric Pareto front (PF) prediction based on patient’s anatomic and dosimetric parameters for esophageal cancer patients. Methods: Eighty esophagus patients in the authors’ institution were enrolled in this study. A total of 2928 intensity-modulated radiotherapy plans were obtained and used to generate PF for each patient. On average, each patient had 36.6 plans. The anatomic and dosimetric features were extracted from these plans. The mean lung dose (MLD), mean heart dose (MHD), spinal cord max dose, and PTV homogeneity index were recorded for each plan. Principal component analysis was used to extract overlap volume histogram (OVH) features between PTV and other organs at risk. The full dataset was separated into two parts; a training dataset and a validation dataset. The prediction outcomes were the MHD and MLD. The spearman’s rank correlation coefficient was used to evaluate the correlation between the anatomical features and dosimetric features. The stepwise multiple regression method was used to fit the PF. The cross validation method was used to evaluate the model. Results: With 1000 repetitions, the mean prediction error of the MHD was 469 cGy. The most correlated factor was the first principal components of the OVH between heart and PTV and the overlap between heart and PTV in Z-axis. The mean prediction error of the MLD was 284 cGy. The most correlated factors were the first principal components of the OVH between heart and PTV and the overlap between lung and PTV in Z-axis. Conclusions: It is feasible to use patients’ anatomic and dosimetric features to generate a predicted Pareto front. Additional samples and further studies are required improve the prediction model.
Martin Klammer
Full Text Available Multivariate biomarkers that can predict the effectiveness of targeted therapy in individual patients are highly desired. Previous biomarker discovery studies have largely focused on the identification of single biomarker signatures, aimed at maximizing prediction accuracy. Here, we present a different approach that identifies multiple biomarkers by simultaneously optimizing their predictive power, number of features, and proximity to the drug target in a protein-protein interaction network. To this end, we incorporated NSGA-II, a fast and elitist multi-objective optimization algorithm that is based on the principle of Pareto optimality, into the biomarker discovery workflow. The method was applied to quantitative phosphoproteome data of 19 non-small cell lung cancer (NSCLC cell lines from a previous biomarker study. The algorithm successfully identified a total of 77 candidate biomarker signatures predicting response to treatment with dasatinib. Through filtering and similarity clustering, this set was trimmed to four final biomarker signatures, which then were validated on an independent set of breast cancer cell lines. All four candidates reached the same good prediction accuracy (83% as the originally published biomarker. Although the newly discovered signatures were diverse in their composition and in their size, the central protein of the originally published signature - integrin β4 (ITGB4 - was also present in all four Pareto signatures, confirming its pivotal role in predicting dasatinib response in NSCLC cell lines. In summary, the method presented here allows for a robust and simultaneous identification of multiple multivariate biomarkers that are optimized for prediction performance, size, and relevance.
Remembering the evolutionary Freud.
Young, Allan
2006-03-01
Throughout his career as a writer, Sigmund Freud maintained an interest in the evolutionary origins of the human mind and its neurotic and psychotic disorders. In common with many writers then and now, he believed that the evolutionary past is conserved in the mind and the brain. Today the "evolutionary Freud" is nearly forgotten. Even among Freudians, he is regarded to be a red herring, relevant only to the extent that he diverts attention from the enduring achievements of the authentic Freud. There are three ways to explain these attitudes. First, the evolutionary Freud's key work is the "Overview of the Transference Neurosis" (1915). But it was published at an inopportune moment, forty years after the author's death, during the so-called "Freud wars." Second, Freud eventually lost interest in the "Overview" and the prospect of a comprehensive evolutionary theory of psychopathology. The publication of The Ego and the Id (1923), introducing Freud's structural theory of the psyche, marked the point of no return. Finally, Freud's evolutionary theory is simply not credible. It is based on just-so stories and a thoroughly discredited evolutionary mechanism, Lamarckian use-inheritance. Explanations one and two are probably correct but also uninteresting. Explanation number three assumes that there is a fundamental difference between Freud's evolutionary narratives (not credible) and the evolutionary accounts of psychopathology that currently circulate in psychiatry and mainstream journals (credible). The assumption is mistaken but worth investigating.
Raffaele Federici
2017-08-01
Full Text Available In questa ricerca di senso fra la fine di un'epoca e la nuova visione del mondo, c’è, nei due Autori, quello che potrebbe chiamarsi una betweenness: Pareto, quasi un franco-italiano, e Michels, un italiano-tedesco, anzi un più che italiano. Nella linea di faglia rappresentata dal primo conflitto mondiale, i due sociologi sono in una doppia relazione interiore appunto franco-italiana Pareto e italo-tedesca Michels e una relazione esteriore fra il mondo di ieri e il mondo successivo al cataclisma che fu la prima guerra mondiale, quando ben quattro imperi colossali erano stati smembrati (l’Impero Russo, l’Impero Tedesco, l’Impero Austro-ungarico e l’Impero ottomano, nello stesso tempo in cui Emile Durkheim guardava con inquietudine alla disgregazione delle vecchie comunità tradizionali, dove il senso della crisi del tempo investe non solo le persone e i comportamenti, ma il mondo logico stesso. Lo scambio epistolare avviene nella stessa terra: Pareto a Celigny, sul lago di Ginevra , e Michels a Basilea , lungo le rive del Reno. Vi è, fra i due sociologi un profondo rispetto, che vedrà Robert Michels dedicare allo “scienziato e amico Vilfredo Pareto con venerazione” un’opera importante come “Problemi di sociologia applicata” pubblicata solo tre anni dopo il Trattato di Sociologia Generale del Maestro. In questa antologia di saggi Robert Michels, probabilmente composti fra il 1914 e il 1917, negli anni del grande cataclisma, anzi concepiti prima «dell’insediamento di questa terribile corte suprema di cassazione di tutte le nostre ideologie, che è la guerra» , quindi contemporanea al Trattato, il Maestro viene citato tre volte, come Max Weber, ma, de facto, la presenza di Pareto è continua. In particolare, il richiamo al Maestro è iscritto a due piste di ricerca: da una parte la realtà della ricerca sociologica e del suo amplissimo spettro di analisi e dall’altra la teoria della circolazione delle elités. È proprio
李琼; 武东
2012-01-01
Bayesian analysis of Pareto distribution under ProgressiveType-II censoring was given. Bayesian estimation for parameters of the model was obtained using Markov chain Monte Carlo method. It is seen that Bayesian estimation is efficient through Monte Carlo simulation and an example.% 对 Pareto 分布场合逐步增加 II 型截尾样本进行了贝叶斯分析，利用马尔可夫链蒙特卡罗方法给出了参数的贝叶斯估计。最后，通过蒙特卡罗模拟和应用实例表明该贝叶斯估计是有效的
Biotic interaction strength and the intensity of selection.
Benkman, Craig W
2013-08-01
Although the ecological and evolutionary impacts of species interactions have been the foci of much research, the relationship between the strength of species interactions and the intensity of selection has been investigated only rarely. I develop a simple model demonstrating how the opportunity for selection varies with interaction strength, and then use the relationship between the maximum value of the selection differential and the opportunity for selection (Arnold & Wade 1984) to evaluate how selection differentials vary in relation to species interaction strength. This model predicts an initial deceleration and then an accelerating increase in the intensity of selection with increasing strength of antagonistic interactions and with decreasing strength of mutualistic interactions. Empirical data from several studies provide support for this model. These results further support an evolutionary mechanism for some striking patterns of evolutionary diversification including the latitudinal species gradient, and should be relevant to studies of eco-evolutionary dynamics.
Evolutionary humanoid robotics
Eaton, Malachy
2015-01-01
This book examines how two distinct strands of research on autonomous robots, evolutionary robotics and humanoid robot research, are converging. The book will be valuable for researchers and postgraduate students working in the areas of evolutionary robotics and bio-inspired computing.
南岱; 钟汉明
2014-01-01
The DPV project is huge system engineering, in which many factors affecting the construction quality and hidden safety troubles existed. In QHSE management process, with the help of the figure of Pareto (Pareto chart), the classification of phenomenon and reason for each problem is carried out, the important issues are discovered, and then corresponding measures can be taken, so that the difficulties in QHSE management are overcome.%深水铺管起重船工程项目是庞大的系统工程，影响建造质量的因素多，现场存在的安全隐患多。在QHSE管理过程中，用帕累托图（Pareto chart）针对问题点按现象、原因进行分类，发现重要问题点，并采取相应的对策，解决QHSE管理中所面临的困难。
陈黎明; 赵辉
2011-01-01
在PPP项目的众多参与方中,选取项目发起人、SPC和贷款银行三个最主要的参与方,通过分析项目发起人与SPC、SPC与贷款银行和贷款银行与项目发起人之间的内在联系,在Pareto Optimality理论分析的基础上,得到三者之间两两优势互补的埃奇沃思方框图,并根据瓦尔拉斯均衡理论得出PPP项目能够实现帕累托最优,实现项目效用的最大化,最后拟画出PPP项目三大主参与方的效用可能性曲线。%Among the numerous participators of PPP project,the paper chooses project promoter,SPC and loan bank these three main participators.Then,it analyzes the internal relation among project promoter,SPC with loan bank,and draws a conclusion of Edgeworth Box of participators＇ advantages based on Pareto Optimality Theory.According to the Walrasian Equilibrium Theory,the PPP project could realize the Pareto Optimization and realize the maximization of utility,and finally it draws the utility possibility curve.
Pareto严格稳定分布在保险理赔中的应用%APPLICATION OF THE PARETO POSITIVE STABLE DISTRIBUTION IN INSURANCE CLAIM
玄海燕; 包海明; 史永侠
2015-01-01
本文研究了Pareto严格稳定分布在保险中的应用。利用极大似然估计的方法得到了Pareto严格稳定分布，正态分布和Pareto分布的参数估计。根据信息准则，表明Pareto严格稳定分布能够较好地拟合保险数据。%In this paper, we study the application of the Pareto positive stable distribution in insurance. The parameter estimates of Pareto positive stable distribution, normal distribution and Pareto distribution are obtained using the method of maximum likelihood estimates. By Akaike information criterion, it is indicated that the Pareto positive stable distribution can fit the insur-ance data well.
Periaux, Jacques; Lee, Dong Seop Chris
2015-01-01
Many complex aeronautical design problems can be formulated with efficient multi-objective evolutionary optimization methods and game strategies. This book describes the role of advanced innovative evolution tools in the solution, or the set of solutions of single or multi disciplinary optimization. These tools use the concept of multi-population, asynchronous parallelization and hierarchical topology which allows different models including precise, intermediate and approximate models with each node belonging to the different hierarchical layer handled by a different Evolutionary Algorithm. The efficiency of evolutionary algorithms for both single and multi-objective optimization problems are significantly improved by the coupling of EAs with games and in particular by a new dynamic methodology named “Hybridized Nash-Pareto games”. Multi objective Optimization techniques and robust design problems taking into account uncertainties are introduced and explained in detail. Several applications dealing with c...
Erik Brynjolfsson; Yu Hu; Duncan Simester
2007-01-01
Many markets have historically been dominated by a small number of best-selling products. The Pareto principle, also known as the 80/20 rule, describes this common pattern of sales concentration. However, information technology in general and Internet markets in particular have the potential to substantially increase the collective share of niche products, thereby creating a longer tail in the distribution of sales. This paper investigates the Internet's "long tail" phenomenon. By analyzing d...
... strengthens your heart and lungs. When you strength train with weights, you're using your muscles to ... see there are lots of different ways to train with weights. Try a few good basic routines ...
... en español Entrenamiento de la fuerza muscular Strength training is a vital part of a balanced exercise routine that includes aerobic activity and flexibility exercises. Regular aerobic exercise, such as running or ...
Jamali, A.; Khaleghi, E.; Gholaminezhad, I.; Nariman-zadeh, N.
2016-05-01
In this paper, a new multi-objective genetic programming (GP) with a diversity preserving mechanism and a real number alteration operator is presented and successfully used for Pareto optimal modelling of some complex non-linear systems using some input-output data. In this study, two different input-output data-sets of a non-linear mathematical model and of an explosive cutting process are considered separately in three-objective optimisation processes. The pertinent conflicting objective functions that have been considered for such Pareto optimisations are namely, training error (TE), prediction error (PE), and the length of tree (complexity of the network) (TL) of the GP models. Such three-objective optimisation implementations leads to some non-dominated choices of GP-type models for both cases representing the trade-offs among those objective functions. Therefore, optimal Pareto fronts of such GP models exhibit the trade-off among the corresponding conflicting objectives and, thus, provide different non-dominated optimal choices of GP-type models. Moreover, the results show that no significant optimality in TE and PE may occur when the TL of the corresponding GP model exceeds some values.
Amanifard, N.; Nariman-Zadeh, N.; Borji, M.; Khalkhali, A.; Habibdoust, A. [Department of Mechanical Engineering, The University of Guilan, P.O. Box 3756, Rasht (Iran)
2008-02-15
Three-dimensional heat transfer characteristics and pressure drop of water flow in a set of rectangular microchannels are numerically investigated using Fluent and compared with those of experimental results. Two metamodels based on the evolved group method of data handling (GMDH) type neural networks are then obtained for modelling of both pressure drop ({delta}P) and Nusselt number (Nu) with respect to design variables such as geometrical parameters of microchannels, the amount of heat flux and the Reynolds number. Using such obtained polynomial neural networks, multi-objective genetic algorithms (GAs) (non-dominated sorting genetic algorithm, NSGA-II) with a new diversity preserving mechanism is then used for Pareto based optimization of microchannels considering two conflicting objectives such as ({delta}P) and (Nu). It is shown that some interesting and important relationships as useful optimal design principles involved in the performance of microchannels can be discovered by Pareto based multi-objective optimization of the obtained polynomial metamodels representing their heat transfer and flow characteristics. Such important optimal principles would not have been obtained without the use of both GMDH type neural network modelling and the Pareto optimization approach. (author)
Enrique Carlos Canessa-Terrazas
2016-01-01
Full Text Available Se presenta el uso de Análisis Envolvente de Datos (AED para priorizar y seleccionar soluciones encontradas por un Algoritmo Genético de Pareto (AGP a problemas de diseño robusto en sistemas multirespuesta con muchos factores de control y ruido. El análisis de eficiencia de las soluciones con AED muestra que el AGP encuentra una buena aproximación a la frontera eficiente. Además, se usa AED para determinar la combinación del nivel de ajuste de media y variación de las respuestas del sistema, y con la finalidad de minimizar el costo económico de alcanzar dichos objetivos. Al unir ese costo con otras consideraciones técnicas y/o económicas, la solución que mejor se ajuste con un nivel predeterminado de calidad puede ser seleccionada más apropiadamente.
Wei Wang
2015-01-01
Full Text Available This paper presented a parameter estimation method based on a coupled hydromechanical model of dynamic compaction and the Pareto multiobjective optimization technique. The hydromechanical model of dynamic compaction is established in the FEM program LS-DYNA. The multiobjective optimization algorithm, Nondominated Sorted Genetic Algorithm (NSGA-IIa, is integrated with the numerical model to identify soil parameters using multiple sources of field data. A field case study is used to demonstrate the capability of the proposed method. The observed pore water pressure and crater depth at early blow of dynamic compaction are simultaneously used to estimate the soil parameters. Robustness of the back estimated parameters is further illustrated by a forward prediction. Results show that the back-analyzed soil parameters can reasonably predict lateral displacements and give generally acceptable predictions of dynamic compaction for an adjacent location. In addition, for prediction of ground response of the dynamic compaction at continuous blows, the prediction based on the second blow is more accurate than the first blow due to the occurrence of the hardening and strengthening of soil during continuous compaction.
Pareto-Optimal Evaluation of Ultimate Limit States in Offshore Wind Turbine Structural Analysis
Michael Muskulus
2015-12-01
Full Text Available The ultimate capacity of support structures is checked with extreme loads. This is straightforward when the limit state equations depend on a single load component, and it has become common to report maxima for each load component. However, if more than one load component is influential, e.g., both axial force and bending moments, it is not straightforward how to define an extreme load. The combination of univariate maxima can be too conservative, and many different combinations of load components can result in the worst value of the limit state equations. The use of contemporaneous load vectors is typically non-conservative. Therefore, in practice, limit state checks are done for each possible load vector, from each time step of a simulation. This is not feasible when performing reliability assessments and structural optimization, where additional, time-consuming computations are involved for each load vector. We therefore propose to use Pareto-optimal loads, which are a small set of loads that together represent all possible worst case scenarios. Simulations with two reference wind turbines show that this approach can be very useful for jacket structures, whereas the design of monopiles is often governed by the bending moment only. Even in this case, the approach might be useful when approaching the structural limits during optimization.
Generalized Pareto for Pattern-Oriented Random Walk Modelling of Organisms' Movements.
Sophie Bertrand
Full Text Available How organisms move and disperse is crucial to understand how population dynamics relates to the spatial heterogeneity of the environment. Random walk (RW models are typical tools to describe movement patterns. Whether Lévy or alternative RW better describes forager movements is keenly debated. We get around this issue using the Generalized Pareto Distribution (GPD. GPD includes as specific cases Normal, exponential and power law distributions, which underlie Brownian, Poisson-like and Lévy walks respectively. Whereas previous studies typically confronted a limited set of candidate models, GPD lets the most likely RW model emerge from the data. We illustrate the wide applicability of the method using GPS-tracked seabird foraging movements and fishing vessel movements tracked by Vessel Monitoring System (VMS, both collected in the Peruvian pelagic ecosystem. The two parameters from the fitted GPD, a scale and a shape parameter, provide a synoptic characterization of the observed movement in terms of characteristic scale and diffusive property. They reveal and quantify the variability, among species and individuals, of the spatial strategies selected by predators foraging on a common prey field. The GPD parameters constitute relevant metrics for (1 providing a synthetic and pattern-oriented description of movement, (2 using top predators as ecosystem indicators and (3 studying the variability of spatial behaviour among species or among individuals with different personalities.
Modelling road accident blackspots data with the discrete generalized Pareto distribution.
Prieto, Faustino; Gómez-Déniz, Emilio; Sarabia, José María
2014-10-01
This study shows how road traffic networks events, in particular road accidents on blackspots, can be modelled with simple probabilistic distributions. We considered the number of crashes and the number of fatalities on Spanish blackspots in the period 2003-2007, from Spanish General Directorate of Traffic (DGT). We modelled those datasets, respectively, with the discrete generalized Pareto distribution (a discrete parametric model with three parameters) and with the discrete Lomax distribution (a discrete parametric model with two parameters, and particular case of the previous model). For that, we analyzed the basic properties of both parametric models: cumulative distribution, survival, probability mass, quantile and hazard functions, genesis and rth-order moments; applied two estimation methods of their parameters: the μ and (μ+1) frequency method and the maximum likelihood method; used two goodness-of-fit tests: Chi-square test and discrete Kolmogorov-Smirnov test based on bootstrap resampling; and compared them with the classical negative binomial distribution in terms of absolute probabilities and in models including covariates. We found that those probabilistic models can be useful to describe the road accident blackspots datasets analyzed.
Pareto genealogies arising from a Poisson branching evolution model with selection.
Huillet, Thierry E
2014-02-01
We study a class of coalescents derived from a sampling procedure out of N i.i.d. Pareto(α) random variables, normalized by their sum, including β-size-biasing on total length effects (β < α). Depending on the range of α we derive the large N limit coalescents structure, leading either to a discrete-time Poisson-Dirichlet (α, -β) Ξ-coalescent (α ε[0, 1)), or to a family of continuous-time Beta (2 - α, α - β)Λ-coalescents (α ε[1, 2)), or to the Kingman coalescent (α ≥ 2). We indicate that this class of coalescent processes (and their scaling limits) may be viewed as the genealogical processes of some forward in time evolving branching population models including selection effects. In such constant-size population models, the reproduction step, which is based on a fitness-dependent Poisson Point Process with scaling power-law(α) intensity, is coupled to a selection step consisting of sorting out the N fittest individuals issued from the reproduction step.
Carapezza, Giovanni; Umeton, Renato; Costanza, Jole; Angione, Claudio; Stracquadanio, Giovanni; Papini, Alessio; Lió, Pietro; Nicosia, Giuseppe
2013-05-17
In this work, we develop methodologies for analyzing and cross comparing metabolic models. We investigate three important metabolic networks to discuss the complexity of biological organization of organisms, modeling, and system properties. In particular, we analyze these metabolic networks because of their biotechnological and basic science importance: the photosynthetic carbon metabolism in a general leaf, the Rhodobacter spheroides bacterium, and the Chlamydomonas reinhardtii alga. We adopt single- and multi-objective optimization algorithms to maximize the CO 2 uptake rate and the production of metabolites of industrial interest or for ecological purposes. We focus both on the level of genes (e.g., finding genetic manipulations to increase the production of one or more metabolites) and on finding concentration enzymes for improving the CO 2 consumption. We find that R. spheroides is able to absorb an amount of CO 2 until 57.452 mmol h (-1) gDW (-1) , while C. reinhardtii obtains a maximum of 6.7331. We report that the Pareto front analysis proves extremely useful to compare different organisms, as well as providing the possibility to investigate them with the same framework. By using the sensitivity and robustness analysis, our framework identifies the most sensitive and fragile components of the biological systems we take into account, allowing us to compare their models. We adopt the identifiability analysis to detect functional relations among enzymes; we observe that RuBisCO, GAPDH, and FBPase belong to the same functional group, as suggested also by the sensitivity analysis.
Xiao-Yun Jiang
2013-12-01
Full Text Available The theory of constraints (TOC supply chain replenishment system (TOC-SCRS is an effective solution for coping with conflicts during the management of supply chain inventories. When it is deployed in a plant or a central warehouse with capacity constraints, TOC-SCRS encounters a problem comprised of 2 parts: first, how to establish a sound setup frequency (SF that can meet production needs and prevent losses from stock-outs, meaning a SF that allows the plant to make full use of its existing capacity to achieve maximal effective output. Second, it must determine how to establish a sound SF that can help the plant minimize its inventory and cut costs to the greatest possible extent. To resolve the problem, a SF optimization model for TOC-SCRS with capacity constraints is constructed and is then used in combination with Pareto particle swarm optimization (PSO to obtain SF optimization schemes in this paper. An illustrative example is conducted to verify the feasibility and effectiveness of the proposed approach.
Extending the Generalised Pareto Distribution for Novelty Detection in High-Dimensional Spaces.
Clifton, David A; Clifton, Lei; Hugueny, Samuel; Tarassenko, Lionel
2014-01-01
Novelty detection involves the construction of a "model of normality", and then classifies test data as being either "normal" or "abnormal" with respect to that model. For this reason, it is often termed one-class classification. The approach is suitable for cases in which examples of "normal" behaviour are commonly available, but in which cases of "abnormal" data are comparatively rare. When performing novelty detection, we are typically most interested in the tails of the normal model, because it is in these tails that a decision boundary between "normal" and "abnormal" areas of data space usually lies. Extreme value statistics provides an appropriate theoretical framework for modelling the tails of univariate (or low-dimensional) distributions, using the generalised Pareto distribution (GPD), which can be demonstrated to be the limiting distribution for data occurring within the tails of most practically-encountered probability distributions. This paper provides an extension of the GPD, allowing the modelling of probability distributions of arbitrarily high dimension, such as occurs when using complex, multimodel, multivariate distributions for performing novelty detection in most real-life cases. We demonstrate our extension to the GPD using examples from patient physiological monitoring, in which we have acquired data from hospital patients in large clinical studies of high-acuity wards, and in which we wish to determine "abnormal" patient data, such that early warning of patient physiological deterioration may be provided.
Generalized Pareto for Pattern-Oriented Random Walk Modelling of Organisms' Movements.
Bertrand, Sophie; Joo, Rocío; Fablet, Ronan
2015-01-01
How organisms move and disperse is crucial to understand how population dynamics relates to the spatial heterogeneity of the environment. Random walk (RW) models are typical tools to describe movement patterns. Whether Lévy or alternative RW better describes forager movements is keenly debated. We get around this issue using the Generalized Pareto Distribution (GPD). GPD includes as specific cases Normal, exponential and power law distributions, which underlie Brownian, Poisson-like and Lévy walks respectively. Whereas previous studies typically confronted a limited set of candidate models, GPD lets the most likely RW model emerge from the data. We illustrate the wide applicability of the method using GPS-tracked seabird foraging movements and fishing vessel movements tracked by Vessel Monitoring System (VMS), both collected in the Peruvian pelagic ecosystem. The two parameters from the fitted GPD, a scale and a shape parameter, provide a synoptic characterization of the observed movement in terms of characteristic scale and diffusive property. They reveal and quantify the variability, among species and individuals, of the spatial strategies selected by predators foraging on a common prey field. The GPD parameters constitute relevant metrics for (1) providing a synthetic and pattern-oriented description of movement, (2) using top predators as ecosystem indicators and (3) studying the variability of spatial behaviour among species or among individuals with different personalities.
Multicriteria Optimization of Gasification Operational Parameters Using a Pareto Genetic Algorithm
Miguel Caldas
2005-04-01
Full Text Available Gasification is a well-known technology that allows for a combustible gas to be obtained from a carbonaceous fuel by a partial oxidation process (POX. The resulting gas (synthesis gas or syngas can be used either as a fuel or as a feedstock for chemical production. Recently, gasification has also received a great deal of attention concerning power production possibilities through IGCC process (Integrated Gasification Combined Cycle, which is currently the most environmentally friendly and efficient method for the production of electricity. Gasification allows for low grade fuels, or dirty fuels, to be used in an environmental acceptable way. Amongst these fuels are wastes from the petrochemical and other industries, which vary in composition from shipment to shipment, and from lot to lot. If operating conditions are kept constant this could result in lose of efficiency. This paper presents an application of Genetic Algorithms to optimize the operating parameters of a gasifier processing a given fuel, so that the system achieves maximum efficiency for each particular fuel composition. A Pareto multiobjective optimization method, combined with a Genetic Algorithm, is applied to the simultaneous maximization of two different objective functions: Cold Gas Efficiency and Hydrogen Contents of the syngas. Results show that the optimization method developed is fast and simple enough to be used for on-line adjustment of the gasification operating parameters for each fuel composition and aim of gasification, thus improving overall performance of the industrial process.
Entropies of negative incomes, Pareto-distributed loss, and financial crises.
Jianbo Gao
Full Text Available Health monitoring of world economy is an important issue, especially in a time of profound economic difficulty world-wide. The most important aspect of health monitoring is to accurately predict economic downturns. To gain insights into how economic crises develop, we present two metrics, positive and negative income entropy and distribution analysis, to analyze the collective "spatial" and temporal dynamics of companies in nine sectors of the world economy over a 19 year period from 1990-2008. These metrics provide accurate predictive skill with a very low false-positive rate in predicting downturns. The new metrics also provide evidence of phase transition-like behavior prior to the onset of recessions. Such a transition occurs when negative pretax incomes prior to or during economic recessions transition from a thin-tailed exponential distribution to the higher entropy Pareto distribution, and develop even heavier tails than those of the positive pretax incomes. These features propagate from the crisis initiating sector of the economy to other sectors.
Modeling air quality in main cities of Peninsular Malaysia by using a generalized Pareto model.
Masseran, Nurulkamal; Razali, Ahmad Mahir; Ibrahim, Kamarulzaman; Latif, Mohd Talib
2016-01-01
The air pollution index (API) is an important figure used for measuring the quality of air in the environment. The API is determined based on the highest average value of individual indices for all the variables which include sulfur dioxide (SO2), nitrogen dioxide (NO2), carbon monoxide (CO), ozone (O3), and suspended particulate matter (PM10) at a particular hour. API values that exceed the limit of 100 units indicate an unhealthy status for the exposed environment. This study investigates the risk of occurrences of API values greater than 100 units for eight urban areas in Peninsular Malaysia for the period of January 2004 to December 2014. An extreme value model, known as the generalized Pareto distribution (GPD), has been fitted to the API values found. Based on the fitted model, return period for describing the occurrences of API exceeding 100 in the different cities has been computed as the indicator of risk. The results obtained indicated that most of the urban areas considered have a very small risk of occurrence of the unhealthy events, except for Kuala Lumpur, Malacca, and Klang. However, among these three cities, it is found that Klang has the highest risk. Based on all the results obtained, the air quality standard in urban areas of Peninsular Malaysia falls within healthy limits to human beings.
Wu, J.; Yang, Y.; Luo, Q.; Wu, J.
2012-12-01
This study presents a new hybrid multi-objective evolutionary algorithm, the niched Pareto tabu search combined with a genetic algorithm (NPTSGA), whereby the global search ability of niched Pareto tabu search (NPTS) is improved by the diversification of candidate solutions arose from the evolving nondominated sorting genetic algorithm II (NSGA-II) population. Also, the NPTSGA coupled with the commonly used groundwater flow and transport codes, MODFLOW and MT3DMS, is developed for multi-objective optimal design of groundwater remediation systems. The proposed methodology is then applied to a large-scale field groundwater remediation system for cleanup of large trichloroethylene (TCE) plume at the Massachusetts Military Reservation (MMR) in Cape Cod, Massachusetts. Furthermore, a master-slave (MS) parallelization scheme based on the Message Passing Interface (MPI) is incorporated into the NPTSGA to implement objective function evaluations in distributed processor environment, which can greatly improve the efficiency of the NPTSGA in finding Pareto-optimal solutions to the real-world application. This study shows that the MS parallel NPTSGA in comparison with the original NPTS and NSGA-II can balance the tradeoff between diversity and optimality of solutions during the search process and is an efficient and effective tool for optimizing the multi-objective design of groundwater remediation systems under complicated hydrogeologic conditions.
Part E: Evolutionary Computation
2015-01-01
of Computational Intelligence. First, comprehensive surveys of genetic algorithms, genetic programming, evolution strategies, parallel evolutionary algorithms are presented, which are readable and constructive so that a large audience might find them useful and – to some extent – ready to use. Some more general...... topics like the estimation of distribution algorithms, indicator-based selection, etc., are also discussed. An important problem, from a theoretical and practical point of view, of learning classifier systems is presented in depth. Multiobjective evolutionary algorithms, which constitute one of the most...... evolutionary algorithms, such as memetic algorithms, which have emerged as a very promising tool for solving many real-world problems in a multitude of areas of science and technology. Moreover, parallel evolutionary combinatorial optimization has been presented. Search operators, which are crucial in all...
Evolutionary mechanisms for loneliness.
Cacioppo, John T; Cacioppo, Stephanie; Boomsma, Dorret I
2014-01-01
Robert Weiss (1973) conceptualised loneliness as perceived social isolation, which he described as a gnawing, chronic disease without redeeming features. On the scale of everyday life, it is understandable how something as personally aversive as loneliness could be regarded as a blight on human existence. However, evolutionary time and evolutionary forces operate at such a different scale of organisation than we experience in everyday life that personal experience is not sufficient to understand the role of loneliness in human existence. Research over the past decade suggests a very different view of loneliness than suggested by personal experience, one in which loneliness serves a variety of adaptive functions in specific habitats. We review evidence on the heritability of loneliness and outline an evolutionary theory of loneliness, with an emphasis on its potential adaptive value in an evolutionary timescale.
Rethinking evolutionary individuality.
Ereshefsky, Marc; Pedroso, Makmiller
2015-08-18
This paper considers whether multispecies biofilms are evolutionary individuals. Numerous multispecies biofilms have characteristics associated with individuality, such as internal integrity, division of labor, coordination among parts, and heritable adaptive traits. However, such multispecies biofilms often fail standard reproductive criteria for individuality: they lack reproductive bottlenecks, are comprised of multiple species, do not form unified reproductive lineages, and fail to have a significant division of reproductive labor among their parts. If such biofilms are good candidates for evolutionary individuals, then evolutionary individuality is achieved through other means than frequently cited reproductive processes. The case of multispecies biofilms suggests that standard reproductive requirements placed on individuality should be reconsidered. More generally, the case of multispecies biofilms indicates that accounts of individuality that focus on single-species eukaryotes are too restrictive and that a pluralistic and open-ended account of evolutionary individuality is needed.
Eco-evolutionary feedbacks, adaptive dynamics and evolutionary rescue theory
Ferriere, Regis; Legendre, Stéphane
2013-01-01
Adaptive dynamics theory has been devised to account for feedbacks between ecological and evolutionary processes. Doing so opens new dimensions to and raises new challenges about evolutionary rescue. Adaptive dynamics theory predicts that successive trait substitutions driven by eco-evolutionary feedbacks can gradually erode population size or growth rate, thus potentially raising the extinction risk. Even a single trait substitution can suffice to degrade population viability drastically at once and cause ‘evolutionary suicide’. In a changing environment, a population may track a viable evolutionary attractor that leads to evolutionary suicide, a phenomenon called ‘evolutionary trapping’. Evolutionary trapping and suicide are commonly observed in adaptive dynamics models in which the smooth variation of traits causes catastrophic changes in ecological state. In the face of trapping and suicide, evolutionary rescue requires that the population overcome evolutionary threats generated by the adaptive process itself. Evolutionary repellors play an important role in determining how variation in environmental conditions correlates with the occurrence of evolutionary trapping and suicide, and what evolutionary pathways rescue may follow. In contrast with standard predictions of evolutionary rescue theory, low genetic variation may attenuate the threat of evolutionary suicide and small population sizes may facilitate escape from evolutionary traps. PMID:23209163
A pareto-optimal characterization of miniaturized distributed occulter/telescope systems
Koenig, Adam W.; D'Amico, Simone; Macintosh, Bruce; Titus, Charles J.
2015-09-01
Distributed occulter/telescope systems hold great promise in the field of direct exoplanet imaging. However, proposed missions using this concept such as the New Worlds Observer or Exo-S (NASA) are exceptionally large with occulter diameters of tens of meters and inter-spacecraft separations of tens of megameters, requiring deployment in deep space. The estimated costs associated with these missions are in the billions of dollars. In order to reduce the risk associated with these missions, it is desirable to first deploy a low-cost technology demonstrator mission to prove that the distributed occulter telescope concept is valid. To that end, this work assesses the feasibility of miniaturizing the optics of the distributed occulter/telescope to enable deployment on micro- or nano-satellites in earth orbit. A variant of the convex optimization formulation introduced by previous authors is used to generate a pareto-optimal characterization between the achievable occulter contrast and a set of critical design variables (occulter radius, inner working angle, science spectrum, etc). This characterization is performed for two different sets of engineering constraints, corresponding to different levels of design complexity. The results of this study are compared to the performance requirements for imaging targets of scientific interest, namely exozodiacal dust disks, in order to identify promising design envelopes. The result of this work is a comprehensive trade of the capabilities of miniaturized, binary, petal-shaped occulters. This research demonstrates that there exist miniaturized occulter geometries compatible with micro- or nano-satellites in earth orbit suitable for imaging exozodiacal dust disks. In addition, this study provides a valuable methodology and performance guidelines for future distributed occulter/telescope designs.
A multiple threshold method for fitting the generalized Pareto distribution to rainfall time series
Deidda, R.
2010-12-01
Previous studies indicate the generalized Pareto distribution (GPD) as a suitable distribution function to reliably describe the exceedances of daily rainfall records above a proper optimum threshold, which should be selected as small as possible to retain the largest sample while assuring an acceptable fitting. Such an optimum threshold may differ from site to site, affecting consequently not only the GPD scale parameter, but also the probability of threshold exceedance. Thus a first objective of this paper is to derive some expressions to parameterize a simple threshold-invariant three-parameter distribution function which assures a perfect overlapping with the GPD fitted on the exceedances over any threshold larger than the optimum one. Since the proposed distribution does not depend on the local thresholds adopted for fitting the GPD, it is expected to reflect the on-site climatic signature and thus appears particularly suitable for hydrological applications and regional analyses. A second objective is to develop and test the Multiple Threshold Method (MTM) to infer the parameters of interest by using exceedances over a wide range of thresholds applying again the concept of parameters threshold-invariance. We show the ability of the MTM in fitting historical daily rainfall time series recorded with different resolutions and with a significative percentage of heavily quantized data. Finally, we prove the supremacy of the MTM fit against the standard single threshold fit, often adopted for partial duration series, by evaluating and comparing the performances on Monte Carlo samples drawn by GPDs with different shape and scale parameters and different discretizations.
Using the Pareto principle in genome-wide breeding value estimation
Yu Xijiang
2011-11-01
Full Text Available Abstract Genome-wide breeding value (GWEBV estimation methods can be classified based on the prior distribution assumptions of marker effects. Genome-wide BLUP methods assume a normal prior distribution for all markers with a constant variance, and are computationally fast. In Bayesian methods, more flexible prior distributions of SNP effects are applied that allow for very large SNP effects although most are small or even zero, but these prior distributions are often also computationally demanding as they rely on Monte Carlo Markov chain sampling. In this study, we adopted the Pareto principle to weight available marker loci, i.e., we consider that x% of the loci explain (100 - x% of the total genetic variance. Assuming this principle, it is also possible to define the variances of the prior distribution of the 'big' and 'small' SNP. The relatively few large SNP explain a large proportion of the genetic variance and the majority of the SNP show small effects and explain a minor proportion of the genetic variance. We name this method MixP, where the prior distribution is a mixture of two normal distributions, i.e. one with a big variance and one with a small variance. Simulation results, using a real Norwegian Red cattle pedigree, show that MixP is at least as accurate as the other methods in all studied cases. This method also reduces the hyper-parameters of the prior distribution from 2 (proportion and variance of SNP with big effects to 1 (proportion of SNP with big effects, assuming the overall genetic variance is known. The mixture of normal distribution prior made it possible to solve the equations iteratively, which greatly reduced computation loads by two orders of magnitude. In the era of marker density reaching million(s and whole-genome sequence data, MixP provides a computationally feasible Bayesian method of analysis.
A framework for the determination of weak Pareto frontier solutions under probabilistic constraints
Ran, Hongjun
The purpose of this research is to provide such a framework. The proposed framework combines separately developed multidisciplinary optimization, multi-objective optimization, and joint probability assessment methods together but in a decoupled way, to solve joint probabilistic constraint, multi-objective, multidisciplinary optimization problems that are representative of realistic conceptual design problems of design alternative generation and selection. The intent here is to find the Weak Pareto Frontier (WPF) solutions that include additional compromised solutions besides the ones identified by a conventional Pareto frontier. This framework starts with constructing fast and accurate surrogate models of different disciplinary analyses in order to reduce the computational time and expense to a manageable level so that the design space can be explored quickly, obtain trustworthy probabilities of the probabilistic constraints (PC) and WPF, and so as to enable conceptual design decision making in shorter time period. A new hybrid method is formed that consists of the second order Response Surface Methodology (RSM) and the Support Vector Regression (SVR) method capturing the global tendency and the local nonlinear behavior respectively. The purpose of forming this hybrid method is to provide a method that can achieve high accuracy for many kinds of problems with a small training sample. The three parameters needed by SVR to be pre-specified are selected using practical methods and a modified information criterion that makes use of model fitting error, predicting error, and model complexity information. The model predicting error is estimated inexpensively with a new method called Random Cross Validation. In order to select a surrogate model without unnecessary complexity from RSM, SVR, and the hybrid method, this modified information criterion is also used as a surrogate model advisor to select the best surrogate model for a given problem. A new neighborhood search
Autonomous Evolutionary Information Systems
无
2001-01-01
Traditional information systems are passive, i.e., data orknowledge is created , retrieved, modified, updated, and deleted only in response to operations issued by users or application programs, and the systems only can execute queries or t ransactions explicitly submitted by users or application programs but have no ab ility to do something actively by themselves. Unlike a traditional information system serving just as a storehouse of data or knowledge and working passively a ccording to queries or transactions explicitly issued by users and application p rograms, an autonomous evolutionary information system serves as an autonomous a nd evolutionary partner of its users that discovers new knowledge from its datab ase or knowledge-base autonomously, cooperates with its users in solving proble m s actively by providing the users with advices, and has a certain mechanism to i mprove its own state of “knowing” and ability of “working”. This paper semi nall y defines what is an autonomous evolutionary information system, explain why aut onomous evolutionary information systems are needed, and presents some new issue s, fundamental considerations, and research directions in design and development of autonomous evolutionary information systems.
Applying evolutionary anthropology.
Gibson, Mhairi A; Lawson, David W
2015-01-01
Evolutionary anthropology provides a powerful theoretical framework for understanding how both current environments and legacies of past selection shape human behavioral diversity. This integrative and pluralistic field, combining ethnographic, demographic, and sociological methods, has provided new insights into the ultimate forces and proximate pathways that guide human adaptation and variation. Here, we present the argument that evolutionary anthropological studies of human behavior also hold great, largely untapped, potential to guide the design, implementation, and evaluation of social and public health policy. Focusing on the key anthropological themes of reproduction, production, and distribution we highlight classic and recent research demonstrating the value of an evolutionary perspective to improving human well-being. The challenge now comes in transforming relevance into action and, for that, evolutionary behavioral anthropologists will need to forge deeper connections with other applied social scientists and policy-makers. We are hopeful that these developments are underway and that, with the current tide of enthusiasm for evidence-based approaches to policy, evolutionary anthropology is well positioned to make a strong contribution. © 2015 Wiley Periodicals, Inc.
Paleoanthropology and evolutionary theory.
Tattersall, Ian
2012-01-01
Paleoanthropologists of the first half of the twentieth century were little concerned either with evolutionary theory or with the technicalities and broader implications of zoological nomenclature. In consequence, the paleoanthropological literature of the period consisted largely of a series of descriptions accompanied by authoritative pronouncements, together with a huge excess of hominid genera and species. Given the intellectual flimsiness of the resulting paleoanthropological framework, it is hardly surprising that in 1950 the ornithologist Ernst Mayr met little resistance when he urged the new postwar generation of paleoanthropologists to accept not only the elegant reductionism of the Evolutionary Synthesis but a vast oversimplification of hominid phylogenetic history and nomenclature. Indeed, the impact of Mayr's onslaught was so great that even when developments in evolutionary biology during the last quarter of the century brought other paleontologists to the realization that much more has been involved in evolutionary histories than the simple action of natural selection within gradually transforming lineages, paleoanthropologists proved highly reluctant to follow. Even today, paleoanthropologists are struggling to reconcile an intuitive realization that the burgeoning hominid fossil record harbors a substantial diversity of species (bringing hominid evolutionary patterns into line with that of other successful mammalian families), with the desire to cram a huge variety of morphologies into an unrealistically minimalist systematic framework. As long as this theoretical ambivalence persists, our perception of events in hominid phylogeny will continue to be distorted.
Larsén, Xiaoli Guo; Mann, Jakob; Rathmann, Ole
2015-01-01
as a guideline for applying GEVD and GPD to wind time series of limited length. The data analysis shows that, with reasonable choice of relevant parameters, GEVD and GPD give consistent estimates of the return winds. For GEVD, the base period should be chosen in accordance with the occurrence of the extreme wind......This study examines the various sources to the uncertainties in the application of two widely used extreme value distribution functions, the generalized extreme value distribution (GEVD) and the generalized Pareto distribution (GPD). The study is done through the analysis of measurements from...
Kamaljit Kaur
2015-01-01
Full Text Available Bayesian estimators of Gini index and a Poverty measure are obtained in case of Pareto distribution under censored and complete setup. The said estimators are obtained using two noninformative priors, namely, uniform prior and Jeffreys’ prior, and one conjugate prior under the assumption of Linear Exponential (LINEX loss function. Using simulation techniques, the relative efficiency of proposed estimators using different priors and loss functions is obtained. The performances of the proposed estimators have been compared on the basis of their simulated risks obtained under LINEX loss function.
Jhon Jairo Santa Chávez
2016-01-01
Full Text Available This paper presents a multiobjective ant colony algorithm for the Multi-Depot Vehicle Routing Problem with Backhauls (MDVRPB where three objectives of traveled distance, traveling times and total consumption of energy are minimized. An ant colony algorithm is proposed to solve the MDVRPB. The solution scheme allows one to find a set of ordered solutions in Pareto fronts by considering the concept of dominance. The effectiveness of the proposed approach is examined by considering a set of instances adapted from the literature. The computational results show high quality results within short computing times.
Rica Gonen
2013-11-01
Full Text Available We analyze the space of deterministic, dominant-strategy incentive compatible, individually rational and Pareto optimal combinatorial auctions. We examine a model with multidimensional types, nonidentical items, private values and quasilinear preferences for the players with one relaxation; the players are subject to publicly-known budget constraints. We show that the space includes dictatorial mechanisms and that if dictatorial mechanisms are ruled out by a natural anonymity property, then an impossibility of design is revealed. The same impossibility naturally extends to other abstract mechanisms with an arbitrary outcome set if one maintains the original assumptions of players with quasilinear utilities, public budgets and nonnegative prices.
Castelletti, A.; Pianosi, F.; Restelli, M.
2013-06-01
The operation of large-scale water resources systems often involves several conflicting and noncommensurable objectives. The full characterization of tradeoffs among them is a necessary step to inform and support decisions in the absence of a unique optimal solution. In this context, the common approach is to consider many single objective problems, resulting from different combinations of the original problem objectives, each one solved using standard optimization methods based on mathematical programming. This scalarization process is computationally very demanding as it requires one optimization run for each trade-off and often results in very sparse and poorly informative representations of the Pareto frontier. More recently, bio-inspired methods have been applied to compute an approximation of the Pareto frontier in one single run. These methods allow to acceptably cover the full extent of the Pareto frontier with a reasonable computational effort. Yet, the quality of the policy obtained might be strongly dependent on the algorithm tuning and preconditioning. In this paper we propose a novel multiobjective Reinforcement Learning algorithm that combines the advantages of the above two approaches and alleviates some of their drawbacks. The proposed algorithm is an extension of fitted Q-iteration (FQI) that enables to learn the operating policies for all the linear combinations of preferences (weights) assigned to the objectives in a single training process. The key idea of multiobjective FQI (MOFQI) is to enlarge the continuous approximation of the value function, that is performed by single objective FQI over the state-decision space, also to the weight space. The approach is demonstrated on a real-world case study concerning the optimal operation of the HoaBinh reservoir on the Da river, Vietnam. MOFQI is compared with the reiterated use of FQI and a multiobjective parameterization-simulation-optimization (MOPSO) approach. Results show that MOFQI provides a
Evolutionary Computation：ao Overview
HeZhenya; WeiChengjian
1997-01-01
Evolutionary computation is a field of simulating evolution on a computer.Both aspects of it ,the problem solving aspect and the aspect of modeling natural evolution,are important.Simulating evolution on a computer results in stochastic optimization techniques that can outperform classical methods of optimization when applied to difficult real-world problems.There are currently four main avenues of research in simulated evolution:genetic algorithms,evolutionary programming,evolution strategies,and genetic programming.This paper presents a brief overview of thd field on evolutionary computation,including some theoretical issues,adaptive mechanisms,improvements,constrained optimizqtion,constrained satisfaction,evolutionary neural networks,evolutionary fuzzy systems,hardware evolution,evolutionary robotics,parallel evolutionary computation,and co-evolutionary models.The applications of evolutionary computation for optimizing system and intelligent information processing in telecommunications are also introduced.
McCormack, Jon
Evolution is one of the most interesting and creative processes we currently understand, so it should come as no surprise that artists and designers are embracing the use of evolution in problems of artistic creativity. The material in this section illustrates the diversity of approaches being used by artists and designers in relation to evolution at the boundary of art and science. While conceptualising human creativity as an evolutionary process in itself may be controversial, what is clear is that evolutionary processes can be used to complement, even enhance human creativity, as the chapters in this section aptly demonstrate.
Studies in evolutionary agroecology
Wille, Wibke
Darwinian evolution by natural selection is driven primarily by differential survival and reproduction among individuals in a population. When the evolutionary interest of an individual is in conflict with the interests of the population, the genes increasing individual fitness at the cost...... of Evolutionary Agroecology that the highest yielding individuals do not necessarily perform best as a population. The investment of resources into strategies and structures increasing individual competitive ability carries a cost. If a whole population consists of individuals investing resources to compete...
Evolutionary Statistical Procedures
Baragona, Roberto; Poli, Irene
2011-01-01
This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a
Siade, A. J.; Prommer, H.; Welter, D.
2014-12-01
Groundwater management and remediation requires the implementation of numerical models in order to evaluate the potential anthropogenic impacts on aquifer systems. In many situations, the numerical model must, not only be able to simulate groundwater flow and transport, but also geochemical and biological processes. Each process being simulated carries with it a set of parameters that must be identified, along with differing potential sources of model-structure error. Various data types are often collected in the field and then used to calibrate the numerical model; however, these data types can represent very different processes and can subsequently be sensitive to the model parameters in extremely complex ways. Therefore, developing an appropriate weighting strategy to address the contributions of each data type to the overall least-squares objective function is not straightforward. This is further compounded by the presence of potential sources of model-structure errors that manifest themselves differently for each observation data type. Finally, reactive transport models are highly nonlinear, which can lead to convergence failure for algorithms operating on the assumption of local linearity. In this study, we propose a variation of the popular, particle swarm optimization algorithm to address trade-offs associated with the calibration of one data type over another. This method removes the need to specify weights between observation groups and instead, produces a multi-dimensional Pareto front that illustrates the trade-offs between data types. We use the PEST++ run manager, along with the standard PEST input/output structure, to implement parallel programming across multiple desktop computers using TCP/IP communications. This allows for very large swarms of particles without the need of a supercomputing facility. The method was applied to a case study in which modeling was used to gain insight into the mobilization of arsenic at a deepwell injection site
Mamalakis, Antonios; Langousis, Andreas; Deidda, Roberto
2016-04-01
Estimation of extreme rainfall from data constitutes one of the most important issues in statistical hydrology, as it is associated with the design of hydraulic structures and flood water management. To that extent, based on asymptotic arguments from Extreme Excess (EE) theory, several studies have focused on developing new, or improving existing methods to fit a generalized Pareto (GP) distribution model to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches, such as non-parametric methods that are intended to locate the changing point between extreme and non-extreme regions of the data, graphical methods where one studies the dependence of GP distribution parameters (or related metrics) on the threshold level u, and Goodness of Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GP distribution model is applicable. In this work, we review representative methods for GP threshold detection, discuss fundamental differences in their theoretical bases, and apply them to 1714 daily rainfall records from the NOAA-NCDC open-access database, with more than 110 years of data. We find that non-parametric methods that are intended to locate the changing point between extreme and non-extreme regions of the data are generally not reliable, while methods that are based on asymptotic properties of the upper distribution tail lead to unrealistically high threshold and shape parameter estimates. The latter is justified by theoretical arguments, and it is especially the case in rainfall applications, where the shape parameter of the GP distribution is low; i.e. on the order of 0.1 ÷ 0.2. Better performance is demonstrated by graphical methods and GoF metrics that rely on pre-asymptotic properties of the GP distribution. For daily rainfall, we find that GP threshold estimates range between 2÷12 mm/d with a mean value of 6.5 mm/d, while the existence of quantization in the
卫超; 师义民
2014-01-01
In this paper,statistical analysis of Pareto distribution is presented under progressive type-II hybrid censored data. The maximum likelihood estimators (MLEs)and Bayes estimator under different prior distributions of Pareto distribution are obtained by using maximum likelihood method and Bayes estimation theory. Finally,Monte-Carlo simulation is performed for illustrative purposes.%基于逐步II型混合截尾寿命试验数据，研究了Pareto分布的统计分析问题。利用极大似然方法和Bayes 估计理论，导出了 Pareto 分布参数的极大似然估计和基于不同先验分布下的 Bayes 估计。最后利用Monte-Carlo模拟方法对估计的结果进行了对比分析。
A data mining approach to evolutionary optimisation of noisy multi-objective problems
Chia, J. Y.; Goh, C. K.; Shim, V. A.; Tan, K. C.
2012-07-01
Many real world optimisation problems have opposing objective functions which are subjected to the influence of noise. Noise in the objective functions can adversely affect the stability, performance and convergence of evolutionary optimisers. This article proposes a Bayesian frequent data mining (DM) approach to identify optimal regions to guide the population amidst the presence of noise. The aggregated information provided by all the solutions helped to average out the effects of noise. This article proposes a DM crossover operator to make use of the rules mined. After implementation of this operator, a better convergence to the true Pareto front is achieved at the expense of the diversity of the solution. Consequently, an ExtremalExploration operator will be proposed in the later part of this article to help curb the loss in diversity caused by the DM operator. The result is a more directive search with a faster convergence rate. The search is effective in decision space where the Pareto set is in a tight cluster. A further investigation of the performance of the proposed algorithm in noisy and noiseless environment will also be studied with respect to non-convexity, discontinuity, multi-modality and uniformity. The proposed algorithm is evaluated on ZDT and other benchmarks problems. The results of the simulations indicate that the proposed method is effective in handling noise and is competitive against the other noise tolerant algorithms.
Design Optimization of an Axial Fan Blade Through Multi-Objective Evolutionary Algorithm
Kim, Jin-Hyuk; Choi, Jae-Ho; Husain, Afzal; Kim, Kwang-Yong
2010-06-01
This paper presents design optimization of an axial fan blade with hybrid multi-objective evolutionary algorithm (hybrid MOEA). Reynolds-averaged Navier-Stokes equations with shear stress transport turbulence model are discretized by the finite volume approximations and solved on hexahedral grids for the flow analyses. The validation of the numerical results was performed with the experimental data for the axial and tangential velocities. Six design variables related to the blade lean angle and blade profile are selected and the Latin hypercube sampling of design of experiments is used to generate design points within the selected design space. Two objective functions namely total efficiency and torque are employed and the multi-objective optimization is carried out to enhance total efficiency and to reduce the torque. The flow analyses are performed numerically at the designed points to obtain values of the objective functions. The Non-dominated Sorting of Genetic Algorithm (NSGA-II) with ɛ -constraint strategy for local search coupled with surrogate model is used for multi-objective optimization. The Pareto-optimal solutions are presented and trade-off analysis is performed between the two competing objectives in view of the design and flow constraints. It is observed that total efficiency is enhanced and torque is decreased as compared to the reference design by the process of multi-objective optimization. The Pareto-optimal solutions are analyzed to understand the mechanism of the improvement in the total efficiency and reduction in torque.
Origins of evolutionary transitions
Ellen Clarke
2014-04-01
An `evolutionary transition in individuality’ or `major transition’ is a transformation in the hierarchical level at which natural selection operates on a population. In this article I give an abstract (i.e. level-neutral and substrate-neutral) articulation of the transition process in order to precisely understand how such processes can happen, especially how they can get started.
Evolutionary mysteries in meiosis
Lenormand, Thomas; Engelstädter, Jan; Johnston, Susan E.; Wijnker, Erik; Haag, Christoph R.
2016-01-01
Meiosis is a key event of sexual life cycles in eukaryotes. Its mechanistic details have been uncovered in several model organisms, and most of its essential features have received various and often contradictory evolutionary interpretations. In this perspective, we present an overview of these
Studies in evolutionary agroecology
Wille, Wibke
of Evolutionary Agroecology that the highest yielding individuals do not necessarily perform best as a population. The investment of resources into strategies and structures increasing individual competitive ability carries a cost. If a whole population consists of individuals investing resources to compete...
Evolutionary developmental psychology.
King, Ashley C; Bjorklund, David F
2010-02-01
The field of evolutionary developmental psychology can potentially broaden the horizons of mainstream evolutionary psychology by combining the principles of Darwinian evolution by natural selection with the study of human development, focusing on the epigenetic effects that occur between humans and their environment in a way that attempts to explain how evolved psychological mechanisms become expressed in the phenotypes of adults. An evolutionary developmental perspective includes an appreciation of comparative research and we, among others, argue that contrasting the cognition of humans with that of nonhuman primates can provide a framework with which to understand how human cognitive abilities and intelligence evolved. Furthermore, we argue that several aspects of childhood (e.g., play and immature cognition) serve both as deferred adaptations as well as imparting immediate benefits. Intense selection pressure was surely exerted on childhood over human evolutionary history and, as a result, neglecting to consider the early developmental period of children when studying their later adulthood produces an incomplete picture of the evolved adaptations expressed through human behavior and cognition.
Optimal Mixing Evolutionary Algorithms
Thierens, D.; Bosman, P.A.N.; Krasnogor, N.
2011-01-01
A key search mechanism in Evolutionary Algorithms is the mixing or juxtaposing of partial solutions present in the parent solutions. In this paper we look at the efficiency of mixing in genetic algorithms (GAs) and estimation-of-distribution algorithms (EDAs). We compute the mixing probabilities of
Evolutionary perspectives on ageing.
Reichard, Martin
2017-05-26
From an evolutionary perspective, ageing is a decrease in fitness with chronological age - expressed by an increase in mortality risk and/or decline in reproductive success and mediated by deterioration of functional performance. While this makes ageing intuitively paradoxical - detrimental to individual fitness - evolutionary theory offers answers as to why ageing has evolved. In this review, I first briefly examine the classic evolutionary theories of ageing and their empirical tests, and highlight recent findings that have advanced our understanding of the evolution of ageing (condition-dependent survival, positive pleiotropy). I then provide an overview of recent theoretical extensions and modifications that accommodate those new discoveries. I discuss the role of indeterminate (asymptotic) growth for lifetime increases in fecundity and ageing trajectories. I outline alternative views that challenge a universal existence of senescence - namely the lack of a germ-soma distinction and the ability of tissue replacement and retrogression to younger developmental stages in modular organisms. I argue that rejuvenation at the organismal level is plausible, but includes a return to a simple developmental stage. This may exempt a particular genotype from somatic defects but, correspondingly, removes any information acquired during development. A resolution of the question of whether a rejuvenated individual is the same entity is central to the recognition of whether current evolutionary theories of ageing, with their extensions and modifications, can explain the patterns of ageing across the Tree of Life. Copyright © 2017 Elsevier Ltd. All rights reserved.
Evolutionary Theories of Detection
Fitch, J P
2005-04-29
Current, mid-term and long range technologies for detection of pathogens and toxins are briefly described in the context of performance metrics and operational scenarios. Predictive (evolutionary) and speculative (revolutionary) assessments are given with trade-offs identified, where possible, among competing performance goals.
Evolutionary trends in Heteroptera
Cobben, R.H.
1968-01-01
1. This work, the first volume of a series dealing with evolutionary trends in Heteroptera, is concerned with the egg system of about 400 species. The data are presented systematically in chapters 1 and 2 with a critical review of the literature after each family.
2. Chapter 3 evaluates facts
Learning: An Evolutionary Analysis
Swann, Joanna
2009-01-01
This paper draws on the philosophy of Karl Popper to present a descriptive evolutionary epistemology that offers philosophical solutions to the following related problems: "What happens when learning takes place?" and "What happens in human learning?" It provides a detailed analysis of how learning takes place without any direct transfer of…
Evolutionary mysteries in meiosis
Lenormand, Thomas; Engelstädter, Jan; Johnston, Susan E.; Wijnker, Erik; Haag, Christoph R.
2016-01-01
Meiosis is a key event of sexual life cycles in eukaryotes. Its mechanistic details have been uncovered in several model organisms, and most of its essential features have received various and often contradictory evolutionary interpretations. In this perspective, we present an overview of these o
[Schizophrenia and evolutionary psychopathology].
Kelemen, Oguz; Kéri, Szabolcs
2007-01-01
Evolution can shape any characteristic appearing as a phenotype that is genetically rooted and possesses a long history. The stress-diathesis model suggests that psychiatric disorders have some genetic roots, and therefore the theory of evolution may be relevant for psychiatry. Schizophrenia is present in every human culture with approximately the same incidence. The great evolutionary paradox is: how can such illness persist despite fundamental reproductive disadvantages? Since the 1960s, several evolutionary explanations have been raised to explain the origins of schizophrenia. This article reviews all the major evolutionary theories about the possible origins of this disease. On the one hand, some researchers have proposed that schizophrenia is an evolutionary disadvantageous byproduct of human brain evolution (e.g. the evolution of hemispheric specialization, social brain or language skills). On the other hand, others have suggested that a compensatory advantage must exist either in the biological system of patients with schizophrenia (e.g. resistance against infectious diseases), or within the social domain (e.g. greater creativity of the relatives). According to some theories, shamanism and religion demonstrate some similarities to psychosis and provide clues regarding the origins of schizophrenia. At the end of this article we discuss this last theory in detail listing arguments for and against.
Molluscan Evolutionary Development
Wanninger, Andreas Wilhelm Georg; Koop, Damien; Moshel-Lynch, Sharon
2008-01-01
Brought together by Winston F. Ponder and David R. Lindberg, thirty-six experts on the evolution of the Mollusca provide an up-to-date review of its evolutionary history. The Mollusca are the second largest animal phylum and boast a fossil record of over 540 million years. They exhibit remarkable...
When development matters: From evolutionary psychology to evolutionary developmental psychology
Hernández Blasi, Carlos; Gardiner, Amy K.; David F. Bjorklund
2008-01-01
This article presents evolutionary developmental psychology (EDP) as an emerging field of evolutionary psychology (EP). In describing the core tenets of both approaches and the differences between them, we emphasize the important roles that evolution and development have in understanding human behaviour. We suggest that developmental psychologists should pay more attention to evolutionary issues and, conversely, evolutionary psychologists should take development seriously. Key words: evol...
Howe, Lauren C; Krosnick, Jon A
2017-01-03
Attitude strength has been the focus of a huge volume of research in psychology and related sciences for decades. The insights offered by this literature have tremendous value for understanding attitude functioning and structure and for the effective application of the attitude concept in applied settings. This is the first Annual Review of Psychology article on the topic, and it offers a review of theory and evidence regarding one of the most researched strength-related attitude features: attitude importance. Personal importance is attached to an attitude when the attitude is perceived to be relevant to self-interest, social identification with reference groups or reference individuals, and values. Attaching personal importance to an attitude causes crystallizing of attitudes (via enhanced resistance to change), effortful gathering and processing of relevant information, accumulation of a large store of well-organized relevant information in long-term memory, enhanced attitude extremity and accessibility, enhanced attitude impact on the regulation of interpersonal attraction, energizing of emotional reactions, and enhanced impact of attitudes on behavioral intentions and action. Thus, important attitudes are real and consequential psychological forces, and their study offers opportunities for addressing behavioral change.
Enrique Canessa
2014-01-01
Full Text Available Se presenta un Algoritmo Genético de Pareto (AGP, que encuentra la frontera de Pareto en problemas de diseño robusto para sistemas multiobjetivo. El AGP fue diseñado para ser aplicado usando el método de Diseño de Parámetros de Taguchi, el cual es el método más frecuentemente empleado por profesionales para ejecutar diseño robusto. El AGP se probó con datos obtenidos de un sistema real con una respuesta y de un simulador de procesos multiobjetivo con muchos factores de control y ruido. En todos los casos, el AGP entregó soluciones óptimas que cumplen con los objetivos del diseño robusto. Además, la discusión de resultados muestra que tener dichas soluciones ayuda en la selección de las mejores a ser implementadas en el sistema bajo estudio, especialmente cuando el sistema tiene muchos factores de control y salidas.
Mukhopadhyay, Somparna; Hazra, Lakshminarayan
2015-11-01
Resolution capability of an optical imaging system can be enhanced by reducing the width of the central lobe of the point spread function. Attempts to achieve the same by pupil plane filtering give rise to a concomitant increase in sidelobe intensity. The mutual exclusivity between these two objectives may be considered as a multiobjective optimization problem that does not have a unique solution; rather, a class of trade-off solutions called Pareto optimal solutions may be generated. Pareto fronts in the synthesis of lossless phase-only pupil plane filters to achieve superresolution with prespecified lower limits for the Strehl ratio are explored by using the particle swarm optimization technique.
Yang, Y.; Wu, J.
2011-12-01
The previous work in the field of multi-objective optimization under uncertainty has concerned with the probabilistic multi-objective algorithm itself, how to effectively evaluate an estimate of uncertain objectives and identify a set of reliable Pareto optimal solutions. However, the design of a robust and reliable groundwater remediation system encounters major difficulties owing to the inherent uncertainty of hydrogeological parameters such as hydraulic conductivity (K). Thus, we need to make reduction of uncertainty associated with the site characteristics of the contaminated aquifers. In this study, we first use the Sequential Gaussian Simulation (SGSIM) to generate 1000 conditional realizations of lnK based on the sampled conditioning data acquired by field test. It is worthwhile to note that the cost for field test often weighs heavily upon the remediation cost and must thus be taken into account in the tradeoff between the solution reliability and remedial cost optimality. In this situation, we perform Monte Carlo simulation to make an uncertainty analysis of lnK realizations associated with the different number of conditioning data points. The results indicate that the uncertainty of the site characteristics and the contaminant concentration output from transport model is decreasing and then tends toward stabilization with the increase of conditioning data. This study presents a probabilistic multi-objective evolutionary algorithm (PMOEA) that integrates noisy genetic algorithm (NGA) and probabilistic multi-objective genetic algorithm (MOGA). The evident difference between deterministic MOGA and probabilistic MOGA is the use of probabilistic Pareto domination ranking and niche technique to ensure that each solution found is most reliable and robust. The proposed algorithm is then evaluated through a synthetic pump-and-treat (PAT) groundwater remediation test case. The 1000 lnK realizations generated by SGSIM with appropriate number of conditioning data (30
Recent Advances in Evolutionary Computation
Xin Yao; Yong Xu
2006-01-01
Evolutionary computation has experienced a tremendous growth in the last decade in both theoretical analyses and industrial applications. Its scope has evolved beyond its original meaning of "biological evolution" toward a wide variety of nature inspired computational algorithms and techniques, including evolutionary, neural, ecological, social and economical computation, etc., in a unified framework. Many research topics in evolutionary computation nowadays are not necessarily "evolutionary". This paper provides an overview of some recent advances in evolutionary computation that have been made in CERCIA at the University of Birmingham, UK. It covers a wide range of topics in optimization, learning and design using evolutionary approaches and techniques, and theoretical results in the computational time complexity of evolutionary algorithms. Some issues related to future development of evolutionary computation are also discussed.
2007-03-01
turn to a visualization of the solutions, as conceived in 1896 by Italian economist Vilfredo Pareto . 2.7 Pareto Optimality and Nondominance By...47 2.6 Single and Multiobjective Optimization ..............................................................49 2.7 Pareto ...73 3.6.7 Calculating the Pareto Front
M. Frutos
2013-01-01
Full Text Available Many of the problems that arise in production systems can be handled with multiobjective techniques. One of those problems is that of scheduling operations subject to constraints on the availability of machines and buffer capacity. In this paper we analyze different Evolutionary multiobjective Algorithms (MOEAs for this kind of problems. We consider an experimental framework in which we schedule production operations for four real world Job-Shop contexts using three algorithms, NSGAII, SPEA2, and IBEA. Using two performance indexes, Hypervolume and R2, we found that SPEA2 and IBEA are the most efficient for the tasks at hand. On the other hand IBEA seems to be a better choice of tool since it yields more solutions in the approximate Pareto frontier.
ANDRES-TOROB.; GIRON-SIERRAJ.M.; FERNANDEZ-BLANCOP.; LOPEZ-OROZCOJ.A.; BESADA-PORTASE.
2004-01-01
This paper describes empirical research on the model, optimization and supervisory control of beer fermentation.Conditions in the laboratory were made as similar as possible to brewery industry conditions. Since mathematical models that consider realistic industrial conditions were not available, a new mathematical model design involving industrial conditions was first developed. Batch fermentations are multiobjective dynamic processes that must be guided along optimal paths to obtain good results.The paper describes a direct way to apply a Pareto set approach with multiobjective evolutionary algorithms (MOEAs).Successful finding of optimal ways to drive these processes were reported.Once obtained, the mathematical fermentation model was used to optimize the fermentation process by using an intelligent control based on certain rules.
A Hybrid Multiobjective Evolutionary Approach for Flexible Job-Shop Scheduling Problems
Jian Xiong
2012-01-01
Full Text Available This paper addresses multiobjective flexible job-shop scheduling problem (FJSP with three simultaneously considered objectives: minimizing makespan, minimizing total workload, and minimizing maximal workload. A hybrid multiobjective evolutionary approach (H-MOEA is developed to solve the problem. According to the characteristic of FJSP, a modified crowding distance measure is introduced to maintain the diversity of individuals. In the proposed H-MOEA, well-designed chromosome representation and genetic operators are developed for FJSP. Moreover, a local search procedure based on critical path theory is incorporated in H-MOEA to improve the convergence ability of the algorithm. Experiment results on several well-known benchmark instances demonstrate the efficiency and stability of the proposed algorithm. The comparison with other recently published approaches validates that H-MOEA can obtain Pareto-optimal solutions with better quality and/or diversity.
Ebtehaj, Isa; Bonakdari, Hossein; Khoshbin, Fatemeh
2016-10-01
To determine the minimum velocity required to prevent sedimentation, six different models were proposed to estimate the densimetric Froude number (Fr). The dimensionless parameters of the models were applied along with a combination of the group method of data handling (GMDH) and the multi-target genetic algorithm. Therefore, an evolutionary design of the generalized GMDH was developed using a genetic algorithm with a specific coding scheme so as not to restrict connectivity configurations to abutting layers only. In addition, a new preserving mechanism by the multi-target genetic algorithm was utilized for the Pareto optimization of GMDH. The results indicated that the most accurate model was the one that used the volumetric concentration of sediment (CV), relative hydraulic radius (d/R), dimensionless particle number (Dgr) and overall sediment friction factor (λs) in estimating Fr. Furthermore, the comparison between the proposed method and traditional equations indicated that GMDH is more accurate than existing equations.
ANDR(E)S-TORO B.; GIR(O)N-SIERRA J.M.; FERN(A)NDEZ-BLANCO P.; L(O)PEZ-OROZCO J.A.; BESADA-PORTAS E.
2004-01-01
This paper describes empirical research on the model, optimization and supervisory control of beer fermentation. Conditions in the laboratory were made as similar as possible to brewery industry conditions. Since mathematical models that consider realistic industrial conditions were not available, a new mathematical model design involving industrial conditions was first developed. Batch fermentations are multiobjective dynamic processes that must be guided along optimal paths to obtain good results. The paper describes a direct way to apply a Pareto set approach with multiobjective evolutionary algorithms (MOEAs). Successful finding of optimal ways to drive these processes were reported. Once obtained, the mathematical fermentation model was used to optimize the fermentation process by using an intelligent control based on certain rules.
Optimization of constrained multiple-objective reliability problems using evolutionary algorithms
Salazar, Daniel [Instituto de Sistemas Inteligentes y Aplicaciones Numericas en Ingenieria (IUSIANI), Division de Computacion Evolutiva y Aplicaciones (CEANI), Universidad de Las Palmas de Gran Canaria, Islas Canarias (Spain) and Facultad de Ingenieria, Universidad Central Venezuela, Caracas (Venezuela)]. E-mail: danielsalazaraponte@gmail.com; Rocco, Claudio M. [Facultad de Ingenieria, Universidad Central Venezuela, Caracas (Venezuela)]. E-mail: crocco@reacciun.ve; Galvan, Blas J. [Instituto de Sistemas Inteligentes y Aplicaciones Numericas en Ingenieria (IUSIANI), Division de Computacion Evolutiva y Aplicaciones (CEANI), Universidad de Las Palmas de Gran Canaria, Islas Canarias (Spain)]. E-mail: bgalvan@step.es
2006-09-15
This paper illustrates the use of multi-objective optimization to solve three types of reliability optimization problems: to find the optimal number of redundant components, find the reliability of components, and determine both their redundancy and reliability. In general, these problems have been formulated as single objective mixed-integer non-linear programming problems with one or several constraints and solved by using mathematical programming techniques or special heuristics. In this work, these problems are reformulated as multiple-objective problems (MOP) and then solved by using a second-generation Multiple-Objective Evolutionary Algorithm (MOEA) that allows handling constraints. The MOEA used in this paper (NSGA-II) demonstrates the ability to identify a set of optimal solutions (Pareto front), which provides the Decision Maker with a complete picture of the optimal solution space. Finally, the advantages of both MOP and MOEA approaches are illustrated by solving four redundancy problems taken from the literature.
Wang, Chun; Ji, Zhicheng; Wang, Yan
2017-07-01
In this paper, multi-objective flexible job shop scheduling problem (MOFJSP) was studied with the objects to minimize makespan, total workload and critical workload. A variable neighborhood evolutionary algorithm (VNEA) was proposed to obtain a set of Pareto optimal solutions. First, two novel crowded operators in terms of the decision space and object space were proposed, and they were respectively used in mating selection and environmental selection. Then, two well-designed neighborhood structures were used in local search, which consider the problem characteristics and can hold fast convergence. Finally, extensive comparison was carried out with the state-of-the-art methods specially presented for solving MOFJSP on well-known benchmark instances. The results show that the proposed VNEA is more effective than other algorithms in solving MOFJSP.
Evolutionary constrained optimization
Deb, Kalyanmoy
2015-01-01
This book makes available a self-contained collection of modern research addressing the general constrained optimization problems using evolutionary algorithms. Broadly the topics covered include constraint handling for single and multi-objective optimizations; penalty function based methodology; multi-objective based methodology; new constraint handling mechanism; hybrid methodology; scaling issues in constrained optimization; design of scalable test problems; parameter adaptation in constrained optimization; handling of integer, discrete and mix variables in addition to continuous variables; application of constraint handling techniques to real-world problems; and constrained optimization in dynamic environment. There is also a separate chapter on hybrid optimization, which is gaining lots of popularity nowadays due to its capability of bridging the gap between evolutionary and classical optimization. The material in the book is useful to researchers, novice, and experts alike. The book will also be useful...
Evolutionary status of Entamoeba
DONG Jiuhong; WEN Jianfan; XIN Dedong; LU Siqi
2004-01-01
In addition to its medical importance as parasitic pathogen, Entamoeba has aroused people's interest in its evolutionary status for a long time. Lacking mitochondrion and other intracellular organelles common to typical eukaryotes, Entamoeba and several other amitochondrial protozoans have been recognized as ancient pre-mitochondriate eukaryotes and named "archezoa", the most primitive extant eukaryotes. It was suggested that they might be living fossils that remained in a primitive stage of evolution before acquisition of organelles, lying close to the transition between prokaryotes and eukaryotes. However, recent studies revealed that Entamoeba contained an organelle, "crypton" or "mitosome", which was regarded as specialized or reductive mitochondrion. Relative molecular phylogenetic analyses also indicated the existence or the probable existence of mitochondrion in Entamoeba. Our phylogenetic analysis based on DNA topoisomerase II strongly suggested its divergence after some mitchondriate eukaryotes. Here, all these recent researches are reviewed and the evolutionary status of Entamoeba is discussed.
Evolutionary internalized regularities.
Schwartz, R
2001-08-01
Roger Shepard's proposals and supporting experiments concerning evolutionary internalized regularities have been very influential in the study of vision and in other areas of psychology and cognitive science. This paper examines issues concerning the need, nature, explanatory role, and justification for postulating such internalized constraints. In particular, I seek further clarification from Shepard on how best to understand his claim that principles of kinematic geometry underlie phenomena of motion perception. My primary focus is on the ecological validity of Shepard's kinematic constraint in the context of ordinary motion perception. First, I explore the analogy Shepard draws between internalized circadian rhythms and the supposed internalization of kinematic geometry. Next, questions are raised about how to interpret and justify applying results from his own and others' experimental studies of apparent motion to more everyday cases of motion perception in richer environments. Finally, some difficulties with Shepard's account of the evolutionary development of his kinematic constraint are considered.
Evolutionary biology of cancer.
Crespi, Bernard; Summers, Kyle
2005-10-01
Cancer is driven by the somatic evolution of cell lineages that have escaped controls on replication and by the population-level evolution of genes that influence cancer risk. We describe here how recent evolutionary ecological studies have elucidated the roles of predation by the immune system and competition among normal and cancerous cells in the somatic evolution of cancer. Recent analyses of the evolution of cancer at the population level show how rapid changes in human environments have augmented cancer risk, how strong selection has frequently led to increased cancer risk as a byproduct, and how anticancer selection has led to tumor-suppression systems, tissue designs that slow somatic evolution, constraints on morphological evolution and even senescence itself. We discuss how applications of the tools of ecology and evolutionary biology are poised to revolutionize our understanding and treatment of this disease.
Browne, Cameron
2011-01-01
The book describes the world's first successful experiment in fully automated board game design. Evolutionary methods were used to derive new rule sets within a custom game description language, and self-play trials used to estimate each derived game's potential to interest human players. The end result is a number of new and interesting games, one of which has proved popular and gone on to be commercially published.
Evolutionary theory of cancer.
Attolini, Camille Stephan-Otto; Michor, Franziska
2009-06-01
As Theodosius Dobzhansky famously noted in 1973, "Nothing in biology makes sense except in the light of evolution," and cancer is no exception to this rule. Our understanding of cancer initiation, progression, treatment, and resistance has advanced considerably by regarding cancer as the product of evolutionary processes. Here we review the literature of mathematical models of cancer evolution and provide a synthesis and discussion of the field.
Multi-objective evolutionary algorithm for operating parallel reservoir system
Chang, Li-Chiu; Chang, Fi-John
2009-10-01
SummaryThis paper applies a multi-objective evolutionary algorithm, the non-dominated sorting genetic algorithm (NSGA-II), to examine the operations of a multi-reservoir system in Taiwan. The Feitsui and Shihmen reservoirs are the most important water supply reservoirs in Northern Taiwan supplying the domestic and industrial water supply needs for over 7 million residents. A daily operational simulation model is developed to guide the releases of the reservoir system and then to calculate the shortage indices (SI) of both reservoirs over a long-term simulation period. The NSGA-II is used to minimize the SI values through identification of optimal joint operating strategies. Based on a 49 year data set, we demonstrate that better operational strategies would reduce shortage indices for both reservoirs. The results indicate that the NSGA-II provides a promising approach. The pareto-front optimal solutions identified operational compromises for the two reservoirs that would be expected to improve joint operations.
A Novel Multiobjective Evolutionary Algorithm Based on Regression Analysis
Zhiming Song
2015-01-01
Full Text Available As is known, the Pareto set of a continuous multiobjective optimization problem with m objective functions is a piecewise continuous (m-1-dimensional manifold in the decision space under some mild conditions. However, how to utilize the regularity to design multiobjective optimization algorithms has become the research focus. In this paper, based on this regularity, a model-based multiobjective evolutionary algorithm with regression analysis (MMEA-RA is put forward to solve continuous multiobjective optimization problems with variable linkages. In the algorithm, the optimization problem is modelled as a promising area in the decision space by a probability distribution, and the centroid of the probability distribution is (m-1-dimensional piecewise continuous manifold. The least squares method is used to construct such a model. A selection strategy based on the nondominated sorting is used to choose the individuals to the next generation. The new algorithm is tested and compared with NSGA-II and RM-MEDA. The result shows that MMEA-RA outperforms RM-MEDA and NSGA-II on the test instances with variable linkages. At the same time, MMEA-RA has higher efficiency than the other two algorithms. A few shortcomings of MMEA-RA have also been identified and discussed in this paper.
A novel multiobjective evolutionary algorithm based on regression analysis.
Song, Zhiming; Wang, Maocai; Dai, Guangming; Vasile, Massimiliano
2015-01-01
As is known, the Pareto set of a continuous multiobjective optimization problem with m objective functions is a piecewise continuous (m - 1)-dimensional manifold in the decision space under some mild conditions. However, how to utilize the regularity to design multiobjective optimization algorithms has become the research focus. In this paper, based on this regularity, a model-based multiobjective evolutionary algorithm with regression analysis (MMEA-RA) is put forward to solve continuous multiobjective optimization problems with variable linkages. In the algorithm, the optimization problem is modelled as a promising area in the decision space by a probability distribution, and the centroid of the probability distribution is (m - 1)-dimensional piecewise continuous manifold. The least squares method is used to construct such a model. A selection strategy based on the nondominated sorting is used to choose the individuals to the next generation. The new algorithm is tested and compared with NSGA-II and RM-MEDA. The result shows that MMEA-RA outperforms RM-MEDA and NSGA-II on the test instances with variable linkages. At the same time, MMEA-RA has higher efficiency than the other two algorithms. A few shortcomings of MMEA-RA have also been identified and discussed in this paper.
Asymmetric Evolutionary Games.
Alex McAvoy
2015-08-01
Full Text Available Evolutionary game theory is a powerful framework for studying evolution in populations of interacting individuals. A common assumption in evolutionary game theory is that interactions are symmetric, which means that the players are distinguished by only their strategies. In nature, however, the microscopic interactions between players are nearly always asymmetric due to environmental effects, differing baseline characteristics, and other possible sources of heterogeneity. To model these phenomena, we introduce into evolutionary game theory two broad classes of asymmetric interactions: ecological and genotypic. Ecological asymmetry results from variation in the environments of the players, while genotypic asymmetry is a consequence of the players having differing baseline genotypes. We develop a theory of these forms of asymmetry for games in structured populations and use the classical social dilemmas, the Prisoner's Dilemma and the Snowdrift Game, for illustrations. Interestingly, asymmetric games reveal essential differences between models of genetic evolution based on reproduction and models of cultural evolution based on imitation that are not apparent in symmetric games.
van Kesteren, Z; Janssen, T M; Damen, E; van Vliet-Vroegindeweij, C
2012-05-21
To evaluate in an objective way the effect of leaf interdigitation and leaf width on volumetric modulated arc therapy plans in Pinnacle. Three multileaf collimators (MLCs) were modeled: two 10 mm leaf width MLCs, with and without interdigitating leafs, and a 5 mm leaf width MLC with interdigitating leafs. Three rectum patients and three prostate patients were used for the planning study. In order to compare treatment techniques in an objective way, a Pareto front comparison was carried out. 200 plans were generated in an automated way, per patient per MLC model, resulting in a total of 3600 plans. From these plans, Pareto-optimal plans were selected which were evaluated for various dosimetric variables. The capability of leaf interdigitation showed little dosimetric impact on the treatment plans, when comparing the 10 mm leaf width MLC with and without leaf interdigitation. When comparing the 10 mm leaf width MLC with the 5 mm leaf width MLC, both with interdigitating leafs, improvement in plan quality was observed. For both patient groups, the integral dose was reduced by 0.6 J for the thin MLC. For the prostate patients, the mean dose to the anal sphincter was reduced by 1.8 Gy and the conformity of the V(95%) was reduced by 0.02 using the thin MLC. The V(65%) of the rectum was reduced by 0.1% and the dose homogeneity with 1.5%. For rectum patients, the mean dose to the bowel was reduced by 1.4 Gy and the mean dose to the bladder with 0.8 Gy for the thin MLC. The conformity of the V(95%) was equivalent for the 10 and 5 mm leaf width MLCs for the rectum patients. We have objectively compared three types of MLCs in a planning study for prostate and rectum patients by analyzing Pareto-optimal plans which were generated in an automated way. Interdigitation of MLC leafs does not generate better plans using the SmartArc algorithm in Pinnacle. Changing the MLC leaf width from 10 to 5 mm generates better treatment plans although the clinical relevance remains
医疗保险索赔数据的广义Pareto分布拟合%GENERALIZED PARETO DISTRIBUTION FIT TO MEDICAL INSURANCE CLAIMS DATA
欧阳资生; 谢赤
2006-01-01
How to choose an optimal threshold is a key problemin the generalized Pareto distribution (GPD) model.This paper attains the exactthreshold by testing for GPD,and shows that GPD model allows the actuary to easily estimate high quantiles and the probable maximum loss from the medical insurance claims data.
Arnaut Dierck
2015-01-01
Full Text Available Designing textile antennas for real-life applications requires a design strategy that is able to produce antennas that are optimized over a wide bandwidth for often conflicting characteristics, such as impedance matching, axial ratio, efficiency, and gain, and, moreover, that is able to account for the variations that apply for the characteristics of the unconventional materials used in smart textile systems. In this paper, such a strategy, incorporating a multiobjective constrained Pareto optimization, is presented and applied to the design of a Galileo E6-band antenna with optimal return loss and wide-band axial ratio characteristics. Subsequently, different prototypes of the optimized antenna are fabricated and measured to validate the proposed design strategy.
Estudio deóntico de las Donatarias con base en las teorías de Sax y Pareto
Héctor Torres Solis
2013-01-01
Full Text Available La Ley del Impuesto Sobre la Renta en México grava los ingresos que perciban las personas físicas y morales. La disposición establece un régimen fiscal para las personas morales no contribuyentes del ISR, personas morales con fines no lucrativos autorizada s para recibir donativos. El objeto de este trabajo es estudiar distribución de la riqueza y las desigualdades sociales a las que se enfrenta la sociedad, con base en las teorías de la utilidad relativa de Sax y la teoría sociológica de Pareto; ambas teor ías se contraponen, no obstante son útiles para mostrar la deóntica jurídica como clave en la interpretación de las leyes fiscales para las donatarias autorizadas.
Leimbach, Marian [Potsdam-Institut fuer Klimafolgenforschung e.V., Potsdam (Germany); Eisenack, Klaus [Oldenburg Univ. (Germany). Dept. of Economics and Statistics
2008-11-15
In this paper we present an algorithm that deals with trade interactions within a multi-region model. In contrast to traditional approaches this algorithm is able to handle spillover externalities. Technological spillovers are expected to foster the diffusion of new technologies, which helps to lower the cost of climate change mitigation. We focus on technological spillovers which are due to capital trade. The algorithm of finding a pareto-optimal solution in an intertemporal framework is embedded in a decomposed optimization process. The paper analyzes convergence and equilibrium properties of this algorithm. In the final part of the paper, we apply the algorithm to investigate possible impacts of technological spillovers. While benefits of technological spillovers are significant for the capital-importing region, benefits for the capital-exporting region depend on the type of regional disparities and the resulting specialization and terms-of-trade effects. (orig.)
José Raúl Castro
2016-02-01
Full Text Available This paper presents an efficient algorithm to solve the multi-objective (MO voltage control problem in distribution networks. The proposed algorithm minimizes the following three objectives: voltage variation on pilot buses, reactive power production ratio deviation, and generator voltage deviation. This work leverages two optimization techniques: fuzzy logic to find the optimum value of the reactive power of the distributed generation (DG and Pareto optimization to find the optimal value of the pilot bus voltage so that this produces lower losses under the constraints that the voltage remains within established limits. Variable loads and DGs are taken into account in this paper. The algorithm is tested on an IEEE 13-node test feeder and the results show the effectiveness of the proposed model.
Hurford, Anthony; Harou, Julien
2014-05-01
Water related eco-system services are important to the livelihoods of the poorest sectors of society in developing countries. Degradation or loss of these services can increase the vulnerability of people decreasing their capacity to support themselves. New approaches to help guide water resources management decisions are needed which account for the non-market value of ecosystem goods and services. In case studies from Brazil and Kenya we demonstrate the capability of many objective Pareto-optimal trade-off analysis to help decision makers balance economic and non-market benefits from the management of existing multi-reservoir systems. A multi-criteria search algorithm is coupled to a water resources management simulator of each basin to generate a set of Pareto-approximate trade-offs representing the best case management decisions. In both cases, volume dependent reservoir release rules are the management decisions being optimised. In the Kenyan case we further assess the impacts of proposed irrigation investments, and how the possibility of new investments impacts the system's trade-offs. During the multi-criteria search (optimisation), performance of different sets of management decisions (policies) is assessed against case-specific objective functions representing provision of water supply and irrigation, hydropower generation and maintenance of ecosystem services. Results are visualised as trade-off surfaces to help decision makers understand the impacts of different policies on a broad range of stakeholders and to assist in decision-making. These case studies show how the approach can reveal unexpected opportunities for win-win solutions, and quantify the trade-offs between investing to increase agricultural revenue and negative impacts on protected ecosystems which support rural livelihoods.
Carlos Pozo
Full Text Available Optimization models in metabolic engineering and systems biology focus typically on optimizing a unique criterion, usually the synthesis rate of a metabolite of interest or the rate of growth. Connectivity and non-linear regulatory effects, however, make it necessary to consider multiple objectives in order to identify useful strategies that balance out different metabolic issues. This is a fundamental aspect, as optimization of maximum yield in a given condition may involve unrealistic values in other key processes. Due to the difficulties associated with detailed non-linear models, analysis using stoichiometric descriptions and linear optimization methods have become rather popular in systems biology. However, despite being useful, these approaches fail in capturing the intrinsic nonlinear nature of the underlying metabolic systems and the regulatory signals involved. Targeting more complex biological systems requires the application of global optimization methods to non-linear representations. In this work we address the multi-objective global optimization of metabolic networks that are described by a special class of models based on the power-law formalism: the generalized mass action (GMA representation. Our goal is to develop global optimization methods capable of efficiently dealing with several biological criteria simultaneously. In order to overcome the numerical difficulties of dealing with multiple criteria in the optimization, we propose a heuristic approach based on the epsilon constraint method that reduces the computational burden of generating a set of Pareto optimal alternatives, each achieving a unique combination of objectives values. To facilitate the post-optimal analysis of these solutions and narrow down their number prior to being tested in the laboratory, we explore the use of Pareto filters that identify the preferred subset of enzymatic profiles. We demonstrate the usefulness of our approach by means of a case study
Pozo, Carlos; Guillén-Gosálbez, Gonzalo; Sorribas, Albert; Jiménez, Laureano
2012-01-01
Optimization models in metabolic engineering and systems biology focus typically on optimizing a unique criterion, usually the synthesis rate of a metabolite of interest or the rate of growth. Connectivity and non-linear regulatory effects, however, make it necessary to consider multiple objectives in order to identify useful strategies that balance out different metabolic issues. This is a fundamental aspect, as optimization of maximum yield in a given condition may involve unrealistic values in other key processes. Due to the difficulties associated with detailed non-linear models, analysis using stoichiometric descriptions and linear optimization methods have become rather popular in systems biology. However, despite being useful, these approaches fail in capturing the intrinsic nonlinear nature of the underlying metabolic systems and the regulatory signals involved. Targeting more complex biological systems requires the application of global optimization methods to non-linear representations. In this work we address the multi-objective global optimization of metabolic networks that are described by a special class of models based on the power-law formalism: the generalized mass action (GMA) representation. Our goal is to develop global optimization methods capable of efficiently dealing with several biological criteria simultaneously. In order to overcome the numerical difficulties of dealing with multiple criteria in the optimization, we propose a heuristic approach based on the epsilon constraint method that reduces the computational burden of generating a set of Pareto optimal alternatives, each achieving a unique combination of objectives values. To facilitate the post-optimal analysis of these solutions and narrow down their number prior to being tested in the laboratory, we explore the use of Pareto filters that identify the preferred subset of enzymatic profiles. We demonstrate the usefulness of our approach by means of a case study that optimizes the
Towards Adaptive Evolutionary Architecture
Bak, Sebastian HOlt; Rask, Nina; Risi, Sebastian
2016-01-01
This paper presents first results from an interdisciplinary project, in which the fields of architecture, philosophy and artificial life are combined to explore possible futures of architecture. Through an interactive evolutionary installation, called EvoCurtain, we investigate aspects of how...... to the development of designs tailored to the individual preferences of inhabitants, changing the roles of architects and designers entirely. Architecture-as-it-could-be is a philosophical approach conducted through artistic methods to anticipate the technological futures of human-centered development within...
Constraints as evolutionary systems
Rácz, István
2016-01-01
The constraint equations for smooth $[n+1]$-dimensional (with $n\\geq 3$) Riemannian or Lorentzian spaces satisfying the Einstein field equations are considered. It is shown, regardless of the signature of the primary space, that the constraints can be put into the form of an evolutionary system comprised either by a first order symmetric hyperbolic system and a parabolic equation or, alternatively, by a strongly hyperbolic system and a subsidiary algebraic relation. In both cases the (local) existence and uniqueness of solutions are also discussed.
Healy, Thomas J.
1993-04-01
The paper describes an evolutionary approach to the development of aerospace systems, represented by the introduction of integrated product teams (IPTs), which are now used at Rockwell's Space Systems Division on all new programs and are introduced into existing projects after demonstrations of increases in quality and reductions in cost and schedule due to IPTs. Each IPT is unique and reflects its own program and lasts for the life of the program. An IPT includes customers, suppliers, subcontractors, and associate contractors, and have a charter, mission, scope of authority, budget, and schedule. Functional management is responsible for the staffing, training, method development, and generic technology development.
Distributed Evolutionary Graph Partitioning
Sanders, Peter
2011-01-01
We present a novel distributed evolutionary algorithm, KaFFPaE, to solve the Graph Partitioning Problem, which makes use of KaFFPa (Karlsruhe Fast Flow Partitioner). The use of our multilevel graph partitioner KaFFPa provides new effective crossover and mutation operators. By combining these with a scalable communication protocol we obtain a system that is able to improve the best known partitioning results for many inputs in a very short amount of time. For example, in Walshaw's well known benchmark tables we are able to improve or recompute 76% of entries for the tables with 1%, 3% and 5% imbalance.
Stress-strength reliability for general bivariate distributions
Alaa H. Abdel-Hamid
2016-10-01
Full Text Available An expression for the stress-strength reliability R=P(X1
The evolutionary language game.
Nowak, M A; Plotkin, J B; Krakauer, D C
1999-09-21
We explore how evolutionary game dynamics have to be modified to accomodate a mathematical framework for the evolution of language. In particular, we are interested in the evolution of vocabulary, that is associations between signals and objects. We assume that successful communication contributes to biological fitness: individuals who communicate well leave more offspring. Children inherit from their parents a strategy for language learning (a language acquisition device). We consider three mechanisms whereby language is passed from one generation to the next: (i) parental learning: children learn the language of their parents; (ii) role model learning: children learn the language of individuals with a high payoff; and (iii) random learning: children learn the language of randomly chosen individuals. We show that parental and role model learning outperform random learning. Then we introduce mistakes in language learning and study how this process changes language over time. Mistakes increase the overall efficacy of parental and role model learning: in a world with errors evolutionary adaptation is more efficient. Our model also provides a simple explanation why homonomy is common while synonymy is rare. Copyright 1999 Academic Press.
Nguyen, N.-T.; Chakraborti, N.; Barlat, F.
2013-12-01
In this study, the advanced high strength steels (AHSS)/mild steel TWB sheet is applied to the U-draw bending springback under non-constant blank holding force (BHF). On both sides of the blank, two different BHF-punch stroke are applied. A systematic approach to obtain optimal BHF-stroke profiles is proposed. The optimal condition would require satisfying two conflicting objectives simultaneously: (1) minimize springback deformation and (2) minimize the forming severity, leading to a Pareto-optimal problem. The optimization procedure consists of the following steps: sampling design, finite element (FE) simulations, metamodeling, and finally the calculation of a Pareto-frontier. PAM-STAMP® FE software is employed in this study. The generated outputs of FE simulations on some statistically significant sampling points are then used for the construction of metamodels of optimum accuracy and complexity, which, in turn, were used to evaluate the output for any set of inputs, replacing the computing intensive FE simulations. A novel genetic algorithms based multi-objective optimization technique is applied for optimization. Yet far to be completely removed, springback in TWB can be appreciably reduced using the proposed approach of variable BHF control.
Govaert, Lynn; Pantel, Jelena H; De Meester, Luc
2016-08-01
Interest in eco-evolutionary dynamics is rapidly increasing thanks to ground-breaking research indicating that evolution can occur rapidly and can alter the outcome of ecological processes. A key challenge in this sub-discipline is establishing how important the contribution of evolutionary and ecological processes and their interactions are to observed shifts in population and community characteristics. Although a variety of metrics to separate and quantify the effects of evolutionary and ecological contributions to observed trait changes have been used, they often allocate fractions of observed changes to ecology and evolution in different ways. We used a mathematical and numerical comparison of two commonly used frameworks - the Price equation and reaction norms - to reveal that the Price equation cannot partition genetic from non-genetic trait change within lineages, whereas the reaction norm approach cannot partition among- from within-lineage trait change. We developed a new metric that combines the strengths of both Price-based and reaction norm metrics, extended all metrics to analyse community change and also incorporated extinction and colonisation of species in these metrics. Depending on whether our new metric is applied to populations or communities, it can correctly separate intraspecific, interspecific, evolutionary, non-evolutionary and interacting eco-evolutionary contributions to trait change. © 2016 John Wiley & Sons Ltd/CNRS.
改进的Pareto多目标协同优化策略%Enhanced Pareto multi-objective collaborative optimization strategy
龙腾; 刘莉
2012-01-01
为了提高标准协同优化的收敛性并扩展其多目标优化能力,将Pareto多目标遗传算法用于协同优化的系统级优化,提出了一种改进的Pareto多目标协同优化策略(enhanced collaborative optimization using Pareto multi-objective genetic algorithm,ECO-PMGA).为了保证非劣解集的Pareto最优性与均布性,提出了一种考虑拥挤度的非劣解逐级排序方法.ECO-PMGA采用2-范数形式的学科间一致性约束以提高学科级优化的效率.通过两个典型的优化算例对ECO-PMGA的数值稳定性与搜索Pareto非劣解集的能力进行了检验.研究结果表明,ECO-PMGA的收敛性与数值稳定性得以显著提高,而且ECO-PMGA具有良好的Pareto多目标优化能力.因此,ECO-PMGA在复杂耦合系统的多目标优化设计方面具有较高的实用价值.%In order to improve the convergence performance of standard collaborative optimization strategy and extend its multi-objective optimization compatibility, by adopting Pareto multi-objective genetic algorithm in the system level optimization, an enhanced collaborative optimization using Pareto multi-objective genetic algorithm (ECO-PMGA) is proposed. A sequential ranking method considering the crowed degree is developed to ensure the Pareto optimality and even distribution of non-inferior solutions. The interdisciplinary consistency constraints of 2-norm format are employed to improve the efficiency of discipline level optimizations in ECO-PMGA. The numerical stability and capability of searching Pareto non-inferior solution set are validated through two typical optimization problems. The results indicate that the convergence of system level optimization and numerical stability of ECO-PMGA are fairly enhanced, moreover, the ECO-PMGA shows a good performance in achieving Pareto optimal set. Accordingly, the proposed ECO-PMGA is practical and valuable for multi-objective optimization problems for complex and coupled systems.
Evolutionary neurobiology and aesthetics.
Smith, Christopher Upham
2005-01-01
If aesthetics is a human universal, it should have a neurobiological basis. Although use of all the senses is, as Aristotle noted, pleasurable, the distance senses are primarily involved in aesthetics. The aesthetic response emerges from the central processing of sensory input. This occurs very rapidly, beneath the level of consciousness, and only the feeling of pleasure emerges into the conscious mind. This is exemplified by landscape appreciation, where it is suggested that a computation built into the nervous system during Paleolithic hunter-gathering is at work. Another inbuilt computation leading to an aesthetic response is the part-whole relationship. This, it is argued, may be traced to the predator-prey "arms races" of evolutionary history. Mate selection also may be responsible for part of our response to landscape and visual art. Aesthetics lies at the core of human mentality, and its study is consequently of importance not only to philosophers and art critics but also to neurobiologists.
Spore: Spawning Evolutionary Misconceptions?
Bean, Thomas E.; Sinatra, Gale M.; Schrader, P. G.
2010-10-01
The use of computer simulations as educational tools may afford the means to develop understanding of evolution as a natural, emergent, and decentralized process. However, special consideration of developmental constraints on learning may be necessary when using these technologies. Specifically, the essentialist (biological forms possess an immutable essence), teleological (assignment of purpose to living things and/or parts of living things that may not be purposeful), and intentionality (assumption that events are caused by an intelligent agent) biases may be reinforced through the use of computer simulations, rather than addressed with instruction. We examine the video game Spore for its depiction of evolutionary content and its potential to reinforce these cognitive biases. In particular, we discuss three pedagogical strategies to mitigate weaknesses of Spore and other computer simulations: directly targeting misconceptions through refutational approaches, targeting specific principles of scientific inquiry, and directly addressing issues related to models as cognitive tools.
Total Integrative Evolutionary Communication
Nedergaard Thomsen, Ole; Brier, Søren
2014-01-01
) instinctual-motivational-emotional sign plays (a level which is shared with other animals and is the domain of ethology), and (3) premeditated, intentional symbol-based language games (specifically human unitary thinking-speaking-gesturing, the domain of pragmatics-based functional linguistics......). In this inclusive hierarchy language games subsume the other stages, and thus human evolutionary communication is primarily a symbolic-conventional practice. It is intertwined with the practice of living, that is, with different life forms, including other forms of semiotic behavior. Together they form a coherent......In this paper we outline a cybersemiotic foundation for the trend of pragmatics-based functional linguistics, Functional Discourse Grammar. Cybersemiotics is a substantial inter- and transdisciplinary semiotic theory which integrates, on the one hand, second-order cybernetics and autopoiesis theory...
Zhou, Mingxing; Liu, Jing
2017-02-01
Designing robust networks has attracted increasing attentions in recent years. Most existing work focuses on improving the robustness of networks against a specific type of attacks. However, networks which are robust against one type of attacks may not be robust against another type of attacks. In the real-world situations, different types of attacks may happen simultaneously. Therefore, we use the Pearson's correlation coefficient to analyze the correlation between different types of attacks, model the robustness measures against different types of attacks which are negatively correlated as objectives, and model the problem of optimizing the robustness of networks against multiple malicious attacks as a multiobjective optimization problem. Furthermore, to effectively solve this problem, we propose a two-phase multiobjective evolutionary algorithm, labeled as MOEA-RSFMMA. In MOEA-RSFMMA, a single-objective sampling phase is first used to generate a good initial population for the later two-objective optimization phase. Such a two-phase optimizing pattern well balances the computational cost of the two objectives and improves the search efficiency. In the experiments, both synthetic scale-free networks and real-world networks are used to validate the performance of MOEA-RSFMMA. Moreover, both local and global characteristics of networks in different parts of the obtained Pareto fronts are studied. The results show that the networks in different parts of Pareto fronts reflect different properties, and provide various choices for decision makers.
Yan Zhen-yu; Kang Li-shan; Lin Guang-ming; He Mei
2003-01-01
Multi objective Evolutionary Algorithm (MOEA) is be coming a hot research area and quite a few aspects of MOEAs have been studied and discussed. However there are still few literatures discussing the roles of search and selection operators in MOEAs. This paper studied their roles by solving a case of discrete Multi-objective Optimization Problem (MOP): Multi-objective TSP with a new MOEA. In the new MOEA, We adopt an efficient search operator, which has the properties of both crossover and mutation, to generate the new individuals and chose two selection operators: Family Competition and Population Competition with probabilities to realize selection. The simulation experiments showed that this new MOEA could get good uniform solutions representing the Pareto Front and outperformed SPEA in almost every simulation run on this problem. Furthermore, we analyzed its convergence property using finite Markov chain and proved that it could converge to Pareto Front with probabili ty 1. We also find that the convergence property of MOEAs has much relationship with search and selection operators.
Evolutionary Design of Boolean Functions
WANG Zhang-yi; ZHANG Huan-guo; QIN Zhong-ping; MENG Qing-shu
2005-01-01
We use evolutionary computing to synthesize Boolean functions randomly. By using specific crossover and mutation operator in evolving process and modifying search space and fitness function, we get some high non-linearity functions which have other good cryptography characteristics such as autocorrelation etc. Comparing to other heuristic search techniques, evolutionary computing approach is more effective because of global search strategy and implicit parallelism.
Topics of Evolutionary Computation 2001
Ursem, Rasmus Kjær
This booklet contains the student reports from the course: Topics of Evolutionary Computation, Fall 2001, given by Thiemo Krink, Rene Thomsen and Rasmus K. Ursem......This booklet contains the student reports from the course: Topics of Evolutionary Computation, Fall 2001, given by Thiemo Krink, Rene Thomsen and Rasmus K. Ursem...
Open Issues in Evolutionary Robotics.
Silva, Fernando; Duarte, Miguel; Correia, Luís; Oliveira, Sancho Moura; Christensen, Anders Lyhne
2016-01-01
One of the long-term goals in evolutionary robotics is to be able to automatically synthesize controllers for real autonomous robots based only on a task specification. While a number of studies have shown the applicability of evolutionary robotics techniques for the synthesis of behavioral control, researchers have consistently been faced with a number of issues preventing the widespread adoption of evolutionary robotics for engineering purposes. In this article, we review and discuss the open issues in evolutionary robotics. First, we analyze the benefits and challenges of simulation-based evolution and subsequent deployment of controllers versus evolution on real robotic hardware. Second, we discuss specific evolutionary computation issues that have plagued evolutionary robotics: (1) the bootstrap problem, (2) deception, and (3) the role of genomic encoding and genotype-phenotype mapping in the evolution of controllers for complex tasks. Finally, we address the absence of standard research practices in the field. We also discuss promising avenues of research. Our underlying motivation is the reduction of the current gap between evolutionary robotics and mainstream robotics, and the establishment of evolutionary robotics as a canonical approach for the engineering of autonomous robots.
Evolutionary trends in directional hearing
Carr, Catherine E; Christensen-Dalsgaard, Jakob
2016-01-01
Tympanic hearing is a true evolutionary novelty that arose in parallel within early tetrapods. We propose that in these tetrapods, selection for sound localization in air acted upon pre-existing directionally sensitive brainstem circuits, similar to those in fishes. Auditory circuits in birds...... interactions between coding strategies and evolutionary history....
Evolutionary Explanations of Eating Disorders
Igor Kardum
2008-12-01
Full Text Available This article reviews several most important evolutionary mechanisms that underlie eating disorders. The first part clarifies evolutionary foundations of mental disorders and various mechanisms leading to their development. In the second part selective pressures and evolved adaptations causing contemporary epidemic of obesity as well as differences in dietary regimes and life-style between modern humans and their ancestors are described. Concerning eating disorders, a number of current evolutionary explanations of anorexia nervosa are presented together with their main weaknesses. Evolutionary explanations of eating disorders based on the reproductive suppression hypothesis and its variants derived from kin selection theory and the model of parental manipulation were elaborated. The sexual competition hypothesis of eating disorder, adapted to flee famine hypothesis as well as explanation based on the concept of social attention holding power and the need to belonging were also explained. The importance of evolutionary theory in modern conceptualization and research of eating disorders is emphasized.
Yeonjoo Kim
2015-03-01
Full Text Available This study developed a robust parameter set (ROPS selection framework for a rainfall-runoff model that considers multi-events using the Pareto optimum and minimax regret approach (MRA. The calibrated parameter sets based on the Nash-Sutcliffe coefficient (NSE for two events were derived using a genetic algorithm. We generated 41 combinations for weighting values between two events for the multi-event objective function and derived 41 Pareto optimum points that were considered as the ROPS candidates. Then, two different approaches for parameter selection were proposed to determine the ROPS among the candidates: one uses NSE only and the other uses four performance measures (NSE, peak flow error, root mean square error and percentage of bias. In the NSE-only method, five events, including two events from the calibration set and three events from the evaluation set, were used, and the ROPS was selected based on the regrets of both the calibration and the evaluation sets. In the multiple (i.e., four performance measure method, only three events from the evaluation set were used and the ROPS was determined based on the regrets of twelve different cases, including three events with four measures. As a result, while single- and multi-event optimizations produced satisfying results for the calibration events, the optimized parameters from the single-event calibration do not perform well for another event, even one with the same criteria, such as NSE. The results of this study suggest that the optimized parameter set from the well-weighted objective function can successfully simulate not only hydrographs in general but also others, such as peak flow. In addition, the ROPS can be selected by considering the multiple performance measures of multiple validation events, as well as the NSE only of multiple calibration and validation events. Note that the study provides a framework that could be performed reasonably well with a limited number of events. While
Wang, Yibing; Breedveld, Sebastiaan; Heijmen, Ben; Petit, Steven F.
2016-06-01
IMRT planning with commercial Treatment Planning Systems (TPSs) is a trial-and-error process. Consequently, the quality of treatment plans may not be consistent among patients, planners and institutions. Recently, different plan quality assurance (QA) models have been proposed, that could flag and guide improvement of suboptimal treatment plans. However, the performance of these models was validated using plans that were created using the conventional trail-and-error treatment planning process. Consequently, it is challenging to assess and compare quantitatively the accuracy of different treatment planning QA models. Therefore, we created a golden standard dataset of consistently planned Pareto-optimal IMRT plans for 115 prostate patients. Next, the dataset was used to assess the performance of a treatment planning QA model that uses the overlap volume histogram (OVH). 115 prostate IMRT plans were fully automatically planned using our in-house developed TPS Erasmus-iCycle. An existing OVH model was trained on the plans of 58 of the patients. Next it was applied to predict DVHs of the rectum, bladder and anus of the remaining 57 patients. The predictions were compared with the achieved values of the golden standard plans for the rectum D mean, V 65, and V 75, and D mean of the anus and the bladder. For the rectum, the prediction errors (predicted-achieved) were only -0.2 ± 0.9 Gy (mean ± 1 SD) for D mean,-1.0 ± 1.6% for V 65, and -0.4 ± 1.1% for V 75. For D mean of the anus and the bladder, the prediction error was 0.1 ± 1.6 Gy and 4.8 ± 4.1 Gy, respectively. Increasing the training cohort to 114 patients only led to minor improvements. A dataset of consistently planned Pareto-optimal prostate IMRT plans was generated. This dataset can be used to train new, and validate and compare existing treatment planning QA models, and has been made publicly available. The OVH model was highly accurate
Maria Luisa Maniscalco
2017-08-01
Full Text Available Vilfredo Pareto è una figura chiave nelle scienze sociali i cui contributi hanno interessato diverse discipline dall’economia alla sociologia, alla scienza politica (Femia, Marshall, 2012. Attento e critico osservatore dei suoi tempi che leggeva alla luce di informazioni e dati provenienti da una pluralità eclettica di fonti – dai classici dell’antichità agli studi coevi, dai padri della Chiesa alle cronache dei giornali – si distinse per una originalità di analisi che rasenta la stravaganza. Pur debitore di una molteplicità di autori e teorie, nondimeno Pareto espresse una sua specificità componendo un mosaico articolato e coerente in cui i reciproci rimandi offrono ogni volta nuove visuali, si strutturano in griglie concettuali, in micro modelli e teorie di medio raggio ancora tutti da esplorare e sviluppare. Conscio della sua singolarità e del suo talento, si compiaceva della solitudine e coltivava la marginalità considerandole fondamentali per la libertà di pensiero e di espressione. Come sociologo si propose l’elaborazione di un’“altra sociologia” (Valade, 1990, concentrando la sua attenzione sia sulla parte costante dei fenomeni sociali, cioè sulla struttura interna delle condotte (le motivazioni irriflesse della vita sociale sia su quella più mutevole, cioè sulle relative giustificazioni, sui principi costitutivi di una logica del “non logico”, dando rilevanza allo svelamento dei meccanismi compositi che producono gli universi simbolici della società. Secondo Pareto, all’osservazione immediata i fenomeni sociali si presentano in forme mutevoli, manifestate attraverso le rappresentazioni collettive, i costumi, le ideologie che sono la risultante di una trama di relazioni e di azioni. Queste ultime si suddividono in “azioni logiche” che sono «almeno per la parte principale, il risultato di un ragionamento» e in “azioni non logiche” che «hanno origine principalmente da un determinato stato
Soriano-Hernández, P.; del Castillo-Mussot, M.; Campirán-Chávez, I.; Montemayor-Aldrete, J. A.
2017-04-01
Forbes Magazine published its list of leading or strongest publicly-traded two thousand companies in the world (G-2000) based on four independent metrics: sales or revenues, profits, assets and market value. Every one of these wealth metrics yields particular information on the corporate size or wealth size of each firm. The G-2000 cumulative probability wealth distribution per employee (per capita) for all four metrics exhibits a two-class structure: quasi-exponential in the lower part, and a Pareto power-law in the higher part. These two-class structure per capita distributions are qualitatively similar to income and wealth distributions in many countries of the world, but the fraction of firms per employee within the high-class Pareto is about 49% in sales per employee, and 33% after averaging on the four metrics, whereas in countries the fraction of rich agents in the Pareto zone is less than 10%. The quasi-exponential zone can be adjusted by Gamma or Log-normal distributions. On the other hand, Forbes classifies the G-2000 firms in 82 different industries or economic activities. Within each industry, the wealth distribution per employee also follows a two-class structure, but when the aggregate wealth of firms in each industry for the four metrics is divided by the total number of employees in that industry, then the 82 points of the aggregate wealth distribution by industry per employee can be well adjusted by quasi-exponential curves for the four metrics.
Inoue, J
2006-01-01
We present a simple microscopic model of economy to explain the Pareto law in tails of wealth distributions. Our model is a kind of the so-called Monkey class urn model in which $N$ urns (might be regarded as people in society) share $M=\\rho N$ balls (might be regarded as money) under global constraint (a conservation of the total amount of money) : $n_{1}+... +n_{N}=M$, where $n_{i}$ ($i=1,...,N$) means the number of balls in the $i$-th urn. Each urn possesses its own energy $E(n_{i})$. Then, we evaluate the probability $P(k)$ that an arbitrary urn has $k$ balls by using statistical mechanics. If we choose the energy function as $E(n_{i}) = \\epsilon_{i} n_{i}$, where $\\epsilon_{i}$ means energy level of $i$-th urn obeying some distribution (density of states) $D(\\epsilon) \\sim \\epsilon^{\\alpha}$, we find that below the critical temperature at high density level ($\\rho \\equiv M/N \\gg 1$), Bose-Einstein condensation occurs and most of the urns falls in the lowest energy level $\\epsilon=0$. As the result, the d...
Sun, Kaibiao; Kasperski, Andrzej; Tian, Yuan
2014-10-01
The aim of this study is the optimization of a product-driven self-cycling bioprocess and presentation of a way to determine the best possible decision variables out of a set of alternatives based on the designed model. Initially, a product-driven generalized kinetic model, which allows a flexible choice of the most appropriate kinetics is designed and analysed. The optimization problem is given as the bi-objective one, where maximization of biomass productivity and minimization of unproductive loss of substrate are the objective functions. Then, the Pareto fronts are calculated for exemplary kinetics. It is found that in the designed bioprocess, a decrease of emptying/refilling fraction and an increase of substrate feeding concentration cause an increase of the biomass productivity. An increase of emptying/refilling fraction and a decrease of substrate feeding concentration cause a decrease of unproductive loss of substrate. The preferred solutions are calculated using the minimum distance from an ideal solution method, while giving proposals of their modifications derived from a decision maker's reactions to the generated solutions.
Evolutionary financial market models
Ponzi, A.; Aizawa, Y.
2000-12-01
We study computer simulations of two financial market models, the second a simplified model of the first. The first is a model of the self-organized formation and breakup of crowds of traders, motivated by the dynamics of competitive evolving systems which shows interesting self-organized critical (SOC)-type behaviour without any fine tuning of control parameters. This SOC-type avalanching and stasis appear as realistic volatility clustering in the price returns time series. The market becomes highly ordered at ‘crashes’ but gradually loses this order through randomization during the intervening stasis periods. The second model is a model of stocks interacting through a competitive evolutionary dynamic in a common stock exchange. This model shows a self-organized ‘market-confidence’. When this is high the market is stable but when it gets low the market may become highly volatile. Volatile bursts rapidly increase the market confidence again. This model shows a phase transition as temperature parameter is varied. The price returns time series in the transition region is very realistic power-law truncated Levy distribution with clustered volatility and volatility superdiffusion. This model also shows generally positive stock cross-correlations as is observed in real markets. This model may shed some light on why such phenomena are observed.
Evolutionary cytogenetics in salamanders.
Sessions, Stanley K
2008-01-01
Salamanders (Amphibia: Caudata/Urodela) have been the subject of numerous cytogenetic studies, and data on karyotypes and genome sizes are available for most groups. Salamanders show a more-or-less distinct dichotomy between families with large chromosome numbers and interspecific variation in chromosome number, relative size, and shape (i.e. position of the centromere), and those that exhibit very little variation in these karyological features. This dichotomy is the basis of a major model of karyotype evolution in salamanders involving a kind of 'karyotypic orthoselection'. Salamanders are also characterized by extremely large genomes (in terms of absolute mass of nuclear DNA) and extensive variation in genome size (and overall size of the chromosomes), which transcends variation in chromosome number and shape. The biological significance and evolution of chromosome number and shape within the karyotype is not yet understood, but genome size variation has been found to have strong phenotypic, biogeographic, and phylogenetic correlates that reveal information about the biological significance of this cytogenetic variable. Urodeles also present the advantage of only 10 families and less than 600 species, which facilitates the analysis of patterns within the entire order. The purpose of this review is to present a summary of what is currently known about overall patterns of variation in karyology and genome size in salamanders. These patterns are discussed within an evolutionary context.
朱富强
2011-01-01
现代主流经济学往往以帕累托效率原则来对制度安排进行评估和设计，但实际上，将帕累托优化作为制度改革的原则会遇到严重的问题。一方面，它的条件太强了会存在实践的可行性问题，因为不同帕累托改进取向及其带来的路径依赖等都会造成个体间的冲突，从而导致制度改革的滞后和停断；另一方面，它的条件太弱了会存在实践的保守性问题，因为帕累托改进只要求没有任何人遭受损失而没有考察收益的分配比例状况，从而在基于力量博弈的均衡理论指导下往往成为替既得利益者辩护的工具。现代主流经济学之所以崇尚帕累托改进原则，主要在于它是与自由交换信条相一致的，可以为个体主义方法论和市场竞争主义政策主张辩护。因此，从根本上说，帕累托效率是以原子个人主义的意识形态为前提的，但现代主流经济学又试图把它宣传成一种没有意识形态的纯技术性概念，从而凸显出“言”与“行”之间的不一致性。%Modern mainstream economics values and designs social institution by Pareto-efficiency principle, however, it will gives rise to some serious problems. Firstly, the condition is so strong as to make reformation of unfeasibility. The different direction of reform based on Pareto-efficiency principle and its path-dependance effects often brings about the conflict of interest, which will stop the course of reformation, secondly, the condition is so week as to make reformation of intensive conservatism. Pareto-efficiency principle only stresses that nobody is lost but does not care for the assignment proportion of income, which will mostly meets the requirement of the vested interest under the guidance of equilibrium theory based on strength gambling. The main reason why modern mainstream economics advocates Pareto-efficiency principle is that it is consistent with its belief of free
Islamic medicine and evolutionary medicine: a comparative analysis.
Saniotis, Arthur
2012-01-01
The advent of evolutionary medicine in the last two decades has provided new insights into the causes of human disease and possible preventative strategies. One of the strengths of evolutionary medicine is that it follows a multi-disciplinary approach. Such an approach is vital to future biomedicine as it enables for the infiltration of new ideas. Although evolutionary medicine uses Darwinian evolution as a heuristic for understanding human beings' susceptibility to disease, this is not necessarily in conflict with Islamic medicine. It should be noted that current evolutionary theory was first expounded by various Muslim scientists such as al-Jāḥiẓ, al-Ṭūsī, Ibn Khaldūn and Ibn Maskawayh centuries before Darwin and Wallace. In this way, evolution should not be viewed as being totally antithetical to Islam. This article provides a comparative overview of Islamic medicine and Evolutionary medicine as well as drawing points of comparison between the two approaches which enables their possible future integration.
Industrial Applications of Evolutionary Algorithms
Sanchez, Ernesto; Tonda, Alberto
2012-01-01
This book is intended as a reference both for experienced users of evolutionary algorithms and for researchers that are beginning to approach these fascinating optimization techniques. Experienced users will find interesting details of real-world problems, and advice on solving issues related to fitness computation, modeling and setting appropriate parameters to reach optimal solutions. Beginners will find a thorough introduction to evolutionary computation, and a complete presentation of all evolutionary algorithms exploited to solve different problems. The book could fill the gap between the
刘洪
2004-01-01
A multiple-objective evolutionary algorithm (MOEA) with a new Decision Making (DM) scheme for MOD of conceptual missile shapes was presented, which is contrived to determine suitable tradeoffs from Pareto optimal set using interactive preference articulation. There are two objective functions, to maximize ratio of lift to drag and to minimize radar cross-section (RCS) value. 3D computational electromagnetic solver was used to evaluate RCS, electromagnetic performance. 3D Navier-Stokes flow solver was adopted to evaluate aerodynamic performance. A flight mechanics solver was used to analyze the stability of the missile. Based on the MOEA, a synergetic optimization of missile shapes for aerodynamic and radar cross-section performance is completed. The results show that the proposed approach can be used in more complex optimization case of flight vehicles.
Yannibelli, Virginia; Amandi, Analía
2013-01-01
In this article, the project scheduling problem is addressed in order to assist project managers at the early stage of scheduling. Thus, as part of the problem, two priority optimization objectives for managers at that stage are considered. One of these objectives is to assign the most effective set of human resources to each project activity. The effectiveness of a human resource is considered to depend on its work context. The other objective is to minimize the project makespan. To solve the problem, a multi-objective evolutionary algorithm is proposed. This algorithm designs feasible schedules for a given project and evaluates the designed schedules in relation to each objective. The algorithm generates an approximation to the Pareto set as a solution to the problem. The computational experiments carried out on nine different instance sets are reported.
Cognition and Culture in Evolutionary Context.
Colmenares, Fernando; Hernández-Lloreda, María Victoria
2017-01-09
In humans and other animals, the individuals' ability to adapt efficiently and effectively to the niches they have actively contributed to construct relies heavily on an evolved psychology which has been shaped by biological, social, and cultural processes over evolutionary time. As expected, although many of the behavioral and cognitive components of this evolved psychology are widely shared across species, many others are species-unique. Although many animal species are known to acquire group-specific traditions (or cultures) via social learning, human culture is unique in terms of its contents and characteristics (observable and unobservable products, cumulative effects, norm conformity, and norm enforcement) and of its cognitive underpinnings (imitation, instructed teaching, and language). Here we provide a brief overview of some of the issues that are currently tackled in the field. We also highlight some of the strengths of a biological, comparative, non-anthropocentric and evolutionarily grounded approach to the study of culture. The main contributions of this approach to the science of culture are its emphasis (a) on the integration of information on mechanisms, function, and evolution, and on mechanistic factors located at different levels of the biological hierarchy, and (b) on the search for general principles that account for commonalities and differences between species, both in the cultural products and in the processes of innovation, dissemination, and accumulation involved that operate during developmental and evolutionary timespans.
Collective influence in evolutionary social dilemmas
Szolnoki, Attila; Perc, Matjaž
2016-03-01
When evolutionary games are contested in structured populations, the degree of each player in the network plays an important role. If they exist, hubs often determine the fate of the population in remarkable ways. Recent research based on optimal percolation in random networks has shown, however, that the degree is neither the sole nor the best predictor of influence in complex networks. Low-degree nodes may also be optimal influencers if they are hierarchically linked to hubs. Taking this into account leads to the formalism of collective influence in complex networks, which as we show here, has far-reaching implications for the favorable resolution of social dilemmas. In particular, there exists an optimal hierarchical depth for the determination of collective influence that we use to describe the potency of players for passing their strategies, which depends on the strength of the social dilemma. Interestingly, the degree, which corresponds to the baseline depth zero, is optimal only when the temptation to defect is small. Our research reveals that evolutionary success stories are related to spreading processes which are rooted in favorable hierarchical structures that extend beyond local neighborhoods.
Li, Miqing; Yang, Shengxiang; Zheng, Jinhua; Liu, Xiaohui
2014-01-01
The Euclidean minimum spanning tree (EMST), widely used in a variety of domains, is a minimum spanning tree of a set of points in space where the edge weight between each pair of points is their Euclidean distance. Since the generation of an EMST is entirely determined by the Euclidean distance between solutions (points), the properties of EMSTs have a close relation with the distribution and position information of solutions. This paper explores the properties of EMSTs and proposes an EMST-based evolutionary algorithm (ETEA) to solve multi-objective optimization problems (MOPs). Unlike most EMO algorithms that focus on the Pareto dominance relation, the proposed algorithm mainly considers distance-based measures to evaluate and compare individuals during the evolutionary search. Specifically, in ETEA, four strategies are introduced: (1) An EMST-based crowding distance (ETCD) is presented to estimate the density of individuals in the population; (2) A distance comparison approach incorporating ETCD is used to assign the fitness value for individuals; (3) A fitness adjustment technique is designed to avoid the partial overcrowding in environmental selection; (4) Three diversity indicators-the minimum edge, degree, and ETCD-with regard to EMSTs are applied to determine the survival of individuals in archive truncation. From a series of extensive experiments on 32 test instances with different characteristics, ETEA is found to be competitive against five state-of-the-art algorithms and its predecessor in providing a good balance among convergence, uniformity, and spread.
Zatarain Salazar, Jazmin; Reed, Patrick M.; Herman, Jonathan D.; Giuliani, Matteo; Castelletti, Andrea
2016-06-01
Globally, the pressures of expanding populations, climate change, and increased energy demands are motivating significant investments in re-operationalizing existing reservoirs or designing operating policies for new ones. These challenges require an understanding of the tradeoffs that emerge across the complex suite of multi-sector demands in river basin systems. This study benchmarks our current capabilities to use Evolutionary Multi-Objective Direct Policy Search (EMODPS), a decision analytic framework in which reservoirs' candidate operating policies are represented using parameterized global approximators (e.g., radial basis functions) then those parameterized functions are optimized using multi-objective evolutionary algorithms to discover the Pareto approximate operating policies. We contribute a comprehensive diagnostic assessment of modern MOEAs' abilities to support EMODPS using the Conowingo reservoir in the Lower Susquehanna River Basin, Pennsylvania, USA. Our diagnostic results highlight that EMODPS can be very challenging for some modern MOEAs and that epsilon dominance, time-continuation, and auto-adaptive search are helpful for attaining high levels of performance. The ɛ-MOEA, the auto-adaptive Borg MOEA, and ɛ-NSGAII all yielded superior results for the six-objective Lower Susquehanna benchmarking test case. The top algorithms show low sensitivity to different MOEA parameterization choices and high algorithmic reliability in attaining consistent results for different random MOEA trials. Overall, EMODPS poses a promising method for discovering key reservoir management tradeoffs; however algorithmic choice remains a key concern for problems of increasing complexity.
Hui Lu
2014-01-01
Full Text Available Test task scheduling problem (TTSP is a complex optimization problem and has many local optima. In this paper, a hybrid chaotic multiobjective evolutionary algorithm based on decomposition (CMOEA/D is presented to avoid becoming trapped in local optima and to obtain high quality solutions. First, we propose an improving integrated encoding scheme (IES to increase the efficiency. Then ten chaotic maps are applied into the multiobjective evolutionary algorithm based on decomposition (MOEA/D in three phases, that is, initial population and crossover and mutation operators. To identify a good approach for hybrid MOEA/D and chaos and indicate the effectiveness of the improving IES several experiments are performed. The Pareto front and the statistical results demonstrate that different chaotic maps in different phases have different effects for solving the TTSP especially the circle map and ICMIC map. The similarity degree of distribution between chaotic maps and the problem is a very essential factor for the application of chaotic maps. In addition, the experiments of comparisons of CMOEA/D and variable neighborhood MOEA/D (VNM indicate that our algorithm has the best performance in solving the TTSP.
Qianwang Deng
2017-01-01
Full Text Available Flexible job-shop scheduling problem (FJSP is an NP-hard puzzle which inherits the job-shop scheduling problem (JSP characteristics. This paper presents a bee evolutionary guiding nondominated sorting genetic algorithm II (BEG-NSGA-II for multiobjective FJSP (MO-FJSP with the objectives to minimize the maximal completion time, the workload of the most loaded machine, and the total workload of all machines. It adopts a two-stage optimization mechanism during the optimizing process. In the first stage, the NSGA-II algorithm with T iteration times is first used to obtain the initial population N, in which a bee evolutionary guiding scheme is presented to exploit the solution space extensively. In the second stage, the NSGA-II algorithm with GEN iteration times is used again to obtain the Pareto-optimal solutions. In order to enhance the searching ability and avoid the premature convergence, an updating mechanism is employed in this stage. More specifically, its population consists of three parts, and each of them changes with the iteration times. What is more, numerical simulations are carried out which are based on some published benchmark instances. Finally, the effectiveness of the proposed BEG-NSGA-II algorithm is shown by comparing the experimental results and the results of some well-known algorithms already existed.
Deng, Qianwang; Gong, Guiliang; Gong, Xuran; Zhang, Like; Liu, Wei; Ren, Qinghua
2017-01-01
Flexible job-shop scheduling problem (FJSP) is an NP-hard puzzle which inherits the job-shop scheduling problem (JSP) characteristics. This paper presents a bee evolutionary guiding nondominated sorting genetic algorithm II (BEG-NSGA-II) for multiobjective FJSP (MO-FJSP) with the objectives to minimize the maximal completion time, the workload of the most loaded machine, and the total workload of all machines. It adopts a two-stage optimization mechanism during the optimizing process. In the first stage, the NSGA-II algorithm with T iteration times is first used to obtain the initial population N, in which a bee evolutionary guiding scheme is presented to exploit the solution space extensively. In the second stage, the NSGA-II algorithm with GEN iteration times is used again to obtain the Pareto-optimal solutions. In order to enhance the searching ability and avoid the premature convergence, an updating mechanism is employed in this stage. More specifically, its population consists of three parts, and each of them changes with the iteration times. What is more, numerical simulations are carried out which are based on some published benchmark instances. Finally, the effectiveness of the proposed BEG-NSGA-II algorithm is shown by comparing the experimental results and the results of some well-known algorithms already existed.
A multi-objective evolutionary algorithm for protein structure prediction with immune operators.
Judy, M V; Ravichandran, K S; Murugesan, K
2009-08-01
Genetic algorithms (GA) are often well suited for optimisation problems involving several conflicting objectives. It is more suitable to model the protein structure prediction problem as a multi-objective optimisation problem since the potential energy functions used in the literature to evaluate the conformation of a protein are based on the calculations of two different interaction energies: local (bond atoms) and non-local (non-bond atoms) and experiments have shown that those types of interactions are in conflict, by using the potential energy function, Chemistry at Harvard Macromolecular Mechanics. In this paper, we have modified the immune inspired Pareto archived evolutionary strategy (I-PAES) algorithm and denoted it as MI-PAES. It can effectively exploit some prior knowledge about the hydrophobic interactions, which is one of the most important driving forces in protein folding to make vaccines. The proposed MI-PAES is comparable with other evolutionary algorithms proposed in literature, both in terms of best solution found and the computational time and often results in much better search ability than that of the canonical GA.
Evolutionary computation for reinforcement learning
S. Whiteson
2012-01-01
Algorithms for evolutionary computation, which simulate the process of natural selection to solve optimization problems, are an effective tool for discovering high-performing reinforcement-learning policies. Because they can automatically find good representations, handle continuous action spaces, a
Evolutionary disarmament in interspecific competition.
Kisdi, E; Geritz, S A
2001-12-22
Competitive asymmetry, which is the advantage of having a larger body or stronger weaponry than a contestant, drives spectacular evolutionary arms races in intraspecific competition. Similar asymmetries are well documented in interspecific competition, yet they seldom lead to exaggerated traits. Here we demonstrate that two species with substantially different size may undergo parallel coevolution towards a smaller size under the same ecological conditions where a single species would exhibit an evolutionary arms race. We show that disarmament occurs for a wide range of parameters in an ecologically explicit model of competition for a single shared resource; disarmament also occurs in a simple Lotka-Volterra competition model. A key property of both models is the interplay between evolutionary dynamics and population density. The mechanism does not rely on very specific features of the model. Thus, evolutionary disarmament may be widespread and may help to explain the lack of interspecific arms races.
Molluscan Evolutionary Genomics
Simison, W. Brian; Boore, Jeffrey L.
2005-12-01
In the last 20 years there have been dramatic advances in techniques of high-throughput DNA sequencing, most recently accelerated by the Human Genome Project, a program that has determined the three billion base pair code on which we are based. Now this tremendous capability is being directed at other genome targets that are being sampled across the broad range of life. This opens up opportunities as never before for evolutionary and organismal biologists to address questions of both processes and patterns of organismal change. We stand at the dawn of a new 'modern synthesis' period, paralleling that of the early 20th century when the fledgling field of genetics first identified the underlying basis for Darwin's theory. We must now unite the efforts of systematists, paleontologists, mathematicians, computer programmers, molecular biologists, developmental biologists, and others in the pursuit of discovering what genomics can teach us about the diversity of life. Genome-level sampling for mollusks to date has mostly been limited to mitochondrial genomes and it is likely that these will continue to provide the best targets for broad phylogenetic sampling in the near future. However, we are just beginning to see an inroad into complete nuclear genome sequencing, with several mollusks and other eutrochozoans having been selected for work about to begin. Here, we provide an overview of the state of molluscan mitochondrial genomics, highlight a few of the discoveries from this research, outline the promise of broadening this dataset, describe upcoming projects to sequence whole mollusk nuclear genomes, and challenge the community to prepare for making the best use of these data.
Evolutionary constraints or opportunities?
Sharov, Alexei A.
2014-01-01
Natural selection is traditionally viewed as a leading factor of evolution, whereas variation is assumed to be random and non-directional. Any order in variation is attributed to epigenetic or developmental constraints that can hinder the action of natural selection. In contrast I consider the positive role of epigenetic mechanisms in evolution because they provide organisms with opportunities for rapid adaptive change. Because the term “constraint” has negative connotations, I use the term “regulated variation” to emphasize the adaptive nature of phenotypic variation, which helps populations and species to survive and evolve in changing environments. The capacity to produce regulated variation is a phenotypic property, which is not described in the genome. Instead, the genome acts as a switchboard, where mostly random mutations switch “on” or “off” preexisting functional capacities of organism components. Thus, there are two channels of heredity: informational (genomic) and structure-functional (phenotypic). Functional capacities of organisms most likely emerged in a chain of modifications and combinations of more simple ancestral functions. The role of DNA has been to keep records of these changes (without describing the result) so that they can be reproduced in the following generations. Evolutionary opportunities include adjustments of individual functions, multitasking, connection between various components of an organism, and interaction between organisms. The adaptive nature of regulated variation can be explained by the differential success of lineages in macro-evolution. Lineages with more advantageous patterns of regulated variation are likely to produce more species and secure more resources (i.e., long-term lineage selection). PMID:24769155
王杜娟; 刘锋; 王建军; 王延章
2016-01-01
For single machine scheduling problem minimizing total weighted completion time , when job ’ s pro-cessing time could be compressed by allocating extra resources , jobs ’ processing sequence and compression times are optimized simultaneously .Two in-conflicts objectives are concerned: schedule performance measured by compressed jobs ’ total weighted completion times , and resource cost measured by linear function of jobs ’ compression times.The problem has been proved to be NP -hard.In order to bridge the gap that this problem has rarely been solved from the perspective of Pareto optimization , we make use of algorithm hybridization to im-prove classic NSGA-II which tends to be pre-mature during evolution .In hybridized algorithm , Archived Multi-Objective Simulated Annealing ( AMOSA) is integrated to jump out of local optimum , external archive is built up to enhance population diversity , and master/slave parallel structure is designed to improve solving efficiency .Fi-nally for verification purposes , first hybridized algorithm is used to solve Benchmark test functions ZDT 1-6, and the results demonstrate that the proposed method is applicable and effective for test functions with various struc -tures and shapes .Second, problem features are utilized to design effective encoding scheme and correspondingly randomly generated problem instances are solved .The analysis of proximity and diversity of obtained Pareto front further verify the effectiveness of hybridized algorithm for solving single machine scheduling with controllable pro -cessing time to minimize total weighted completion times .%针对单机环境最优化加权总完工时间问题，当工件加工时间可通过分配资源进行压缩时，研究对工件的加工次序和时间压缩量的优化，从而权衡调度性能目标和资源成本目标。调度性能目标为压缩后工件的加权总完工时间，资源成本目标为工件压缩量的线性函数。此问题复杂性已
Svensson, Mats Krüger
2015-01-01
This thesis investigates the use of evolutionary algorithms (EAs) to evolve and optimize lacing patterns of spokes for a bicycle wheel. There are multiple objectives and tradeoffs to be considered when evaluating a lacing pattern, for instance, strength versus balance. To handle this, an evolutionary multiobjective optimization (EMO) method has been used. Various EMO algorithms and approaches are tested. Among these, the new NSGA-III algorithm is used. Different representations of the lac...
Optimal Control of Evolutionary Dynamics
Chakrabarti, Raj; McLendon, George
2008-01-01
Elucidating the fitness measures optimized during the evolution of complex biological systems is a major challenge in evolutionary theory. We present experimental evidence and an analytical framework demonstrating how biochemical networks exploit optimal control strategies in their evolutionary dynamics. Optimal control theory explains a striking pattern of extremization in the redox potentials of electron transport proteins, assuming only that their fitness measure is a control objective functional with bounded controls.
Schizophrenia-an evolutionary enigma?
Brüne, Martin
2004-03-01
The term 'schizophrenia' refers to a group of disorders that have been described in every human culture. Two apparently well established findings have corroborated the need for an evolutionary explanation of these disorders: (1) cross-culturally stable incidence rates and (2) decreased fecundity of the affected individuals. The rationale behind this relates to the evolutionary paradox that susceptibility genes for schizophrenia are obviously preserved in the human genepool, despite fundamental reproductive disadvantages associated with the disorders. Some researchers have therefore proposed that a compensatory advantage must exist in people who are carriers of these genes or in their first-degree relatives. Such advantages were hypothesised to be outside the brain (e.g. greater resistance against toxins or infectious diseases), or within the social domain (e.g. schizotypal shamans, creativity). More specifically, T.J. Crow has suggested an evolutionary theory of schizophrenia that relates the disorders to an extreme of variation of hemispheric specialisation and the evolution of language due to a single gene mutation located on homologous regions of the sex chromosomes. None of the evolutionary scenarios does, however, fully account for the diversity of the symptomatology, nor does any one hypothesis acknowledge the objection that the mere prevalence of a disorder must not be confused with adaptation. In the present article, I therefore discuss the evolutionary hypotheses of schizophrenia, arguing that a symptom-based approach to psychotic disorders in evolutionary perspective may improve upon the existing models of schizophrenia.
FATIGUE STRENGTH OF HIGH-STRENGTH STEEL,
coldhardened by deforming to 83%. It was found that it has low static notch sensitivity (lower than that of heat-treated steels), that static strength ...is raised appreciably by increased cold plastic deformation, and that its fatigue strength is raised substantially by mechanical polishing. (Author)
Simons, C. L.; Parmee, I. C.
2007-07-01
Although object-oriented conceptual software design is difficult to learn and perform, computational tool support for the conceptual software designer is limited. In conceptual engineering design, however, computational tools exploiting interactive evolutionary computation (EC) have shown significant utility. This article investigates the cross-disciplinary technology transfer of search-based EC from engineering design to software engineering design in an attempt to provide support for the conceptual software designer. Firstly, genetic operators inspired by genetic algorithms (GAs) and evolutionary programming are evaluated for their effectiveness against a conceptual software design representation using structural cohesion as an objective fitness function. Building on this evaluation, a multi-objective GA inspired by a non-dominated Pareto sorting approach is investigated for an industrial-scale conceptual design problem. Results obtained reveal a mass of interesting and useful conceptual software design solution variants of equivalent optimality—a typical characteristic of successful multi-objective evolutionary search techniques employed in conceptual engineering design. The mass of software design solution variants produced suggests that transferring search-based technology across disciplines has significant potential to provide computationally intelligent tool support for the conceptual software designer.
A Hybrid Evolutionary Algorithm for Discrete Optimization
J. Bhuvana
2015-03-01
Full Text Available Most of the real world multi-objective problems demand us to choose one Pareto optimal solution out of a finite set of choices. Flexible job shop scheduling problem is one such problem whose solutions are required to be selected from a discrete solution space. In this study we have designed a hybrid genetic algorithm to solve this scheduling problem. Hybrid genetic algorithms combine both the aspects of the search, exploration and exploitation of the search space. Proposed algorithm, Hybrid GA with Discrete Local Search, performs global search through the GA and exploits the locality through discrete local search. Proposed hybrid algorithm not only has the ability to generate Pareto optimal solutions and also identifies them with less computation. Five different benchmark test instances are used to evaluate the performance of the proposed algorithm. Results observed shown that the proposed algorithm has produced the known Pareto optimal solutions through exploration and exploitation of the search space with less number of functional evaluations.
Bijani, Rodrigo; Lelièvre, Peter G.; Ponte-Neto, Cosme F.; Farquharson, Colin G.
2017-05-01
This paper is concerned with the applicability of Pareto Multi-Objective Global Optimization (PMOGO) algorithms for solving different types of geophysical inverse problems. The standard deterministic approach is to combine the multiple objective functions (i.e. data misfit, regularization and joint coupling terms) in a weighted-sum aggregate objective function and minimize using local (decent-based) smooth optimization methods. This approach has some disadvantages: (1) appropriate weights must be determined for the aggregate, (2) the objective functions must be differentiable and (3) local minima entrapment may occur. PMOGO algorithms can overcome these drawbacks but introduce increased computational effort. Previous work has demonstrated how PMOGO algorithms can overcome the first issue for single data set geophysical inversion, that is, the trade-off between data misfit and model regularization. However, joint inversion, which can involve many weights in the aggregate, has seen little study. The advantage of PMOGO algorithms for the other two issues has yet to be addressed in the context of geophysical inversion. In this paper, we implement a PMOGO genetic algorithm and apply it to physical-property-, lithology- and surface-geometry-based inverse problems to demonstrate the advantages of using a global optimization strategy. Lithological inversions work on a mesh but use integer model parameters representing rock unit identifiers instead of continuous physical properties. Surface geometry inversions change the geometry of wireframe surfaces that represent the contacts between discrete rock units. Despite the potentially high computational requirements of global optimization algorithms (compared to local), their application to realistically sized 2-D geophysical inverse problems is within reach of current capacity of standard computers. Furthermore, they open the door to geophysical inverse problems that could not otherwise be considered through traditional
Langousis, Andreas; Mamalakis, Antonios; Puliga, Michelangelo; Deidda, Roberto
2016-04-01
In extreme excess modeling, one fits a generalized Pareto (GP) distribution to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches, such as nonparametric methods that are intended to locate the changing point between extreme and nonextreme regions of the data, graphical methods where one studies the dependence of GP-related metrics on the threshold level u, and Goodness-of-Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GP distribution model is applicable. Here we review representative methods for GP threshold detection, discuss fundamental differences in their theoretical bases, and apply them to 1714 overcentennial daily rainfall records from the NOAA-NCDC database. We find that nonparametric methods are generally not reliable, while methods that are based on GP asymptotic properties lead to unrealistically high threshold and shape parameter estimates. The latter is justified by theoretical arguments, and it is especially the case in rainfall applications, where the shape parameter of the GP distribution is low; i.e., on the order of 0.1-0.2. Better performance is demonstrated by graphical methods and GoF metrics that rely on preasymptotic properties of the GP distribution. For daily rainfall, we find that GP threshold estimates range between 2 and 12 mm/d with a mean value of 6.5 mm/d, while the existence of quantization in the empirical records, as well as variations in their size, constitute the two most important factors that may significantly affect the accuracy of the obtained results.
Deidda, R.
2010-07-01
Previous studies indicate the generalized Pareto distribution (GPD) as a suitable distribution function to reliably describe the exceedances of daily rainfall records above a proper optimum threshold, which should be selected as small as possible to retain the largest sample while assuring an acceptable fitting. Such an optimum threshold may differ from site to site, affecting consequently not only the GPD scale parameter, but also the probability of threshold exceedance. Thus a first objective of this paper is to derive some expressions to parameterize a simple threshold-invariant three-parameter distribution function which is able to describe zero and non zero values of rainfall time series by assuring a perfect overlapping with the GPD fitted on the exceedances of any threshold larger than the optimum one. Since the proposed distribution does not depend on the local thresholds adopted for fitting the GPD, it will only reflect the on-site climatic signature and thus appears particularly suitable for hydrological applications and regional analyses. A second objective is to develop and test the Multiple Threshold Method (MTM) to infer the parameters of interest on the exceedances of a wide range of thresholds using again the concept of parameters threshold-invariance. We show the ability of the MTM in fitting historical daily rainfall time series recorded with different resolutions. Finally, we prove the supremacy of the MTM fit against the standard single threshold fit, often adopted for partial duration series, by evaluating and comparing the performances on Monte Carlo samples drawn by GPDs with different shape and scale parameters and different discretizations.
Anirban Mukhopadhyay
Full Text Available With the advancement of microarray technology, it is now possible to study the expression profiles of thousands of genes across different experimental conditions or tissue samples simultaneously. Microarray cancer datasets, organized as samples versus genes fashion, are being used for classification of tissue samples into benign and malignant or their subtypes. They are also useful for identifying potential gene markers for each cancer subtype, which helps in successful diagnosis of particular cancer types. In this article, we have presented an unsupervised cancer classification technique based on multiobjective genetic clustering of the tissue samples. In this regard, a real-coded encoding of the cluster centers is used and cluster compactness and separation are simultaneously optimized. The resultant set of near-Pareto-optimal solutions contains a number of non-dominated solutions. A novel approach to combine the clustering information possessed by the non-dominated solutions through Support Vector Machine (SVM classifier has been proposed. Final clustering is obtained by consensus among the clusterings yielded by different kernel functions. The performance of the proposed multiobjective clustering method has been compared with that of several other microarray clustering algorithms for three publicly available benchmark cancer datasets. Moreover, statistical significance tests have been conducted to establish the statistical superiority of the proposed clustering method. Furthermore, relevant gene markers have been identified using the clustering result produced by the proposed clustering method and demonstrated visually. Biological relationships among the gene markers are also studied based on gene ontology. The results obtained are found to be promising and can possibly have important impact in the area of unsupervised cancer classification as well as gene marker identification for multiple cancer subtypes.
Bijani, Rodrigo; Lelièvre, Peter G.; Ponte-Neto, Cosme F.; Farquharson, Colin G.
2017-02-01
This paper is concerned with the applicability of Pareto Multi-Objective Global Optimization (PMOGO) algorithms for solving different types of geophysical inverse problems. The standard deterministic approach is to combine the multiple objective functions (i.e. data misfit, regularization and joint coupling terms) in a weighted-sum aggregate objective function and minimize using local (decent-based) smooth optimization methods. This approach has some disadvantages: 1) appropriate weights must be determined for the aggregate, 2) the objective functions must be differentiable, and 3) local minima entrapment may occur. PMOGO algorithms can overcome these drawbacks but introduce increased computational effort. Previous work has demonstrated how PMOGO algorithms can overcome the first issue for single data set geophysical inversion, i.e. the tradeoff between data misfit and model regularization. However, joint inversion, which can involve many weights in the aggregate, has seen little study. The advantage of PMOGO algorithms for the other two issues has yet to be addressed in the context of geophysical inversion. In this paper, we implement a PMOGO genetic algorithm and apply it to physical property-, lithology- and surface geometry-based inverse problems to demonstrate the advantages of using a global optimization strategy. Lithological inversions work on a mesh but use integer model parameters representing rock unit identifiers instead of continuous physical properties. Surface geometry inversions change the geometry of wireframe surfaces that represent the contacts between discrete rock units. Despite the potentially high computational requirements of global optimization algorithms (compared to local), their application to realistically-sized 2D geophysical inverse problems is within reach of current capacity of standard computers. Furthermore, they open the door to geophysical inverse problems that could not otherwise be considered through traditional optimization
高鸿鹰; 武康平
2008-01-01
采用OLS方法测算我国各省、三大区域以及全国的城市人口规模分布和经济规模分布的Pareto指数(1997,2000,2003),对Pareto-指数进行跨区域和跨时间的对比分析,并实证分析我国城市规模分布的影响因素.结果表明,我国的规模分布显著地服从Pareto分布,并具有明显的结构性特征.工业化、产业结构以及运输能力对城市人口规模分布具有显著影响,而工业化和运输能力则是影响城市经济规模分布的重要因素.%This paper estimates the Pareto exponent of the city size (population size and economy size) distribution, all provinces, and three regions in China in 1997, 2000 and 2003 by OLS, comparatively analyzes the Pareto exponent cross section and times, and empirically analyzes the factors which impacts on the Pareto exponents of provinces. Our analyses show that the size distributions of cities in China follow the Pareto distribution and are of structural features. Variations in the value of the Pareto exponent of city population size distribution are significantly explained by industrialization, industry structure and regional transportation infrastructure, and variations in the value the Pareto exponents of city economy size distribution are significantly explained by industrialization and regional transportation infrastructure.
Strength and Balance Exercises
... Peripheral Artery Disease Venous Thromboembolism Aortic Aneurysm More Strength and Balance Exercises Updated:Sep 8,2016 If ... Be Safe While Being Active - Stretching & Flexibility Exercises - Strength & Balance Exercises - Problems & Solutions for Being Active - FAQs ...
Atashkari, K. [Department of Mechanical Engineering, Faculty of Engineering, The University of Guilan, P.O. Box 3756, Rasht (Iran, Islamic Republic of); Nariman-Zadeh, N. [Department of Mechanical Engineering, Faculty of Engineering, The University of Guilan, P.O. Box 3756, Rasht (Iran, Islamic Republic of)]. E-mail: nnzadeh@guilan.ac.ir; Goelcue, M. [Department of Mechanical Education, Technical Education faculty, Pamukkale University, 20017 Kinikli, Denizli (Turkey); Khalkhali, A. [Department of Mechanical Engineering, Faculty of Engineering, The University of Guilan, P.O. Box 3756, Rasht (Iran, Islamic Republic of); Jamali, A. [Department of Mechanical Engineering, Faculty of Engineering, The University of Guilan, P.O. Box 3756, Rasht (Iran, Islamic Republic of)
2007-03-15
The main reason for the efficiency decrease at part load conditions for four-stroke spark-ignition (SI) engines is the flow restriction at the cross-sectional area of the intake system. Traditionally, valve-timing has been designed to optimize operation at high engine-speed and wide open throttle conditions. Several investigations have demonstrated that improvements at part load conditions in engine performance can be accomplished if the valve-timing is variable. Controlling valve-timing can be used to improve the torque and power curve as well as to reduce fuel consumption and emissions. In this paper, a group method of data handling (GMDH) type neural network and evolutionary algorithms (EAs) are firstly used for modelling the effects of intake valve-timing (V {sub t}) and engine speed (N) of a spark-ignition engine on both developed engine torque (T) and fuel consumption (Fc) using some experimentally obtained training and test data. Using such obtained polynomial neural network models, a multi-objective EA (non-dominated sorting genetic algorithm, NSGA-II) with a new diversity preserving mechanism are secondly used for Pareto based optimization of the variable valve-timing engine considering two conflicting objectives such as torque (T) and fuel consumption (Fc). The comparison results demonstrate the superiority of the GMDH type models over feedforward neural network models in terms of the statistical measures in the training data, testing data and the number of hidden neurons. Further, it is shown that some interesting and important relationships, as useful optimal design principles, involved in the performance of the variable valve-timing four-stroke spark-ignition engine can be discovered by the Pareto based multi-objective optimization of the polynomial models. Such important optimal principles would not have been obtained without the use of both the GMDH type neural network modelling and the multi-objective Pareto optimization approach.
Structural symmetry in evolutionary games.
McAvoy, Alex; Hauert, Christoph
2015-10-06
In evolutionary game theory, an important measure of a mutant trait (strategy) is its ability to invade and take over an otherwise-monomorphic population. Typically, one quantifies the success of a mutant strategy via the probability that a randomly occurring mutant will fixate in the population. However, in a structured population, this fixation probability may depend on where the mutant arises. Moreover, the fixation probability is just one quantity by which one can measure the success of a mutant; fixation time, for instance, is another. We define a notion of homogeneity for evolutionary games that captures what it means for two single-mutant states, i.e. two configurations of a single mutant in an otherwise-monomorphic population, to be 'evolutionarily equivalent' in the sense that all measures of evolutionary success are the same for both configurations. Using asymmetric games, we argue that the term 'homogeneous' should apply to the evolutionary process as a whole rather than to just the population structure. For evolutionary matrix games in graph-structured populations, we give precise conditions under which the resulting process is homogeneous. Finally, we show that asymmetric matrix games can be reduced to symmetric games if the population structure possesses a sufficient degree of symmetry.
The major synthetic evolutionary transitions
Solé, Ricard
2016-01-01
Evolution is marked by well-defined events involving profound innovations that are known as ‘major evolutionary transitions'. They involve the integration of autonomous elements into a new, higher-level organization whereby the former isolated units interact in novel ways, losing their original autonomy. All major transitions, which include the origin of life, cells, multicellular systems, societies or language (among other examples), took place millions of years ago. Are these transitions unique, rare events? Have they instead universal traits that make them almost inevitable when the right pieces are in place? Are there general laws of evolutionary innovation? In order to approach this problem under a novel perspective, we argue that a parallel class of evolutionary transitions can be explored involving the use of artificial evolutionary experiments where alternative paths to innovation can be explored. These ‘synthetic’ transitions include, for example, the artificial evolution of multicellular systems or the emergence of language in evolved communicating robots. These alternative scenarios could help us to understand the underlying laws that predate the rise of major innovations and the possibility for general laws of evolved complexity. Several key examples and theoretical approaches are summarized and future challenges are outlined. This article is part of the themed issue ‘The major synthetic evolutionary transitions’. PMID:27431528
Connaughton, Daniel; Connaughton, Angela; Poor, Linda
2001-01-01
Strength training can be fun, safe, and appropriate for young girls and women and is an important component of any fitness program when combined with appropriate cardiovascular and flexibility activities. Concerns and misconceptions regarding girls' strength training are discussed, presenting general principles of strength training for children…
Connaughton, Daniel; Connaughton, Angela; Poor, Linda
2001-01-01
Strength training can be fun, safe, and appropriate for young girls and women and is an important component of any fitness program when combined with appropriate cardiovascular and flexibility activities. Concerns and misconceptions regarding girls' strength training are discussed, presenting general principles of strength training for children…
Neuronal boost to evolutionary dynamics.
de Vladar, Harold P; Szathmáry, Eörs
2015-12-06
Standard evolutionary dynamics is limited by the constraints of the genetic system. A central message of evolutionary neurodynamics is that evolutionary dynamics in the brain can happen in a neuronal niche in real time, despite the fact that neurons do not reproduce. We show that Hebbian learning and structural synaptic plasticity broaden the capacity for informational replication and guided variability provided a neuronally plausible mechanism of replication is in place. The synergy between learning and selection is more efficient than the equivalent search by mutation selection. We also consider asymmetric landscapes and show that the learning weights become correlated with the fitness gradient. That is, the neuronal complexes learn the local properties of the fitness landscape, resulting in the generation of variability directed towards the direction of fitness increase, as if mutations in a genetic pool were drawn such that they would increase reproductive success. Evolution might thus be more efficient within evolved brains than among organisms out in the wild.
Neuronal boost to evolutionary dynamics
de Vladar, Harold P.; Szathmáry, Eörs
2015-01-01
Standard evolutionary dynamics is limited by the constraints of the genetic system. A central message of evolutionary neurodynamics is that evolutionary dynamics in the brain can happen in a neuronal niche in real time, despite the fact that neurons do not reproduce. We show that Hebbian learning and structural synaptic plasticity broaden the capacity for informational replication and guided variability provided a neuronally plausible mechanism of replication is in place. The synergy between learning and selection is more efficient than the equivalent search by mutation selection. We also consider asymmetric landscapes and show that the learning weights become correlated with the fitness gradient. That is, the neuronal complexes learn the local properties of the fitness landscape, resulting in the generation of variability directed towards the direction of fitness increase, as if mutations in a genetic pool were drawn such that they would increase reproductive success. Evolution might thus be more efficient within evolved brains than among organisms out in the wild. PMID:26640653
Multivariate Evolutionary Analyses in Astrophysics
Fraix-Burnet, Didier
2011-01-01
The large amount of data on galaxies, up to higher and higher redshifts, asks for sophisticated statistical approaches to build adequate classifications. Multivariate cluster analyses, that compare objects for their global similarities, are still confidential in astrophysics, probably because their results are somewhat difficult to interpret. We believe that the missing key is the unavoidable characteristics in our Universe: evolution. Our approach, known as Astrocladistics, is based on the evolutionary nature of both galaxies and their properties. It gathers objects according to their "histories" and establishes an evolutionary scenario among groups of objects. In this presentation, I show two recent results on globular clusters and earlytype galaxies to illustrate how the evolutionary concepts of Astrocladistics can also be useful for multivariate analyses such as K-means Cluster Analysis.
Evolutionary engineering for industrial microbiology.
Vanee, Niti; Fisher, Adam B; Fong, Stephen S
2012-01-01
Superficially, evolutionary engineering is a paradoxical field that balances competing interests. In natural settings, evolution iteratively selects and enriches subpopulations that are best adapted to a particular ecological niche using random processes such as genetic mutation. In engineering desired approaches utilize rational prospective design to address targeted problems. When considering details of evolutionary and engineering processes, more commonality can be found. Engineering relies on detailed knowledge of the problem parameters and design properties in order to predict design outcomes that would be an optimized solution. When detailed knowledge of a system is lacking, engineers often employ algorithmic search strategies to identify empirical solutions. Evolution epitomizes this iterative optimization by continuously diversifying design options from a parental design, and then selecting the progeny designs that represent satisfactory solutions. In this chapter, the technique of applying the natural principles of evolution to engineer microbes for industrial applications is discussed to highlight the challenges and principles of evolutionary engineering.
Kirlik, G; Zhang, H [University of Maryland School of Medicine, Baltimore, MD (United States)
2015-06-15
Purpose: To present a novel multi-criteria optimization (MCO) solution approach that generates well-dispersed representation of the Pareto front for radiation treatment planning. Methods: Different algorithms have been proposed and implemented in commercial planning software to generate MCO plans for external-beam radiation therapy. These algorithms consider convex optimization problems. We propose a grid-based algorithm to generate well-dispersed treatment plans over Pareto front. Our method is able to handle nonconvexity in the problem to deal with dose-volume objectives/constraints, biological objectives, such as equivalent uniform dose (EUD), tumor control probability (TCP), normal tissue complication probability (NTCP), etc. In addition, our algorithm is able to provide single MCO plan when clinicians are targeting narrow bounds of objectives for patients. In this situation, usually none of the generated plans were within the bounds and a solution is difficult to identify via manual navigation. We use the subproblem formulation utilized in the grid-based algorithm to obtain a plan within the specified bounds. The subproblem aims to generate a solution that maps into the rectangle defined by the bounds. If such a solution does not exist, it generates the solution closest to the rectangle. We tested our method with 10 locally advanced head and neck cancer cases. Results: 8 objectives were used including 3 different objectives for primary target volume, high-risk and low-risk target volumes, and 5 objectives for each of the organs-at-risk (OARs) (two parotids, spinal cord, brain stem and oral cavity). Given tight bounds, uniform dose was achieved for all targets while as much as 26% improvement was achieved in OAR sparing comparing to clinical plans without MCO and previously proposed MCO method. Conclusion: Our method is able to obtain well-dispersed treatment plans to attain better approximation for convex and nonconvex Pareto fronts. Single treatment plan can
Evolutionary optimization of optical antennas
Feichtner, Thorsten; Kiunke, Markus; Hecht, Bert
2012-01-01
The design of nano-antennas is so far mainly inspired by radio-frequency technology. However, material properties and experimental settings need to be reconsidered at optical frequencies, which entails the need for alternative optimal antenna designs. Here a checkerboard-type, initially random array of gold cubes is subjected to evolutionary optimization. To illustrate the power of the approach we demonstrate that by optimizing the near-field intensity enhancement the evolutionary algorithm finds a new antenna geometry, essentially a split-ring/two-wire antenna hybrid which surpasses by far the performance of a conventional gap antenna by shifting the n=1 split-ring resonance into the optical regime.
Evolutionary Aesthetics and Print Advertising
Kamil Luczaj
2015-06-01
Full Text Available The article analyzes the extent to which predictions based on the theory of evolutionary aesthetics are utilized by the advertising industry. The purpose of a comprehensive content analysis of print advertising is to determine whether the items indicated by evolutionists such as animals, flowers, certain types of landscapes, beautiful humans, and some colors are part of real advertising strategies. This article has shown that many evolutionary hypotheses (although not all of them are supported by empirical data. Along with these hypotheses, some inferences from Bourdieu’s cultural capital theory were tested. It turned out that advertising uses both biological schemata and cultural patterns to make an image more likable.
Evolutionary Dynamics of Biological Games
Nowak, Martin A.; Sigmund, Karl
2004-02-01
Darwinian dynamics based on mutation and selection form the core of mathematical models for adaptation and coevolution of biological populations. The evolutionary outcome is often not a fitness-maximizing equilibrium but can include oscillations and chaos. For studying frequency-dependent selection, game-theoretic arguments are more appropriate than optimization algorithms. Replicator and adaptive dynamics describe short- and long-term evolution in phenotype space and have found applications ranging from animal behavior and ecology to speciation, macroevolution, and human language. Evolutionary game theory is an essential component of a mathematical and computational approach to biology.
Diversity-Guided Evolutionary Algorithms
Ursem, Rasmus Kjær
2002-01-01
Population diversity is undoubtably a key issue in the performance of evolutionary algorithms. A common hypothesis is that high diversity is important to avoid premature convergence and to escape local optima. Various diversity measures have been used to analyze algorithms, but so far few...... algorithms have used a measure to guide the search. The diversity-guided evolutionary algorithm (DGEA) uses the wellknown distance-to-average-point measure to alternate between phases of exploration (mutation) and phases of exploitation (recombination and selection). The DGEA showed remarkable results...
Zaninetti, L
2015-01-01
Recently it could be shown ( that the impact crater size-frequency distribution of Pluto (based on an analysis of first images obtained by the recent New Horizons flyby) follows a power law alpha = 2.4926 in the interval of diameter (D) values ranging from 3.75 km to the largest determined value of 37.77 km. A reanalysis of this data set revealed that the whole crater SFD (i.e., with values in the interval of 1.2-37.7 km) can be described by a truncated Pareto distribution.
Badler, N. I.; Lee, P.; Wong, S.
1985-01-01
Strength modeling is a complex and multi-dimensional issue. There are numerous parameters to the problem of characterizing human strength, most notably: (1) position and orientation of body joints; (2) isometric versus dynamic strength; (3) effector force versus joint torque; (4) instantaneous versus steady force; (5) active force versus reactive force; (6) presence or absence of gravity; (7) body somatotype and composition; (8) body (segment) masses; (9) muscle group envolvement; (10) muscle size; (11) fatigue; and (12) practice (training) or familiarity. In surveying the available literature on strength measurement and modeling an attempt was made to examine as many of these parameters as possible. The conclusions reached at this point toward the feasibility of implementing computationally reasonable human strength models. The assessment of accuracy of any model against a specific individual, however, will probably not be possible on any realistic scale. Taken statistically, strength modeling may be an effective tool for general questions of task feasibility and strength requirements.
Ledertoug, Mette Marie
Individual paper presentation: The ‘Strength Compass’. The results of a PhDresearch project among schoolchildren (age 6-16) identifying VIAstrengths concerning age, gender, mother-tongue-langue and possible child psychiatric diagnosis. Strengths-based interventions in schools have a theoretical...... foundation in research in VIA-strengths by Seligman & Peterson (2004) and in research on strengths by Linley (2008). Based on this research the VIA-test was created for adults and later for children and youths from the age of 10. For children younger than 10 years of age Peterson & Park (2011) have made...... interviews with the parents. For younger children there has been no possibility to test for strengths. In a Danish PhD project a tool to map children’s strengths was needed for children aged 6-16 and with permission from the VIA-institute ‘The Strength Compass’ was made in cooperation with The Danish...
Deidda, Roberto; Mamalakis, Antonis; Langousis, Andreas
2015-04-01
One of the most crucial issues in statistical hydrology is the estimation of extreme rainfall from data. To that extent, based on asymptotic arguments from Extreme Excess (EE) theory, several studies have focused on developing new, or improving existing methods to fit a Generalized Pareto Distribution (GPD) model to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches that can be grouped into three basic classes: a) non-parametric methods that locate the changing point between extreme and non-extreme regions of the data, b) graphical methods where one studies the dependence of the GPD parameters (or related metrics) to the threshold level u, and c) Goodness of Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GPD model is applicable. In this work, we review representative methods for GPD threshold detection, discuss fundamental differences in their theoretical bases, and apply them to daily rainfall records from the NOAA-NCDC open-access database (http://www.ncdc.noaa.gov/oa/climate/ghcn-daily/). We find that non-parametric methods that locate the changing point between extreme and non-extreme regions of the data are generally not reliable, while graphical methods and GoF metrics that rely on limiting arguments for the upper distribution tail lead to unrealistically high thresholds u. The latter is expected, since one checks the validity of the limiting arguments rather than the applicability of a GPD distribution model. Better performance is demonstrated by graphical methods and GoF metrics that rely on GPD properties. Finally, we discuss the effects of data quantization (common in hydrologic applications) on the estimated thresholds. Acknowledgments: The research project is implemented within the framework of the Action «Supporting Postdoctoral Researchers» of the Operational Program "Education and Lifelong Learning" (Action's Beneficiary: General
An Evolutionary Psychological Perspective on Cultures of Honor
Todd K. Shackelford
2005-01-01
Full Text Available A key element of cultures of honor is that men in these cultures are prepared to protect with violence the reputation for strength and toughness. Such cultures are likely to develop where (1 a man's resources can be thieved in full by other men and (2 the governing body is weak and thus cannot prevent or punish theft. Historically a herding culture operating outside of formal government, the southern United States has a rich culture of honor. In this article, I briefly review research conducted by Nisbett, Cohen, and colleagues on the southern culture of honor. I then present several important but unanswered questions about the development and maintenance of the southern culture of honor. I next argue that current models of the development and maintenance of cultures of honor and violence can be informed by an evolutionary psychological perspective. I conclude with a tentative evolutionary psychological analysis of the development and maintenance of the southern culture of honor.
Bligaard, Thomas; Johannesson, Gisli Holmar; Ruban, Andrei;
2003-01-01
and the cost. In this letter we present a database consisting of the lattice parameters, bulk moduli, and heats of formation for over 64 000 ordered metallic alloys, which has been established by direct first-principles density-functional-theory calculations. Furthermore, we use a concept from economic theory...
Importance of tie strengths in the prisoner's dilemma game on social networks
Xu, Bo, E-mail: xubosuper@163.com [Department of Information Systems, School of Economics and Management, Beihang University (China); Liu, Lu; You, Weijia [Department of Information Systems, School of Economics and Management, Beihang University (China)
2011-06-13
Though numerous researches have shown that tie strengths play a key role in the formation of collective behavior in social networks, little work has been done to explore their impact on the outcome of evolutionary games. In this Letter, we studied the effect of tie strength in the dynamics of evolutionary prisoner's dilemma games by using online social network datasets. The results show that the fraction of cooperators has a non-trivial dependence on tie strength. Weak ties, just like previous researches on epidemics and information diffusion have shown, play a key role by the maintenance of cooperators in evolutionary prisoner's dilemma games. -- Highlights: → Tie strength is used to measure heterogeneous influences of different pairs of nodes. → Weak ties play a role in maintaining cooperation in prisoner's dilemma games. → Micro-dynamics of nodes are illustrated to explain the conclusion.
Evolutionary perspective in child growth.
Hochberg, Ze'ev
2011-07-01
Hereditary, environmental, and stochastic factors determine a child's growth in his unique environment, but their relative contribution to the phenotypic outcome and the extent of stochastic programming that is required to alter human phenotypes is not known because few data are available. This is an attempt to use evolutionary life-history theory in understanding child growth in a broad evolutionary perspective, using the data and theory of evolutionary predictive adaptive growth-related strategies. Transitions from one life-history phase to the next have inherent adaptive plasticity in their timing. Humans evolved to withstand energy crises by decreasing their body size, and evolutionary short-term adaptations to energy crises utilize a plasticity that modifies the timing of transition from infancy into childhood, culminating in short stature in times of energy crisis. Transition to juvenility is part of a strategy of conversion from a period of total dependence on the family and tribe for provision and security to self-supply, and a degree of adaptive plasticity is provided and determines body composition. Transition to adolescence entails plasticity in adapting to energy resources, other environmental cues, and the social needs of the maturing adolescent to determine life-span and the period of fecundity and fertility. Fundamental questions are raised by a life-history approach to the unique growth pattern of each child in his given genetic background and current environment.
Evolutionary Perspective in Child Growth
Ze’ev Hochberg
2011-07-01
Full Text Available Hereditary, environmental, and stochastic factors determine a child’s growth in his unique environment, but their relative contribution to the phenotypic outcome and the extent of stochastic programming that is required to alter human phenotypes is not known because few data are available. This is an attempt to use evolutionary life-history theory in understanding child growth in a broad evolutionary perspective, using the data and theory of evolutionary predictive adaptive growth-related strategies. Transitions from one life-history phase to the next have inherent adaptive plasticity in their timing. Humans evolved to withstand energy crises by decreasing their body size, and evolutionary short-term adaptations to energy crises utilize a plasticity that modifies the timing of transition from infancy into childhood, culminating in short stature in times of energy crisis. Transition to juvenility is part of a strategy of conversion from a period of total dependence on the family and tribe for provision and security to self-supply, and a degree of adaptive plasticity is provided and determines body composition. Transition to adolescence entails plasticity in adapting to energy resources, other environmental cues, and the social needs of the maturing adolescent to determine life-span and the period of fecundity and fertility. Fundamental questions are raised by a life-history approach to the unique growth pattern of each child in his given genetic background and current environment.
Genetical Genomics for Evolutionary Studies
Prins, J.C.P.; Smant, G.; Jansen, R.C.
2012-01-01
enetical genomics combines acquired high-throughput genomic data with genetic analysis. In this chapter, we discuss the application of genetical genomics for evolutionary studies, where new high-throughput molecular technologies are combined with mapping quantitative trait loci (QTL) on the genome
Current Issues in Evolutionary Paleontology.
Scully, Erik Paul
1987-01-01
Describes some of the contributions made by the field of paleontology to theories in geology and biology. Suggests that the two best examples of modern evolutionary paleontology relate to the theory of punctuated equilibria, and the possibility that mass extinctions may be cyclic. (TW)
Evolutionary Computation and its Application
Licheng Jiao; Lishan Kang; Zhenya He; Tao Xie
2006-01-01
@@ On Mar.23,2006,a project in the Major Program of NSFC-"Evolutionary computation and its application",managed by Prof.Licheng Jiao,Prof.Lishan Kang,Prof.Zhenya He,and Prof.Tao Xie,passed its Final Qualification Process and was evaluated as Excellent.
Evolutionary models of human personality
Haysom, H.J.; Verweij, C.J.H.; Zietsch, B.P.
2015-01-01
Behavioral genetic studies have shown that around a third to a half of the between-individual variation in personality traits can be accounted for by genetic differences between individuals. There is rapidly growing interest in understanding the evolutionary basis of this genetic variation. In this
Is evolutionary biology strategic science?
Meagher, Thomas R
2007-01-01
There is a profound need for the scientific community to be better aware of the policy context in which it operates. To address this need, Evolution has established a new Outlook feature section to include papers that explore the interface between society and evolutionary biology. This first paper in the series considers the strategic relevance of evolutionary biology. Support for scientific research in general is based on governmental or institutional expenditure that is an investment, and such investment is based on strategies designed to achieve particular outcomes, such as advance in particular areas of basic science or application. The scientific community can engage in the development of scientific strategies on a variety of levels, including workshops to explicitly develop research priorities and targeted funding initiatives to help define emerging scientific areas. Better understanding and communication of the scientific achievements of evolutionary biology, emphasizing immediate and potential societal relevance, are effective counters to challenges presented by the creationist agenda. Future papers in the Outlook feature section should assist the evolutionary biology community in achieving a better collective understanding of the societal relevance of their field.
Scalable Computing for Evolutionary Genomics
Prins, J.C.P.; Belhachemi, D.; Möller, S.; Smant, G.
2012-01-01
Genomic data analysis in evolutionary biology is becoming so computationally intensive that analysis of multiple hypotheses and scenarios takes too long on a single desktop computer. In this chapter, we discuss techniques for scaling computations through parallelization of calculations, after giving
Functional morphology and evolutionary biology.
Dullemeijer, P
1980-01-01
In this study the relationship between functional morpholoy and evolutionary biology is analysed by confronting the main concepts in both disciplines. Rather than only discussing this connection theoretically, the analysis is carried out by introducing important practical and experimental studies, which use aspects from both disciplines. The mentioned investigations are methodologically analysed and the consequences for extensions of the relationship are worked out. It can be shown that both disciplines have a large domain of their own and also share a large common ground. Many disagreements among evolutionary biologists can be reduced to differences in general philosophy (idealism vs. realism), selection of phenomenona (structure vs. function), definition of concepts (natural selection) and the position of the concept theory as an explaining factor (neutralists vs selectionist, random variation, determinate selection, etc.). The significance of functional morphology for evolutionary biology, and vice versa depends on these differences. For a neo-Darwinian evolutionary theory, contributions from functional and ecological morphology are indispensable. Of ultimate importance are the notions of internal selection and constraints in the constructions determining further development. In this context the concepts of random variation and natural selection need more detailed definition. The study ends with a recommendation for future research founded in a system-theoretical or structuralistic conception.