WorldWideScience

Sample records for evolutionary computation techniques

  1. Evolutionary computation techniques a comparative perspective

    CERN Document Server

    Cuevas, Erik; Oliva, Diego

    2017-01-01

    This book compares the performance of various evolutionary computation (EC) techniques when they are faced with complex optimization problems extracted from different engineering domains. Particularly focusing on recently developed algorithms, it is designed so that each chapter can be read independently. Several comparisons among EC techniques have been reported in the literature, however, they all suffer from one limitation: their conclusions are based on the performance of popular evolutionary approaches over a set of synthetic functions with exact solutions and well-known behaviors, without considering the application context or including recent developments. In each chapter, a complex engineering optimization problem is posed, and then a particular EC technique is presented as the best choice, according to its search characteristics. Lastly, a set of experiments is conducted in order to compare its performance to other popular EC methods.

  2. Practical advantages of evolutionary computation

    Science.gov (United States)

    Fogel, David B.

    1997-10-01

    Evolutionary computation is becoming a common technique for solving difficult, real-world problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as their ability to self-adapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine.

  3. Evolutionary Computing for Intelligent Power System Optimization and Control

    DEFF Research Database (Denmark)

    This new book focuses on how evolutionary computing techniques benefit engineering research and development tasks by converting practical problems of growing complexities into simple formulations, thus largely reducing development efforts. This book begins with an overview of the optimization the...... theory and modern evolutionary computing techniques, and goes on to cover specific applications of evolutionary computing to power system optimization and control problems....

  4. Controller Design of DFIG Based Wind Turbine by Using Evolutionary Soft Computational Techniques

    Directory of Open Access Journals (Sweden)

    O. P. Bharti

    2017-06-01

    Full Text Available This manuscript illustrates the controller design for a doubly fed induction generator based variable speed wind turbine by using a bioinspired scheme. This methodology is based on exploiting two proficient swarm intelligence based evolutionary soft computational procedures. The particle swarm optimization (PSO and bacterial foraging optimization (BFO techniques are employed to design the controller intended for small damping plant of the DFIG. Wind energy overview and DFIG operating principle along with the equivalent circuit model is adequately discussed in this paper. The controller design for DFIG based WECS using PSO and BFO are described comparatively in detail. The responses of the DFIG system regarding terminal voltage, current, active-reactive power, and DC-Link voltage have slightly improved with the evolutionary soft computational procedure. Lastly, the obtained output is equated with a standard technique for performance improvement of DFIG based wind energy conversion system.

  5. Practical Applications of Evolutionary Computation to Financial Engineering Robust Techniques for Forecasting, Trading and Hedging

    CERN Document Server

    Iba, Hitoshi

    2012-01-01

    “Practical Applications of Evolutionary Computation to Financial Engineering” presents the state of the art techniques in Financial Engineering using recent results in Machine Learning and Evolutionary Computation. This book bridges the gap between academics in computer science and traders and explains the basic ideas of the proposed systems and the financial problems in ways that can be understood by readers without previous knowledge on either of the fields. To cement the ideas discussed in the book, software packages are offered that implement the systems described within. The book is structured so that each chapter can be read independently from the others. Chapters 1 and 2 describe evolutionary computation. The third chapter is an introduction to financial engineering problems for readers who are unfamiliar with this area. The following chapters each deal, in turn, with a different problem in the financial engineering field describing each problem in detail and focusing on solutions based on evolutio...

  6. Applications of evolutionary computation in image processing and pattern recognition

    CERN Document Server

    Cuevas, Erik; Perez-Cisneros, Marco

    2016-01-01

    This book presents the use of efficient Evolutionary Computation (EC) algorithms for solving diverse real-world image processing and pattern recognition problems. It provides an overview of the different aspects of evolutionary methods in order to enable the reader in reaching a global understanding of the field and, in conducting studies on specific evolutionary techniques that are related to applications in image processing and pattern recognition. It explains the basic ideas of the proposed applications in a way that can also be understood by readers outside of the field. Image processing and pattern recognition practitioners who are not evolutionary computation researchers will appreciate the discussed techniques beyond simple theoretical tools since they have been adapted to solve significant problems that commonly arise on such areas. On the other hand, members of the evolutionary computation community can learn the way in which image processing and pattern recognition problems can be translated into an...

  7. Evolutionary Computation and Its Applications in Neural and Fuzzy Systems

    Directory of Open Access Journals (Sweden)

    Biaobiao Zhang

    2011-01-01

    Full Text Available Neural networks and fuzzy systems are two soft-computing paradigms for system modelling. Adapting a neural or fuzzy system requires to solve two optimization problems: structural optimization and parametric optimization. Structural optimization is a discrete optimization problem which is very hard to solve using conventional optimization techniques. Parametric optimization can be solved using conventional optimization techniques, but the solution may be easily trapped at a bad local optimum. Evolutionary computation is a general-purpose stochastic global optimization approach under the universally accepted neo-Darwinian paradigm, which is a combination of the classical Darwinian evolutionary theory, the selectionism of Weismann, and the genetics of Mendel. Evolutionary algorithms are a major approach to adaptation and optimization. In this paper, we first introduce evolutionary algorithms with emphasis on genetic algorithms and evolutionary strategies. Other evolutionary algorithms such as genetic programming, evolutionary programming, particle swarm optimization, immune algorithm, and ant colony optimization are also described. Some topics pertaining to evolutionary algorithms are also discussed, and a comparison between evolutionary algorithms and simulated annealing is made. Finally, the application of EAs to the learning of neural networks as well as to the structural and parametric adaptations of fuzzy systems is also detailed.

  8. Integrated evolutionary computation neural network quality controller for automated systems

    Energy Technology Data Exchange (ETDEWEB)

    Patro, S.; Kolarik, W.J. [Texas Tech Univ., Lubbock, TX (United States). Dept. of Industrial Engineering

    1999-06-01

    With increasing competition in the global market, more and more stringent quality standards and specifications are being demands at lower costs. Manufacturing applications of computing power are becoming more common. The application of neural networks to identification and control of dynamic processes has been discussed. The limitations of using neural networks for control purposes has been pointed out and a different technique, evolutionary computation, has been discussed. The results of identifying and controlling an unstable, dynamic process using evolutionary computation methods has been presented. A framework for an integrated system, using both neural networks and evolutionary computation, has been proposed to identify the process and then control the product quality, in a dynamic, multivariable system, in real-time.

  9. Algorithmic Mechanism Design of Evolutionary Computation.

    Science.gov (United States)

    Pei, Yan

    2015-01-01

    We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm.

  10. International Conference of Intelligence Computation and Evolutionary Computation ICEC 2012

    CERN Document Server

    Intelligence Computation and Evolutionary Computation

    2013-01-01

    2012 International Conference of Intelligence Computation and Evolutionary Computation (ICEC 2012) is held on July 7, 2012 in Wuhan, China. This conference is sponsored by Information Technology & Industrial Engineering Research Center.  ICEC 2012 is a forum for presentation of new research results of intelligent computation and evolutionary computation. Cross-fertilization of intelligent computation, evolutionary computation, evolvable hardware and newly emerging technologies is strongly encouraged. The forum aims to bring together researchers, developers, and users from around the world in both industry and academia for sharing state-of-art results, for exploring new areas of research and development, and to discuss emerging issues facing intelligent computation and evolutionary computation.

  11. Computational intelligence synergies of fuzzy logic, neural networks and evolutionary computing

    CERN Document Server

    Siddique, Nazmul

    2013-01-01

    Computational Intelligence: Synergies of Fuzzy Logic, Neural Networks and Evolutionary Computing presents an introduction to some of the cutting edge technological paradigms under the umbrella of computational intelligence. Computational intelligence schemes are investigated with the development of a suitable framework for fuzzy logic, neural networks and evolutionary computing, neuro-fuzzy systems, evolutionary-fuzzy systems and evolutionary neural systems. Applications to linear and non-linear systems are discussed with examples. Key features: Covers all the aspect

  12. Evolutionary Computation Techniques for Predicting Atmospheric Corrosion

    Directory of Open Access Journals (Sweden)

    Amine Marref

    2013-01-01

    Full Text Available Corrosion occurs in many engineering structures such as bridges, pipelines, and refineries and leads to the destruction of materials in a gradual manner and thus shortening their lifespan. It is therefore crucial to assess the structural integrity of engineering structures which are approaching or exceeding their designed lifespan in order to ensure their correct functioning, for example, carrying ability and safety. An understanding of corrosion and an ability to predict corrosion rate of a material in a particular environment plays a vital role in evaluating the residual life of the material. In this paper we investigate the use of genetic programming and genetic algorithms in the derivation of corrosion-rate expressions for steel and zinc. Genetic programming is used to automatically evolve corrosion-rate expressions while a genetic algorithm is used to evolve the parameters of an already engineered corrosion-rate expression. We show that both evolutionary techniques yield corrosion-rate expressions that have good accuracy.

  13. Evolutionary Computing Methods for Spectral Retrieval

    Science.gov (United States)

    Terrile, Richard; Fink, Wolfgang; Huntsberger, Terrance; Lee, Seugwon; Tisdale, Edwin; VonAllmen, Paul; Tinetti, Geivanna

    2009-01-01

    A methodology for processing spectral images to retrieve information on underlying physical, chemical, and/or biological phenomena is based on evolutionary and related computational methods implemented in software. In a typical case, the solution (the information that one seeks to retrieve) consists of parameters of a mathematical model that represents one or more of the phenomena of interest. The methodology was developed for the initial purpose of retrieving the desired information from spectral image data acquired by remote-sensing instruments aimed at planets (including the Earth). Examples of information desired in such applications include trace gas concentrations, temperature profiles, surface types, day/night fractions, cloud/aerosol fractions, seasons, and viewing angles. The methodology is also potentially useful for retrieving information on chemical and/or biological hazards in terrestrial settings. In this methodology, one utilizes an iterative process that minimizes a fitness function indicative of the degree of dissimilarity between observed and synthetic spectral and angular data. The evolutionary computing methods that lie at the heart of this process yield a population of solutions (sets of the desired parameters) within an accuracy represented by a fitness-function value specified by the user. The evolutionary computing methods (ECM) used in this methodology are Genetic Algorithms and Simulated Annealing, both of which are well-established optimization techniques and have also been described in previous NASA Tech Briefs articles. These are embedded in a conceptual framework, represented in the architecture of the implementing software, that enables automatic retrieval of spectral and angular data and analysis of the retrieved solutions for uniqueness.

  14. Evolutionary computation in zoology and ecology.

    Science.gov (United States)

    Boone, Randall B

    2017-12-01

    Evolutionary computational methods have adopted attributes of natural selection and evolution to solve problems in computer science, engineering, and other fields. The method is growing in use in zoology and ecology. Evolutionary principles may be merged with an agent-based modeling perspective to have individual animals or other agents compete. Four main categories are discussed: genetic algorithms, evolutionary programming, genetic programming, and evolutionary strategies. In evolutionary computation, a population is represented in a way that allows for an objective function to be assessed that is relevant to the problem of interest. The poorest performing members are removed from the population, and remaining members reproduce and may be mutated. The fitness of the members is again assessed, and the cycle continues until a stopping condition is met. Case studies include optimizing: egg shape given different clutch sizes, mate selection, migration of wildebeest, birds, and elk, vulture foraging behavior, algal bloom prediction, and species richness given energy constraints. Other case studies simulate the evolution of species and a means to project shifts in species ranges in response to a changing climate that includes competition and phenotypic plasticity. This introduction concludes by citing other uses of evolutionary computation and a review of the flexibility of the methods. For example, representing species' niche spaces subject to selective pressure allows studies on cladistics, the taxon cycle, neutral versus niche paradigms, fundamental versus realized niches, community structure and order of colonization, invasiveness, and responses to a changing climate.

  15. International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems

    CERN Document Server

    Bhaskar, M; Panigrahi, Bijaya; Das, Swagatam

    2016-01-01

    The book is a collection of high-quality peer-reviewed research papers presented in the first International Conference on International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems (ICAIECES -2015) held at Velammal Engineering College (VEC), Chennai, India during 22 – 23 April 2015. The book discusses wide variety of industrial, engineering and scientific applications of the emerging techniques. Researchers from academic and industry present their original work and exchange ideas, information, techniques and applications in the field of Communication, Computing and Power Technologies.

  16. Part E: Evolutionary Computation

    DEFF Research Database (Denmark)

    2015-01-01

    of Computational Intelligence. First, comprehensive surveys of genetic algorithms, genetic programming, evolution strategies, parallel evolutionary algorithms are presented, which are readable and constructive so that a large audience might find them useful and – to some extent – ready to use. Some more general...... kinds of evolutionary algorithms, have been prudently analyzed. This analysis was followed by a thorough analysis of various issues involved in stochastic local search algorithms. An interesting survey of various technological and industrial applications in mechanical engineering and design has been...... topics like the estimation of distribution algorithms, indicator-based selection, etc., are also discussed. An important problem, from a theoretical and practical point of view, of learning classifier systems is presented in depth. Multiobjective evolutionary algorithms, which constitute one of the most...

  17. Regulatory RNA design through evolutionary computation and strand displacement.

    Science.gov (United States)

    Rostain, William; Landrain, Thomas E; Rodrigo, Guillermo; Jaramillo, Alfonso

    2015-01-01

    The discovery and study of a vast number of regulatory RNAs in all kingdoms of life over the past decades has allowed the design of new synthetic RNAs that can regulate gene expression in vivo. Riboregulators, in particular, have been used to activate or repress gene expression. However, to accelerate and scale up the design process, synthetic biologists require computer-assisted design tools, without which riboregulator engineering will remain a case-by-case design process requiring expert attention. Recently, the design of RNA circuits by evolutionary computation and adapting strand displacement techniques from nanotechnology has proven to be suited to the automated generation of DNA sequences implementing regulatory RNA systems in bacteria. Herein, we present our method to carry out such evolutionary design and how to use it to create various types of riboregulators, allowing the systematic de novo design of genetic control systems in synthetic biology.

  18. Comparison of evolutionary computation algorithms for solving bi ...

    Indian Academy of Sciences (India)

    failure probability. Multiobjective Evolutionary Computation algorithms (MOEAs) are well-suited for Multiobjective task scheduling on heterogeneous environment. The two Multi-Objective Evolutionary Algorithms such as Multiobjective Genetic. Algorithm (MOGA) and Multiobjective Evolutionary Programming (MOEP) with.

  19. Applications of Evolutionary Computation

    NARCIS (Netherlands)

    Mora, Antonio M.; Squillero, Giovanni; Di Chio, C; Agapitos, Alexandros; Cagnoni, Stefano; Cotta, Carlos; Fernández De Vega, F; Di Caro, G A; Drechsler, R.; Ekárt, A; Esparcia-Alcázar, Anna I.; Farooq, M; Langdon, W B; Merelo-Guervós, J.J.; Preuss, M; Richter, O.-M.H.; Silva, Sara; Sim$\\$~oes, A; Squillero, Giovanni; Tarantino, Ernesto; Tettamanzi, Andrea G B; Togelius, J; Urquhart, Neil; Uyar, A S; Yannakakis, G N; Smith, Stephen L; Caserta, Marco; Ramirez, Adriana; Voß, Stefan; Squillero, Giovanni; Burelli, Paolo; Mora, Antonio M.; Squillero, Giovanni; Jan, Mathieu; Matthias, M; Di Chio, C; Agapitos, Alexandros; Cagnoni, Stefano; Cotta, Carlos; Fernández De Vega, F; Di Caro, G A; Drechsler, R.; Ekárt, A; Esparcia-Alcázar, Anna I.; Farooq, M; Langdon, W B; Merelo-Guervós, J.J.; Preuss, M; Richter, O.-M.H.; Silva, Sara; Sim$\\$~oes, A; Squillero, Giovanni; Tarantino, Ernesto; Tettamanzi, Andrea G B; Togelius, J; Urquhart, Neil; Uyar, A S; Yannakakis, G N; Caserta, Marco; Ramirez, Adriana; Voß, Stefan; Squillero, Giovanni; Burelli, Paolo; Esparcia-Alcazar, Anna I; Silva, Sara; Agapitos, Alexandros; Cotta, Carlos; De Falco, Ivanoe; Cioppa, Antonio Della; Diwold, Konrad; Ekart, Aniko; Tarantino, Ernesto; Vega, Francisco Fernandez De; Burelli, Paolo; Sim, Kevin; Cagnoni, Stefano; Simoes, Anabela; Merelo, J.J.; Urquhart, Neil; Haasdijk, Evert; Zhang, Mengjie; Squillero, Giovanni; Eiben, A E; Tettamanzi, Andrea G B; Glette, Kyrre; Rohlfshagen, Philipp; Schaefer, Robert; Caserta, Marco; Ramirez, Adriana; Voß, Stefan

    2015-01-01

    The application of genetic and evolutionary computation to problems in medicine has increased rapidly over the past five years, but there are specific issues and challenges that distinguish it from other real-world applications. Obtaining reliable and coherent patient data, establishing the clinical

  20. Prospective Algorithms for Quantum Evolutionary Computation

    OpenAIRE

    Sofge, Donald A.

    2008-01-01

    This effort examines the intersection of the emerging field of quantum computing and the more established field of evolutionary computation. The goal is to understand what benefits quantum computing might offer to computational intelligence and how computational intelligence paradigms might be implemented as quantum programs to be run on a future quantum computer. We critically examine proposed algorithms and methods for implementing computational intelligence paradigms, primarily focused on ...

  1. Evolutionary computation for reinforcement learning

    NARCIS (Netherlands)

    Whiteson, S.; Wiering, M.; van Otterlo, M.

    2012-01-01

    Algorithms for evolutionary computation, which simulate the process of natural selection to solve optimization problems, are an effective tool for discovering high-performing reinforcement-learning policies. Because they can automatically find good representations, handle continuous action spaces,

  2. International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems

    CERN Document Server

    Vijayakumar, K; Panigrahi, Bijaya; Das, Swagatam

    2017-01-01

    The volume is a collection of high-quality peer-reviewed research papers presented in the International Conference on Artificial Intelligence and Evolutionary Computation in Engineering Systems (ICAIECES 2016) held at SRM University, Chennai, Tamilnadu, India. This conference is an international forum for industry professionals and researchers to deliberate and state their research findings, discuss the latest advancements and explore the future directions in the emerging areas of engineering and technology. The book presents original work and novel ideas, information, techniques and applications in the field of communication, computing and power technologies.

  3. Soft computing integrating evolutionary, neural, and fuzzy systems

    CERN Document Server

    Tettamanzi, Andrea

    2001-01-01

    Soft computing encompasses various computational methodologies, which, unlike conventional algorithms, are tolerant of imprecision, uncertainty, and partial truth. Soft computing technologies offer adaptability as a characteristic feature and thus permit the tracking of a problem through a changing environment. Besides some recent developments in areas like rough sets and probabilistic networks, fuzzy logic, evolutionary algorithms, and artificial neural networks are core ingredients of soft computing, which are all bio-inspired and can easily be combined synergetically. This book presents a well-balanced integration of fuzzy logic, evolutionary computing, and neural information processing. The three constituents are introduced to the reader systematically and brought together in differentiated combinations step by step. The text was developed from courses given by the authors and offers numerous illustrations as

  4. Evolutionary Computation Methods and their applications in Statistics

    Directory of Open Access Journals (Sweden)

    Francesco Battaglia

    2013-05-01

    Full Text Available A brief discussion of the genesis of evolutionary computation methods, their relationship to artificial intelligence, and the contribution of genetics and Darwin’s theory of natural evolution is provided. Then, the main evolutionary computation methods are illustrated: evolution strategies, genetic algorithms, estimation of distribution algorithms, differential evolution, and a brief description of some evolutionary behavior methods such as ant colony and particle swarm optimization. We also discuss the role of the genetic algorithm for multivariate probability distribution random generation, rather than as a function optimizer. Finally, some relevant applications of genetic algorithm to statistical problems are reviewed: selection of variables in regression, time series model building, outlier identification, cluster analysis, design of experiments.

  5. Evolutionary computing in Nuclear Engineering Institute/CNEN-Brazil

    International Nuclear Information System (INIS)

    Pereira, Claudio M.N.A.; Lapa, Celso M.F.; Lapa, Nelbia da Silva; Mol, Antonio C.

    2000-01-01

    This paper aims to discuss the importance of evolutionary computation (CE) for nuclear engineering and the development of this area in the Instituto de Engenharia Nuclear (IEN) at the last years. Are describe, briefly, the applications realized in this institute by the technical group of CE. For example: nuclear reactor core design optimization, preventive maintenance scheduling optimizing and nuclear reactor transient identifications. It is also shown a novel computational tool to implementation of genetic algorithm that was development in this institute and applied in those works. Some results were presents and the gains obtained with the evolutionary computation were discussing. (author)

  6. REAL TIME PULVERISED COAL FLOW SOFT SENSOR FOR THERMAL POWER PLANTS USING EVOLUTIONARY COMPUTATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    B. Raja Singh

    2015-01-01

    Full Text Available Pulverised coal preparation system (Coal mills is the heart of coal-fired power plants. The complex nature of a milling process, together with the complex interactions between coal quality and mill conditions, would lead to immense difficulties for obtaining an effective mathematical model of the milling process. In this paper, vertical spindle coal mills (bowl mill that are widely used in coal-fired power plants, is considered for the model development and its pulverised fuel flow rate is computed using the model. For the steady state coal mill model development, plant measurements such as air-flow rate, differential pressure across mill etc., are considered as inputs/outputs. The mathematical model is derived from analysis of energy, heat and mass balances. An Evolutionary computation technique is adopted to identify the unknown model parameters using on-line plant data. Validation results indicate that this model is accurate enough to represent the whole process of steady state coal mill dynamics. This coal mill model is being implemented on-line in a 210 MW thermal power plant and the results obtained are compared with plant data. The model is found accurate and robust that will work better in power plants for system monitoring. Therefore, the model can be used for online monitoring, fault detection, and control to improve the efficiency of combustion.

  7. Massively parallel evolutionary computation on GPGPUs

    CERN Document Server

    Tsutsui, Shigeyoshi

    2013-01-01

    Evolutionary algorithms (EAs) are metaheuristics that learn from natural collective behavior and are applied to solve optimization problems in domains such as scheduling, engineering, bioinformatics, and finance. Such applications demand acceptable solutions with high-speed execution using finite computational resources. Therefore, there have been many attempts to develop platforms for running parallel EAs using multicore machines, massively parallel cluster machines, or grid computing environments. Recent advances in general-purpose computing on graphics processing units (GPGPU) have opened u

  8. Fundamentals of computational intelligence neural networks, fuzzy systems, and evolutionary computation

    CERN Document Server

    Keller, James M; Fogel, David B

    2016-01-01

    This book covers the three fundamental topics that form the basis of computational intelligence: neural networks, fuzzy systems, and evolutionary computation. The text focuses on inspiration, design, theory, and practical aspects of implementing procedures to solve real-world problems. While other books in the three fields that comprise computational intelligence are written by specialists in one discipline, this book is co-written by current former Editor-in-Chief of IEEE Transactions on Neural Networks and Learning Systems, a former Editor-in-Chief of IEEE Transactions on Fuzzy Systems, and the founding Editor-in-Chief of IEEE Transactions on Evolutionary Computation. The coverage across the three topics is both uniform and consistent in style and notation. Discusses single-layer and multilayer neural networks, radial-basi function networks, and recurrent neural networks Covers fuzzy set theory, fuzzy relations, fuzzy logic interference, fuzzy clustering and classification, fuzzy measures and fuzz...

  9. Solution of Fractional Order System of Bagley-Torvik Equation Using Evolutionary Computational Intelligence

    Directory of Open Access Journals (Sweden)

    Muhammad Asif Zahoor Raja

    2011-01-01

    Full Text Available A stochastic technique has been developed for the solution of fractional order system represented by Bagley-Torvik equation. The mathematical model of the equation was developed with the help of feed-forward artificial neural networks. The training of the networks was made with evolutionary computational intelligence based on genetic algorithm hybrid with pattern search technique. Designed scheme was successfully applied to different forms of the equation. Results are compared with standard approximate analytic, stochastic numerical solvers and exact solutions.

  10. 10th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Lin, Jerry; Wang, Chia-Hung; Jiang, Xin

    2017-01-01

    This book gathers papers presented at the 10th International Conference on Genetic and Evolutionary Computing (ICGEC 2016). The conference was co-sponsored by Springer, Fujian University of Technology in China, the University of Computer Studies in Yangon, University of Miyazaki in Japan, National Kaohsiung University of Applied Sciences in Taiwan, Taiwan Association for Web Intelligence Consortium, and VSB-Technical University of Ostrava, Czech Republic. The ICGEC 2016, which was held from November 7 to 9, 2016 in Fuzhou City, China, was intended as an international forum for researchers and professionals in all areas of genetic and evolutionary computing.

  11. Industrial Applications of Evolutionary Algorithms

    CERN Document Server

    Sanchez, Ernesto; Tonda, Alberto

    2012-01-01

    This book is intended as a reference both for experienced users of evolutionary algorithms and for researchers that are beginning to approach these fascinating optimization techniques. Experienced users will find interesting details of real-world problems, and advice on solving issues related to fitness computation, modeling and setting appropriate parameters to reach optimal solutions. Beginners will find a thorough introduction to evolutionary computation, and a complete presentation of all evolutionary algorithms exploited to solve different problems. The book could fill the gap between the

  12. From evolutionary computation to the evolution of things

    NARCIS (Netherlands)

    Eiben, A.E.; Smith, J.E.

    2015-01-01

    Evolution has provided a source of inspiration for algorithm designers since the birth of computers. The resulting field, evolutionary computation, has been successful in solving engineering tasks ranging in outlook from the molecular to the astronomical. Today, the field is entering a new phase as

  13. Computational Modeling of Teaching and Learning through Application of Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Richard Lamb

    2015-09-01

    Full Text Available Within the mind, there are a myriad of ideas that make sense within the bounds of everyday experience, but are not reflective of how the world actually exists; this is particularly true in the domain of science. Classroom learning with teacher explanation are a bridge through which these naive understandings can be brought in line with scientific reality. The purpose of this paper is to examine how the application of a Multiobjective Evolutionary Algorithm (MOEA can work in concert with an existing computational-model to effectively model critical-thinking in the science classroom. An evolutionary algorithm is an algorithm that iteratively optimizes machine learning based computational models. The research question is, does the application of an evolutionary algorithm provide a means to optimize the Student Task and Cognition Model (STAC-M and does the optimized model sufficiently represent and predict teaching and learning outcomes in the science classroom? Within this computational study, the authors outline and simulate the effect of teaching on the ability of a “virtual” student to solve a Piagetian task. Using the Student Task and Cognition Model (STAC-M a computational model of student cognitive processing in science class developed in 2013, the authors complete a computational experiment which examines the role of cognitive retraining on student learning. Comparison of the STAC-M and the STAC-M with inclusion of the Multiobjective Evolutionary Algorithm shows greater success in solving the Piagetian science-tasks post cognitive retraining with the Multiobjective Evolutionary Algorithm. This illustrates the potential uses of cognitive and neuropsychological computational modeling in educational research. The authors also outline the limitations and assumptions of computational modeling.

  14. A cyber kill chain based taxonomy of banking Trojans for evolutionary computational intelligence

    OpenAIRE

    Kiwia, D; Dehghantanha, A; Choo, K-KR; Slaughter, J

    2017-01-01

    Malware such as banking Trojans are popular with financially-motivated cybercriminals. Detection of banking Trojans remains a challenging task, due to the constant evolution of techniques used to obfuscate and circumvent existing detection and security solutions. Having a malware taxonomy can facilitate the design of mitigation strategies such as those based on evolutionary computational intelligence. Specifically, in this paper, we propose a cyber kill chain based taxonomy of banking Trojans...

  15. Function Follows Performance in Evolutionary Computational Processing

    DEFF Research Database (Denmark)

    Pasold, Anke; Foged, Isak Worre

    2011-01-01

    As the title ‘Function Follows Performance in Evolutionary Computational Processing’ suggests, this paper explores the potentials of employing multiple design and evaluation criteria within one processing model in order to account for a number of performative parameters desired within varied...

  16. Constrained Optimization Based on Hybrid Evolutionary Algorithm and Adaptive Constraint-Handling Technique

    DEFF Research Database (Denmark)

    Wang, Yong; Cai, Zixing; Zhou, Yuren

    2009-01-01

    A novel approach to deal with numerical and engineering constrained optimization problems, which incorporates a hybrid evolutionary algorithm and an adaptive constraint-handling technique, is presented in this paper. The hybrid evolutionary algorithm simultaneously uses simplex crossover and two...... mutation operators to generate the offspring population. Additionally, the adaptive constraint-handling technique consists of three main situations. In detail, at each situation, one constraint-handling mechanism is designed based on current population state. Experiments on 13 benchmark test functions...... and four well-known constrained design problems verify the effectiveness and efficiency of the proposed method. The experimental results show that integrating the hybrid evolutionary algorithm with the adaptive constraint-handling technique is beneficial, and the proposed method achieves competitive...

  17. An evolutionary computing frame work toward object extraction from satellite images

    Directory of Open Access Journals (Sweden)

    P.V. Arun

    2013-12-01

    Full Text Available Image interpretation domains have witnessed the application of many intelligent methodologies over the past decade; however the effective use of evolutionary computing techniques for feature detection has been less explored. In this paper, we critically analyze the possibility of using cellular neural network for accurate feature detection. Contextual knowledge has been effectively represented by incorporating spectral and spatial aspects using adaptive kernel strategy. Developed methodology has been compared with traditional approaches in an object based context and investigations revealed that considerable success has been achieved with the procedure. Intelligent interpretation, automatic interpolation, and effective contextual representations are the features of the system.

  18. 7th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Krömer, Pavel; Snášel, Václav

    2014-01-01

    Genetic and Evolutionary Computing This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2013, the 7th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by The Waseda University in Japan, Kaohsiung University of Applied Science in Taiwan, and VSB-Technical University of Ostrava. ICGEC 2013 was held in Prague, Czech Republic. Prague is one of the most beautiful cities in the world whose magical atmosphere has been shaped over ten centuries. Places of the greatest tourist interest are on the Royal Route running from the Powder Tower through Celetna Street to Old Town Square, then across Charles Bridge through the Lesser Town up to the Hradcany Castle. One should not miss the Jewish Town, and the National Gallery with its fine collection of Czech Gothic art, collection of old European art, and a beautiful collection of French art. The conference was intended as an international forum for the res...

  19. 8th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Yang, Chin-Yu; Lin, Chun-Wei; Pan, Jeng-Shyang; Snasel, Vaclav; Abraham, Ajith

    2015-01-01

    This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2014, the 8th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by Nanchang Institute of Technology in China, Kaohsiung University of Applied Science in Taiwan, and VSB-Technical University of Ostrava. ICGEC 2014 is held from 18-20 October 2014 in Nanchang, China. Nanchang is one of is the capital of Jiangxi Province in southeastern China, located in the north-central portion of the province. As it is bounded on the west by the Jiuling Mountains, and on the east by Poyang Lake, it is famous for its scenery, rich history and cultural sites. Because of its central location relative to the Yangtze and Pearl River Delta regions, it is a major railroad hub in Southern China. The conference is intended as an international forum for the researchers and professionals in all areas of genetic and evolutionary computing.

  20. Coevolution of Artificial Agents Using Evolutionary Computation in Bargaining Game

    Directory of Open Access Journals (Sweden)

    Sangwook Lee

    2015-01-01

    Full Text Available Analysis of bargaining game using evolutionary computation is essential issue in the field of game theory. This paper investigates the interaction and coevolutionary process among heterogeneous artificial agents using evolutionary computation (EC in the bargaining game. In particular, the game performance with regard to payoff through the interaction and coevolution of agents is studied. We present three kinds of EC based agents (EC-agent participating in the bargaining game: genetic algorithm (GA, particle swarm optimization (PSO, and differential evolution (DE. The agents’ performance with regard to changing condition is compared. From the simulation results it is found that the PSO-agent is superior to the other agents.

  1. Evolutionary Statistical Procedures

    CERN Document Server

    Baragona, Roberto; Poli, Irene

    2011-01-01

    This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a

  2. Optimizing a reconfigurable material via evolutionary computation

    Science.gov (United States)

    Wilken, Sam; Miskin, Marc Z.; Jaeger, Heinrich M.

    2015-08-01

    Rapid prototyping by combining evolutionary computation with simulations is becoming a powerful tool for solving complex design problems in materials science. This method of optimization operates in a virtual design space that simulates potential material behaviors and after completion needs to be validated by experiment. However, in principle an evolutionary optimizer can also operate on an actual physical structure or laboratory experiment directly, provided the relevant material parameters can be accessed by the optimizer and information about the material's performance can be updated by direct measurements. Here we provide a proof of concept of such direct, physical optimization by showing how a reconfigurable, highly nonlinear material can be tuned to respond to impact. We report on an entirely computer controlled laboratory experiment in which a 6 ×6 grid of electromagnets creates a magnetic field pattern that tunes the local rigidity of a concentrated suspension of ferrofluid and iron filings. A genetic algorithm is implemented and tasked to find field patterns that minimize the force transmitted through the suspension. Searching within a space of roughly 1010 possible configurations, after testing only 1500 independent trials the algorithm identifies an optimized configuration of layered rigid and compliant regions.

  3. MEGA X: Molecular Evolutionary Genetics Analysis across Computing Platforms.

    Science.gov (United States)

    Kumar, Sudhir; Stecher, Glen; Li, Michael; Knyaz, Christina; Tamura, Koichiro

    2018-06-01

    The Molecular Evolutionary Genetics Analysis (Mega) software implements many analytical methods and tools for phylogenomics and phylomedicine. Here, we report a transformation of Mega to enable cross-platform use on Microsoft Windows and Linux operating systems. Mega X does not require virtualization or emulation software and provides a uniform user experience across platforms. Mega X has additionally been upgraded to use multiple computing cores for many molecular evolutionary analyses. Mega X is available in two interfaces (graphical and command line) and can be downloaded from www.megasoftware.net free of charge.

  4. Interior spatial layout with soft objectives using evolutionary computation

    NARCIS (Netherlands)

    Chatzikonstantinou, I.; Bengisu, E.

    2016-01-01

    This paper presents the design problem of furniture arrangement in a residential interior living space, and addresses it by means of evolutionary computation. Interior arrangement is an important and interesting problem that occurs commonly when designing living spaces. It entails determining the

  5. Techniques and environments for big data analysis parallel, cloud, and grid computing

    CERN Document Server

    Dehuri, Satchidananda; Kim, Euiwhan; Wang, Gi-Name

    2016-01-01

    This volume is aiming at a wide range of readers and researchers in the area of Big Data by presenting the recent advances in the fields of Big Data Analysis, as well as the techniques and tools used to analyze it. The book includes 10 distinct chapters providing a concise introduction to Big Data Analysis and recent Techniques and Environments for Big Data Analysis. It gives insight into how the expensive fitness evaluation of evolutionary learning can play a vital role in big data analysis by adopting Parallel, Grid, and Cloud computing environments.

  6. Computing the Quartet Distance Between Evolutionary Trees in Time O(n log n)

    DEFF Research Database (Denmark)

    Brodal, Gerth Sølfting; Fagerberg, Rolf; Pedersen, Christian Nørgaard Storm

    2003-01-01

    Evolutionary trees describing the relationship for a set of species are central in evolutionary biology, and quantifying differences between evolutionary trees is therefore an important task. The quartet distance is a distance measure between trees previously proposed by Estabrook, McMorris, and ...... unrooted evolutionary trees of n species, where all internal nodes have degree three, in time O(n log n. The previous best algorithm for the problem uses time O(n 2).......Evolutionary trees describing the relationship for a set of species are central in evolutionary biology, and quantifying differences between evolutionary trees is therefore an important task. The quartet distance is a distance measure between trees previously proposed by Estabrook, Mc......Morris, and Meacham. The quartet distance between two unrooted evolutionary trees is the number of quartet topology differences between the two trees, where a quartet topology is the topological subtree induced by four species. In this paper we present an algorithm for computing the quartet distance between two...

  7. Evolutionary Based Solutions for Green Computing

    CERN Document Server

    Kołodziej, Joanna; Li, Juan; Zomaya, Albert

    2013-01-01

    Today’s highly parameterized large-scale distributed computing systems may be composed  of a large number of various components (computers, databases, etc) and must provide a wide range of services. The users of such systems, located at different (geographical or managerial) network cluster may have a limited access to the system’s services and resources, and different, often conflicting, expectations and requirements. Moreover, the information and data processed in such dynamic environments may be incomplete, imprecise, fragmentary, and overloading. All of the above mentioned issues require some intelligent scalable methodologies for the management of the whole complex structure, which unfortunately may increase the energy consumption of such systems.   This book in its eight chapters, addresses the fundamental issues related to the energy usage and the optimal low-cost system design in high performance ``green computing’’ systems. The recent evolutionary and general metaheuristic-based solutions ...

  8. Hardware for soft computing and soft computing for hardware

    CERN Document Server

    Nedjah, Nadia

    2014-01-01

    Single and Multi-Objective Evolutionary Computation (MOEA),  Genetic Algorithms (GAs), Artificial Neural Networks (ANNs), Fuzzy Controllers (FCs), Particle Swarm Optimization (PSO) and Ant colony Optimization (ACO) are becoming omnipresent in almost every intelligent system design. Unfortunately, the application of the majority of these techniques is complex and so requires a huge computational effort to yield useful and practical results. Therefore, dedicated hardware for evolutionary, neural and fuzzy computation is a key issue for designers. With the spread of reconfigurable hardware such as FPGAs, digital as well as analog hardware implementations of such computation become cost-effective. The idea behind this book is to offer a variety of hardware designs for soft computing techniques that can be embedded in any final product. Also, to introduce the successful application of soft computing technique to solve many hard problem encountered during the design of embedded hardware designs. Reconfigurable em...

  9. A hybrid finite element analysis and evolutionary computation method for the design of lightweight lattice components with optimized strut diameter

    DEFF Research Database (Denmark)

    Salonitis, Konstantinos; Chantzis, Dimitrios; Kappatos, Vasileios

    2017-01-01

    approaches or with the use of topology optimization methodologies. An optimization approach utilizing multipurpose optimization algorithms has not been proposed yet. This paper presents a novel user-friendly method for the design optimization of lattice components towards weight minimization, which combines...... finite element analysis and evolutionary computation. The proposed method utilizes the cell homogenization technique in order to reduce the computational cost of the finite element analysis and a genetic algorithm in order to search for the most lightweight lattice configuration. A bracket consisting...

  10. Open Issues in Evolutionary Robotics.

    Science.gov (United States)

    Silva, Fernando; Duarte, Miguel; Correia, Luís; Oliveira, Sancho Moura; Christensen, Anders Lyhne

    2016-01-01

    One of the long-term goals in evolutionary robotics is to be able to automatically synthesize controllers for real autonomous robots based only on a task specification. While a number of studies have shown the applicability of evolutionary robotics techniques for the synthesis of behavioral control, researchers have consistently been faced with a number of issues preventing the widespread adoption of evolutionary robotics for engineering purposes. In this article, we review and discuss the open issues in evolutionary robotics. First, we analyze the benefits and challenges of simulation-based evolution and subsequent deployment of controllers versus evolution on real robotic hardware. Second, we discuss specific evolutionary computation issues that have plagued evolutionary robotics: (1) the bootstrap problem, (2) deception, and (3) the role of genomic encoding and genotype-phenotype mapping in the evolution of controllers for complex tasks. Finally, we address the absence of standard research practices in the field. We also discuss promising avenues of research. Our underlying motivation is the reduction of the current gap between evolutionary robotics and mainstream robotics, and the establishment of evolutionary robotics as a canonical approach for the engineering of autonomous robots.

  11. Machine learning and evolutionary techniques in interplanetary trajectory design

    OpenAIRE

    Izzo, Dario; Sprague, Christopher; Tailor, Dharmesh

    2018-01-01

    After providing a brief historical overview on the synergies between artificial intelligence research, in the areas of evolutionary computations and machine learning, and the optimal design of interplanetary trajectories, we propose and study the use of deep artificial neural networks to represent, on-board, the optimal guidance profile of an interplanetary mission. The results, limited to the chosen test case of an Earth-Mars orbital transfer, extend the findings made previously for landing ...

  12. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment.

    Science.gov (United States)

    Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che

    2014-01-16

    To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high

  13. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design

    International Nuclear Information System (INIS)

    Menges, Achim

    2012-01-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies. (paper)

  14. Markov Networks in Evolutionary Computation

    CERN Document Server

    Shakya, Siddhartha

    2012-01-01

    Markov networks and other probabilistic graphical modes have recently received an upsurge in attention from Evolutionary computation community, particularly in the area of Estimation of distribution algorithms (EDAs).  EDAs have arisen as one of the most successful experiences in the application of machine learning methods in optimization, mainly due to their efficiency to solve complex real-world optimization problems and their suitability for theoretical analysis. This book focuses on the different steps involved in the conception, implementation and application of EDAs that use Markov networks, and undirected models in general. It can serve as a general introduction to EDAs but covers also an important current void in the study of these algorithms by explaining the specificities and benefits of modeling optimization problems by means of undirected probabilistic models. All major developments to date in the progressive introduction of Markov networks based EDAs are reviewed in the book. Hot current researc...

  15. Advances of evolutionary computation methods and operators

    CERN Document Server

    Cuevas, Erik; Oliva Navarro, Diego Alberto

    2016-01-01

    The goal of this book is to present advances that discuss alternative Evolutionary Computation (EC) developments and non-conventional operators which have proved to be effective in the solution of several complex problems. The book has been structured so that each chapter can be read independently from the others. The book contains nine chapters with the following themes: 1) Introduction, 2) the Social Spider Optimization (SSO), 3) the States of Matter Search (SMS), 4) the collective animal behavior (CAB) algorithm, 5) the Allostatic Optimization (AO) method, 6) the Locust Search (LS) algorithm, 7) the Adaptive Population with Reduced Evaluations (APRE) method, 8) the multimodal CAB, 9) the constrained SSO method.

  16. Assessment of traffic noise levels in urban areas using different soft computing techniques.

    Science.gov (United States)

    Tomić, J; Bogojević, N; Pljakić, M; Šumarac-Pavlović, D

    2016-10-01

    Available traffic noise prediction models are usually based on regression analysis of experimental data, and this paper presents the application of soft computing techniques in traffic noise prediction. Two mathematical models are proposed and their predictions are compared to data collected by traffic noise monitoring in urban areas, as well as to predictions of commonly used traffic noise models. The results show that application of evolutionary algorithms and neural networks may improve process of development, as well as accuracy of traffic noise prediction.

  17. Applications of soft computing in time series forecasting simulation and modeling techniques

    CERN Document Server

    Singh, Pritpal

    2016-01-01

    This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...

  18. Recent advances in swarm intelligence and evolutionary computation

    CERN Document Server

    2015-01-01

    This timely review volume summarizes the state-of-the-art developments in nature-inspired algorithms and applications with the emphasis on swarm intelligence and bio-inspired computation. Topics include the analysis and overview of swarm intelligence and evolutionary computation, hybrid metaheuristic algorithms, bat algorithm, discrete cuckoo search, firefly algorithm, particle swarm optimization, and harmony search as well as convergent hybridization. Application case studies have focused on the dehydration of fruits and vegetables by the firefly algorithm and goal programming, feature selection by the binary flower pollination algorithm, job shop scheduling, single row facility layout optimization, training of feed-forward neural networks, damage and stiffness identification, synthesis of cross-ambiguity functions by the bat algorithm, web document clustering, truss analysis, water distribution networks, sustainable building designs and others. As a timely review, this book can serve as an ideal reference f...

  19. EVOLVE : a Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation II

    CERN Document Server

    Coello, Carlos; Tantar, Alexandru-Adrian; Tantar, Emilia; Bouvry, Pascal; Moral, Pierre; Legrand, Pierrick; EVOLVE 2012

    2013-01-01

    This book comprises a selection of papers from the EVOLVE 2012 held in Mexico City, Mexico. The aim of the EVOLVE is to build a bridge between probability, set oriented numerics and evolutionary computing, as to identify new common and challenging research aspects. The conference is also intended to foster a growing interest for robust and efficient methods with a sound theoretical background. EVOLVE is intended to unify theory-inspired methods and cutting-edge techniques ensuring performance guarantee factors. By gathering researchers with different backgrounds, a unified view and vocabulary can emerge where the theoretical advancements may echo in different domains. Summarizing, the EVOLVE focuses on challenging aspects arising at the passage from theory to new paradigms and aims to provide a unified view while raising questions related to reliability,  performance guarantees and modeling. The papers of the EVOLVE 2012 make a contribution to this goal. 

  20. Multi-objective optimization of HVAC system with an evolutionary computation algorithm

    International Nuclear Information System (INIS)

    Kusiak, Andrew; Tang, Fan; Xu, Guanglin

    2011-01-01

    A data-mining approach for the optimization of a HVAC (heating, ventilation, and air conditioning) system is presented. A predictive model of the HVAC system is derived by data-mining algorithms, using a dataset collected from an experiment conducted at a research facility. To minimize the energy while maintaining the corresponding IAQ (indoor air quality) within a user-defined range, a multi-objective optimization model is developed. The solutions of this model are set points of the control system derived with an evolutionary computation algorithm. The controllable input variables - supply air temperature and supply air duct static pressure set points - are generated to reduce the energy use. The results produced by the evolutionary computation algorithm show that the control strategy saves energy by optimizing operations of an HVAC system. -- Highlights: → A data-mining approach for the optimization of a heating, ventilation, and air conditioning (HVAC) system is presented. → The data used in the project has been collected from an experiment conducted at an energy research facility. → The approach presented in the paper leads to accomplishing significant energy savings without compromising the indoor air quality. → The energy savings are accomplished by computing set points for the supply air temperature and the supply air duct static pressure.

  1. Statistical physics and computational methods for evolutionary game theory

    CERN Document Server

    Javarone, Marco Alberto

    2018-01-01

    This book presents an introduction to Evolutionary Game Theory (EGT) which is an emerging field in the area of complex systems attracting the attention of researchers from disparate scientific communities. EGT allows one to represent and study several complex phenomena, such as the emergence of cooperation in social systems, the role of conformity in shaping the equilibrium of a population, and the dynamics in biological and ecological systems. Since EGT models belong to the area of complex systems, statistical physics constitutes a fundamental ingredient for investigating their behavior. At the same time, the complexity of some EGT models, such as those realized by means of agent-based methods, often require the implementation of numerical simulations. Therefore, beyond providing an introduction to EGT, this book gives a brief overview of the main statistical physics tools (such as phase transitions and the Ising model) and computational strategies for simulating evolutionary games (such as Monte Carlo algor...

  2. Genomicus 2018: karyotype evolutionary trees and on-the-fly synteny computing.

    Science.gov (United States)

    Nguyen, Nga Thi Thuy; Vincens, Pierre; Roest Crollius, Hugues; Louis, Alexandra

    2018-01-04

    Since 2010, the Genomicus web server is available online at http://genomicus.biologie.ens.fr/genomicus. This graphical browser provides access to comparative genomic analyses in four different phyla (Vertebrate, Plants, Fungi, and non vertebrate Metazoans). Users can analyse genomic information from extant species, as well as ancestral gene content and gene order for vertebrates and flowering plants, in an integrated evolutionary context. New analyses and visualization tools have recently been implemented in Genomicus Vertebrate. Karyotype structures from several genomes can now be compared along an evolutionary pathway (Multi-KaryotypeView), and synteny blocks can be computed and visualized between any two genomes (PhylDiagView). © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. 9th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Lin, Jerry; Pan, Jeng-Shyang; Tin, Pyke; Yokota, Mitsuhiro; Genetic and Evolutionary Computing

    2016-01-01

    This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2015, the 9th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by Ministry of Science and Technology, Myanmar, University of Computer Studies, Yangon, University of Miyazaki in Japan, Kaohsiung University of Applied Science in Taiwan, Fujian University of Technology in China and VSB-Technical University of Ostrava. ICGEC 2015 is held from 26-28, August, 2015 in Yangon, Myanmar. Yangon, the most multiethnic and cosmopolitan city in Myanmar, is the main gateway to the country. Despite being the commercial capital of Myanmar, Yangon is a city engulfed by its rich history and culture, an integration of ancient traditions and spiritual heritage. The stunning SHWEDAGON Pagoda is the center piece of Yangon city, which itself is famous for the best British colonial era architecture. Of particular interest in many shops of Bogyoke Aung San Market,...

  4. Strength Pareto Evolutionary Algorithm using Self-Organizing Data Analysis Techniques

    Directory of Open Access Journals (Sweden)

    Ionut Balan

    2015-03-01

    Full Text Available Multiobjective optimization is widely used in problems solving from a variety of areas. To solve such problems there was developed a set of algorithms, most of them based on evolutionary techniques. One of the algorithms from this class, which gives quite good results is SPEA2, method which is the basis of the proposed algorithm in this paper. Results from this paper are obtained by running these two algorithms on a flow-shop problem.

  5. Higher-order techniques in computational electromagnetics

    CERN Document Server

    Graglia, Roberto D

    2016-01-01

    Higher-Order Techniques in Computational Electromagnetics explains 'high-order' techniques that can significantly improve the accuracy, computational cost, and reliability of computational techniques for high-frequency electromagnetics, such as antennas, microwave devices and radar scattering applications.

  6. The Security Challenges in the IoT Enabled Cyber-Physical Systems and Opportunities for Evolutionary Computing & Other Computational Intelligence

    OpenAIRE

    He, H.; Maple, C.; Watson, T.; Tiwari, A.; Mehnen, J.; Jin, Y.; Gabrys, Bogdan

    2016-01-01

    Internet of Things (IoT) has given rise to the fourth industrial revolution (Industrie 4.0), and it brings great benefits by connecting people, processes and data. However, cybersecurity has become a critical challenge in the IoT enabled cyber physical systems, from connected supply chain, Big Data produced by huge amount of IoT devices, to industry control systems. Evolutionary computation combining with other computational intelligence will play an important role for cybersecurity, such as ...

  7. Protein 3D structure computed from evolutionary sequence variation.

    Directory of Open Access Journals (Sweden)

    Debora S Marks

    Full Text Available The evolutionary trajectory of a protein through sequence space is constrained by its function. Collections of sequence homologs record the outcomes of millions of evolutionary experiments in which the protein evolves according to these constraints. Deciphering the evolutionary record held in these sequences and exploiting it for predictive and engineering purposes presents a formidable challenge. The potential benefit of solving this challenge is amplified by the advent of inexpensive high-throughput genomic sequencing.In this paper we ask whether we can infer evolutionary constraints from a set of sequence homologs of a protein. The challenge is to distinguish true co-evolution couplings from the noisy set of observed correlations. We address this challenge using a maximum entropy model of the protein sequence, constrained by the statistics of the multiple sequence alignment, to infer residue pair couplings. Surprisingly, we find that the strength of these inferred couplings is an excellent predictor of residue-residue proximity in folded structures. Indeed, the top-scoring residue couplings are sufficiently accurate and well-distributed to define the 3D protein fold with remarkable accuracy.We quantify this observation by computing, from sequence alone, all-atom 3D structures of fifteen test proteins from different fold classes, ranging in size from 50 to 260 residues, including a G-protein coupled receptor. These blinded inferences are de novo, i.e., they do not use homology modeling or sequence-similar fragments from known structures. The co-evolution signals provide sufficient information to determine accurate 3D protein structure to 2.7-4.8 Å C(α-RMSD error relative to the observed structure, over at least two-thirds of the protein (method called EVfold, details at http://EVfold.org. This discovery provides insight into essential interactions constraining protein evolution and will facilitate a comprehensive survey of the universe of

  8. Soft computing techniques in engineering applications

    CERN Document Server

    Zhong, Baojiang

    2014-01-01

    The Soft Computing techniques, which are based on the information processing of biological systems are now massively used in the area of pattern recognition, making prediction & planning, as well as acting on the environment. Ideally speaking, soft computing is not a subject of homogeneous concepts and techniques; rather, it is an amalgamation of distinct methods that confirms to its guiding principle. At present, the main aim of soft computing is to exploit the tolerance for imprecision and uncertainty to achieve tractability, robustness and low solutions cost. The principal constituents of soft computing techniques are probabilistic reasoning, fuzzy logic, neuro-computing, genetic algorithms, belief networks, chaotic systems, as well as learning theory. This book covers contributions from various authors to demonstrate the use of soft computing techniques in various applications of engineering.  

  9. Parallel evolutionary computation in bioinformatics applications.

    Science.gov (United States)

    Pinho, Jorge; Sobral, João Luis; Rocha, Miguel

    2013-05-01

    A large number of optimization problems within the field of Bioinformatics require methods able to handle its inherent complexity (e.g. NP-hard problems) and also demand increased computational efforts. In this context, the use of parallel architectures is a necessity. In this work, we propose ParJECoLi, a Java based library that offers a large set of metaheuristic methods (such as Evolutionary Algorithms) and also addresses the issue of its efficient execution on a wide range of parallel architectures. The proposed approach focuses on the easiness of use, making the adaptation to distinct parallel environments (multicore, cluster, grid) transparent to the user. Indeed, this work shows how the development of the optimization library can proceed independently of its adaptation for several architectures, making use of Aspect-Oriented Programming. The pluggable nature of parallelism related modules allows the user to easily configure its environment, adding parallelism modules to the base source code when needed. The performance of the platform is validated with two case studies within biological model optimization. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  10. Unscented Sampling Techniques For Evolutionary Computation With Applications To Astrodynamic Optimization

    Science.gov (United States)

    2016-09-01

    factors that can cause the variations in trajectory computation time. First of all, these cases are initially computed using the guess-free mode of DIDO... Goldberg [91]. This concept essentially states that fundamental building blocks, or lower order schemata are pieced together by the genetic algorithms in...in Section 3.13.2. While this idea is very straightforward and logical, Goldberg also later points out that there are deceptive problems where these

  11. Evolutionary Cell Computing: From Protocells to Self-Organized Computing

    Science.gov (United States)

    Colombano, Silvano; New, Michael H.; Pohorille, Andrew; Scargle, Jeffrey; Stassinopoulos, Dimitris; Pearson, Mark; Warren, James

    2000-01-01

    On the path from inanimate to animate matter, a key step was the self-organization of molecules into protocells - the earliest ancestors of contemporary cells. Studies of the properties of protocells and the mechanisms by which they maintained themselves and reproduced are an important part of astrobiology. These studies also have the potential to greatly impact research in nanotechnology and computer science. Previous studies of protocells have focussed on self-replication. In these systems, Darwinian evolution occurs through a series of small alterations to functional molecules whose identities are stored. Protocells, however, may have been incapable of such storage. We hypothesize that under such conditions, the replication of functions and their interrelationships, rather than the precise identities of the functional molecules, is sufficient for survival and evolution. This process is called non-genomic evolution. Recent breakthroughs in experimental protein chemistry have opened the gates for experimental tests of non-genomic evolution. On the basis of these achievements, we have developed a stochastic model for examining the evolutionary potential of non-genomic systems. In this model, the formation and destruction (hydrolysis) of bonds joining amino acids in proteins occur through catalyzed, albeit possibly inefficient, pathways. Each protein can act as a substrate for polymerization or hydrolysis, or as a catalyst of these chemical reactions. When a protein is hydrolyzed to form two new proteins, or two proteins are joined into a single protein, the catalytic abilities of the product proteins are related to the catalytic abilities of the reactants. We will demonstrate that the catalytic capabilities of such a system can increase. Its evolutionary potential is dependent upon the competition between the formation of bond-forming and bond-cutting catalysts. The degree to which hydrolysis preferentially affects bonds in less efficient, and therefore less well

  12. An encoding technique for multiobjective evolutionary algorithms applied to power distribution system reconfiguration.

    Science.gov (United States)

    Guardado, J L; Rivas-Davalos, F; Torres, J; Maximov, S; Melgoza, E

    2014-01-01

    Network reconfiguration is an alternative to reduce power losses and optimize the operation of power distribution systems. In this paper, an encoding scheme for evolutionary algorithms is proposed in order to search efficiently for the Pareto-optimal solutions during the reconfiguration of power distribution systems considering multiobjective optimization. The encoding scheme is based on the edge window decoder (EWD) technique, which was embedded in the Strength Pareto Evolutionary Algorithm 2 (SPEA2) and the Nondominated Sorting Genetic Algorithm II (NSGA-II). The effectiveness of the encoding scheme was proved by solving a test problem for which the true Pareto-optimal solutions are known in advance. In order to prove the practicability of the encoding scheme, a real distribution system was used to find the near Pareto-optimal solutions for different objective functions to optimize.

  13. Statistical and Computational Techniques in Manufacturing

    CERN Document Server

    2012-01-01

    In recent years, interest in developing statistical and computational techniques for applied manufacturing engineering has been increased. Today, due to the great complexity of manufacturing engineering and the high number of parameters used, conventional approaches are no longer sufficient. Therefore, in manufacturing, statistical and computational techniques have achieved several applications, namely, modelling and simulation manufacturing processes, optimization manufacturing parameters, monitoring and control, computer-aided process planning, etc. The present book aims to provide recent information on statistical and computational techniques applied in manufacturing engineering. The content is suitable for final undergraduate engineering courses or as a subject on manufacturing at the postgraduate level. This book serves as a useful reference for academics, statistical and computational science researchers, mechanical, manufacturing and industrial engineers, and professionals in industries related to manu...

  14. Artificial Intelligence, Evolutionary Computing and Metaheuristics In the Footsteps of Alan Turing

    CERN Document Server

    2013-01-01

    Alan Turing pioneered many research areas such as artificial intelligence, computability, heuristics and pattern formation.  Nowadays at the information age, it is hard to imagine how the world would be without computers and the Internet. Without Turing's work, especially the core concept of Turing Machine at the heart of every computer, mobile phone and microchip today, so many things on which we are so dependent would be impossible. 2012 is the Alan Turing year -- a centenary celebration of the life and work of Alan Turing. To celebrate Turing's legacy and follow the footsteps of this brilliant mind, we take this golden opportunity to review the latest developments in areas of artificial intelligence, evolutionary computation and metaheuristics, and all these areas can be traced back to Turing's pioneer work. Topics include Turing test, Turing machine, artificial intelligence, cryptography, software testing, image processing, neural networks, nature-inspired algorithms such as bat algorithm and cuckoo sear...

  15. Discovering Unique, Low-Energy Transition States Using Evolutionary Molecular Memetic Computing

    DEFF Research Database (Denmark)

    Ellabaan, Mostafa M Hashim; Ong, Y.S.; Handoko, S.D.

    2013-01-01

    In the last few decades, identification of transition states has experienced significant growth in research interests from various scientific communities. As per the transition states theory, reaction paths and landscape analysis as well as many thermodynamic properties of biochemical systems can...... be accurately identified through the transition states. Transition states describe the paths of molecular systems in transiting across stable states. In this article, we present the discovery of unique, low-energy transition states and showcase the efficacy of their identification using the memetic computing...... paradigm under a Molecular Memetic Computing (MMC) framework. In essence, the MMC is equipped with the tree-based representation of non-cyclic molecules and the covalent-bond-driven evolutionary operators, in addition to the typical backbone of memetic algorithms. Herein, we employ genetic algorithm...

  16. An Encoding Technique for Multiobjective Evolutionary Algorithms Applied to Power Distribution System Reconfiguration

    Directory of Open Access Journals (Sweden)

    J. L. Guardado

    2014-01-01

    Full Text Available Network reconfiguration is an alternative to reduce power losses and optimize the operation of power distribution systems. In this paper, an encoding scheme for evolutionary algorithms is proposed in order to search efficiently for the Pareto-optimal solutions during the reconfiguration of power distribution systems considering multiobjective optimization. The encoding scheme is based on the edge window decoder (EWD technique, which was embedded in the Strength Pareto Evolutionary Algorithm 2 (SPEA2 and the Nondominated Sorting Genetic Algorithm II (NSGA-II. The effectiveness of the encoding scheme was proved by solving a test problem for which the true Pareto-optimal solutions are known in advance. In order to prove the practicability of the encoding scheme, a real distribution system was used to find the near Pareto-optimal solutions for different objective functions to optimize.

  17. Artificial intelligence in peer review: How can evolutionary computation support journal editors?

    Science.gov (United States)

    Mrowinski, Maciej J; Fronczak, Piotr; Fronczak, Agata; Ausloos, Marcel; Nedic, Olgica

    2017-01-01

    With the volume of manuscripts submitted for publication growing every year, the deficiencies of peer review (e.g. long review times) are becoming more apparent. Editorial strategies, sets of guidelines designed to speed up the process and reduce editors' workloads, are treated as trade secrets by publishing houses and are not shared publicly. To improve the effectiveness of their strategies, editors in small publishing groups are faced with undertaking an iterative trial-and-error approach. We show that Cartesian Genetic Programming, a nature-inspired evolutionary algorithm, can dramatically improve editorial strategies. The artificially evolved strategy reduced the duration of the peer review process by 30%, without increasing the pool of reviewers (in comparison to a typical human-developed strategy). Evolutionary computation has typically been used in technological processes or biological ecosystems. Our results demonstrate that genetic programs can improve real-world social systems that are usually much harder to understand and control than physical systems.

  18. Artificial intelligence in peer review: How can evolutionary computation support journal editors?

    Directory of Open Access Journals (Sweden)

    Maciej J Mrowinski

    Full Text Available With the volume of manuscripts submitted for publication growing every year, the deficiencies of peer review (e.g. long review times are becoming more apparent. Editorial strategies, sets of guidelines designed to speed up the process and reduce editors' workloads, are treated as trade secrets by publishing houses and are not shared publicly. To improve the effectiveness of their strategies, editors in small publishing groups are faced with undertaking an iterative trial-and-error approach. We show that Cartesian Genetic Programming, a nature-inspired evolutionary algorithm, can dramatically improve editorial strategies. The artificially evolved strategy reduced the duration of the peer review process by 30%, without increasing the pool of reviewers (in comparison to a typical human-developed strategy. Evolutionary computation has typically been used in technological processes or biological ecosystems. Our results demonstrate that genetic programs can improve real-world social systems that are usually much harder to understand and control than physical systems.

  19. A sense of life: computational and experimental investigations with models of biochemical and evolutionary processes.

    Science.gov (United States)

    Mishra, Bud; Daruwala, Raoul-Sam; Zhou, Yi; Ugel, Nadia; Policriti, Alberto; Antoniotti, Marco; Paxia, Salvatore; Rejali, Marc; Rudra, Archisman; Cherepinsky, Vera; Silver, Naomi; Casey, William; Piazza, Carla; Simeoni, Marta; Barbano, Paolo; Spivak, Marina; Feng, Jiawu; Gill, Ofer; Venkatesh, Mysore; Cheng, Fang; Sun, Bing; Ioniata, Iuliana; Anantharaman, Thomas; Hubbard, E Jane Albert; Pnueli, Amir; Harel, David; Chandru, Vijay; Hariharan, Ramesh; Wigler, Michael; Park, Frank; Lin, Shih-Chieh; Lazebnik, Yuri; Winkler, Franz; Cantor, Charles R; Carbone, Alessandra; Gromov, Mikhael

    2003-01-01

    We collaborate in a research program aimed at creating a rigorous framework, experimental infrastructure, and computational environment for understanding, experimenting with, manipulating, and modifying a diverse set of fundamental biological processes at multiple scales and spatio-temporal modes. The novelty of our research is based on an approach that (i) requires coevolution of experimental science and theoretical techniques and (ii) exploits a certain universality in biology guided by a parsimonious model of evolutionary mechanisms operating at the genomic level and manifesting at the proteomic, transcriptomic, phylogenic, and other higher levels. Our current program in "systems biology" endeavors to marry large-scale biological experiments with the tools to ponder and reason about large, complex, and subtle natural systems. To achieve this ambitious goal, ideas and concepts are combined from many different fields: biological experimentation, applied mathematical modeling, computational reasoning schemes, and large-scale numerical and symbolic simulations. From a biological viewpoint, the basic issues are many: (i) understanding common and shared structural motifs among biological processes; (ii) modeling biological noise due to interactions among a small number of key molecules or loss of synchrony; (iii) explaining the robustness of these systems in spite of such noise; and (iv) cataloging multistatic behavior and adaptation exhibited by many biological processes.

  20. Design of a computation tool for neutron spectrometry and dosimetry through evolutionary neural networks

    International Nuclear Information System (INIS)

    Ortiz R, J. M.; Vega C, H. R.; Martinez B, M. R.; Gallego, E.

    2009-10-01

    The neutron dosimetry is one of the most complicated tasks of radiation protection, due to it is a complex technique and highly dependent of neutron energy. One of the first devices used to perform neutron spectrometry is the system known as spectrometric system of Bonner spheres, that continuous being one of spectrometers most commonly used. This system has disadvantages such as: the components weight, the low resolution of spectrum, long and drawn out procedure for the spectra reconstruction, which require an expert user in system management, the need of use a reconstruction code as BUNKIE, SAND, etc., which are based on an iterative reconstruction algorithm and whose greatest inconvenience is that for the spectrum reconstruction, are needed to provide to system and initial spectrum as close as possible to the desired spectrum get. Consequently, researchers have mentioned the need to developed alternative measurement techniques to improve existing monitoring systems for workers. Among these alternative techniques have been reported several reconstruction procedures based on artificial intelligence techniques such as genetic algorithms, artificial neural networks and hybrid systems of evolutionary artificial neural networks using genetic algorithms. However, the use of these techniques in the nuclear science area is not free of problems, so it has been suggested that more research is conducted in such a way as to solve these disadvantages. Because they are emerging technologies, there are no tools for the results analysis, so in this paper we present first the design of a computation tool that allow to analyze the neutron spectra and equivalent doses, obtained through the hybrid technology of neural networks and genetic algorithms. This tool provides an user graphical environment, friendly, intuitive and easy of operate. The speed of program operation is high, executing the analysis in a few seconds, so it may storage and or print the obtained information for

  1. Optimization and Assessment of Wavelet Packet Decompositions with Evolutionary Computation

    Directory of Open Access Journals (Sweden)

    Schell Thomas

    2003-01-01

    Full Text Available In image compression, the wavelet transformation is a state-of-the-art component. Recently, wavelet packet decomposition has received quite an interest. A popular approach for wavelet packet decomposition is the near-best-basis algorithm using nonadditive cost functions. In contrast to additive cost functions, the wavelet packet decomposition of the near-best-basis algorithm is only suboptimal. We apply methods from the field of evolutionary computation (EC to test the quality of the near-best-basis results. We observe a phenomenon: the results of the near-best-basis algorithm are inferior in terms of cost-function optimization but are superior in terms of rate/distortion performance compared to EC methods.

  2. Epigenetic Tracking, a Method to Generate Arbitrary Shapes By Using Evolutionary-Developmental Techniques

    OpenAIRE

    Fontana, Alessandro

    2008-01-01

    This paper describes an Artificial Embryology method (called ``Epigenetic Tracking'') to generate predefined arbitrarily shaped 2-dimensional arrays of cells by means of evolutionary techniques. It is based on a model of development, whose key features are: i) the distinction bewteen ``normal'' and ``driver'' cells, being the latter able to receive guidance from the genome, ii) the implementation of the proliferation/apoptosis events in such a way that many cells are created/deleted at once, ...

  3. MEGA-CC: computing core of molecular evolutionary genetics analysis program for automated and iterative data analysis.

    Science.gov (United States)

    Kumar, Sudhir; Stecher, Glen; Peterson, Daniel; Tamura, Koichiro

    2012-10-15

    There is a growing need in the research community to apply the molecular evolutionary genetics analysis (MEGA) software tool for batch processing a large number of datasets and to integrate it into analysis workflows. Therefore, we now make available the computing core of the MEGA software as a stand-alone executable (MEGA-CC), along with an analysis prototyper (MEGA-Proto). MEGA-CC provides users with access to all the computational analyses available through MEGA's graphical user interface version. This includes methods for multiple sequence alignment, substitution model selection, evolutionary distance estimation, phylogeny inference, substitution rate and pattern estimation, tests of natural selection and ancestral sequence inference. Additionally, we have upgraded the source code for phylogenetic analysis using the maximum likelihood methods for parallel execution on multiple processors and cores. Here, we describe MEGA-CC and outline the steps for using MEGA-CC in tandem with MEGA-Proto for iterative and automated data analysis. http://www.megasoftware.net/.

  4. Multi-objective evolutionary algorithms for fuzzy classification in survival prediction.

    Science.gov (United States)

    Jiménez, Fernando; Sánchez, Gracia; Juárez, José M

    2014-03-01

    This paper presents a novel rule-based fuzzy classification methodology for survival/mortality prediction in severe burnt patients. Due to the ethical aspects involved in this medical scenario, physicians tend not to accept a computer-based evaluation unless they understand why and how such a recommendation is given. Therefore, any fuzzy classifier model must be both accurate and interpretable. The proposed methodology is a three-step process: (1) multi-objective constrained optimization of a patient's data set, using Pareto-based elitist multi-objective evolutionary algorithms to maximize accuracy and minimize the complexity (number of rules) of classifiers, subject to interpretability constraints; this step produces a set of alternative (Pareto) classifiers; (2) linguistic labeling, which assigns a linguistic label to each fuzzy set of the classifiers; this step is essential to the interpretability of the classifiers; (3) decision making, whereby a classifier is chosen, if it is satisfactory, according to the preferences of the decision maker. If no classifier is satisfactory for the decision maker, the process starts again in step (1) with a different input parameter set. The performance of three multi-objective evolutionary algorithms, niched pre-selection multi-objective algorithm, elitist Pareto-based multi-objective evolutionary algorithm for diversity reinforcement (ENORA) and the non-dominated sorting genetic algorithm (NSGA-II), was tested using a patient's data set from an intensive care burn unit and a standard machine learning data set from an standard machine learning repository. The results are compared using the hypervolume multi-objective metric. Besides, the results have been compared with other non-evolutionary techniques and validated with a multi-objective cross-validation technique. Our proposal improves the classification rate obtained by other non-evolutionary techniques (decision trees, artificial neural networks, Naive Bayes, and case

  5. Fuzzy Controller Design Using Evolutionary Techniques for Twin Rotor MIMO System: A Comparative Study

    Directory of Open Access Journals (Sweden)

    H. A. Hashim

    2015-01-01

    Full Text Available This paper presents a comparative study of fuzzy controller design for the twin rotor multi-input multioutput (MIMO system (TRMS considering most promising evolutionary techniques. These are gravitational search algorithm (GSA, particle swarm optimization (PSO, artificial bee colony (ABC, and differential evolution (DE. In this study, the gains of four fuzzy proportional derivative (PD controllers for TRMS have been optimized using the considered techniques. The optimization techniques are developed to identify the optimal control parameters for system stability enhancement, to cancel high nonlinearities in the model, to reduce the coupling effect, and to drive TRMS pitch and yaw angles into the desired tracking trajectory efficiently and accurately. The most effective technique in terms of system response due to different disturbances has been investigated. In this work, it is observed that GSA is the most effective technique in terms of solution quality and convergence speed.

  6. International Conference on Computer, Communication and Computational Sciences

    CERN Document Server

    Mishra, Krishn; Tiwari, Shailesh; Singh, Vivek

    2017-01-01

    Exchange of information and innovative ideas are necessary to accelerate the development of technology. With advent of technology, intelligent and soft computing techniques came into existence with a wide scope of implementation in engineering sciences. Keeping this ideology in preference, this book includes the insights that reflect the ‘Advances in Computer and Computational Sciences’ from upcoming researchers and leading academicians across the globe. It contains high-quality peer-reviewed papers of ‘International Conference on Computer, Communication and Computational Sciences (ICCCCS 2016), held during 12-13 August, 2016 in Ajmer, India. These papers are arranged in the form of chapters. The content of the book is divided into two volumes that cover variety of topics such as intelligent hardware and software design, advanced communications, power and energy optimization, intelligent techniques used in internet of things, intelligent image processing, advanced software engineering, evolutionary and ...

  7. Safety management in NPPs using an evolutionary algorithm technique

    International Nuclear Information System (INIS)

    Mishra, Alok; Patwardhan, Anand; Verma, A.K.

    2007-01-01

    The general goal of safety management in Nuclear Power Plants (NPPs) is to make requirements and activities more risk effective and less costly. The technical specification and maintenance (TS and M) activities in a plant are associated with controlling risk or with satisfying requirements, and are candidates to be evaluated for their resource effectiveness in risk-informed applications. Accordingly, the risk-based analysis of technical specification (RBTS) is being considered in evaluating current TS. The multi-objective optimization of the TS and M requirements of a NPP based on risk and cost, gives the pareto-optimal solutions, from which the utility can pick its decision variables suiting its interest. In this paper, a multi-objective evolutionary algorithm technique has been used to make a trade-off between risk and cost both at the system level and at the plant level for loss of coolant accident (LOCA) and main steam line break (MSLB) as initiating events

  8. Effectively Tackling Reinsurance Problems by Using Evolutionary and Swarm Intelligence Algorithms

    Directory of Open Access Journals (Sweden)

    Sancho Salcedo-Sanz

    2014-04-01

    Full Text Available This paper is focused on solving different hard optimization problems that arise in the field of insurance and, more specifically, in reinsurance problems. In this area, the complexity of the models and assumptions considered in the definition of the reinsurance rules and conditions produces hard black-box optimization problems (problems in which the objective function does not have an algebraic expression, but it is the output of a system (usually a computer program, which must be solved in order to obtain the optimal output of the reinsurance. The application of traditional optimization approaches is not possible in this kind of mathematical problem, so new computational paradigms must be applied to solve these problems. In this paper, we show the performance of two evolutionary and swarm intelligence techniques (evolutionary programming and particle swarm optimization. We provide an analysis in three black-box optimization problems in reinsurance, where the proposed approaches exhibit an excellent behavior, finding the optimal solution within a fraction of the computational cost used by inspection or enumeration methods.

  9. EvoluCode: Evolutionary Barcodes as a Unifying Framework for Multilevel Evolutionary Data.

    Science.gov (United States)

    Linard, Benjamin; Nguyen, Ngoc Hoan; Prosdocimi, Francisco; Poch, Olivier; Thompson, Julie D

    2012-01-01

    Evolutionary systems biology aims to uncover the general trends and principles governing the evolution of biological networks. An essential part of this process is the reconstruction and analysis of the evolutionary histories of these complex, dynamic networks. Unfortunately, the methodologies for representing and exploiting such complex evolutionary histories in large scale studies are currently limited. Here, we propose a new formalism, called EvoluCode (Evolutionary barCode), which allows the integration of different evolutionary parameters (eg, sequence conservation, orthology, synteny …) in a unifying format and facilitates the multilevel analysis and visualization of complex evolutionary histories at the genome scale. The advantages of the approach are demonstrated by constructing barcodes representing the evolution of the complete human proteome. Two large-scale studies are then described: (i) the mapping and visualization of the barcodes on the human chromosomes and (ii) automatic clustering of the barcodes to highlight protein subsets sharing similar evolutionary histories and their functional analysis. The methodologies developed here open the way to the efficient application of other data mining and knowledge extraction techniques in evolutionary systems biology studies. A database containing all EvoluCode data is available at: http://lbgi.igbmc.fr/barcodes.

  10. Evolutionary Sound Synthesis Controlled by Gestural Data

    Directory of Open Access Journals (Sweden)

    Jose Fornari

    2011-05-01

    Full Text Available This article focuses on the interdisciplinary research involving Computer Music and Generative Visual Art. We describe the implementation of two interactive artistic systems based on principles of Gestural Data (WILSON, 2002 retrieval and self-organization (MORONI, 2003, to control an Evolutionary Sound Synthesis method (ESSynth. The first implementation uses, as gestural data, image mapping of handmade drawings. The second one uses gestural data from dynamic body movements of dance. The resulting computer output is generated by an interactive system implemented in Pure Data (PD. This system uses principles of Evolutionary Computation (EC, which yields the generation of a synthetic adaptive population of sound objects. Considering that music could be seen as “organized sound” the contribution of our study is to develop a system that aims to generate "self-organized sound" – a method that uses evolutionary computation to bridge between gesture, sound and music.

  11. Computational Evolutionary Methodology for Knowledge Discovery and Forecasting in Epidemiology and Medicine

    International Nuclear Information System (INIS)

    Rao, Dhananjai M.; Chernyakhovsky, Alexander; Rao, Victoria

    2008-01-01

    Humanity is facing an increasing number of highly virulent and communicable diseases such as avian influenza. Researchers believe that avian influenza has potential to evolve into one of the deadliest pandemics. Combating these diseases requires in-depth knowledge of their epidemiology. An effective methodology for discovering epidemiological knowledge is to utilize a descriptive, evolutionary, ecological model and use bio-simulations to study and analyze it. These types of bio-simulations fall under the category of computational evolutionary methods because the individual entities participating in the simulation are permitted to evolve in a natural manner by reacting to changes in the simulated ecosystem. This work describes the application of the aforementioned methodology to discover epidemiological knowledge about avian influenza using a novel eco-modeling and bio-simulation environment called SEARUMS. The mathematical principles underlying SEARUMS, its design, and the procedure for using SEARUMS are discussed. The bio-simulations and multi-faceted case studies conducted using SEARUMS elucidate its ability to pinpoint timelines, epicenters, and socio-economic impacts of avian influenza. This knowledge is invaluable for proactive deployment of countermeasures in order to minimize negative socioeconomic impacts, combat the disease, and avert a pandemic

  12. Towards Automatic Controller Design using Multi-Objective Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf

    of evolutionary computation, a choice was made to use multi-objective algorithms for the purpose of aiding in automatic controller design. More specifically, the choice was made to use the Non-dominated Sorting Genetic Algorithm II (NSGAII), which is one of the most potent algorithms currently in use...... for automatic controller design. However, because the field of evolutionary computation is relatively unknown in the field of control engineering, this thesis also includes a comprehensive introduction to the basic field of evolutionary computation as well as a description of how the field has previously been......In order to design the controllers of tomorrow, a need has risen for tools that can aid in the design of these. A desire to use evolutionary computation as a tool to achieve that goal is what gave inspiration for the work contained in this thesis. After having studied the foundations...

  13. Crowd Computing as a Cooperation Problem: An Evolutionary Approach

    Science.gov (United States)

    Christoforou, Evgenia; Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A.; Sánchez, Angel

    2013-05-01

    Cooperation is one of the socio-economic issues that has received more attention from the physics community. The problem has been mostly considered by studying games such as the Prisoner's Dilemma or the Public Goods Game. Here, we take a step forward by studying cooperation in the context of crowd computing. We introduce a model loosely based on Principal-agent theory in which people (workers) contribute to the solution of a distributed problem by computing answers and reporting to the problem proposer (master). To go beyond classical approaches involving the concept of Nash equilibrium, we work on an evolutionary framework in which both the master and the workers update their behavior through reinforcement learning. Using a Markov chain approach, we show theoretically that under certain----not very restrictive—conditions, the master can ensure the reliability of the answer resulting of the process. Then, we study the model by numerical simulations, finding that convergence, meaning that the system reaches a point in which it always produces reliable answers, may in general be much faster than the upper bounds given by the theoretical calculation. We also discuss the effects of the master's level of tolerance to defectors, about which the theory does not provide information. The discussion shows that the system works even with very large tolerances. We conclude with a discussion of our results and possible directions to carry this research further.

  14. International Conference on Frontiers of Intelligent Computing : Theory and Applications

    CERN Document Server

    Udgata, Siba; Biswal, Bhabendra

    2013-01-01

    The volume contains the papers presented at FICTA 2012: International Conference on Frontiers in Intelligent Computing: Theory and Applications held on December 22-23, 2012 in Bhubaneswar engineering College, Bhubaneswar, Odissa, India. It contains 86 papers contributed by authors from the globe. These research papers mainly focused on application of intelligent techniques which includes evolutionary computation techniques like genetic algorithm, particle swarm optimization techniques, teaching-learning based optimization etc  for various engineering applications such as data mining, image processing, cloud computing, networking etc.

  15. Harmonic elimination in diode-clamped multilevel inverter using evolutionary algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Barkati, Said [Laboratoire d' analyse des Signaux et Systemes (LASS), Universite de M' sila, BP. 166, rue Ichbilia 28000 M' sila (Algeria); Baghli, Lotfi [Groupe de Recherche en Electrotechnique et Electronique de Nancy (GREEN), CNRS UMR 7030, Universite Henri Poincare Nancy 1, BP. 239, 54506 Vandoeuvre-les-Nancy (France); Berkouk, El Madjid; Boucherit, Mohamed-Seghir [Laboratoire de Commande des Processus (LCP), Ecole Nationale Polytechnique, BP. 182, 10 Avenue Hassen Badi, 16200 El Harrach, Alger (Algeria)

    2008-10-15

    This paper describes two evolutionary algorithms for the optimized harmonic stepped-waveform technique. Genetic algorithms and particle swarm optimization are applied to compute the switching angles in a three-phase seven-level inverter to produce the required fundamental voltage while, at the same time, specified harmonics are eliminated. Furthermore, these algorithms are also used to solve the starting point problem of the Newton-Raphson conventional method. This combination provides a very effective method for the harmonic elimination technique. This strategy is useful for different structures of seven-level inverters. The diode-clamped topology is considered in this study. (author)

  16. Intention recognition, commitment and their roles in the evolution of cooperation from artificial intelligence techniques to evolutionary game theory models

    CERN Document Server

    Han, The Anh

    2013-01-01

    This original and timely monograph describes a unique self-contained excursion that reveals to the readers the roles of two basic cognitive abilities, i.e. intention recognition and arranging commitments, in the evolution of cooperative behavior. This book analyses intention recognition, an important ability that helps agents predict others’ behavior, in its artificial intelligence and evolutionary computational modeling aspects, and proposes a novel intention recognition method. Furthermore, the book presents a new framework for intention-based decision making and illustrates several ways in which an ability to recognize intentions of others can enhance a decision making process. By employing the new intention recognition method and the tools of evolutionary game theory, this book introduces computational models demonstrating that intention recognition promotes the emergence of cooperation within populations of self-regarding agents. Finally, the book describes how commitment provides a pathway to the evol...

  17. Improved quantum-inspired evolutionary algorithm with diversity information applied to economic dispatch problem with prohibited operating zones

    International Nuclear Information System (INIS)

    Vianna Neto, Julio Xavier; Andrade Bernert, Diego Luis de; Santos Coelho, Leandro dos

    2011-01-01

    The objective of the economic dispatch problem (EDP) of electric power generation, whose characteristics are complex and highly nonlinear, is to schedule the committed generating unit outputs so as to meet the required load demand at minimum operating cost while satisfying all unit and system equality and inequality constraints. Recently, as an alternative to the conventional mathematical approaches, modern meta-heuristic optimization techniques have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. Research on merging evolutionary computation and quantum computation has been started since late 1990. Inspired on the quantum computation, this paper presented an improved quantum-inspired evolutionary algorithm (IQEA) based on diversity information of population. A classical quantum-inspired evolutionary algorithm (QEA) and the IQEA were implemented and validated for a benchmark of EDP with 15 thermal generators with prohibited operating zones. From the results for the benchmark problem, it is observed that the proposed IQEA approach provides promising results when compared to various methods available in the literature.

  18. Improved quantum-inspired evolutionary algorithm with diversity information applied to economic dispatch problem with prohibited operating zones

    Energy Technology Data Exchange (ETDEWEB)

    Vianna Neto, Julio Xavier, E-mail: julio.neto@onda.com.b [Pontifical Catholic University of Parana, PUCPR, Undergraduate Program at Mechatronics Engineering, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil); Andrade Bernert, Diego Luis de, E-mail: dbernert@gmail.co [Pontifical Catholic University of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil); Santos Coelho, Leandro dos, E-mail: leandro.coelho@pucpr.b [Pontifical Catholic University of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil)

    2011-01-15

    The objective of the economic dispatch problem (EDP) of electric power generation, whose characteristics are complex and highly nonlinear, is to schedule the committed generating unit outputs so as to meet the required load demand at minimum operating cost while satisfying all unit and system equality and inequality constraints. Recently, as an alternative to the conventional mathematical approaches, modern meta-heuristic optimization techniques have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. Research on merging evolutionary computation and quantum computation has been started since late 1990. Inspired on the quantum computation, this paper presented an improved quantum-inspired evolutionary algorithm (IQEA) based on diversity information of population. A classical quantum-inspired evolutionary algorithm (QEA) and the IQEA were implemented and validated for a benchmark of EDP with 15 thermal generators with prohibited operating zones. From the results for the benchmark problem, it is observed that the proposed IQEA approach provides promising results when compared to various methods available in the literature.

  19. Improved quantum-inspired evolutionary algorithm with diversity information applied to economic dispatch problem with prohibited operating zones

    Energy Technology Data Exchange (ETDEWEB)

    Neto, Julio Xavier Vianna [Pontifical Catholic University of Parana, PUCPR, Undergraduate Program at Mechatronics Engineering, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil); Bernert, Diego Luis de Andrade; Coelho, Leandro dos Santos [Pontifical Catholic University of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil)

    2011-01-15

    The objective of the economic dispatch problem (EDP) of electric power generation, whose characteristics are complex and highly nonlinear, is to schedule the committed generating unit outputs so as to meet the required load demand at minimum operating cost while satisfying all unit and system equality and inequality constraints. Recently, as an alternative to the conventional mathematical approaches, modern meta-heuristic optimization techniques have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. Research on merging evolutionary computation and quantum computation has been started since late 1990. Inspired on the quantum computation, this paper presented an improved quantum-inspired evolutionary algorithm (IQEA) based on diversity information of population. A classical quantum-inspired evolutionary algorithm (QEA) and the IQEA were implemented and validated for a benchmark of EDP with 15 thermal generators with prohibited operating zones. From the results for the benchmark problem, it is observed that the proposed IQEA approach provides promising results when compared to various methods available in the literature. (author)

  20. Computational techniques of the simplex method

    CERN Document Server

    Maros, István

    2003-01-01

    Computational Techniques of the Simplex Method is a systematic treatment focused on the computational issues of the simplex method. It provides a comprehensive coverage of the most important and successful algorithmic and implementation techniques of the simplex method. It is a unique source of essential, never discussed details of algorithmic elements and their implementation. On the basis of the book the reader will be able to create a highly advanced implementation of the simplex method which, in turn, can be used directly or as a building block in other solution algorithms.

  1. A representation-theoretic approach to the calculation of evolutionary distance in bacteria

    Science.gov (United States)

    Sumner, Jeremy G.; Jarvis, Peter D.; Francis, Andrew R.

    2017-08-01

    In the context of bacteria and models of their evolution under genome rearrangement, we explore a novel application of group representation theory to the inference of evolutionary history. Our contribution is to show, in a very general maximum likelihood setting, how to use elementary matrix algebra to sidestep intractable combinatorial computations and convert the problem into one of eigenvalue estimation amenable to standard numerical approximation techniques.

  2. International Conference on Frontiers of Intelligent Computing : Theory and Applications

    CERN Document Server

    Udgata, Siba; Biswal, Bhabendra

    2014-01-01

    This volume contains the papers presented at the Second International Conference on Frontiers in Intelligent Computing: Theory and Applications (FICTA-2013) held during 14-16 November 2013 organized by Bhubaneswar Engineering College (BEC), Bhubaneswar, Odisha, India. It contains 63 papers focusing on application of intelligent techniques which includes evolutionary computation techniques like genetic algorithm, particle swarm optimization techniques, teaching-learning based optimization etc  for various engineering applications such as data mining, Fuzzy systems, Machine Intelligence and ANN, Web technologies and Multimedia applications and Intelligent computing and Networking etc.

  3. MEGA5: Molecular Evolutionary Genetics Analysis Using Maximum Likelihood, Evolutionary Distance, and Maximum Parsimony Methods

    Science.gov (United States)

    Tamura, Koichiro; Peterson, Daniel; Peterson, Nicholas; Stecher, Glen; Nei, Masatoshi; Kumar, Sudhir

    2011-01-01

    Comparative analysis of molecular sequence data is essential for reconstructing the evolutionary histories of species and inferring the nature and extent of selective forces shaping the evolution of genes and species. Here, we announce the release of Molecular Evolutionary Genetics Analysis version 5 (MEGA5), which is a user-friendly software for mining online databases, building sequence alignments and phylogenetic trees, and using methods of evolutionary bioinformatics in basic biology, biomedicine, and evolution. The newest addition in MEGA5 is a collection of maximum likelihood (ML) analyses for inferring evolutionary trees, selecting best-fit substitution models (nucleotide or amino acid), inferring ancestral states and sequences (along with probabilities), and estimating evolutionary rates site-by-site. In computer simulation analyses, ML tree inference algorithms in MEGA5 compared favorably with other software packages in terms of computational efficiency and the accuracy of the estimates of phylogenetic trees, substitution parameters, and rate variation among sites. The MEGA user interface has now been enhanced to be activity driven to make it easier for the use of both beginners and experienced scientists. This version of MEGA is intended for the Windows platform, and it has been configured for effective use on Mac OS X and Linux desktops. It is available free of charge from http://www.megasoftware.net. PMID:21546353

  4. Computed Radiography: An Innovative Inspection Technique

    International Nuclear Information System (INIS)

    Klein, William A.; Councill, Donald L.

    2002-01-01

    Florida Power and Light Company's (FPL) Nuclear Division combined two diverse technologies to create an innovative inspection technique, Computed Radiography, that improves personnel safety and unit reliability while reducing inspection costs. This technique was pioneered in the medical field and applied in the Nuclear Division initially to detect piping degradation due to flow-accelerated corrosion. Component degradation can be detected by this additional technique. This approach permits FPL to reduce inspection costs, perform on line examinations (no generation curtailment), and to maintain or improve both personnel safety and unit reliability. Computed Radiography is a very versatile tool capable of other uses: - improving the external corrosion program by permitting inspections underneath insulation, and - diagnosing system and component problems such as valve positions, without the need to shutdown or disassemble the component. (authors)

  5. Parallel Evolutionary Optimization for Neuromorphic Network Training

    Energy Technology Data Exchange (ETDEWEB)

    Schuman, Catherine D [ORNL; Disney, Adam [University of Tennessee (UT); Singh, Susheela [North Carolina State University (NCSU), Raleigh; Bruer, Grant [University of Tennessee (UT); Mitchell, John Parker [University of Tennessee (UT); Klibisz, Aleksander [University of Tennessee (UT); Plank, James [University of Tennessee (UT)

    2016-01-01

    One of the key impediments to the success of current neuromorphic computing architectures is the issue of how best to program them. Evolutionary optimization (EO) is one promising programming technique; in particular, its wide applicability makes it especially attractive for neuromorphic architectures, which can have many different characteristics. In this paper, we explore different facets of EO on a spiking neuromorphic computing model called DANNA. We focus on the performance of EO in the design of our DANNA simulator, and on how to structure EO on both multicore and massively parallel computing systems. We evaluate how our parallel methods impact the performance of EO on Titan, the U.S.'s largest open science supercomputer, and BOB, a Beowulf-style cluster of Raspberry Pi's. We also focus on how to improve the EO by evaluating commonality in higher performing neural networks, and present the result of a study that evaluates the EO performed by Titan.

  6. AN EVOLUTIONARY ALGORITHM FOR FAST INTENSITY BASED IMAGE MATCHING BETWEEN OPTICAL AND SAR SATELLITE IMAGERY

    Directory of Open Access Journals (Sweden)

    P. Fischer

    2018-04-01

    Full Text Available This paper presents a hybrid evolutionary algorithm for fast intensity based matching between satellite imagery from SAR and very high-resolution (VHR optical sensor systems. The precise and accurate co-registration of image time series and images of different sensors is a key task in multi-sensor image processing scenarios. The necessary preprocessing step of image matching and tie-point detection is divided into a search problem and a similarity measurement. Within this paper we evaluate the use of an evolutionary search strategy for establishing the spatial correspondence between satellite imagery of optical and radar sensors. The aim of the proposed algorithm is to decrease the computational costs during the search process by formulating the search as an optimization problem. Based upon the canonical evolutionary algorithm, the proposed algorithm is adapted for SAR/optical imagery intensity based matching. Extensions are drawn using techniques like hybridization (e.g. local search and others to lower the number of objective function calls and refine the result. The algorithm significantely decreases the computational costs whilst finding the optimal solution in a reliable way.

  7. A Hybrid Chaotic Quantum Evolutionary Algorithm

    DEFF Research Database (Denmark)

    Cai, Y.; Zhang, M.; Cai, H.

    2010-01-01

    A hybrid chaotic quantum evolutionary algorithm is proposed to reduce amount of computation, speed up convergence and restrain premature phenomena of quantum evolutionary algorithm. The proposed algorithm adopts the chaotic initialization method to generate initial population which will form a pe...... tests. The presented algorithm is applied to urban traffic signal timing optimization and the effect is satisfied....

  8. Fundamentals of natural computing basic concepts, algorithms, and applications

    CERN Document Server

    de Castro, Leandro Nunes

    2006-01-01

    Introduction A Small Sample of Ideas The Philosophy of Natural Computing The Three Branches: A Brief Overview When to Use Natural Computing Approaches Conceptualization General Concepts PART I - COMPUTING INSPIRED BY NATURE Evolutionary Computing Problem Solving as a Search Task Hill Climbing and Simulated Annealing Evolutionary Biology Evolutionary Computing The Other Main Evolutionary Algorithms From Evolutionary Biology to Computing Scope of Evolutionary Computing Neurocomputing The Nervous System Artif

  9. GRAVTool, a Package to Compute Geoid Model by Remove-Compute-Restore Technique

    Science.gov (United States)

    Marotta, G. S.; Blitzkow, D.; Vidotti, R. M.

    2015-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astro-geodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove-Compute-Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and global geopotential coefficients, respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and that adjust these models to one local vertical datum. This research presents a developed package called GRAVTool based on MATLAB software to compute local geoid models by RCR technique and its application in a study area. The studied area comprehends the federal district of Brazil, with ~6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show the local geoid model computed by the GRAVTool package (Figure), using 1377 terrestrial gravity data, SRTM data with 3 arc second of resolution, and geopotential coefficients of the EIGEN-6C4 model to degree 360. The accuracy of the computed model (σ = ± 0.071 m, RMS = 0.069 m, maximum = 0.178 m and minimum = -0.123 m) matches the uncertainty (σ =± 0.073) of 21 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.099 m, RMS = 0.208 m, maximum = 0.419 m and minimum = -0.040 m).

  10. Artificial intelligence techniques in power systems

    Energy Technology Data Exchange (ETDEWEB)

    Laughton, M.A.

    1997-12-31

    Since the early to mid 1980s much of the effort in power systems analysis has turned away from the methodology of formal mathematical modelling which came from the fields of operations research, control theory and numerical analysis to the less rigorous techniques of artificial intelligence (AI). Today the main AI techniques found in power systems applications are those utilising the logic and knowledge representations of expert systems, fuzzy systems, artificial neural networks (ANN) and, more recently, evolutionary computing. These techniques will be outlined in this chapter and the power system applications indicated. (Author)

  11. Mono and multi-objective optimization techniques applied to a large range of industrial test cases using Metamodel assisted Evolutionary Algorithms

    Science.gov (United States)

    Fourment, Lionel; Ducloux, Richard; Marie, Stéphane; Ejday, Mohsen; Monnereau, Dominique; Massé, Thomas; Montmitonnet, Pierre

    2010-06-01

    The use of material processing numerical simulation allows a strategy of trial and error to improve virtual processes without incurring material costs or interrupting production and therefore save a lot of money, but it requires user time to analyze the results, adjust the operating conditions and restart the simulation. Automatic optimization is the perfect complement to simulation. Evolutionary Algorithm coupled with metamodelling makes it possible to obtain industrially relevant results on a very large range of applications within a few tens of simulations and without any specific automatic optimization technique knowledge. Ten industrial partners have been selected to cover the different area of the mechanical forging industry and provide different examples of the forming simulation tools. It aims to demonstrate that it is possible to obtain industrially relevant results on a very large range of applications within a few tens of simulations and without any specific automatic optimization technique knowledge. The large computational time is handled by a metamodel approach. It allows interpolating the objective function on the entire parameter space by only knowing the exact function values at a reduced number of "master points". Two algorithms are used: an evolution strategy combined with a Kriging metamodel and a genetic algorithm combined with a Meshless Finite Difference Method. The later approach is extended to multi-objective optimization. The set of solutions, which corresponds to the best possible compromises between the different objectives, is then computed in the same way. The population based approach allows using the parallel capabilities of the utilized computer with a high efficiency. An optimization module, fully embedded within the Forge2009 IHM, makes possible to cover all the defined examples, and the use of new multi-core hardware to compute several simulations at the same time reduces the needed time dramatically. The presented examples

  12. Conversion Rate Optimization through Evolutionary Computation

    OpenAIRE

    Miikkulainen, Risto; Iscoe, Neil; Shagrin, Aaron; Cordell, Ron; Nazari, Sam; Schoolland, Cory; Brundage, Myles; Epstein, Jonathan; Dean, Randy; Lamba, Gurmeet

    2017-01-01

    Conversion optimization means designing a web interface so that as many users as possible take a desired action on it, such as register or purchase. Such design is usually done by hand, testing one change at a time through A/B testing, or a limited number of combinations through multivariate testing, making it possible to evaluate only a small fraction of designs in a vast design space. This paper describes Sentient Ascend, an automatic conversion optimization system that uses evolutionary op...

  13. Estimating true evolutionary distances under the DCJ model.

    Science.gov (United States)

    Lin, Yu; Moret, Bernard M E

    2008-07-01

    Modern techniques can yield the ordering and strandedness of genes on each chromosome of a genome; such data already exists for hundreds of organisms. The evolutionary mechanisms through which the set of the genes of an organism is altered and reordered are of great interest to systematists, evolutionary biologists, comparative genomicists and biomedical researchers. Perhaps the most basic concept in this area is that of evolutionary distance between two genomes: under a given model of genomic evolution, how many events most likely took place to account for the difference between the two genomes? We present a method to estimate the true evolutionary distance between two genomes under the 'double-cut-and-join' (DCJ) model of genome rearrangement, a model under which a single multichromosomal operation accounts for all genomic rearrangement events: inversion, transposition, translocation, block interchange and chromosomal fusion and fission. Our method relies on a simple structural characterization of a genome pair and is both analytically and computationally tractable. We provide analytical results to describe the asymptotic behavior of genomes under the DCJ model, as well as experimental results on a wide variety of genome structures to exemplify the very high accuracy (and low variance) of our estimator. Our results provide a tool for accurate phylogenetic reconstruction from multichromosomal gene rearrangement data as well as a theoretical basis for refinements of the DCJ model to account for biological constraints. All of our software is available in source form under GPL at http://lcbb.epfl.ch.

  14. Future trends in power plant process computer techniques

    International Nuclear Information System (INIS)

    Dettloff, K.

    1975-01-01

    The development of new concepts of the process computer technique has advanced in great steps. The steps are in the three sections: hardware, software, application concept. New computers with a new periphery such as, e.g., colour layer equipment, have been developed in hardware. In software, a decisive step in the sector 'automation software' has been made. Through these components, a step forwards has also been made in the question of incorporating the process computer in the structure of the whole power plant control technique. (orig./LH) [de

  15. Training Software in Artificial-Intelligence Computing Techniques

    Science.gov (United States)

    Howard, Ayanna; Rogstad, Eric; Chalfant, Eugene

    2005-01-01

    The Artificial Intelligence (AI) Toolkit is a computer program for training scientists, engineers, and university students in three soft-computing techniques (fuzzy logic, neural networks, and genetic algorithms) used in artificial-intelligence applications. The program promotes an easily understandable tutorial interface, including an interactive graphical component through which the user can gain hands-on experience in soft-computing techniques applied to realistic example problems. The tutorial provides step-by-step instructions on the workings of soft-computing technology, whereas the hands-on examples allow interaction and reinforcement of the techniques explained throughout the tutorial. In the fuzzy-logic example, a user can interact with a robot and an obstacle course to verify how fuzzy logic is used to command a rover traverse from an arbitrary start to the goal location. For the genetic algorithm example, the problem is to determine the minimum-length path for visiting a user-chosen set of planets in the solar system. For the neural-network example, the problem is to decide, on the basis of input data on physical characteristics, whether a person is a man, woman, or child. The AI Toolkit is compatible with the Windows 95,98, ME, NT 4.0, 2000, and XP operating systems. A computer having a processor speed of at least 300 MHz, and random-access memory of at least 56MB is recommended for optimal performance. The program can be run on a slower computer having less memory, but some functions may not be executed properly.

  16. Computer animation algorithms and techniques

    CERN Document Server

    Parent, Rick

    2012-01-01

    Driven by the demands of research and the entertainment industry, the techniques of animation are pushed to render increasingly complex objects with ever-greater life-like appearance and motion. This rapid progression of knowledge and technique impacts professional developers, as well as students. Developers must maintain their understanding of conceptual foundations, while their animation tools become ever more complex and specialized. The second edition of Rick Parent's Computer Animation is an excellent resource for the designers who must meet this challenge. The first edition establ

  17. Computational intelligence techniques in health care

    CERN Document Server

    Zhou, Wengang; Satheesh, P

    2016-01-01

    This book presents research on emerging computational intelligence techniques and tools, with a particular focus on new trends and applications in health care. Healthcare is a multi-faceted domain, which incorporates advanced decision-making, remote monitoring, healthcare logistics, operational excellence and modern information systems. In recent years, the use of computational intelligence methods to address the scale and the complexity of the problems in healthcare has been investigated. This book discusses various computational intelligence methods that are implemented in applications in different areas of healthcare. It includes contributions by practitioners, technology developers and solution providers.

  18. Evaluation of models generated via hybrid evolutionary algorithms ...

    African Journals Online (AJOL)

    2016-04-02

    Apr 2, 2016 ... Evaluation of models generated via hybrid evolutionary algorithms for the prediction of Microcystis ... evolutionary algorithms (HEA) proved to be highly applica- ble to the hypertrophic reservoirs of South Africa. .... discovered and optimised using a large-scale parallel computational device and relevant soft-.

  19. Multi-Detector Computed Tomography Imaging Techniques in Arterial Injuries

    Directory of Open Access Journals (Sweden)

    Cameron Adler

    2018-04-01

    Full Text Available Cross-sectional imaging has become a critical aspect in the evaluation of arterial injuries. In particular, angiography using computed tomography (CT is the imaging of choice. A variety of techniques and options are available when evaluating for arterial injuries. Techniques involve contrast bolus, various phases of contrast enhancement, multiplanar reconstruction, volume rendering, and maximum intensity projection. After the images are rendered, a variety of features may be seen that diagnose the injury. This article provides a general overview of the techniques, important findings, and pitfalls in cross sectional imaging of arterial imaging, particularly in relation to computed tomography. In addition, the future directions of computed tomography, including a few techniques in the process of development, is also discussed.

  20. Hybrid soft computing systems for electromyographic signals analysis: a review.

    Science.gov (United States)

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  1. Operator support system using computational intelligence techniques

    International Nuclear Information System (INIS)

    Bueno, Elaine Inacio; Pereira, Iraci Martinez

    2015-01-01

    Computational Intelligence Systems have been widely applied in Monitoring and Fault Detection Systems in several processes and in different kinds of applications. These systems use interdependent components ordered in modules. It is a typical behavior of such systems to ensure early detection and diagnosis of faults. Monitoring and Fault Detection Techniques can be divided into two categories: estimative and pattern recognition methods. The estimative methods use a mathematical model, which describes the process behavior. The pattern recognition methods use a database to describe the process. In this work, an operator support system using Computational Intelligence Techniques was developed. This system will show the information obtained by different CI techniques in order to help operators to take decision in real time and guide them in the fault diagnosis before the normal alarm limits are reached. (author)

  2. Operator support system using computational intelligence techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bueno, Elaine Inacio, E-mail: ebueno@ifsp.edu.br [Instituto Federal de Educacao, Ciencia e Tecnologia de Sao Paulo (IFSP), Sao Paulo, SP (Brazil); Pereira, Iraci Martinez, E-mail: martinez@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Computational Intelligence Systems have been widely applied in Monitoring and Fault Detection Systems in several processes and in different kinds of applications. These systems use interdependent components ordered in modules. It is a typical behavior of such systems to ensure early detection and diagnosis of faults. Monitoring and Fault Detection Techniques can be divided into two categories: estimative and pattern recognition methods. The estimative methods use a mathematical model, which describes the process behavior. The pattern recognition methods use a database to describe the process. In this work, an operator support system using Computational Intelligence Techniques was developed. This system will show the information obtained by different CI techniques in order to help operators to take decision in real time and guide them in the fault diagnosis before the normal alarm limits are reached. (author)

  3. Mean-Potential Law in Evolutionary Games

    Science.gov (United States)

    Nałecz-Jawecki, Paweł; Miekisz, Jacek

    2018-01-01

    The Letter presents a novel way to connect random walks, stochastic differential equations, and evolutionary game theory. We introduce a new concept of a potential function for discrete-space stochastic systems. It is based on a correspondence between one-dimensional stochastic differential equations and random walks, which may be exact not only in the continuous limit but also in finite-state spaces. Our method is useful for computation of fixation probabilities in discrete stochastic dynamical systems with two absorbing states. We apply it to evolutionary games, formulating two simple and intuitive criteria for evolutionary stability of pure Nash equilibria in finite populations. In particular, we show that the 1 /3 law of evolutionary games, introduced by Nowak et al. [Nature, 2004], follows from a more general mean-potential law.

  4. Approximate Computing Techniques for Iterative Graph Algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Panyala, Ajay R.; Subasi, Omer; Halappanavar, Mahantesh; Kalyanaraman, Anantharaman; Chavarria Miranda, Daniel G.; Krishnamoorthy, Sriram

    2017-12-18

    Approximate computing enables processing of large-scale graphs by trading off quality for performance. Approximate computing techniques have become critical not only due to the emergence of parallel architectures but also the availability of large scale datasets enabling data-driven discovery. Using two prototypical graph algorithms, PageRank and community detection, we present several approximate computing heuristics to scale the performance with minimal loss of accuracy. We present several heuristics including loop perforation, data caching, incomplete graph coloring and synchronization, and evaluate their efficiency. We demonstrate performance improvements of up to 83% for PageRank and up to 450x for community detection, with low impact of accuracy for both the algorithms. We expect the proposed approximate techniques will enable scalable graph analytics on data of importance to several applications in science and their subsequent adoption to scale similar graph algorithms.

  5. Numerical Computational Technique for Scattering from Underwater Objects

    OpenAIRE

    T. Ratna Mani; Raj Kumar; Odamapally Vijay Kumar

    2013-01-01

    This paper presents a computational technique for mono-static and bi-static scattering from underwater objects of different shape such as submarines. The scatter has been computed using finite element time domain (FETD) method, based on the superposition of reflections, from the different elements reaching the receiver at a particular instant in time. The results calculated by this method has been verified with the published results based on ramp response technique. An in-depth parametric s...

  6. Evolutionary optimization of production materials workflow processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    We present an evolutionary optimisation technique for stochastic production processes, which is able to find improved production materials workflow processes with respect to arbitrary combinations of numerical quantities associated with the production process. Working from a core fragment...... of the BPMN language, we employ an evolutionary algorithm where stochastic model checking is used as a fitness function to determine the degree of improvement of candidate processes derived from the original process through mutation and cross-over operations. We illustrate this technique using a case study...

  7. Development of computer-aided auto-ranging technique for a computed radiography system

    International Nuclear Information System (INIS)

    Ishida, M.; Shimura, K.; Nakajima, N.; Kato, H.

    1988-01-01

    For a computed radiography system, the authors developed a computer-aided autoranging technique in which the clinically useful image data are automatically mapped to the available display range. The preread image data are inspected to determine the location of collimation. A histogram of the pixels inside the collimation is evaluated regarding characteristic values such as maxima and minima, and then the optimal density and contrast are derived for the display image. The effect of the autoranging technique was investigated at several hospitals in Japan. The average rate of films lost due to undesirable density or contrast was about 0.5%

  8. Practical techniques for pediatric computed tomography

    International Nuclear Information System (INIS)

    Fitz, C.R.; Harwood-Nash, D.C.; Kirks, D.R.; Kaufman, R.A.; Berger, P.E.; Kuhn, J.P.; Siegel, M.J.

    1983-01-01

    Dr. Donald Kirks has assembled this section on Practical Techniques for Pediatric Computed Tomography. The material is based on a presentation in the Special Interest session at the 25th Annual Meeting of the Society for Pediatric Radiology in New Orleans, Louisiana, USA in 1982. Meticulous attention to detail and technique is required to ensure an optimal CT examination. CT techniques specifically applicable to infants and children have not been disseminated in the radiology literature and in this respect it may rightly be observed that ''the child is not a small adult''. What follows is a ''cookbook'' prepared by seven participants and it is printed in Pediatric Radiology, in outline form, as a statement of individual preferences for pediatric CT techniques. This outline gives concise explanation of techniques and permits prompt dissemination of information. (orig.)

  9. Computer Assisted Audit Techniques

    Directory of Open Access Journals (Sweden)

    Eugenia Iancu

    2007-01-01

    Full Text Available From the modern point of view, audit takes intoaccount especially the information systems representingmainly the examination performed by a professional asregards the manner for developing an activity by means ofcomparing it to the quality criteria specific to this activity.Having as reference point this very general definition ofauditing, it must be emphasized that the best known segmentof auditing is the financial audit that had a parallel evolutionto the accountancy one.The present day phase of developing the financial audithas as main trait the internationalization of the accountantprofessional. World wide there are multinational companiesthat offer services in the financial auditing, taxing andconsultancy domain. The auditors, natural persons and auditcompanies, take part at the works of the national andinternational authorities for setting out norms in theaccountancy and auditing domain.The computer assisted audit techniques can be classified inseveral manners according to the approaches used by theauditor. The most well-known techniques are comprised inthe following categories: testing data techniques, integratedtest, parallel simulation, revising the program logics,programs developed upon request, generalized auditsoftware, utility programs and expert systems.

  10. An Artificial Immune System-Inspired Multiobjective Evolutionary Algorithm with Application to the Detection of Distributed Computer Network Intrusions

    Science.gov (United States)

    2007-03-01

    Optimization Coello, Van Veldhuizen , and Lamont define global optimization as, “the process of finding the global minimum4 within some search space S [CVL02...Technology, Shapes Markets, and Manages People, Simon & Schuster, New York, 1995. [CVL02] Coello, C., Van Veldhuizen , D., Lamont, G.B., Evolutionary...Anomaly Detection, Technical Report CS- 2003-02, Computer Science Department, Florida Institute of Technology, 2003. [Marmelstein99] Marmelstein, R., Van

  11. Comparative Analysis Between Computed and Conventional Inferior Alveolar Nerve Block Techniques.

    Science.gov (United States)

    Araújo, Gabriela Madeira; Barbalho, Jimmy Charles Melo; Dias, Tasiana Guedes de Souza; Santos, Thiago de Santana; Vasconcellos, Ricardo José de Holanda; de Morais, Hécio Henrique Araújo

    2015-11-01

    The aim of this randomized, double-blind, controlled trial was to compare the computed and conventional inferior alveolar nerve block techniques in symmetrically positioned inferior third molars. Both computed and conventional anesthetic techniques were performed in 29 healthy patients (58 surgeries) aged between 18 and 40 years. The anesthetic of choice was 2% lidocaine with 1: 200,000 epinephrine. The Visual Analogue Scale assessed the pain variable after anesthetic infiltration. Patient satisfaction was evaluated using the Likert Scale. Heart and respiratory rates, mean time to perform technique, and the need for additional anesthesia were also evaluated. Pain variable means were higher for the conventional technique as compared with computed, 3.45 ± 2.73 and 2.86 ± 1.96, respectively, but no statistically significant differences were found (P > 0.05). Patient satisfaction showed no statistically significant differences. The average computed technique runtime and the conventional were 3.85 and 1.61 minutes, respectively, showing statistically significant differences (P <0.001). The computed anesthetic technique showed lower mean pain perception, but did not show statistically significant differences when contrasted to the conventional technique.

  12. Application of computer technique in SMCAMS

    International Nuclear Information System (INIS)

    Lu Deming

    2001-01-01

    A series of applications of computer technique in SMCAMS physics design and magnetic field measurement is described, including digital calculation of electric-magnetic field, beam dynamics, calculation of beam injection and extraction, and mapping and shaping of the magnetic field

  13. Avoiding Local Optima with Interactive Evolutionary Robotics

    Science.gov (United States)

    2012-07-09

    the top of a flight of stairs selects for climbing ; suspending the robot and the target object above the ground and creating rungs between the two will...REPORT Avoiding Local Optimawith Interactive Evolutionary Robotics 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: The main bottleneck in evolutionary... robotics has traditionally been the time required to evolve robot controllers. However with the continued acceleration in computational resources, the

  14. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  15. Application of computer technique in the reconstruction of Chinese ancient buildings

    Science.gov (United States)

    Li, Deren; Yang, Jie; Zhu, Yixuan

    2003-01-01

    This paper offers an introduction of computer assemble and simulation of ancient building. A pioneer research work was carried out by investigators of surveying and mapping describing ancient Chinese timber buildings by 3D frame graphs with computers. But users can know the structural layers and the assembly process of these buildings if the frame graphs are processed further with computer. This can be implemented by computer simulation technique. This technique display the raw data on the screen of a computer and interactively manage them by combining technologies from computer graphics and image processing, multi-media technology, artificial intelligence, highly parallel real-time computation technique and human behavior science. This paper presents the implement procedure of simulation for large-sized wooden buildings as well as 3D dynamic assembly of these buildings under the 3DS MAX environment. The results of computer simulation are also shown in the paper.

  16. An improved shuffled frog leaping algorithm based evolutionary framework for currency exchange rate prediction

    Science.gov (United States)

    Dash, Rajashree

    2017-11-01

    Forecasting purchasing power of one currency with respect to another currency is always an interesting topic in the field of financial time series prediction. Despite the existence of several traditional and computational models for currency exchange rate forecasting, there is always a need for developing simpler and more efficient model, which will produce better prediction capability. In this paper, an evolutionary framework is proposed by using an improved shuffled frog leaping (ISFL) algorithm with a computationally efficient functional link artificial neural network (CEFLANN) for prediction of currency exchange rate. The model is validated by observing the monthly prediction measures obtained for three currency exchange data sets such as USD/CAD, USD/CHF, and USD/JPY accumulated within same period of time. The model performance is also compared with two other evolutionary learning techniques such as Shuffled frog leaping algorithm and Particle Swarm optimization algorithm. Practical analysis of results suggest that, the proposed model developed using the ISFL algorithm with CEFLANN network is a promising predictor model for currency exchange rate prediction compared to other models included in the study.

  17. THE COMPUTATIONAL INTELLIGENCE TECHNIQUES FOR PREDICTIONS - ARTIFICIAL NEURAL NETWORKS

    OpenAIRE

    Mary Violeta Bar

    2014-01-01

    The computational intelligence techniques are used in problems which can not be solved by traditional techniques when there is insufficient data to develop a model problem or when they have errors.Computational intelligence, as he called Bezdek (Bezdek, 1992) aims at modeling of biological intelligence. Artificial Neural Networks( ANNs) have been applied to an increasing number of real world problems of considerable complexity. Their most important advantage is solving problems that are too c...

  18. The State of Software for Evolutionary Biology.

    Science.gov (United States)

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  19. Computational methods using weighed-extreme learning machine to predict protein self-interactions with protein evolutionary information.

    Science.gov (United States)

    An, Ji-Yong; Zhang, Lei; Zhou, Yong; Zhao, Yu-Jun; Wang, Da-Fu

    2017-08-18

    Self-interactions Proteins (SIPs) is important for their biological activity owing to the inherent interaction amongst their secondary structures or domains. However, due to the limitations of experimental Self-interactions detection, one major challenge in the study of prediction SIPs is how to exploit computational approaches for SIPs detection based on evolutionary information contained protein sequence. In the work, we presented a novel computational approach named WELM-LAG, which combined the Weighed-Extreme Learning Machine (WELM) classifier with Local Average Group (LAG) to predict SIPs based on protein sequence. The major improvement of our method lies in presenting an effective feature extraction method used to represent candidate Self-interactions proteins by exploring the evolutionary information embedded in PSI-BLAST-constructed position specific scoring matrix (PSSM); and then employing a reliable and robust WELM classifier to carry out classification. In addition, the Principal Component Analysis (PCA) approach is used to reduce the impact of noise. The WELM-LAG method gave very high average accuracies of 92.94 and 96.74% on yeast and human datasets, respectively. Meanwhile, we compared it with the state-of-the-art support vector machine (SVM) classifier and other existing methods on human and yeast datasets, respectively. Comparative results indicated that our approach is very promising and may provide a cost-effective alternative for predicting SIPs. In addition, we developed a freely available web server called WELM-LAG-SIPs to predict SIPs. The web server is available at http://219.219.62.123:8888/WELMLAG/ .

  20. New computing techniques in physics research

    International Nuclear Information System (INIS)

    Becks, Karl-Heinz; Perret-Gallix, Denis

    1994-01-01

    New techniques were highlighted by the ''Third International Workshop on Software Engineering, Artificial Intelligence and Expert Systems for High Energy and Nuclear Physics'' in Oberammergau, Bavaria, Germany, from October 4 to 8. It was the third workshop in the series; the first was held in Lyon in 1990 and the second at France-Telecom site near La Londe les Maures in 1992. This series of workshops covers a broad spectrum of problems. New, highly sophisticated experiments demand new techniques in computing, in hardware as well as in software. Software Engineering Techniques could in principle satisfy the needs for forthcoming accelerator experiments. The growing complexity of detector systems demands new techniques in experimental error diagnosis and repair suggestions; Expert Systems seem to offer a way of assisting the experimental crew during data-taking

  1. Gender approaches to evolutionary multi-objective optimization using pre-selection of criteria

    Science.gov (United States)

    Kowalczuk, Zdzisław; Białaszewski, Tomasz

    2018-01-01

    A novel idea to perform evolutionary computations (ECs) for solving highly dimensional multi-objective optimization (MOO) problems is proposed. Following the general idea of evolution, it is proposed that information about gender is used to distinguish between various groups of objectives and identify the (aggregate) nature of optimality of individuals (solutions). This identification is drawn out of the fitness of individuals and applied during parental crossover in the processes of evolutionary multi-objective optimization (EMOO). The article introduces the principles of the genetic-gender approach (GGA) and virtual gender approach (VGA), which are not just evolutionary techniques, but constitute a completely new rule (philosophy) for use in solving MOO tasks. The proposed approaches are validated against principal representatives of the EMOO algorithms of the state of the art in solving benchmark problems in the light of recognized EC performance criteria. The research shows the superiority of the gender approach in terms of effectiveness, reliability, transparency, intelligibility and MOO problem simplification, resulting in the great usefulness and practicability of GGA and VGA. Moreover, an important feature of GGA and VGA is that they alleviate the 'curse' of dimensionality typical of many engineering designs.

  2. A Novel Handwritten Letter Recognizer Using Enhanced Evolutionary Neural Network

    Science.gov (United States)

    Mahmoudi, Fariborz; Mirzashaeri, Mohsen; Shahamatnia, Ehsan; Faridnia, Saed

    This paper introduces a novel design for handwritten letter recognition by employing a hybrid back-propagation neural network with an enhanced evolutionary algorithm. Feeding the neural network consists of a new approach which is invariant to translation, rotation, and scaling of input letters. Evolutionary algorithm is used for the global search of the search space and the back-propagation algorithm is used for the local search. The results have been computed by implementing this approach for recognizing 26 English capital letters in the handwritings of different people. The computational results show that the neural network reaches very satisfying results with relatively scarce input data and a promising performance improvement in convergence of the hybrid evolutionary back-propagation algorithms is exhibited.

  3. Evolutionary robotics – A review

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    a need for a technique by which the robot is able to acquire new behaviours automatically .... Evolutionary robotics is a comparatively new field of robotics research, which seems to ..... Technical Report: PCIA-94-04, Institute of Psychology,.

  4. Genomes, Phylogeny, and Evolutionary Systems Biology

    Energy Technology Data Exchange (ETDEWEB)

    Medina, Monica

    2005-03-25

    With the completion of the human genome and the growing number of diverse genomes being sequenced, a new age of evolutionary research is currently taking shape. The myriad of technological breakthroughs in biology that are leading to the unification of broad scientific fields such as molecular biology, biochemistry, physics, mathematics and computer science are now known as systems biology. Here I present an overview, with an emphasis on eukaryotes, of how the postgenomics era is adopting comparative approaches that go beyond comparisons among model organisms to shape the nascent field of evolutionary systems biology.

  5. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. New coding technique for computer generated holograms.

    Science.gov (United States)

    Haskell, R. E.; Culver, B. C.

    1972-01-01

    A coding technique is developed for recording computer generated holograms on a computer controlled CRT in which each resolution cell contains two beam spots of equal size and equal intensity. This provides a binary hologram in which only the position of the two dots is varied from cell to cell. The amplitude associated with each resolution cell is controlled by selectively diffracting unwanted light into a higher diffraction order. The recording of the holograms is fast and simple.

  7. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    Bunshah, R.F.

    1976-01-01

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 4 0 K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  8. [Development of computer aided forming techniques in manufacturing scaffolds for bone tissue engineering].

    Science.gov (United States)

    Wei, Xuelei; Dong, Fuhui

    2011-12-01

    To review recent advance in the research and application of computer aided forming techniques for constructing bone tissue engineering scaffolds. The literature concerning computer aided forming techniques for constructing bone tissue engineering scaffolds in recent years was reviewed extensively and summarized. Several studies over last decade have focused on computer aided forming techniques for bone scaffold construction using various scaffold materials, which is based on computer aided design (CAD) and bone scaffold rapid prototyping (RP). CAD include medical CAD, STL, and reverse design. Reverse design can fully simulate normal bone tissue and could be very useful for the CAD. RP techniques include fused deposition modeling, three dimensional printing, selected laser sintering, three dimensional bioplotting, and low-temperature deposition manufacturing. These techniques provide a new way to construct bone tissue engineering scaffolds with complex internal structures. With rapid development of molding and forming techniques, computer aided forming techniques are expected to provide ideal bone tissue engineering scaffolds.

  9. XTALOPT: An open-source evolutionary algorithm for crystal structure prediction

    Science.gov (United States)

    Lonie, David C.; Zurek, Eva

    2011-02-01

    The implementation and testing of XTALOPT, an evolutionary algorithm for crystal structure prediction, is outlined. We present our new periodic displacement (ripple) operator which is ideally suited to extended systems. It is demonstrated that hybrid operators, which combine two pure operators, reduce the number of duplicate structures in the search. This allows for better exploration of the potential energy surface of the system in question, while simultaneously zooming in on the most promising regions. A continuous workflow, which makes better use of computational resources as compared to traditional generation based algorithms, is employed. Various parameters in XTALOPT are optimized using a novel benchmarking scheme. XTALOPT is available under the GNU Public License, has been interfaced with various codes commonly used to study extended systems, and has an easy to use, intuitive graphical interface. Program summaryProgram title:XTALOPT Catalogue identifier: AEGX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL v2.1 or later [1] No. of lines in distributed program, including test data, etc.: 36 849 No. of bytes in distributed program, including test data, etc.: 1 149 399 Distribution format: tar.gz Programming language: C++ Computer: PCs, workstations, or clusters Operating system: Linux Classification: 7.7 External routines: QT [2], OpenBabel [3], AVOGADRO [4], SPGLIB [8] and one of: VASP [5], PWSCF [6], GULP [7]. Nature of problem: Predicting the crystal structure of a system from its stoichiometry alone remains a grand challenge in computational materials science, chemistry, and physics. Solution method: Evolutionary algorithms are stochastic search techniques which use concepts from biological evolution in order to locate the global minimum on their potential energy surface. Our evolutionary algorithm, XTALOPT, is freely

  10. [Cardiac computed tomography: new applications of an evolving technique].

    Science.gov (United States)

    Martín, María; Corros, Cecilia; Calvo, Juan; Mesa, Alicia; García-Campos, Ana; Rodríguez, María Luisa; Barreiro, Manuel; Rozado, José; Colunga, Santiago; de la Hera, Jesús M; Morís, César; Luyando, Luis H

    2015-01-01

    During the last years we have witnessed an increasing development of imaging techniques applied in Cardiology. Among them, cardiac computed tomography is an emerging and evolving technique. With the current possibility of very low radiation studies, the applications have expanded and go further coronariography In the present article we review the technical developments of cardiac computed tomography and its new applications. Copyright © 2014 Instituto Nacional de Cardiología Ignacio Chávez. Published by Masson Doyma México S.A. All rights reserved.

  11. Evolutionary programming-based univector field navigation method for past mobile robots.

    Science.gov (United States)

    Kim, Y J; Kim, J H; Kwon, D S

    2001-01-01

    Most of navigation techniques with obstacle avoidance do not consider the robot orientation at the target position. These techniques deal with the robot position only and are independent of its orientation and velocity. To solve these problems this paper proposes a novel univector field method for fast mobile robot navigation which introduces a normalized two dimensional vector field. The method provides fast moving robots with the desired posture at the target position and obstacle avoidance. To obtain the sub-optimal vector field, a function approximator is used and trained by evolutionary programming. Two kinds of vector fields are trained, one for the final posture acquisition and the other for obstacle avoidance. Computer simulations and real experiments are carried out for a fast moving mobile robot to demonstrate the effectiveness of the proposed scheme.

  12. Optimum topology for radial networks by using evolutionary computer programming; Topologia optima de redes radiais utilizando programacao evolucionaria

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, Joao Luis [Instituto de Engenhariade Sistemas e Computadores (INESC), Porto (Portugal). E-mail: jpinto@duque.inescn.pt; Proenca, Luis Miguel [Instituto Superior de Linguas e Administracao (ISLA), Gaia (Portugal). E-mail: lproenca@inescn.pt

    1999-07-01

    This paper describes the using of Evolutionary Programming techniques for determination of the radial electric network topology, considering investment costs and losses. The work aims to demonstrate the particular easiness of coding and implementation and the parallelism implicit to the method as well, giving outstanding performance levels. As test example, a 43 bars and 75 alternative lines network has been used by describing an implementation of the algorithm in an Object Oriented platform.

  13. Evolutionary autonomous agents and the nature of apraxia

    Directory of Open Access Journals (Sweden)

    Jin Frank

    2005-01-01

    Full Text Available Abstract Background Evolutionary autonomous agents are robots or robot simulations whose controller is a dynamical neural network and whose evolution occurs autonomously under the guidance of a fitness function without the detailed or explicit direction of an external programmer. They are embodied agents with a simple neural network controller and as such they provide the optimal forum by which sensorimotor interactions in a specified environment can be studied without the computational assumptions inherent in standard neuroscience. Methods Evolutionary autonomous agents were evolved that were able to perform identical movements under two different contexts, one which represented an automatic movement and one which had a symbolic context. In an attempt to model the automatic-voluntary dissociation frequently seen in ideomotor apraxia, lesions were introduced into the neural network controllers resulting in a behavioral dissociation with loss of the ability to perform the movement which had a symbolic context and preservation of the simpler, automatic movement. Results Analysis of the changes in the hierarchical organization of the networks in the apractic EAAs demonstrated consistent changes in the network dynamics across all agents with loss of longer duration time scales in the network dynamics. Conclusion The concepts of determinate motor programs and perceptual representations that are implicit in the present day understanding of ideomotor apraxia are assumptions inherent in the computational understanding of brain function. The strength of the present study using EAAs to model one aspect of ideomotor apraxia is the absence of these assumptions and a grounding of all sensorimotor interactions in an embodied, autonomous agent. The consistency of the hierarchical changes in the network dynamics across all apractic agents demonstrates that this technique is tenable and will be a valuable adjunct to a computational formalism in the understanding

  14. Nash evolutionary algorithms : Testing problem size in reconstruction problems in frame structures

    OpenAIRE

    Greiner, D.; Periaux, Jacques; Emperador, J.M.; Galván, B.; Winter, G.

    2016-01-01

    The use of evolutionary algorithms has been enhanced in recent years for solving real engineering problems, where the requirements of intense computational calculations are needed, especially when computational engineering simulations are involved (use of finite element method, boundary element method, etc). The coupling of game-theory concepts in evolutionary algorithms has been a recent line of research which could enhance the efficiency of the optimum design procedure and th...

  15. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    Science.gov (United States)

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing

  16. Morphing-Based Shape Optimization in Computational Fluid Dynamics

    Science.gov (United States)

    Rousseau, Yannick; Men'Shov, Igor; Nakamura, Yoshiaki

    In this paper, a Morphing-based Shape Optimization (MbSO) technique is presented for solving Optimum-Shape Design (OSD) problems in Computational Fluid Dynamics (CFD). The proposed method couples Free-Form Deformation (FFD) and Evolutionary Computation, and, as its name suggests, relies on the morphing of shape and computational domain, rather than direct shape parameterization. Advantages of the FFD approach compared to traditional parameterization are first discussed. Then, examples of shape and grid deformations by FFD are presented. Finally, the MbSO approach is illustrated and applied through an example: the design of an airfoil for a future Mars exploration airplane.

  17. Evolutionary dynamics of complex communications networks

    CERN Document Server

    Karyotis, Vasileios; Papavassiliou, Symeon

    2013-01-01

    Until recently, most network design techniques employed a bottom-up approach with lower protocol layer mechanisms affecting the development of higher ones. This approach, however, has not yielded fascinating results in the case of wireless distributed networks. Addressing the emerging aspects of modern network analysis and design, Evolutionary Dynamics of Complex Communications Networks introduces and develops a top-bottom approach where elements of the higher layer can be exploited in modifying the lowest physical topology-closing the network design loop in an evolutionary fashion similar to

  18. Information granularity, big data, and computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2015-01-01

    The recent pursuits emerging in the realm of big data processing, interpretation, collection and organization have emerged in numerous sectors including business, industry, and government organizations. Data sets such as customer transactions for a mega-retailer, weather monitoring, intelligence gathering, quickly outpace the capacities of traditional techniques and tools of data analysis. The 3V (volume, variability and velocity) challenges led to the emergence of new techniques and tools in data visualization, acquisition, and serialization. Soft Computing being regarded as a plethora of technologies of fuzzy sets (or Granular Computing), neurocomputing and evolutionary optimization brings forward a number of unique features that might be instrumental to the development of concepts and algorithms to deal with big data. This carefully edited volume provides the reader with an updated, in-depth material on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligenc...

  19. Computational techniques in tribology and material science at the atomic level

    Science.gov (United States)

    Ferrante, J.; Bozzolo, G. H.

    1992-01-01

    Computations in tribology and material science at the atomic level present considerable difficulties. Computational techniques ranging from first-principles to semi-empirical and their limitations are discussed. Example calculations of metallic surface energies using semi-empirical techniques are presented. Finally, application of the methods to calculation of adhesion and friction are presented.

  20. Multiobjective Multifactorial Optimization in Evolutionary Multitasking.

    Science.gov (United States)

    Gupta, Abhishek; Ong, Yew-Soon; Feng, Liang; Tan, Kay Chen

    2016-05-03

    In recent decades, the field of multiobjective optimization has attracted considerable interest among evolutionary computation researchers. One of the main features that makes evolutionary methods particularly appealing for multiobjective problems is the implicit parallelism offered by a population, which enables simultaneous convergence toward the entire Pareto front. While a plethora of related algorithms have been proposed till date, a common attribute among them is that they focus on efficiently solving only a single optimization problem at a time. Despite the known power of implicit parallelism, seldom has an attempt been made to multitask, i.e., to solve multiple optimization problems simultaneously. It is contended that the notion of evolutionary multitasking leads to the possibility of automated transfer of information across different optimization exercises that may share underlying similarities, thereby facilitating improved convergence characteristics. In particular, the potential for automated transfer is deemed invaluable from the standpoint of engineering design exercises where manual knowledge adaptation and reuse are routine. Accordingly, in this paper, we present a realization of the evolutionary multitasking paradigm within the domain of multiobjective optimization. The efficacy of the associated evolutionary algorithm is demonstrated on some benchmark test functions as well as on a real-world manufacturing process design problem from the composites industry.

  1. Comparison of radiographic technique by computer simulation

    International Nuclear Information System (INIS)

    Brochi, M.A.C.; Ghilardi Neto, T.

    1989-01-01

    A computational algorithm to compare radiographic techniques (KVp, mAs and filters) is developed based in the fixation of parameters that defines the images, such as optical density and constrast. Before the experience, the results were used in a radiography of thorax. (author) [pt

  2. Resistance and relatedness on an evolutionary graph

    Science.gov (United States)

    Maciejewski, Wes

    2012-01-01

    When investigating evolution in structured populations, it is often convenient to consider the population as an evolutionary graph—individuals as nodes, and whom they may act with as edges. There has, in recent years, been a surge of interest in evolutionary graphs, especially in the study of the evolution of social behaviours. An inclusive fitness framework is best suited for this type of study. A central requirement for an inclusive fitness analysis is an expression for the genetic similarity between individuals residing on the graph. This has been a major hindrance for work in this area as highly technical mathematics are often required. Here, I derive a result that links genetic relatedness between haploid individuals on an evolutionary graph to the resistance between vertices on a corresponding electrical network. An example that demonstrates the potential computational advantage of this result over contemporary approaches is provided. This result offers more, however, to the study of population genetics than strictly computationally efficient methods. By establishing a link between gene transfer and electric circuit theory, conceptualizations of the latter can enhance understanding of the former. PMID:21849384

  3. Evolutionary Robotics: What, Why, and Where to

    Directory of Open Access Journals (Sweden)

    Stephane eDoncieux

    2015-03-01

    Full Text Available Evolutionary robotics applies the selection, variation, and heredity principles of natural evolution to the design of robots with embodied intelligence. It can be considered as a subfield of robotics that aims to create more robust and adaptive robots. A pivotal feature of the evolutionary approach is that it considers the whole robot at once, and enables the exploitation of robot features in a holistic manner. Evolutionary robotics can also be seen as an innovative approach to the study of evolution based on a new kind of experimentalism. The use of robots as a substrate can help address questions that are difficult, if not impossible, to investigate through computer simulations or biological studies. In this paper we consider the main achievements of evolutionary robotics, focusing particularly on its contributions to both engineering and biology. We briefly elaborate on methodological issues, review some of the most interesting findings, and discuss important open issues and promising avenues for future work.

  4. Introduction to Evolutionary Algorithms

    CERN Document Server

    Yu, Xinjie

    2010-01-01

    Evolutionary algorithms (EAs) are becoming increasingly attractive for researchers from various disciplines, such as operations research, computer science, industrial engineering, electrical engineering, social science, economics, etc. This book presents an insightful, comprehensive, and up-to-date treatment of EAs, such as genetic algorithms, differential evolution, evolution strategy, constraint optimization, multimodal optimization, multiobjective optimization, combinatorial optimization, evolvable hardware, estimation of distribution algorithms, ant colony optimization, particle swarm opti

  5. Evolutionary inference via the Poisson Indel Process.

    Science.gov (United States)

    Bouchard-Côté, Alexandre; Jordan, Michael I

    2013-01-22

    We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114-124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments.

  6. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  7. Evolutionary constrained optimization

    CERN Document Server

    Deb, Kalyanmoy

    2015-01-01

    This book makes available a self-contained collection of modern research addressing the general constrained optimization problems using evolutionary algorithms. Broadly the topics covered include constraint handling for single and multi-objective optimizations; penalty function based methodology; multi-objective based methodology; new constraint handling mechanism; hybrid methodology; scaling issues in constrained optimization; design of scalable test problems; parameter adaptation in constrained optimization; handling of integer, discrete and mix variables in addition to continuous variables; application of constraint handling techniques to real-world problems; and constrained optimization in dynamic environment. There is also a separate chapter on hybrid optimization, which is gaining lots of popularity nowadays due to its capability of bridging the gap between evolutionary and classical optimization. The material in the book is useful to researchers, novice, and experts alike. The book will also be useful...

  8. Evolutionary Models for Simple Biosystems

    Science.gov (United States)

    Bagnoli, Franco

    The concept of evolutionary development of structures constituted a real revolution in biology: it was possible to understand how the very complex structures of life can arise in an out-of-equilibrium system. The investigation of such systems has shown that indeed, systems under a flux of energy or matter can self-organize into complex patterns, think for instance to Rayleigh-Bernard convection, Liesegang rings, patterns formed by granular systems under shear. Following this line, one could characterize life as a state of matter, characterized by the slow, continuous process that we call evolution. In this paper we try to identify the organizational level of life, that spans several orders of magnitude from the elementary constituents to whole ecosystems. Although similar structures can be found in other contexts like ideas (memes) in neural systems and self-replicating elements (computer viruses, worms, etc.) in computer systems, we shall concentrate on biological evolutionary structure, and try to put into evidence the role and the emergence of network structure in such systems.

  9. Economic modeling using evolutionary algorithms : the effect of binary encoding of strategies

    NARCIS (Netherlands)

    Waltman, L.R.; Eck, van N.J.; Dekker, Rommert; Kaymak, U.

    2011-01-01

    We are concerned with evolutionary algorithms that are employed for economic modeling purposes. We focus in particular on evolutionary algorithms that use a binary encoding of strategies. These algorithms, commonly referred to as genetic algorithms, are popular in agent-based computational economics

  10. Computer-Assisted Technique for Surgical Tooth Extraction

    Directory of Open Access Journals (Sweden)

    Hosamuddin Hamza

    2016-01-01

    Full Text Available Introduction. Surgical tooth extraction is a common procedure in dentistry. However, numerous extraction cases show a high level of difficulty in practice. This difficulty is usually related to inadequate visualization, improper instrumentation, or other factors related to the targeted tooth (e.g., ankyloses or presence of bony undercut. Methods. In this work, the author presents a new technique for surgical tooth extraction based on 3D imaging, computer planning, and a new concept of computer-assisted manufacturing. Results. The outcome of this work is a surgical guide made by 3D printing of plastics and CNC of metals (hybrid outcome. In addition, the conventional surgical cutting tools (surgical burs are modified with a number of stoppers adjusted to avoid any excessive drilling that could harm bone or other vital structures. Conclusion. The present outcome could provide a minimally invasive technique to overcome the routine complications facing dental surgeons in surgical extraction procedures.

  11. Computational Intelligence Techniques for New Product Design

    CERN Document Server

    Chan, Kit Yan; Dillon, Tharam S

    2012-01-01

    Applying computational intelligence for product design is a fast-growing and promising research area in computer sciences and industrial engineering. However, there is currently a lack of books, which discuss this research area. This book discusses a wide range of computational intelligence techniques for implementation on product design. It covers common issues on product design from identification of customer requirements in product design, determination of importance of customer requirements, determination of optimal design attributes, relating design attributes and customer satisfaction, integration of marketing aspects into product design, affective product design, to quality control of new products. Approaches for refinement of computational intelligence are discussed, in order to address different issues on product design. Cases studies of product design in terms of development of real-world new products are included, in order to illustrate the design procedures, as well as the effectiveness of the com...

  12. Towards a Population Dynamics Theory for Evolutionary Computing: Learning from Biological Population Dynamics in Nature

    Science.gov (United States)

    Ma, Zhanshan (Sam)

    In evolutionary computing (EC), population size is one of the critical parameters that a researcher has to deal with. Hence, it was no surprise that the pioneers of EC, such as De Jong (1975) and Holland (1975), had already studied the population sizing from the very beginning of EC. What is perhaps surprising is that more than three decades later, we still largely depend on the experience or ad-hoc trial-and-error approach to set the population size. For example, in a recent monograph, Eiben and Smith (2003) indicated: "In almost all EC applications, the population size is constant and does not change during the evolutionary search." Despite enormous research on this issue in recent years, we still lack a well accepted theory for population sizing. In this paper, I propose to develop a population dynamics theory forEC with the inspiration from the population dynamics theory of biological populations in nature. Essentially, the EC population is considered as a dynamic system over time (generations) and space (search space or fitness landscape), similar to the spatial and temporal dynamics of biological populations in nature. With this conceptual mapping, I propose to 'transplant' the biological population dynamics theory to EC via three steps: (i) experimentally test the feasibility—whether or not emulating natural population dynamics improves the EC performance; (ii) comparatively study the underlying mechanisms—why there are improvements, primarily via statistical modeling analysis; (iii) conduct theoretical analysis with theoretical models such as percolation theory and extended evolutionary game theory that are generally applicable to both EC and natural populations. This article is a summary of a series of studies we have performed to achieve the general goal [27][30]-[32]. In the following, I start with an extremely brief introduction on the theory and models of natural population dynamics (Sections 1 & 2). In Sections 4 to 6, I briefly discuss three

  13. Evolutionary Algorithms Application Analysis in Biometric Systems

    Directory of Open Access Journals (Sweden)

    N. Goranin

    2010-01-01

    Full Text Available Wide usage of biometric information for person identity verification purposes, terrorist acts prevention measures and authenticationprocess simplification in computer systems has raised significant attention to reliability and efficiency of biometricsystems. Modern biometric systems still face many reliability and efficiency related issues such as reference databasesearch speed, errors while recognizing of biometric information or automating biometric feature extraction. Current scientificinvestigations show that application of evolutionary algorithms may significantly improve biometric systems. In thisarticle we provide a comprehensive review of main scientific research done in sphere of evolutionary algorithm applicationfor biometric system parameter improvement.

  14. Deterministic network interdiction optimization via an evolutionary approach

    International Nuclear Information System (INIS)

    Rocco S, Claudio M.; Ramirez-Marquez, Jose Emmanuel

    2009-01-01

    This paper introduces an evolutionary optimization approach that can be readily applied to solve deterministic network interdiction problems. The network interdiction problem solved considers the minimization of the maximum flow that can be transmitted between a source node and a sink node for a fixed network design when there is a limited amount of resources available to interdict network links. Furthermore, the model assumes that the nominal capacity of each network link and the cost associated with their interdiction can change from link to link. For this problem, the solution approach developed is based on three steps that use: (1) Monte Carlo simulation, to generate potential network interdiction strategies, (2) Ford-Fulkerson algorithm for maximum s-t flow, to analyze strategies' maximum source-sink flow and, (3) an evolutionary optimization technique to define, in probabilistic terms, how likely a link is to appear in the final interdiction strategy. Examples for different sizes of networks and network behavior are used throughout the paper to illustrate the approach. In terms of computational effort, the results illustrate that solutions are obtained from a significantly restricted solution search space. Finally, the authors discuss the need for a reliability perspective to network interdiction, so that solutions developed address more realistic scenarios of such problem

  15. Computer technique for evaluating collimator performance

    International Nuclear Information System (INIS)

    Rollo, F.D.

    1975-01-01

    A computer program has been developed to theoretically evaluate the overall performance of collimators used with radioisotope scanners and γ cameras. The first step of the program involves the determination of the line spread function (LSF) and geometrical efficiency from the fundamental parameters of the collimator being evaluated. The working equations can be applied to any plane of interest. The resulting LSF is applied to subroutine computer programs which compute corresponding modulation transfer function and contrast efficiency functions. The latter function is then combined with appropriate geometrical efficiency data to determine the performance index function. The overall computer program allows one to predict from the physical parameters of the collimator alone how well the collimator will reproduce various sized spherical voids of activity in the image plane. The collimator performance program can be used to compare the performance of various collimator types, to study the effects of source depth on collimator performance, and to assist in the design of collimators. The theory of the collimator performance equation is discussed, a comparison between the experimental and theoretical LSF values is made, and examples of the application of the technique are presented

  16. A review of metaheuristic scheduling techniques in cloud computing

    Directory of Open Access Journals (Sweden)

    Mala Kalra

    2015-11-01

    Full Text Available Cloud computing has become a buzzword in the area of high performance distributed computing as it provides on-demand access to shared pool of resources over Internet in a self-service, dynamically scalable and metered manner. Cloud computing is still in its infancy, so to reap its full benefits, much research is required across a broad array of topics. One of the important research issues which need to be focused for its efficient performance is scheduling. The goal of scheduling is to map tasks to appropriate resources that optimize one or more objectives. Scheduling in cloud computing belongs to a category of problems known as NP-hard problem due to large solution space and thus it takes a long time to find an optimal solution. There are no algorithms which may produce optimal solution within polynomial time to solve these problems. In cloud environment, it is preferable to find suboptimal solution, but in short period of time. Metaheuristic based techniques have been proved to achieve near optimal solutions within reasonable time for such problems. In this paper, we provide an extensive survey and comparative analysis of various scheduling algorithms for cloud and grid environments based on three popular metaheuristic techniques: Ant Colony Optimization (ACO, Genetic Algorithm (GA and Particle Swarm Optimization (PSO, and two novel techniques: League Championship Algorithm (LCA and BAT algorithm.

  17. Phylogenetic inference with weighted codon evolutionary distances.

    Science.gov (United States)

    Criscuolo, Alexis; Michel, Christian J

    2009-04-01

    We develop a new approach to estimate a matrix of pairwise evolutionary distances from a codon-based alignment based on a codon evolutionary model. The method first computes a standard distance matrix for each of the three codon positions. Then these three distance matrices are weighted according to an estimate of the global evolutionary rate of each codon position and averaged into a unique distance matrix. Using a large set of both real and simulated codon-based alignments of nucleotide sequences, we show that this approach leads to distance matrices that have a significantly better treelikeness compared to those obtained by standard nucleotide evolutionary distances. We also propose an alternative weighting to eliminate the part of the noise often associated with some codon positions, particularly the third position, which is known to induce a fast evolutionary rate. Simulation results show that fast distance-based tree reconstruction algorithms on distance matrices based on this codon position weighting can lead to phylogenetic trees that are at least as accurate as, if not better, than those inferred by maximum likelihood. Finally, a well-known multigene dataset composed of eight yeast species and 106 codon-based alignments is reanalyzed and shows that our codon evolutionary distances allow building a phylogenetic tree which is similar to those obtained by non-distance-based methods (e.g., maximum parsimony and maximum likelihood) and also significantly improved compared to standard nucleotide evolutionary distance estimates.

  18. Study of natural circulation for the design of a research reactor using computational fluid dynamics and evolutionary computation techniques

    International Nuclear Information System (INIS)

    Oliveira, Andre Felipe da Silva de

    2012-01-01

    Safety is one of the most important and desirable characteristics in a nuclear plant Natural circulation cooling systems are noted for providing passive safety. These systems can be used as mechanism for removing the residual heat from the reactor, or even as the main cooling system for heated sections, such as the core. In this work, a computational fluid dynamics (CFD) code called CFX is used to simulate the process of natural circulation in a research reactor pool after its shutdown. The physical model studied is similar to the Open Pool Australian Light water reactor (OPAL), and contains the core, cooling pool, reflecting tank, circulation pipes and chimney. For best computing performance, the core region was modeled as a porous medium, where the parameters were obtained from a separately detailed CFD analysis. This work also aims to study the viability of the implementation of Differential Evolution algorithm for optimization the physical and operational parameters that, obeying the laws of similarity, lead to a test section on a reduced scale of the reactor pool.

  19. Application of computational intelligence techniques for load shedding in power systems: A review

    International Nuclear Information System (INIS)

    Laghari, J.A.; Mokhlis, H.; Bakar, A.H.A.; Mohamad, Hasmaini

    2013-01-01

    Highlights: • The power system blackout history of last two decades is presented. • Conventional load shedding techniques, their types and limitations are presented. • Applications of intelligent techniques in load shedding are presented. • Intelligent techniques include ANN, fuzzy logic, ANFIS, genetic algorithm and PSO. • The discussion and comparison between these techniques are provided. - Abstract: Recent blackouts around the world question the reliability of conventional and adaptive load shedding techniques in avoiding such power outages. To address this issue, reliable techniques are required to provide fast and accurate load shedding to prevent collapse in the power system. Computational intelligence techniques, due to their robustness and flexibility in dealing with complex non-linear systems, could be an option in addressing this problem. Computational intelligence includes techniques like artificial neural networks, genetic algorithms, fuzzy logic control, adaptive neuro-fuzzy inference system, and particle swarm optimization. Research in these techniques is being undertaken in order to discover means for more efficient and reliable load shedding. This paper provides an overview of these techniques as applied to load shedding in a power system. This paper also compares the advantages of computational intelligence techniques over conventional load shedding techniques. Finally, this paper discusses the limitation of computational intelligence techniques, which restricts their usage in load shedding in real time

  20. Ifuzzer : An evolutionary interpreter fuzzer using genetic programming

    NARCIS (Netherlands)

    Veggalam, Spandan; Rawat, Sanjay; Haller, Istvan; Bos, Herbert

    We present an automated evolutionary fuzzing technique to find bugs in JavaScript interpreters. Fuzzing is an automated black box testing technique used for finding security vulnerabilities in the software by providing random data as input. However, in the case of an interpreter, fuzzing is

  1. Langley's CSI evolutionary model: Phase O

    Science.gov (United States)

    Belvin, W. Keith; Elliott, Kenny B.; Horta, Lucas G.; Bailey, Jim P.; Bruner, Anne M.; Sulla, Jeffrey L.; Won, John; Ugoletti, Roberto M.

    1991-01-01

    A testbed for the development of Controls Structures Interaction (CSI) technology to improve space science platform pointing is described. The evolutionary nature of the testbed will permit the study of global line-of-sight pointing in phases 0 and 1, whereas, multipayload pointing systems will be studied beginning with phase 2. The design, capabilities, and typical dynamic behavior of the phase 0 version of the CSI evolutionary model (CEM) is documented for investigator both internal and external to NASA. The model description includes line-of-sight pointing measurement, testbed structure, actuators, sensors, and real time computers, as well as finite element and state space models of major components.

  2. Evolutionary design assistants for architecture

    Directory of Open Access Journals (Sweden)

    N. Onur Sönmez

    2015-04-01

    Full Text Available In its parallel pursuit of an increased competitivity for design offices and more pleasurable and easier workflows for designers, artificial design intelligence is a technical, intellectual, and political challenge. While human-machine cooperation has become commonplace through Computer Aided Design (CAD tools, a more improved collaboration and better support appear possible only through an endeavor into a kind of artificial design intelligence, which is more sensitive to the human perception of affairs. Considered as part of the broader Computational Design studies, the research program of this quest can be called Artificial / Autonomous / Automated Design (AD. The current available level of Artificial Intelligence (AI for design is limited and a viable aim for current AD would be to develop design assistants that are capable of producing drafts for various design tasks. Thus, the overall aim of this thesis is the development of approaches, techniques, and tools towards artificial design assistants that offer a capability for generating drafts for sub-tasks within design processes. The main technology explored for this aim is Evolutionary Computation (EC, and the target design domain is architecture. The two connected research questions of the study concern, first, the investigation of the ways to develop an architectural design assistant, and secondly, the utilization of EC for the development of such assistants. While developing approaches, techniques, and computational tools for such an assistant, the study also carries out a broad theoretical investigation into the main problems, challenges, and requirements towards such assistants on a rather overall level. Therefore, the research is shaped as a parallel investigation of three main threads interwoven along several levels, moving from a more general level to specific applications. The three research threads comprise, first, theoretical discussions and speculations with regard to both

  3. An Efficient Evolutionary Based Method For Image Segmentation

    OpenAIRE

    Aslanzadeh, Roohollah; Qazanfari, Kazem; Rahmati, Mohammad

    2017-01-01

    The goal of this paper is to present a new efficient image segmentation method based on evolutionary computation which is a model inspired from human behavior. Based on this model, a four layer process for image segmentation is proposed using the split/merge approach. In the first layer, an image is split into numerous regions using the watershed algorithm. In the second layer, a co-evolutionary process is applied to form centers of finals segments by merging similar primary regions. In the t...

  4. Evolutionary engineering for industrial microbiology.

    Science.gov (United States)

    Vanee, Niti; Fisher, Adam B; Fong, Stephen S

    2012-01-01

    Superficially, evolutionary engineering is a paradoxical field that balances competing interests. In natural settings, evolution iteratively selects and enriches subpopulations that are best adapted to a particular ecological niche using random processes such as genetic mutation. In engineering desired approaches utilize rational prospective design to address targeted problems. When considering details of evolutionary and engineering processes, more commonality can be found. Engineering relies on detailed knowledge of the problem parameters and design properties in order to predict design outcomes that would be an optimized solution. When detailed knowledge of a system is lacking, engineers often employ algorithmic search strategies to identify empirical solutions. Evolution epitomizes this iterative optimization by continuously diversifying design options from a parental design, and then selecting the progeny designs that represent satisfactory solutions. In this chapter, the technique of applying the natural principles of evolution to engineer microbes for industrial applications is discussed to highlight the challenges and principles of evolutionary engineering.

  5. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  6. Swarm, genetic and evolutionary programming algorithms applied to multiuser detection

    Directory of Open Access Journals (Sweden)

    Paul Jean Etienne Jeszensky

    2005-02-01

    Full Text Available In this paper, the particles swarm optimization technique, recently published in the literature, and applied to Direct Sequence/Code Division Multiple Access systems (DS/CDMA with multiuser detection (MuD is analyzed, evaluated and compared. The Swarm algorithm efficiency when applied to the DS-CDMA multiuser detection (Swarm-MuD is compared through the tradeoff performance versus computational complexity, being the complexity expressed in terms of the number of necessary operations in order to reach the performance obtained through the optimum detector or the Maximum Likelihood detector (ML. The comparison is accomplished among the genetic algorithm, evolutionary programming with cloning and Swarm algorithm under the same simulation basis. Additionally, it is proposed an heuristics-MuD complexity analysis through the number of computational operations. Finally, an analysis is carried out for the input parameters of the Swarm algorithm in the attempt to find the optimum parameters (or almost-optimum for the algorithm applied to the MuD problem.

  7. An Improved Evolutionary Programming with Voting and Elitist Dispersal Scheme

    Science.gov (United States)

    Maity, Sayan; Gunjan, Kumar; Das, Swagatam

    Although initially conceived for evolving finite state machines, Evolutionary Programming (EP), in its present form, is largely used as a powerful real parameter optimizer. For function optimization, EP mainly relies on its mutation operators. Over past few years several mutation operators have been proposed to improve the performance of EP on a wide variety of numerical benchmarks. However, unlike real-coded GAs, there has been no fitness-induced bias in parent selection for mutation in EP. That means the i-th population member is selected deterministically for mutation and creation of the i-th offspring in each generation. In this article we present an improved EP variant called Evolutionary Programming with Voting and Elitist Dispersal (EPVE). The scheme encompasses a voting process which not only gives importance to best solutions but also consider those solutions which are converging fast. By introducing Elitist Dispersal Scheme we maintain the elitism by keeping the potential solutions intact and other solutions are perturbed accordingly, so that those come out of the local minima. By applying these two techniques we can be able to explore those regions which have not been explored so far that may contain optima. Comparison with the recent and best-known versions of EP over 25 benchmark functions from the CEC (Congress on Evolutionary Computation) 2005 test-suite for real parameter optimization reflects the superiority of the new scheme in terms of final accuracy, speed, and robustness.

  8. Cloud computing and digital media fundamentals, techniques, and applications

    CERN Document Server

    Li, Kuan-Ching; Shih, Timothy K

    2014-01-01

    Cloud Computing and Digital Media: Fundamentals, Techniques, and Applications presents the fundamentals of cloud and media infrastructure, novel technologies that integrate digital media with cloud computing, and real-world applications that exemplify the potential of cloud computing for next-generation digital media. It brings together technologies for media/data communication, elastic media/data storage, security, authentication, cross-network media/data fusion, interdevice media interaction/reaction, data centers, PaaS, SaaS, and more.The book covers resource optimization for multimedia clo

  9. Fusion of neural computing and PLS techniques for load estimation

    Energy Technology Data Exchange (ETDEWEB)

    Lu, M.; Xue, H.; Cheng, X. [Northwestern Polytechnical Univ., Xi' an (China); Zhang, W. [Xi' an Inst. of Post and Telecommunication, Xi' an (China)

    2007-07-01

    A method to predict the electric load of a power system in real time was presented. The method is based on neurocomputing and partial least squares (PLS). Short-term load forecasts for power systems are generally determined by conventional statistical methods and Computational Intelligence (CI) techniques such as neural computing. However, statistical modeling methods often require the input of questionable distributional assumptions, and neural computing is weak, particularly in determining topology. In order to overcome the problems associated with conventional techniques, the authors developed a CI hybrid model based on neural computation and PLS techniques. The theoretical foundation for the designed CI hybrid model was presented along with its application in a power system. The hybrid model is suitable for nonlinear modeling and latent structure extracting. It can automatically determine the optimal topology to maximize the generalization. The CI hybrid model provides faster convergence and better prediction results compared to the abductive networks model because it incorporates a load conversion technique as well as new transfer functions. In order to demonstrate the effectiveness of the hybrid model, load forecasting was performed on a data set obtained from the Puget Sound Power and Light Company. Compared with the abductive networks model, the CI hybrid model reduced the forecast error by 32.37 per cent on workday, and by an average of 27.18 per cent on the weekend. It was concluded that the CI hybrid model has a more powerful predictive ability. 7 refs., 1 tab., 3 figs.

  10. A heuristic ranking approach on capacity benefit margin determination using Pareto-based evolutionary programming technique.

    Science.gov (United States)

    Othman, Muhammad Murtadha; Abd Rahman, Nurulazmi; Musirin, Ismail; Fotuhi-Firuzabad, Mahmud; Rajabi-Ghahnavieh, Abbas

    2015-01-01

    This paper introduces a novel multiobjective approach for capacity benefit margin (CBM) assessment taking into account tie-line reliability of interconnected systems. CBM is the imperative information utilized as a reference by the load-serving entities (LSE) to estimate a certain margin of transfer capability so that a reliable access to generation through interconnected system could be attained. A new Pareto-based evolutionary programming (EP) technique is used to perform a simultaneous determination of CBM for all areas of the interconnected system. The selection of CBM at the Pareto optimal front is proposed to be performed by referring to a heuristic ranking index that takes into account system loss of load expectation (LOLE) in various conditions. Eventually, the power transfer based available transfer capability (ATC) is determined by considering the firm and nonfirm transfers of CBM. A comprehensive set of numerical studies are conducted on the modified IEEE-RTS79 and the performance of the proposed method is numerically investigated in detail. The main advantage of the proposed technique is in terms of flexibility offered to an independent system operator in selecting an appropriate solution of CBM simultaneously for all areas.

  11. A Heuristic Ranking Approach on Capacity Benefit Margin Determination Using Pareto-Based Evolutionary Programming Technique

    Directory of Open Access Journals (Sweden)

    Muhammad Murtadha Othman

    2015-01-01

    Full Text Available This paper introduces a novel multiobjective approach for capacity benefit margin (CBM assessment taking into account tie-line reliability of interconnected systems. CBM is the imperative information utilized as a reference by the load-serving entities (LSE to estimate a certain margin of transfer capability so that a reliable access to generation through interconnected system could be attained. A new Pareto-based evolutionary programming (EP technique is used to perform a simultaneous determination of CBM for all areas of the interconnected system. The selection of CBM at the Pareto optimal front is proposed to be performed by referring to a heuristic ranking index that takes into account system loss of load expectation (LOLE in various conditions. Eventually, the power transfer based available transfer capability (ATC is determined by considering the firm and nonfirm transfers of CBM. A comprehensive set of numerical studies are conducted on the modified IEEE-RTS79 and the performance of the proposed method is numerically investigated in detail. The main advantage of the proposed technique is in terms of flexibility offered to an independent system operator in selecting an appropriate solution of CBM simultaneously for all areas.

  12. Short-term electric load forecasting using computational intelligence methods

    OpenAIRE

    Jurado, Sergio; Peralta, J.; Nebot, Àngela; Mugica, Francisco; Cortez, Paulo

    2013-01-01

    Accurate time series forecasting is a key issue to support individual and organizational decision making. In this paper, we introduce several methods for short-term electric load forecasting. All the presented methods stem from computational intelligence techniques: Random Forest, Nonlinear Autoregressive Neural Networks, Evolutionary Support Vector Machines and Fuzzy Inductive Reasoning. The performance of the suggested methods is experimentally justified with several experiments carried out...

  13. Robust and Flexible Scheduling with Evolutionary Computation

    DEFF Research Database (Denmark)

    Jensen, Mikkel T.

    Over the last ten years, there have been numerous applications of evolutionary algorithms to a variety of scheduling problems. Like most other research on heuristic scheduling, the primary aim of the research has been on deterministic formulations of the problems. This is in contrast to real world...... scheduling problems which are usually not deterministic. Usually at the time the schedule is made some information about the problem and processing environment is available, but this information is uncertain and likely to change during schedule execution. Changes frequently encountered in scheduling...... environments include machine breakdowns, uncertain processing times, workers getting sick, materials being delayed and the appearance of new jobs. These possible environmental changes mean that a schedule which was optimal for the information available at the time of scheduling can end up being highly...

  14. An evolutionary outlook of air traffic flow management techniques

    Science.gov (United States)

    Kistan, Trevor; Gardi, Alessandro; Sabatini, Roberto; Ramasamy, Subramanian; Batuwangala, Eranga

    2017-01-01

    In recent years Air Traffic Flow Management (ATFM) has become pertinent even in regions without sustained overload conditions caused by dense traffic operations. Increasing traffic volumes in the face of constrained resources has created peak congestion at specific locations and times in many areas of the world. Increased environmental awareness and economic drivers have combined to create a resurgent interest in ATFM as evidenced by a spate of recent ATFM conferences and workshops mediated by official bodies such as ICAO, IATA, CANSO the FAA and Eurocontrol. Significant ATFM acquisitions in the last 5 years include South Africa, Australia and India. Singapore, Thailand and Korea are all expected to procure ATFM systems within a year while China is expected to develop a bespoke system. Asia-Pacific nations are particularly pro-active given the traffic growth projections for the region (by 2050 half of all air traffic will be to, from or within the Asia-Pacific region). National authorities now have access to recently published international standards to guide the development of national and regional operational concepts for ATFM, geared to Communications, Navigation, Surveillance/Air Traffic Management and Avionics (CNS+A) evolutions. This paper critically reviews the field to determine which ATFM research and development efforts hold the best promise for practical technological implementations, offering clear benefits both in terms of enhanced safety and efficiency in times of growing air traffic. An evolutionary approach is adopted starting from an ontology of current ATFM techniques and proceeding to identify the technological and regulatory evolutions required in the future CNS+A context, as the aviation industry moves forward with a clearer understanding of emerging operational needs, the geo-political realities of regional collaboration and the impending needs of global harmonisation.

  15. Evolutionary image simplification for lung nodule classification with convolutional neural networks.

    Science.gov (United States)

    Lückehe, Daniel; von Voigt, Gabriele

    2018-05-29

    Understanding decisions of deep learning techniques is important. Especially in the medical field, the reasons for a decision in a classification task are as crucial as the pure classification results. In this article, we propose a new approach to compute relevant parts of a medical image. Knowing the relevant parts makes it easier to understand decisions. In our approach, a convolutional neural network is employed to learn structures of images of lung nodules. Then, an evolutionary algorithm is applied to compute a simplified version of an unknown image based on the learned structures by the convolutional neural network. In the simplified version, irrelevant parts are removed from the original image. In the results, we show simplified images which allow the observer to focus on the relevant parts. In these images, more than 50% of the pixels are simplified. The simplified pixels do not change the meaning of the images based on the learned structures by the convolutional neural network. An experimental analysis shows the potential of the approach. Besides the examples of simplified images, we analyze the run time development. Simplified images make it easier to focus on relevant parts and to find reasons for a decision. The combination of an evolutionary algorithm employing a learned convolutional neural network is well suited for the simplification task. From a research perspective, it is interesting which areas of the images are simplified and which parts are taken as relevant.

  16. Optimality and stability of symmetric evolutionary games with applications in genetic selection.

    Science.gov (United States)

    Huang, Yuanyuan; Hao, Yiping; Wang, Min; Zhou, Wen; Wu, Zhijun

    2015-06-01

    Symmetric evolutionary games, i.e., evolutionary games with symmetric fitness matrices, have important applications in population genetics, where they can be used to model for example the selection and evolution of the genotypes of a given population. In this paper, we review the theory for obtaining optimal and stable strategies for symmetric evolutionary games, and provide some new proofs and computational methods. In particular, we review the relationship between the symmetric evolutionary game and the generalized knapsack problem, and discuss the first and second order necessary and sufficient conditions that can be derived from this relationship for testing the optimality and stability of the strategies. Some of the conditions are given in different forms from those in previous work and can be verified more efficiently. We also derive more efficient computational methods for the evaluation of the conditions than conventional approaches. We demonstrate how these conditions can be applied to justifying the strategies and their stabilities for a special class of genetic selection games including some in the study of genetic disorders.

  17. Biology Needs Evolutionary Software Tools: Let’s Build Them Right

    Science.gov (United States)

    Team, Galaxy; Goecks, Jeremy; Taylor, James

    2018-01-01

    Abstract Research in population genetics and evolutionary biology has always provided a computational backbone for life sciences as a whole. Today evolutionary and population biology reasoning are essential for interpretation of large complex datasets that are characteristic of all domains of today’s life sciences ranging from cancer biology to microbial ecology. This situation makes algorithms and software tools developed by our community more important than ever before. This means that we, developers of software tool for molecular evolutionary analyses, now have a shared responsibility to make these tools accessible using modern technological developments as well as provide adequate documentation and training. PMID:29688462

  18. Computer aided composition by means of interactive GP

    DEFF Research Database (Denmark)

    Ando, Daichi; Dahlstedt, Palle; Nordahl, Mats G.

    2006-01-01

    Research on the application of Interactive Evolutionary Computation (IEC) to the field of musical computation has been improved in recent years, marking an interesting parallel to the current trend of applying human characteristics or sensitivities to computer systems. However, past techniques...... developed for IEC-based composition have not necessarily proven very effective for professional use. This is due to the large difference between data representation used by IEC and authored classical music composition. To solve this difficulties, we purpose a new IEC approach to music composition based...... on classical music theory. In this paper, we describe an established system according to the above idea, and detail of making success of composition a piece....

  19. Network-level architecture and the evolutionary potential of underground metabolism.

    Science.gov (United States)

    Notebaart, Richard A; Szappanos, Balázs; Kintses, Bálint; Pál, Ferenc; Györkei, Ádám; Bogos, Balázs; Lázár, Viktória; Spohn, Réka; Csörgő, Bálint; Wagner, Allon; Ruppin, Eytan; Pál, Csaba; Papp, Balázs

    2014-08-12

    A central unresolved issue in evolutionary biology is how metabolic innovations emerge. Low-level enzymatic side activities are frequent and can potentially be recruited for new biochemical functions. However, the role of such underground reactions in adaptation toward novel environments has remained largely unknown and out of reach of computational predictions, not least because these issues demand analyses at the level of the entire metabolic network. Here, we provide a comprehensive computational model of the underground metabolism in Escherichia coli. Most underground reactions are not isolated and 45% of them can be fully wired into the existing network and form novel pathways that produce key precursors for cell growth. This observation allowed us to conduct an integrated genome-wide in silico and experimental survey to characterize the evolutionary potential of E. coli to adapt to hundreds of nutrient conditions. We revealed that underground reactions allow growth in new environments when their activity is increased. We estimate that at least ∼20% of the underground reactions that can be connected to the existing network confer a fitness advantage under specific environments. Moreover, our results demonstrate that the genetic basis of evolutionary adaptations via underground metabolism is computationally predictable. The approach used here has potential for various application areas from bioengineering to medical genetics.

  20. Visualization of Minkowski operations by computer graphics techniques

    NARCIS (Netherlands)

    Roerdink, J.B.T.M.; Blaauwgeers, G.S.M.; Serra, J; Soille, P

    1994-01-01

    We consider the problem of visualizing 3D objects defined as a Minkowski addition or subtraction of elementary objects. It is shown that such visualizations can be obtained by using techniques from computer graphics such as ray tracing and Constructive Solid Geometry. Applications of the method are

  1. At the crossroads of evolutionary computation and music: self-programming synthesizers, swarm orchestras and the origins of melody.

    Science.gov (United States)

    Miranda, Eduardo Reck

    2004-01-01

    This paper introduces three approaches to using Evolutionary Computation (EC) in Music (namely, engineering, creative and musicological approaches) and discusses examples of representative systems that have been developed within the last decade, with emphasis on more recent and innovative works. We begin by reviewing engineering applications of EC in Music Technology such as Genetic Algorithms and Cellular Automata sound synthesis, followed by an introduction to applications where EC has been used to generate musical compositions. Next, we introduce ongoing research into EC models to study the origins of music and detail our own research work on modelling the evolution of melody. Copryright 2004 Massachusetts Institute of Technology

  2. Experimental data processing techniques by a personal computer

    International Nuclear Information System (INIS)

    Matsuura, Kiyokata; Tsuda, Kenzo; Abe, Yoshihiko; Kojima, Tsuyoshi; Nishikawa, Akira; Shimura, Hitoshi; Hyodo, Hiromi; Yamagishi, Shigeru.

    1989-01-01

    A personal computer (16-bit, about 1 MB memory) can be used at a low cost in the experimental data processing. This report surveys the important techniques on A/D and D/A conversion, display, store and transfer of the experimental data. It is also discussed the items to be considered in the software. Practical softwares programed BASIC and Assembler language are given as examples. Here, we present some techniques to get faster process in BASIC language and show that the system composed of BASIC and Assembler is useful in a practical experiment. The system performance such as processing speed and flexibility in setting operation condition will depend strongly on programming language. We have made test for processing speed by some typical programming languages; BASIC(interpreter), C, FORTRAN and Assembler. As for the calculation, FORTRAN has the best performance which is comparable to or better than Assembler even in the personal computer. (author)

  3. Prediction of scour caused by 2D horizontal jets using soft computing techniques

    Directory of Open Access Journals (Sweden)

    Masoud Karbasi

    2017-12-01

    Full Text Available This paper presents application of five soft-computing techniques, artificial neural networks, support vector regression, gene expression programming, grouping method of data handling (GMDH neural network and adaptive-network-based fuzzy inference system, to predict maximum scour hole depth downstream of a sluice gate. The input parameters affecting the scour depth are the sediment size and its gradation, apron length, sluice gate opening, jet Froude number and the tail water depth. Six non-dimensional parameters were achieved to define a functional relationship between the input and output variables. Published data were used from the experimental researches. The results of soft-computing techniques were compared with empirical and regression based equations. The results obtained from the soft-computing techniques are superior to those of empirical and regression based equations. Comparison of soft-computing techniques showed that accuracy of the ANN model is higher than other models (RMSE = 0.869. A new GEP based equation was proposed.

  4. Artificial Life as an Aid to Astrobiology: Testing Life Seeking Techniques

    OpenAIRE

    Centler, F.; Dittrich, P.; Ku, L.; Matsumaru, N.; Pfaffmann, J.; Zauner, K.-P.

    2003-01-01

    Searching for signatures of fossil or present life in our solar system requires autonomous devices capable of investigating remote locations with limited assistance from earth. Here, we use an artificial chemistry model to create spatially complex chemical environments. An autonomous experimentation technique based on evolutionary computation is then employed to explore these environments with the aim of discovering the chemical signature of small patches of biota present in the simulation sp...

  5. Spore: Spawning Evolutionary Misconceptions?

    Science.gov (United States)

    Bean, Thomas E.; Sinatra, Gale M.; Schrader, P. G.

    2010-10-01

    The use of computer simulations as educational tools may afford the means to develop understanding of evolution as a natural, emergent, and decentralized process. However, special consideration of developmental constraints on learning may be necessary when using these technologies. Specifically, the essentialist (biological forms possess an immutable essence), teleological (assignment of purpose to living things and/or parts of living things that may not be purposeful), and intentionality (assumption that events are caused by an intelligent agent) biases may be reinforced through the use of computer simulations, rather than addressed with instruction. We examine the video game Spore for its depiction of evolutionary content and its potential to reinforce these cognitive biases. In particular, we discuss three pedagogical strategies to mitigate weaknesses of Spore and other computer simulations: directly targeting misconceptions through refutational approaches, targeting specific principles of scientific inquiry, and directly addressing issues related to models as cognitive tools.

  6. Proceedings of the 2011 International Conference on Informatics, Cybernetics, and Computer Engineering

    CERN Document Server

    2012-01-01

    The volume includes a set of selected papers extended and revised from the International Conference on Informatics, Cybernetics, and Computer Engineering. Intelligent control is a class of control techniques, that use various AI computing approaches like neural networks, Bayesian probability, fuzzy logic, machine learning, evolutionary computation and genetic algorithms. Intelligent control can be divided into the following major sub-domains: Neural network control Bayesian control Fuzzy (logic) control Neuro-fuzzy control Expert Systems Genetic control Intelligent agents (Cognitive/Conscious control) New control techniques are created continuously as new models of intelligent behavior are created and computational methods developed to support them. Networks may be classified according to a wide variety of characteristics such as medium used to transport the data, communications protocol used, scale, topology, organizational scope, etc. ICCE 2011 Volume 1 is to provide a forum for researchers, educators, engi...

  7. Computer-aided classification of lung nodules on computed tomography images via deep learning technique

    Directory of Open Access Journals (Sweden)

    Hua KL

    2015-08-01

    Full Text Available Kai-Lung Hua,1 Che-Hao Hsu,1 Shintami Chusnul Hidayati,1 Wen-Huang Cheng,2 Yu-Jen Chen3 1Department of Computer Science and Information Engineering, National Taiwan University of Science and Technology, 2Research Center for Information Technology Innovation, Academia Sinica, 3Department of Radiation Oncology, MacKay Memorial Hospital, Taipei, Taiwan Abstract: Lung cancer has a poor prognosis when not diagnosed early and unresectable lesions are present. The management of small lung nodules noted on computed tomography scan is controversial due to uncertain tumor characteristics. A conventional computer-aided diagnosis (CAD scheme requires several image processing and pattern recognition steps to accomplish a quantitative tumor differentiation result. In such an ad hoc image analysis pipeline, every step depends heavily on the performance of the previous step. Accordingly, tuning of classification performance in a conventional CAD scheme is very complicated and arduous. Deep learning techniques, on the other hand, have the intrinsic advantage of an automatic exploitation feature and tuning of performance in a seamless fashion. In this study, we attempted to simplify the image analysis pipeline of conventional CAD with deep learning techniques. Specifically, we introduced models of a deep belief network and a convolutional neural network in the context of nodule classification in computed tomography images. Two baseline methods with feature computing steps were implemented for comparison. The experimental results suggest that deep learning methods could achieve better discriminative results and hold promise in the CAD application domain. Keywords: nodule classification, deep learning, deep belief network, convolutional neural network

  8. Evolutionary Nephrology.

    Science.gov (United States)

    Chevalier, Robert L

    2017-05-01

    Progressive kidney disease follows nephron loss, hyperfiltration, and incomplete repair, a process described as "maladaptive." In the past 20 years, a new discipline has emerged that expands research horizons: evolutionary medicine. In contrast to physiologic (homeostatic) adaptation, evolutionary adaptation is the result of reproductive success that reflects natural selection. Evolutionary explanations for physiologically maladaptive responses can emerge from mismatch of the phenotype with environment or evolutionary tradeoffs. Evolutionary adaptation to a terrestrial environment resulted in a vulnerable energy-consuming renal tubule and a hypoxic, hyperosmolar microenvironment. Natural selection favors successful energy investment strategy: energy is allocated to maintenance of nephron integrity through reproductive years, but this declines with increasing senescence after ~40 years of age. Risk factors for chronic kidney disease include restricted fetal growth or preterm birth (life history tradeoff resulting in fewer nephrons), evolutionary selection for APOL1 mutations (that provide resistance to trypanosome infection, a tradeoff), and modern life experience (Western diet mismatch leading to diabetes and hypertension). Current advances in genomics, epigenetics, and developmental biology have revealed proximate causes of kidney disease, but attempts to slow kidney disease remain elusive. Evolutionary medicine provides a complementary approach by addressing ultimate causes of kidney disease. Marked variation in nephron number at birth, nephron heterogeneity, and changing susceptibility to kidney injury throughout life history are the result of evolutionary processes. Combined application of molecular genetics, evolutionary developmental biology (evo-devo), developmental programming and life history theory may yield new strategies for prevention and treatment of chronic kidney disease.

  9. Evolutionary Nephrology

    Directory of Open Access Journals (Sweden)

    Robert L. Chevalier

    2017-05-01

    Full Text Available Progressive kidney disease follows nephron loss, hyperfiltration, and incomplete repair, a process described as “maladaptive.” In the past 20 years, a new discipline has emerged that expands research horizons: evolutionary medicine. In contrast to physiologic (homeostatic adaptation, evolutionary adaptation is the result of reproductive success that reflects natural selection. Evolutionary explanations for physiologically maladaptive responses can emerge from mismatch of the phenotype with environment or from evolutionary tradeoffs. Evolutionary adaptation to a terrestrial environment resulted in a vulnerable energy-consuming renal tubule and a hypoxic, hyperosmolar microenvironment. Natural selection favors successful energy investment strategy: energy is allocated to maintenance of nephron integrity through reproductive years, but this declines with increasing senescence after ∼40 years of age. Risk factors for chronic kidney disease include restricted fetal growth or preterm birth (life history tradeoff resulting in fewer nephrons, evolutionary selection for APOL1 mutations (which provide resistance to trypanosome infection, a tradeoff, and modern life experience (Western diet mismatch leading to diabetes and hypertension. Current advances in genomics, epigenetics, and developmental biology have revealed proximate causes of kidney disease, but attempts to slow kidney disease remain elusive. Evolutionary medicine provides a complementary approach by addressing ultimate causes of kidney disease. Marked variation in nephron number at birth, nephron heterogeneity, and changing susceptibility to kidney injury throughout the life history are the result of evolutionary processes. Combined application of molecular genetics, evolutionary developmental biology (evo-devo, developmental programming, and life history theory may yield new strategies for prevention and treatment of chronic kidney disease.

  10. European Workshop on High Order Nonlinear Numerical Schemes for Evolutionary PDEs

    CERN Document Server

    Beaugendre, Héloïse; Congedo, Pietro; Dobrzynski, Cécile; Perrier, Vincent; Ricchiuto, Mario

    2014-01-01

    This book collects papers presented during the European Workshop on High Order Nonlinear Numerical Methods for Evolutionary PDEs (HONOM 2013) that was held at INRIA Bordeaux Sud-Ouest, Talence, France in March, 2013. The central topic is high order methods for compressible fluid dynamics. In the workshop, and in this proceedings, greater emphasis is placed on the numerical than the theoretical aspects of this scientific field. The range of topics is broad, extending through algorithm design, accuracy, large scale computing, complex geometries, discontinuous Galerkin, finite element methods, Lagrangian hydrodynamics, finite difference methods and applications and uncertainty quantification. These techniques find practical applications in such fields as fluid mechanics, magnetohydrodynamics, nonlinear solid mechanics, and others for which genuinely nonlinear methods are needed.

  11. Bone tissue engineering scaffolding: computer-aided scaffolding techniques.

    Science.gov (United States)

    Thavornyutikarn, Boonlom; Chantarapanich, Nattapon; Sitthiseripratip, Kriskrai; Thouas, George A; Chen, Qizhi

    Tissue engineering is essentially a technique for imitating nature. Natural tissues consist of three components: cells, signalling systems (e.g. growth factors) and extracellular matrix (ECM). The ECM forms a scaffold for its cells. Hence, the engineered tissue construct is an artificial scaffold populated with living cells and signalling molecules. A huge effort has been invested in bone tissue engineering, in which a highly porous scaffold plays a critical role in guiding bone and vascular tissue growth and regeneration in three dimensions. In the last two decades, numerous scaffolding techniques have been developed to fabricate highly interconnective, porous scaffolds for bone tissue engineering applications. This review provides an update on the progress of foaming technology of biomaterials, with a special attention being focused on computer-aided manufacturing (Andrade et al. 2002) techniques. This article starts with a brief introduction of tissue engineering (Bone tissue engineering and scaffolds) and scaffolding materials (Biomaterials used in bone tissue engineering). After a brief reviews on conventional scaffolding techniques (Conventional scaffolding techniques), a number of CAM techniques are reviewed in great detail. For each technique, the structure and mechanical integrity of fabricated scaffolds are discussed in detail. Finally, the advantaged and disadvantage of these techniques are compared (Comparison of scaffolding techniques) and summarised (Summary).

  12. Measuring techniques in emission computed tomography

    International Nuclear Information System (INIS)

    Jordan, K.; Knoop, B.

    1988-01-01

    The chapter reviews the historical development of the emission computed tomography and its basic principles, proceeds to SPECT and PET, special techniques of emission tomography, and concludes with a comprehensive discussion of the mathematical fundamentals of the reconstruction and the quantitative activity determination in vivo, dealing with radon transformation and the projection slice theorem, methods of image reconstruction such as analytical and algebraic methods, limiting conditions in real systems such as limited number of measured data, noise enhancement, absorption, stray radiation, and random coincidence. (orig./HP) With 111 figs., 6 tabs [de

  13. Soft Computing Techniques for the Protein Folding Problem on High Performance Computing Architectures.

    Science.gov (United States)

    Llanes, Antonio; Muñoz, Andrés; Bueno-Crespo, Andrés; García-Valverde, Teresa; Sánchez, Antonia; Arcas-Túnez, Francisco; Pérez-Sánchez, Horacio; Cecilia, José M

    2016-01-01

    The protein-folding problem has been extensively studied during the last fifty years. The understanding of the dynamics of global shape of a protein and the influence on its biological function can help us to discover new and more effective drugs to deal with diseases of pharmacological relevance. Different computational approaches have been developed by different researchers in order to foresee the threedimensional arrangement of atoms of proteins from their sequences. However, the computational complexity of this problem makes mandatory the search for new models, novel algorithmic strategies and hardware platforms that provide solutions in a reasonable time frame. We present in this revision work the past and last tendencies regarding protein folding simulations from both perspectives; hardware and software. Of particular interest to us are both the use of inexact solutions to this computationally hard problem as well as which hardware platforms have been used for running this kind of Soft Computing techniques.

  14. Computational Intelligence based techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Laghari, J.A.; Mokhlis, H.; Karimi, M.; Bakar, A.H.A.; Mohamad, Hasmaini

    2014-01-01

    Highlights: • Unintentional and intentional islanding, their causes, and solutions are presented. • Remote, passive, active and hybrid islanding detection techniques are discussed. • The limitation of these techniques in accurately detect islanding are discussed. • Computational intelligence techniques ability in detecting islanding is discussed. • Review of ANN, fuzzy logic control, ANFIS, Decision tree techniques is provided. - Abstract: Accurate and fast islanding detection of distributed generation is highly important for its successful operation in distribution networks. Up to now, various islanding detection technique based on communication, passive, active and hybrid methods have been proposed. However, each technique suffers from certain demerits that cause inaccuracies in islanding detection. Computational intelligence based techniques, due to their robustness and flexibility in dealing with complex nonlinear systems, is an option that might solve this problem. This paper aims to provide a comprehensive review of computational intelligence based techniques applied for islanding detection of distributed generation. Moreover, the paper compares the accuracies of computational intelligence based techniques over existing techniques to provide a handful of information for industries and utility researchers to determine the best method for their respective system

  15. Real-life applications with membrane computing

    CERN Document Server

    Zhang, Gexiang; Gheorghe, Marian

    2017-01-01

    This book thoroughly investigates the underlying theoretical basis of membrane computing models, and reveals their latest applications. In addition, to date there have been no illustrative case studies or complex real-life applications that capitalize on the full potential of the sophisticated membrane systems computational apparatus; gaps that this book remedies. By studying various complex applications – including engineering optimization, power systems fault diagnosis, mobile robot controller design, and complex biological systems involving data modeling and process interactions – the book also extends the capabilities of membrane systems models with features such as formal verification techniques, evolutionary approaches, and fuzzy reasoning methods. As such, the book offers a comprehensive and up-to-date guide for all researchers, PhDs and undergraduate students in the fields of computer science, engineering and the bio-sciences who are interested in the applications of natural computing models.

  16. Evolutionary molecular medicine.

    Science.gov (United States)

    Nesse, Randolph M; Ganten, Detlev; Gregory, T Ryan; Omenn, Gilbert S

    2012-05-01

    Evolution has long provided a foundation for population genetics, but some major advances in evolutionary biology from the twentieth century that provide foundations for evolutionary medicine are only now being applied in molecular medicine. They include the need for both proximate and evolutionary explanations, kin selection, evolutionary models for cooperation, competition between alleles, co-evolution, and new strategies for tracing phylogenies and identifying signals of selection. Recent advances in genomics are transforming evolutionary biology in ways that create even more opportunities for progress at its interfaces with genetics, medicine, and public health. This article reviews 15 evolutionary principles and their applications in molecular medicine in hopes that readers will use them and related principles to speed the development of evolutionary molecular medicine.

  17. Evolving Non-Dominated Parameter Sets for Computational Models from Multiple Experiments

    Science.gov (United States)

    Lane, Peter C. R.; Gobet, Fernand

    2013-03-01

    Creating robust, reproducible and optimal computational models is a key challenge for theorists in many sciences. Psychology and cognitive science face particular challenges as large amounts of data are collected and many models are not amenable to analytical techniques for calculating parameter sets. Particular problems are to locate the full range of acceptable model parameters for a given dataset, and to confirm the consistency of model parameters across different datasets. Resolving these problems will provide a better understanding of the behaviour of computational models, and so support the development of general and robust models. In this article, we address these problems using evolutionary algorithms to develop parameters for computational models against multiple sets of experimental data; in particular, we propose the `speciated non-dominated sorting genetic algorithm' for evolving models in several theories. We discuss the problem of developing a model of categorisation using twenty-nine sets of data and models drawn from four different theories. We find that the evolutionary algorithms generate high quality models, adapted to provide a good fit to all available data.

  18. Teachers of Advertising Media Courses Describe Techniques, Show Computer Applications.

    Science.gov (United States)

    Lancaster, Kent M.; Martin, Thomas C.

    1989-01-01

    Reports on a survey of university advertising media teachers regarding textbooks and instructional aids used, teaching techniques, computer applications, student placement, instructor background, and faculty publishing. (SR)

  19. An Agent-Based Co-Evolutionary Multi-Objective Algorithm for Portfolio Optimization

    Directory of Open Access Journals (Sweden)

    Rafał Dreżewski

    2017-08-01

    Full Text Available Algorithms based on the process of natural evolution are widely used to solve multi-objective optimization problems. In this paper we propose the agent-based co-evolutionary algorithm for multi-objective portfolio optimization. The proposed technique is compared experimentally to the genetic algorithm, co-evolutionary algorithm and a more classical approach—the trend-following algorithm. During the experiments historical data from the Warsaw Stock Exchange is used in order to assess the performance of the compared algorithms. Finally, we draw some conclusions from these experiments, showing the strong and weak points of all the techniques.

  20. An Angiotensin II type 1 receptor activation switch patch revealed through Evolutionary Trace analysis

    DEFF Research Database (Denmark)

    Bonde, Marie Mi; Yao, Rong; Ma, Jian-Nong

    2010-01-01

    to be completely resolved. Evolutionary Trace (ET) analysis is a computational method, which identifies clusters of functionally important residues by integrating information on evolutionary important residue variations with receptor structure. Combined with known mutational data, ET predicted a patch of residues......) displayed phenotypes associated with changed activation state, such as increased agonist affinity or basal activity, promiscuous activation, or constitutive internalization highlighting the importance of testing different signaling pathways. We conclude that this evolutionary important patch mediates...

  1. Unified commutation-pruning technique for efficient computation of composite DFTs

    Science.gov (United States)

    Castro-Palazuelos, David E.; Medina-Melendrez, Modesto Gpe.; Torres-Roman, Deni L.; Shkvarko, Yuriy V.

    2015-12-01

    An efficient computation of a composite length discrete Fourier transform (DFT), as well as a fast Fourier transform (FFT) of both time and space data sequences in uncertain (non-sparse or sparse) computational scenarios, requires specific processing algorithms. Traditional algorithms typically employ some pruning methods without any commutations, which prevents them from attaining the potential computational efficiency. In this paper, we propose an alternative unified approach with automatic commutations between three computational modalities aimed at efficient computations of the pruned DFTs adapted for variable composite lengths of the non-sparse input-output data. The first modality is an implementation of the direct computation of a composite length DFT, the second one employs the second-order recursive filtering method, and the third one performs the new pruned decomposed transform. The pruned decomposed transform algorithm performs the decimation in time or space (DIT) data acquisition domain and, then, decimation in frequency (DIF). The unified combination of these three algorithms is addressed as the DFTCOMM technique. Based on the treatment of the combinational-type hypotheses testing optimization problem of preferable allocations between all feasible commuting-pruning modalities, we have found the global optimal solution to the pruning problem that always requires a fewer or, at most, the same number of arithmetic operations than other feasible modalities. The DFTCOMM method outperforms the existing competing pruning techniques in the sense of attainable savings in the number of required arithmetic operations. It requires fewer or at most the same number of arithmetic operations for its execution than any other of the competing pruning methods reported in the literature. Finally, we provide the comparison of the DFTCOMM with the recently developed sparse fast Fourier transform (SFFT) algorithmic family. We feature that, in the sensing scenarios with

  2. Random sampling technique for ultra-fast computations of molecular opacities for exoplanet atmospheres

    Science.gov (United States)

    Min, M.

    2017-10-01

    Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims: We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods: Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results: The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations ( 3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.

  3. A survey of energy saving techniques for mobile computers

    NARCIS (Netherlands)

    Smit, Gerardus Johannes Maria; Havinga, Paul J.M.

    1997-01-01

    Portable products such as pagers, cordless and digital cellular telephones, personal audio equipment, and laptop computers are increasingly being used. Because these applications are battery powered, reducing power consumption is vital. In this report we first give a survey of techniques for

  4. Computer Aided Measurement Laser (CAML): technique to quantify post-mastectomy lymphoedema

    International Nuclear Information System (INIS)

    Trombetta, Chiara; Abundo, Paolo; Felici, Antonella; Ljoka, Concetta; Foti, Calogero; Cori, Sandro Di; Rosato, Nicola

    2012-01-01

    Lymphoedema can be a side effect of cancer treatment. Eventhough several methods for assessing lymphoedema are used in clinical practice, an objective quantification of lymphoedema has been problematic. The aim of the study was to determine the objectivity, reliability and repeatability of the computer aided measurement laser (CAML) technique. CAML technique is based on computer aided design (CAD) methods and requires an infrared laser scanner. Measurements are scanned and the information describing size and shape of the limb allows to design the model by using the CAD software. The objectivity and repeatability was established in the beginning using a phantom. Consequently a group of subjects presenting post-breast cancer lymphoedema was evaluated using as a control the contralateral limb. Results confirmed that in clinical settings CAML technique is easy to perform, rapid and provides meaningful data for assessing lymphoedema. Future research will include a comparison of upper limb CAML technique between healthy subjects and patients with known lymphoedema.

  5. New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity

    Science.gov (United States)

    Pak, Chan-Gi; Lung, Shun-Fat

    2017-01-01

    A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.

  6. Three-dimensional integrated CAE system applying computer graphic technique

    International Nuclear Information System (INIS)

    Kato, Toshisada; Tanaka, Kazuo; Akitomo, Norio; Obata, Tokayasu.

    1991-01-01

    A three-dimensional CAE system for nuclear power plant design is presented. This system utilizes high-speed computer graphic techniques for the plant design review, and an integrated engineering database for handling the large amount of nuclear power plant engineering data in a unified data format. Applying this system makes it possible to construct a nuclear power plant using only computer data from the basic design phase to the manufacturing phase, and it increases the productivity and reliability of the nuclear power plants. (author)

  7. High Efficiency Computation of the Variances of Structural Evolutionary Random Responses

    Directory of Open Access Journals (Sweden)

    J.H. Lin

    2000-01-01

    Full Text Available For structures subjected to stationary or evolutionary white/colored random noise, their various response variances satisfy algebraic or differential Lyapunov equations. The solution of these Lyapunov equations used to be very difficult. A precise integration method is proposed in the present paper, which solves such Lyapunov equations accurately and very efficiently.

  8. High-efficiency photorealistic computer-generated holograms based on the backward ray-tracing technique

    Science.gov (United States)

    Wang, Yuan; Chen, Zhidong; Sang, Xinzhu; Li, Hui; Zhao, Linmin

    2018-03-01

    Holographic displays can provide the complete optical wave field of a three-dimensional (3D) scene, including the depth perception. However, it often takes a long computation time to produce traditional computer-generated holograms (CGHs) without more complex and photorealistic rendering. The backward ray-tracing technique is able to render photorealistic high-quality images, which noticeably reduce the computation time achieved from the high-degree parallelism. Here, a high-efficiency photorealistic computer-generated hologram method is presented based on the ray-tracing technique. Rays are parallelly launched and traced under different illuminations and circumstances. Experimental results demonstrate the effectiveness of the proposed method. Compared with the traditional point cloud CGH, the computation time is decreased to 24 s to reconstruct a 3D object of 100 ×100 rays with continuous depth change.

  9. Computed tomography of the llama head: technique and normal anatomy

    International Nuclear Information System (INIS)

    Hathcock, J.T.; Pugh, D.G.; Cartee, R.E.; Hammond, L.

    1996-01-01

    Computed tomography was performed on the head of 6 normal adult llamas. The animals were under general anesthesia and positioned in dorsal recumbency on the scanning table. The area scanned was from the external occipital protuberance to the rostral portion of the nasal passage, and the images are presented in both a bone window and a soft tissue window to allow evaluation and identification of the anatomy of the head. Computed tomography of the llama head can be accomplished by most computed tomography scanners utilizing a technique similar to that used in small animals with minor modification of the scanning table

  10. VAR control in distribution systems by using artificial intelligence techniques

    Energy Technology Data Exchange (ETDEWEB)

    Golkar, M.A. [Curtin Univ. of Technology, Sarawak (Malaysia). School of Engineering and Science

    2005-07-01

    This paper reviewed artificial intelligence techniques used in VAR control systems. Reactive power controls in distribution systems were also reviewed. While artificial intelligence methods are widely used in power control systems, the techniques require extensive human knowledge bases and experiences in order to operate correctly. Expert systems use knowledge and interface procedures to solve problems that often require human expertise. Expert systems often cause knowledge bottlenecks as they are unable to learn or adopt to new situations. While neural networks possess learning ability, they are computationally expensive. However, test results in recent neural network studies have demonstrated that they work well in a variety of loading conditions. Fuzzy logic techniques are used to accurately represent the operational constraints of power systems. Fuzzy logic has an advantage over other artificial intelligence techniques as it is able to remedy uncertainties in data. Evolutionary computing algorithms use probabilistic transition rules which can search complicated data to determine optimal constraints and parameters. Over 95 per cent of all papers published on power systems use genetic algorithms. It was concluded that hybrid systems using various artificial intelligence techniques are now being used by researchers. 69 refs.

  11. Applications of NLP Techniques to Computer-Assisted Authoring of Test Items for Elementary Chinese

    Science.gov (United States)

    Liu, Chao-Lin; Lin, Jen-Hsiang; Wang, Yu-Chun

    2010-01-01

    The authors report an implemented environment for computer-assisted authoring of test items and provide a brief discussion about the applications of NLP techniques for computer assisted language learning. Test items can serve as a tool for language learners to examine their competence in the target language. The authors apply techniques for…

  12. Bipartite Graphs as Models of Population Structures in Evolutionary Multiplayer Games

    Science.gov (United States)

    Peña, Jorge; Rochat, Yannick

    2012-01-01

    By combining evolutionary game theory and graph theory, “games on graphs” study the evolutionary dynamics of frequency-dependent selection in population structures modeled as geographical or social networks. Networks are usually represented by means of unipartite graphs, and social interactions by two-person games such as the famous prisoner’s dilemma. Unipartite graphs have also been used for modeling interactions going beyond pairwise interactions. In this paper, we argue that bipartite graphs are a better alternative to unipartite graphs for describing population structures in evolutionary multiplayer games. To illustrate this point, we make use of bipartite graphs to investigate, by means of computer simulations, the evolution of cooperation under the conventional and the distributed N-person prisoner’s dilemma. We show that several implicit assumptions arising from the standard approach based on unipartite graphs (such as the definition of replacement neighborhoods, the intertwining of individual and group diversity, and the large overlap of interaction neighborhoods) can have a large impact on the resulting evolutionary dynamics. Our work provides a clear example of the importance of construction procedures in games on graphs, of the suitability of bigraphs and hypergraphs for computational modeling, and of the importance of concepts from social network analysis such as centrality, centralization and bipartite clustering for the understanding of dynamical processes occurring on networked population structures. PMID:22970237

  13. Evolutionary thinking

    Science.gov (United States)

    Hunt, Tam

    2014-01-01

    Evolution as an idea has a lengthy history, even though the idea of evolution is generally associated with Darwin today. Rebecca Stott provides an engaging and thoughtful overview of this history of evolutionary thinking in her 2013 book, Darwin's Ghosts: The Secret History of Evolution. Since Darwin, the debate over evolution—both how it takes place and, in a long war of words with religiously-oriented thinkers, whether it takes place—has been sustained and heated. A growing share of this debate is now devoted to examining how evolutionary thinking affects areas outside of biology. How do our lives change when we recognize that all is in flux? What can we learn about life more generally if we study change instead of stasis? Carter Phipps’ book, Evolutionaries: Unlocking the Spiritual and Cultural Potential of Science's Greatest Idea, delves deep into this relatively new development. Phipps generally takes as a given the validity of the Modern Synthesis of evolutionary biology. His story takes us into, as the subtitle suggests, the spiritual and cultural implications of evolutionary thinking. Can religion and evolution be reconciled? Can evolutionary thinking lead to a new type of spirituality? Is our culture already being changed in ways that we don't realize by evolutionary thinking? These are all important questions and Phipps book is a great introduction to this discussion. Phipps is an author, journalist, and contributor to the emerging “integral” or “evolutionary” cultural movement that combines the insights of Integral Philosophy, evolutionary science, developmental psychology, and the social sciences. He has served as the Executive Editor of EnlightenNext magazine (no longer published) and more recently is the co-founder of the Institute for Cultural Evolution, a public policy think tank addressing the cultural roots of America's political challenges. What follows is an email interview with Phipps. PMID:26478766

  14. Mathematics in computed tomography and related techniques

    International Nuclear Information System (INIS)

    Sawicka, B.

    1992-01-01

    The mathematical basis of computed tomography (CT) was formulated in 1917 by Radon. His theorem states that the 2-D function f(x,y) can be determined at all points from a complete set of its line integrals. Modern methods of image reconstruction include three approaches: algebraic reconstruction techniques with simultaneous iterative reconstruction or simultaneous algebraic reconstruction; convolution back projection; and the Fourier transform method. There is no one best approach. Because the experimental data do not strictly satisfy theoretical models, a number of effects have to be taken into account; in particular, the problems of beam geometry, finite beam dimensions and distribution, beam scattering, and the radiation source spectrum. Tomography with truncated data is of interest, employing mathematical approximations to compensate for the unmeasured projection data. Mathematical techniques in image processing and data analysis are also extensively used. 13 refs

  15. [Clinical analysis of 12 cases of orthognathic surgery with digital computer-assisted technique].

    Science.gov (United States)

    Tan, Xin-ying; Hu, Min; Liu, Chang-kui; Liu, Hua-wei; Liu, San-xia; Tao, Ye

    2014-06-01

    This study was to investigate the effect of the digital computer-assisted technique in orthognathic surgery. Twelve patients from January 2008 to December 2011 with jaw malformation were treated in our department. With the help of CT and three-dimensional reconstruction technique, 12 patients underwent surgical treatment and the results were evaluated after surgery. Digital computer-assisted technique could clearly show the status of the jaw deformity and assist virtual surgery. After surgery all patients were satisfied with the results. Digital orthognathic surgery can improve the predictability of the surgical procedure, and to facilitate patients' communication, shorten operative time, and reduce patients' pain.

  16. Evolutionary Optimization of Centrifugal Nozzles for Organic Vapours

    Science.gov (United States)

    Persico, Giacomo

    2017-03-01

    This paper discusses the shape-optimization of non-conventional centrifugal turbine nozzles for Organic Rankine Cycle applications. The optimal aerodynamic design is supported by the use of a non-intrusive, gradient-free technique specifically developed for shape optimization of turbomachinery profiles. The method is constructed as a combination of a geometrical parametrization technique based on B-Splines, a high-fidelity and experimentally validated Computational Fluid Dynamic solver, and a surrogate-based evolutionary algorithm. The non-ideal gas behaviour featuring the flow of organic fluids in the cascades of interest is introduced via a look-up-table approach, which is rigorously applied throughout the whole optimization process. Two transonic centrifugal nozzles are considered, featuring very different loading and radial extension. The use of a systematic and automatic design method to such a non-conventional configuration highlights the character of centrifugal cascades; the blades require a specific and non-trivial definition of the shape, especially in the rear part, to avoid the onset of shock waves. It is shown that the optimization acts in similar way for the two cascades, identifying an optimal curvature of the blade that both provides a relevant increase of cascade performance and a reduction of downstream gradients.

  17. A computer code to simulate X-ray imaging techniques

    International Nuclear Information System (INIS)

    Duvauchelle, Philippe; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-01-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests

  18. A computer code to simulate X-ray imaging techniques

    Energy Technology Data Exchange (ETDEWEB)

    Duvauchelle, Philippe E-mail: philippe.duvauchelle@insa-lyon.fr; Freud, Nicolas; Kaftandjian, Valerie; Babot, Daniel

    2000-09-01

    A computer code was developed to simulate the operation of radiographic, radioscopic or tomographic devices. The simulation is based on ray-tracing techniques and on the X-ray attenuation law. The use of computer-aided drawing (CAD) models enables simulations to be carried out with complex three-dimensional (3D) objects and the geometry of every component of the imaging chain, from the source to the detector, can be defined. Geometric unsharpness, for example, can be easily taken into account, even in complex configurations. Automatic translations or rotations of the object can be performed to simulate radioscopic or tomographic image acquisition. Simulations can be carried out with monochromatic or polychromatic beam spectra. This feature enables, for example, the beam hardening phenomenon to be dealt with or dual energy imaging techniques to be studied. The simulation principle is completely deterministic and consequently the computed images present no photon noise. Nevertheless, the variance of the signal associated with each pixel of the detector can be determined, which enables contrast-to-noise ratio (CNR) maps to be computed, in order to predict quantitatively the detectability of defects in the inspected object. The CNR is a relevant indicator for optimizing the experimental parameters. This paper provides several examples of simulated images that illustrate some of the rich possibilities offered by our software. Depending on the simulation type, the computation time order of magnitude can vary from 0.1 s (simple radiographic projection) up to several hours (3D tomography) on a PC, with a 400 MHz microprocessor. Our simulation tool proves to be useful in developing new specific applications, in choosing the most suitable components when designing a new testing chain, and in saving time by reducing the number of experimental tests.

  19. Determining flexor-tendon repair techniques via soft computing

    Science.gov (United States)

    Johnson, M.; Firoozbakhsh, K.; Moniem, M.; Jamshidi, M.

    2001-01-01

    An SC-based multi-objective decision-making method for determining the optimal flexor-tendon repair technique from experimental and clinical survey data, and with variable circumstances, was presented. Results were compared with those from the Taguchi method. Using the Taguchi method results in the need to perform ad-hoc decisions when the outcomes for individual objectives are contradictory to a particular preference or circumstance, whereas the SC-based multi-objective technique provides a rigorous straightforward computational process in which changing preferences and importance of differing objectives are easily accommodated. Also, adding more objectives is straightforward and easily accomplished. The use of fuzzy-set representations of information categories provides insight into their performance throughout the range of their universe of discourse. The ability of the technique to provide a "best" medical decision given a particular physician, hospital, patient, situation, and other criteria was also demonstrated.

  20. Evolutionary Stable Strategy

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 21; Issue 9. Evolutionary Stable Strategy: Application of Nash Equilibrium in Biology. General Article Volume 21 Issue 9 September 2016 pp 803- ... Keywords. Evolutionary game theory, evolutionary stable state, conflict, cooperation, biological games.

  1. Jet-images: computer vision inspired techniques for jet tagging

    Energy Technology Data Exchange (ETDEWEB)

    Cogan, Josh; Kagan, Michael; Strauss, Emanuel; Schwarztman, Ariel [SLAC National Accelerator Laboratory,Menlo Park, CA 94028 (United States)

    2015-02-18

    We introduce a novel approach to jet tagging and classification through the use of techniques inspired by computer vision. Drawing parallels to the problem of facial recognition in images, we define a jet-image using calorimeter towers as the elements of the image and establish jet-image preprocessing methods. For the jet-image processing step, we develop a discriminant for classifying the jet-images derived using Fisher discriminant analysis. The effectiveness of the technique is shown within the context of identifying boosted hadronic W boson decays with respect to a background of quark- and gluon-initiated jets. Using Monte Carlo simulation, we demonstrate that the performance of this technique introduces additional discriminating power over other substructure approaches, and gives significant insight into the internal structure of jets.

  2. Jet-images: computer vision inspired techniques for jet tagging

    International Nuclear Information System (INIS)

    Cogan, Josh; Kagan, Michael; Strauss, Emanuel; Schwarztman, Ariel

    2015-01-01

    We introduce a novel approach to jet tagging and classification through the use of techniques inspired by computer vision. Drawing parallels to the problem of facial recognition in images, we define a jet-image using calorimeter towers as the elements of the image and establish jet-image preprocessing methods. For the jet-image processing step, we develop a discriminant for classifying the jet-images derived using Fisher discriminant analysis. The effectiveness of the technique is shown within the context of identifying boosted hadronic W boson decays with respect to a background of quark- and gluon-initiated jets. Using Monte Carlo simulation, we demonstrate that the performance of this technique introduces additional discriminating power over other substructure approaches, and gives significant insight into the internal structure of jets.

  3. Self-organized modularization in evolutionary algorithms.

    Science.gov (United States)

    Dauscher, Peter; Uthmann, Thomas

    2005-01-01

    The principle of modularization has proven to be extremely successful in the field of technical applications and particularly for Software Engineering purposes. The question to be answered within the present article is whether mechanisms can also be identified within the framework of Evolutionary Computation that cause a modularization of solutions. We will concentrate on processes, where modularization results only from the typical evolutionary operators, i.e. selection and variation by recombination and mutation (and not, e.g., from special modularization operators). This is what we call Self-Organized Modularization. Based on a combination of two formalizations by Radcliffe and Altenberg, some quantitative measures of modularity are introduced. Particularly, we distinguish Built-in Modularity as an inherent property of a genotype and Effective Modularity, which depends on the rest of the population. These measures can easily be applied to a wide range of present Evolutionary Computation models. It will be shown, both theoretically and by simulation, that under certain conditions, Effective Modularity (as defined within this paper) can be a selection factor. This causes Self-Organized Modularization to take place. The experimental observations emphasize the importance of Effective Modularity in comparison with Built-in Modularity. Although the experimental results have been obtained using a minimalist toy model, they can lead to a number of consequences for existing models as well as for future approaches. Furthermore, the results suggest a complex self-amplification of highly modular equivalence classes in the case of respected relations. Since the well-known Holland schemata are just the equivalence classes of respected relations in most Simple Genetic Algorithms, this observation emphasizes the role of schemata as Building Blocks (in comparison with arbitrary subsets of the search space).

  4. Research on integrated simulation of fluid-structure system by computation science techniques

    International Nuclear Information System (INIS)

    Yamaguchi, Akira

    1996-01-01

    In Power Reactor and Nuclear Fuel Development Corporation, the research on the integrated simulation of fluid-structure system by computation science techniques has been carried out, and by its achievement, the verification of plant systems which has depended on large scale experiments is substituted by computation science techniques, in this way, it has been aimed at to reduce development costs and to attain the optimization of FBR systems. For the purpose, it is necessary to establish the technology for integrally and accurately analyzing complicated phenomena (simulation technology), the technology for applying it to large scale problems (speed increasing technology), and the technology for assuring the reliability of the results of analysis when simulation technology is utilized for the permission and approval of FBRs (verifying technology). The simulation of fluid-structure interaction, the heat flow simulation in the space with complicated form and the related technologies are explained. As the utilization of computation science techniques, the elucidation of phenomena by numerical experiment and the numerical simulation as the substitute for tests are discussed. (K.I.)

  5. Bioinformatics education dissemination with an evolutionary problem solving perspective.

    Science.gov (United States)

    Jungck, John R; Donovan, Samuel S; Weisstein, Anton E; Khiripet, Noppadon; Everse, Stephen J

    2010-11-01

    Bioinformatics is central to biology education in the 21st century. With the generation of terabytes of data per day, the application of computer-based tools to stored and distributed data is fundamentally changing research and its application to problems in medicine, agriculture, conservation and forensics. In light of this 'information revolution,' undergraduate biology curricula must be redesigned to prepare the next generation of informed citizens as well as those who will pursue careers in the life sciences. The BEDROCK initiative (Bioinformatics Education Dissemination: Reaching Out, Connecting and Knitting together) has fostered an international community of bioinformatics educators. The initiative's goals are to: (i) Identify and support faculty who can take leadership roles in bioinformatics education; (ii) Highlight and distribute innovative approaches to incorporating evolutionary bioinformatics data and techniques throughout undergraduate education; (iii) Establish mechanisms for the broad dissemination of bioinformatics resource materials and teaching models; (iv) Emphasize phylogenetic thinking and problem solving; and (v) Develop and publish new software tools to help students develop and test evolutionary hypotheses. Since 2002, BEDROCK has offered more than 50 faculty workshops around the world, published many resources and supported an environment for developing and sharing bioinformatics education approaches. The BEDROCK initiative builds on the established pedagogical philosophy and academic community of the BioQUEST Curriculum Consortium to assemble the diverse intellectual and human resources required to sustain an international reform effort in undergraduate bioinformatics education.

  6. An evolutionary programming based simulated annealing method for solving the unit commitment problem

    Energy Technology Data Exchange (ETDEWEB)

    Christober Asir Rajan, C. [Department of EEE, Pondicherry Engineering College, Pondicherry 605014 (India); Mohan, M.R. [Department of EEE, Anna University, Chennai 600 025 (India)

    2007-09-15

    This paper presents a new approach to solve the short-term unit commitment problem using an evolutionary programming based simulated annealing method. The objective of this paper is to find the generation scheduling such that the total operating cost can be minimized, when subjected to a variety of constraints. This also means that it is desirable to find the optimal generating unit commitment in the power system for the next H hours. Evolutionary programming, which happens to be a global optimisation technique for solving unit commitment Problem, operates on a system, which is designed to encode each unit's operating schedule with regard to its minimum up/down time. In this, the unit commitment schedule is coded as a string of symbols. An initial population of parent solutions is generated at random. Here, each schedule is formed by committing all the units according to their initial status (''flat start''). Here the parents are obtained from a pre-defined set of solution's, i.e. each and every solution is adjusted to meet the requirements. Then, a random recommitment is carried out with respect to the unit's minimum down times. And SA improves the status. The best population is selected by evolutionary strategy. The Neyveli Thermal Power Station (NTPS) Unit-II in India demonstrates the effectiveness of the proposed approach; extensive studies have also been performed for different power systems consists of 10, 26, 34 generating units. Numerical results are shown comparing the cost solutions and computation time obtained by using the Evolutionary Programming method and other conventional methods like Dynamic Programming, Lagrangian Relaxation and Simulated Annealing and Tabu Search in reaching proper unit commitment. (author)

  7. Cash Management Policies By Evolutionary Models: A Comparison Using The MILLER-ORR Model

    Directory of Open Access Journals (Sweden)

    Marcelo Botelho da Costa Moraes

    2013-10-01

    Full Text Available This work aims to apply genetic algorithms (GA and particle swarm optimization (PSO to managing cash balance, comparing performance results between computational models and the Miller-Orr model. Thus, the paper proposes the application of computational evolutionary models to minimize the total cost of cash balance maintenance, obtaining the parameters for a cash management policy, using assumptions presented in the literature, considering the cost of maintenance and opportunity for cost of cash. For such, we developed computational experiments from cash flows simulated to implement the algorithms. For a control purpose, an algorithm has been developed that uses the Miller-Orr model defining the lower bound parameter, which is not obtained by the original model. The results indicate that evolutionary algorithms present better results than the Miller-Orr model, with prevalence for PSO algorithm in results.

  8. The development of a computer technique for the investigation of reactor lattice parameters

    International Nuclear Information System (INIS)

    Joubert, W.R.

    1982-01-01

    An integrated computer technique was developed whereby all the computer programmes needed to calculate reactor lattice parameters from basic neutron data, could be combined in one system. The theory of the computer programmes is explained in detail. Results are given and compared with experimental values as well as those calculated with a standard system

  9. The Algorithm for Algorithms: An Evolutionary Algorithm Based on Automatic Designing of Genetic Operators

    Directory of Open Access Journals (Sweden)

    Dazhi Jiang

    2015-01-01

    Full Text Available At present there is a wide range of evolutionary algorithms available to researchers and practitioners. Despite the great diversity of these algorithms, virtually all of the algorithms share one feature: they have been manually designed. A fundamental question is “are there any algorithms that can design evolutionary algorithms automatically?” A more complete definition of the question is “can computer construct an algorithm which will generate algorithms according to the requirement of a problem?” In this paper, a novel evolutionary algorithm based on automatic designing of genetic operators is presented to address these questions. The resulting algorithm not only explores solutions in the problem space like most traditional evolutionary algorithms do, but also automatically generates genetic operators in the operator space. In order to verify the performance of the proposed algorithm, comprehensive experiments on 23 well-known benchmark optimization problems are conducted. The results show that the proposed algorithm can outperform standard differential evolution algorithm in terms of convergence speed and solution accuracy which shows that the algorithm designed automatically by computers can compete with the algorithms designed by human beings.

  10. Applying Evolutionary Genetics to Developmental Toxicology and Risk Assessment

    Science.gov (United States)

    Leung, Maxwell C. K.; Procter, Andrew C.; Goldstone, Jared V.; Foox, Jonathan; DeSalle, Robert; Mattingly, Carolyn J.; Siddall, Mark E.; Timme-Laragy, Alicia R.

    2018-01-01

    Evolutionary thinking continues to challenge our views on health and disease. Yet, there is a communication gap between evolutionary biologists and toxicologists in recognizing the connections among developmental pathways, high-throughput screening, and birth defects in humans. To increase our capability in identifying potential developmental toxicants in humans, we propose to apply evolutionary genetics to improve the experimental design and data interpretation with various in vitro and whole-organism models. We review five molecular systems of stress response and update 18 consensual cell-cell signaling pathways that are the hallmark for early development, organogenesis, and differentiation; and revisit the principles of teratology in light of recent advances in high-throughput screening, big data techniques, and systems toxicology. Multiscale systems modeling plays an integral role in the evolutionary approach to cross-species extrapolation. Phylogenetic analysis and comparative bioinformatics are both valuable tools in identifying and validating the molecular initiating events that account for adverse developmental outcomes in humans. The discordance of susceptibility between test species and humans (ontogeny) reflects their differences in evolutionary history (phylogeny). This synthesis not only can lead to novel applications in developmental toxicity and risk assessment, but also can pave the way for applying an evo-devo perspective to the study of developmental origins of health and disease. PMID:28267574

  11. Evolutionary game theory using agent-based methods.

    Science.gov (United States)

    Adami, Christoph; Schossau, Jory; Hintze, Arend

    2016-12-01

    Evolutionary game theory is a successful mathematical framework geared towards understanding the selective pressures that affect the evolution of the strategies of agents engaged in interactions with potential conflicts. While a mathematical treatment of the costs and benefits of decisions can predict the optimal strategy in simple settings, more realistic settings such as finite populations, non-vanishing mutations rates, stochastic decisions, communication between agents, and spatial interactions, require agent-based methods where each agent is modeled as an individual, carries its own genes that determine its decisions, and where the evolutionary outcome can only be ascertained by evolving the population of agents forward in time. While highlighting standard mathematical results, we compare those to agent-based methods that can go beyond the limitations of equations and simulate the complexity of heterogeneous populations and an ever-changing set of interactors. We conclude that agent-based methods can predict evolutionary outcomes where purely mathematical treatments cannot tread (for example in the weak selection-strong mutation limit), but that mathematics is crucial to validate the computational simulations. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Exploiting Genomic Knowledge in Optimising Molecular Breeding Programmes: Algorithms from Evolutionary Computing

    Science.gov (United States)

    O'Hagan, Steve; Knowles, Joshua; Kell, Douglas B.

    2012-01-01

    Comparatively few studies have addressed directly the question of quantifying the benefits to be had from using molecular genetic markers in experimental breeding programmes (e.g. for improved crops and livestock), nor the question of which organisms should be mated with each other to best effect. We argue that this requires in silico modelling, an approach for which there is a large literature in the field of evolutionary computation (EC), but which has not really been applied in this way to experimental breeding programmes. EC seeks to optimise measurable outcomes (phenotypic fitnesses) by optimising in silico the mutation, recombination and selection regimes that are used. We review some of the approaches from EC, and compare experimentally, using a biologically relevant in silico landscape, some algorithms that have knowledge of where they are in the (genotypic) search space (G-algorithms) with some (albeit well-tuned ones) that do not (F-algorithms). For the present kinds of landscapes, F- and G-algorithms were broadly comparable in quality and effectiveness, although we recognise that the G-algorithms were not equipped with any ‘prior knowledge’ of epistatic pathway interactions. This use of algorithms based on machine learning has important implications for the optimisation of experimental breeding programmes in the post-genomic era when we shall potentially have access to the full genome sequence of every organism in a breeding population. The non-proprietary code that we have used is made freely available (via Supplementary information). PMID:23185279

  13. Evolutionary optimization of neural networks with heterogeneous computation: study and implementation

    OpenAIRE

    FE, JORGE DEOLINDO; Aliaga Varea, Ramón José; Gadea Gironés, Rafael

    2015-01-01

    In the optimization of artificial neural networks (ANNs) via evolutionary algorithms and the implementation of the necessary training for the objective function, there is often a trade-off between efficiency and flexibility. Pure software solutions on general-purpose processors tend to be slow because they do not take advantage of the inherent parallelism, whereas hardware realizations usually rely on optimizations that reduce the range of applicable network topologies, or they...

  14. Computer Tomography: A Novel Diagnostic Technique used in Horses

    African Journals Online (AJOL)

    In Veterinary Medicine, Computer Tomography (CT scan) is used more often in dogs and cats than in large animals due to their small size and ease of manipulation. This paper, however, illustrates the use of the technique in horses. CT scan was used in the diagnosis of two conditions of the head and limbs, namely alveolar ...

  15. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  16. Ancient Biomolecules and Evolutionary Inference.

    Science.gov (United States)

    Cappellini, Enrico; Prohaska, Ana; Racimo, Fernando; Welker, Frido; Pedersen, Mikkel Winther; Allentoft, Morten E; de Barros Damgaard, Peter; Gutenbrunner, Petra; Dunne, Julie; Hammann, Simon; Roffet-Salque, Mélanie; Ilardo, Melissa; Moreno-Mayar, J Víctor; Wang, Yucheng; Sikora, Martin; Vinner, Lasse; Cox, Jürgen; Evershed, Richard P; Willerslev, Eske

    2018-04-25

    Over the last decade, studies of ancient biomolecules-particularly ancient DNA, proteins, and lipids-have revolutionized our understanding of evolutionary history. Though initially fraught with many challenges, the field now stands on firm foundations. Researchers now successfully retrieve nucleotide and amino acid sequences, as well as lipid signatures, from progressively older samples, originating from geographic areas and depositional environments that, until recently, were regarded as hostile to long-term preservation of biomolecules. Sampling frequencies and the spatial and temporal scope of studies have also increased markedly, and with them the size and quality of the data sets generated. This progress has been made possible by continuous technical innovations in analytical methods, enhanced criteria for the selection of ancient samples, integrated experimental methods, and advanced computational approaches. Here, we discuss the history and current state of ancient biomolecule research, its applications to evolutionary inference, and future directions for this young and exciting field. Expected final online publication date for the Annual Review of Biochemistry Volume 87 is June 20, 2018. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.

  17. Domain Immersion Technique And Free Surface Computations Applied To Extrusion And Mixing Processes

    Science.gov (United States)

    Valette, Rudy; Vergnes, Bruno; Basset, Olivier; Coupez, Thierry

    2007-04-01

    This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment. We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each subdomain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique backgound computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.

  18. Genetic algorithms - A new technique for solving the neutron spectrum unfolding problem

    International Nuclear Information System (INIS)

    Freeman, David W.; Edwards, D. Ray; Bolon, Albert E.

    1999-01-01

    A new technique utilizing genetic algorithms has been applied to the Bonner sphere neutron spectrum unfolding problem. Genetic algorithms are part of a relatively new field of 'evolutionary' solution techniques that mimic living systems with computer-simulated 'chromosome' solutions. Solutions mate and mutate to create better solutions. Several benchmark problems, considered representative of radiation protection environments, have been evaluated using the newly developed UMRGA code which implements the genetic algorithm unfolding technique. The results are compared with results from other well-established unfolding codes. The genetic algorithm technique works remarkably well and produces solutions with relatively high spectral qualities. UMRGA appears to be a superior technique in the absence of a priori data - it does not rely on 'lucky' guesses of input spectra. Calculated personnel doses associated with the unfolded spectra match benchmark values within a few percent

  19. Time warping of evolutionary distant temporal gene expression data based on noise suppression

    Directory of Open Access Journals (Sweden)

    Papatsenko Dmitri

    2009-10-01

    Full Text Available Abstract Background Comparative analysis of genome wide temporal gene expression data has a broad potential area of application, including evolutionary biology, developmental biology, and medicine. However, at large evolutionary distances, the construction of global alignments and the consequent comparison of the time-series data are difficult. The main reason is the accumulation of variability in expression profiles of orthologous genes, in the course of evolution. Results We applied Pearson distance matrices, in combination with other noise-suppression techniques and data filtering to improve alignments. This novel framework enhanced the capacity to capture the similarities between the temporal gene expression datasets separated by large evolutionary distances. We aligned and compared the temporal gene expression data in budding (Saccharomyces cerevisiae and fission (Schizosaccharomyces pombe yeast, which are separated by more then ~400 myr of evolution. We found that the global alignment (time warping properly matched the duration of cell cycle phases in these distant organisms, which was measured in prior studies. At the same time, when applied to individual ortholog pairs, this alignment procedure revealed groups of genes with distinct alignments, different from the global alignment. Conclusion Our alignment-based predictions of differences in the cell cycle phases between the two yeast species were in a good agreement with the existing data, thus supporting the computational strategy adopted in this study. We propose that the existence of the alternative alignments, specific to distinct groups of genes, suggests presence of different synchronization modes between the two organisms and possible functional decoupling of particular physiological gene networks in the course of evolution.

  20. Bayer Digester Optimization Studies using Computer Techniques

    Science.gov (United States)

    Kotte, Jan J.; Schleider, Victor H.

    Theoretically required heat transfer performance by the multistaged flash heat reclaim system of a high pressure Bayer digester unit is determined for various conditions of discharge temperature, excess flash vapor and indirect steam addition. Solution of simultaneous heat balances around the digester vessels and the heat reclaim system yields the magnitude of available heat for representation of each case on a temperature-enthalpy diagram, where graphical fit of the number of flash stages fixes the heater requirements. Both the heat balances and the trial-and-error graphical solution are adapted to solution by digital computer techniques.

  1. On the theories, techniques, and computer codes used in numerical reactor criticality and burnup calculations

    International Nuclear Information System (INIS)

    El-Osery, I.A.

    1981-01-01

    The purpose of this paper is to discuss the theories, techniques and computer codes that are frequently used in numerical reactor criticality and burnup calculations. It is a part of an integrated nuclear reactor calculation scheme conducted by the Reactors Department, Inshas Nuclear Research Centre. The crude part in numerical reactor criticality and burnup calculations includes the determination of neutron flux distribution which can be obtained in principle as a solution of Boltzmann transport equation. Numerical methods used for solving transport equations are discussed. Emphasis are made on numerical techniques based on multigroup diffusion theory. These numerical techniques include nodal, modal, and finite difference ones. The most commonly known computer codes utilizing these techniques are reviewed. Some of the main computer codes that have been already developed at the Reactors Department and related to numerical reactor criticality and burnup calculations have been presented

  2. On the numerical treatment of selected oscillatory evolutionary problems

    Science.gov (United States)

    Cardone, Angelamaria; Conte, Dajana; D'Ambrosio, Raffaele; Paternoster, Beatrice

    2017-07-01

    We focus on evolutionary problems whose qualitative behaviour is known a-priori and exploited in order to provide efficient and accurate numerical schemes. For classical numerical methods, depending on constant coefficients, the required computational effort could be quite heavy, due to the necessary employ of very small stepsizes needed to accurately reproduce the qualitative behaviour of the solution. In these situations, it may be convenient to use special purpose formulae, i.e. non-polynomially fitted formulae on basis functions adapted to the problem (see [16, 17] and references therein). We show examples of special purpose strategies to solve two families of evolutionary problems exhibiting periodic solutions, i.e. partial differential equations and Volterra integral equations.

  3. Simplified Drift Analysis for Proving Lower Bounds in Evolutionary Computation

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten

    2011-01-01

    Drift analysis is a powerful tool used to bound the optimization time of evolutionary algorithms (EAs). Various previous works apply a drift theorem going back to Hajek in order to show exponential lower bounds on the optimization time of EAs. However, this drift theorem is tedious to read...... and to apply since it requires two bounds on the moment-generating (exponential) function of the drift. A recent work identifies a specialization of this drift theorem that is much easier to apply. Nevertheless, it is not as simple and not as general as possible. The present paper picks up Hajek’s line...

  4. Precision of lumbar intervertebral measurements: does a computer-assisted technique improve reliability?

    Science.gov (United States)

    Pearson, Adam M; Spratt, Kevin F; Genuario, James; McGough, William; Kosman, Katherine; Lurie, Jon; Sengupta, Dilip K

    2011-04-01

    Comparison of intra- and interobserver reliability of digitized manual and computer-assisted intervertebral motion measurements and classification of "instability." To determine if computer-assisted measurement of lumbar intervertebral motion on flexion-extension radiographs improves reliability compared with digitized manual measurements. Many studies have questioned the reliability of manual intervertebral measurements, although few have compared the reliability of computer-assisted and manual measurements on lumbar flexion-extension radiographs. Intervertebral rotation, anterior-posterior (AP) translation, and change in anterior and posterior disc height were measured with a digitized manual technique by three physicians and by three other observers using computer-assisted quantitative motion analysis (QMA) software. Each observer measured 30 sets of digital flexion-extension radiographs (L1-S1) twice. Shrout-Fleiss intraclass correlation coefficients for intra- and interobserver reliabilities were computed. The stability of each level was also classified (instability defined as >4 mm AP translation or 10° rotation), and the intra- and interobserver reliabilities of the two methods were compared using adjusted percent agreement (APA). Intraobserver reliability intraclass correlation coefficients were substantially higher for the QMA technique THAN the digitized manual technique across all measurements: rotation 0.997 versus 0.870, AP translation 0.959 versus 0.557, change in anterior disc height 0.962 versus 0.770, and change in posterior disc height 0.951 versus 0.283. The same pattern was observed for interobserver reliability (rotation 0.962 vs. 0.693, AP translation 0.862 vs. 0.151, change in anterior disc height 0.862 vs. 0.373, and change in posterior disc height 0.730 vs. 0.300). The QMA technique was also more reliable for the classification of "instability." Intraobserver APAs ranged from 87 to 97% for QMA versus 60% to 73% for digitized manual

  5. Computational reduction techniques for numerical vibro-acoustic analysis of hearing aids

    DEFF Research Database (Denmark)

    Creixell Mediante, Ester

    . In this thesis, several challenges encountered in the process of modelling and optimizing hearing aids are addressed. Firstly, a strategy for modelling the contacts between plastic parts for harmonic analysis is developed. Irregularities in the contact surfaces, inherent to the manufacturing process of the parts....... Secondly, the applicability of Model Order Reduction (MOR) techniques to lower the computational complexity of hearing aid vibro-acoustic models is studied. For fine frequency response calculation and optimization, which require solving the numerical model repeatedly, a computational challenge...... is encountered due to the large number of Degrees of Freedom (DOFs) needed to represent the complexity of the hearing aid system accurately. In this context, several MOR techniques are discussed, and an adaptive reduction method for vibro-acoustic optimization problems is developed as a main contribution. Lastly...

  6. Development of a computational technique to measure cartilage contact area.

    Science.gov (United States)

    Willing, Ryan; Lapner, Michael; Lalone, Emily A; King, Graham J W; Johnson, James A

    2014-03-21

    Computational measurement of joint contact distributions offers the benefit of non-invasive measurements of joint contact without the use of interpositional sensors or casting materials. This paper describes a technique for indirectly measuring joint contact based on overlapping of articular cartilage computer models derived from CT images and positioned using in vitro motion capture data. The accuracy of this technique when using the physiological nonuniform cartilage thickness distribution, or simplified uniform cartilage thickness distributions, is quantified through comparison with direct measurements of contact area made using a casting technique. The efficacy of using indirect contact measurement techniques for measuring the changes in contact area resulting from hemiarthroplasty at the elbow is also quantified. Using the physiological nonuniform cartilage thickness distribution reliably measured contact area (ICC=0.727), but not better than the assumed bone specific uniform cartilage thicknesses (ICC=0.673). When a contact pattern agreement score (s(agree)) was used to assess the accuracy of cartilage contact measurements made using physiological nonuniform or simplified uniform cartilage thickness distributions in terms of size, shape and location, their accuracies were not significantly different (p>0.05). The results of this study demonstrate that cartilage contact can be measured indirectly based on the overlapping of cartilage contact models. However, the results also suggest that in some situations, inter-bone distance measurement and an assumed cartilage thickness may suffice for predicting joint contact patterns. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Phase behavior of multicomponent membranes: Experimental and computational techniques

    DEFF Research Database (Denmark)

    Bagatolli, Luis; Kumar, P.B. Sunil

    2009-01-01

    Recent developments in biology seems to indicate that the Fluid Mosaic model of membrane proposed by Singer and Nicolson, with lipid bilayer functioning only as medium to support protein machinery, may be too simple to be realistic. Many protein functions are now known to depend on the compositio....... This review includes basic foundations on membrane model systems and experimental approaches applied in the membrane research area, stressing on recent advances in the experimental and computational techniques....... membranes. Current increase in interest in the domain formation in multicomponent membranes also stems from the experiments demonstrating liquid ordered-liquid disordered coexistence in mixtures of lipids and cholesterol and the success of several computational models in predicting their behavior...

  8. Experimental and Computational Techniques in Soft Condensed Matter Physics

    Science.gov (United States)

    Olafsen, Jeffrey

    2010-09-01

    1. Microscopy of soft materials Eric R. Weeks; 2. Computational methods to study jammed Systems Carl F. Schrek and Corey S. O'Hern; 3. Soft random solids: particulate gels, compressed emulsions and hybrid materials Anthony D. Dinsmore; 4. Langmuir monolayers Michael Dennin; 5. Computer modeling of granular rheology Leonardo E. Silbert; 6. Rheological and microrheological measurements of soft condensed matter John R. de Bruyn and Felix K. Oppong; 7. Particle-based measurement techniques for soft matter Nicholas T. Ouellette; 8. Cellular automata models of granular flow G. William Baxter; 9. Photoelastic materials Brian Utter; 10. Image acquisition and analysis in soft condensed matter Jeffrey S. Olafsen; 11. Structure and patterns in bacterial colonies Nicholas C. Darnton.

  9. Basic emotions and adaptation. A computational and evolutionary model.

    Directory of Open Access Journals (Sweden)

    Daniela Pacella

    Full Text Available The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then

  10. Basic emotions and adaptation. A computational and evolutionary model.

    Science.gov (United States)

    Pacella, Daniela; Ponticorvo, Michela; Gigliotta, Onofrio; Miglino, Orazio

    2017-01-01

    The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then switching their behavior

  11. Selective evolutionary generation systems: Theory and applications

    Science.gov (United States)

    Menezes, Amor A.

    This dissertation is devoted to the problem of behavior design, which is a generalization of the standard global optimization problem: instead of generating the optimizer, the generalization produces, on the space of candidate optimizers, a probability density function referred to as the behavior. The generalization depends on a parameter, the level of selectivity, such that as this parameter tends to infinity, the behavior becomes a delta function at the location of the global optimizer. The motivation for this generalization is that traditional off-line global optimization is non-resilient and non-opportunistic. That is, traditional global optimization is unresponsive to perturbations of the objective function. On-line optimization methods that are more resilient and opportunistic than their off-line counterparts typically consist of the computationally expensive sequential repetition of off-line techniques. A novel approach to inexpensive resilience and opportunism is to utilize the theory of Selective Evolutionary Generation Systems (SECS), which sequentially and probabilistically selects a candidate optimizer based on the ratio of the fitness values of two candidates and the level of selectivity. Using time-homogeneous, irreducible, ergodic Markov chains to model a sequence of local, and hence inexpensive, dynamic transitions, this dissertation proves that such transitions result in behavior that is called rational; such behavior is desirable because it can lead to both efficient search for an optimizer as well as resilient and opportunistic behavior. The dissertation also identifies system-theoretic properties of the proposed scheme, including equilibria, their stability and their optimality. Moreover, this dissertation demonstrates that the canonical genetic algorithm with fitness proportional selection and the (1+1) evolutionary strategy are particular cases of the scheme. Applications in three areas illustrate the versatility of the SECS theory: flight

  12. Embodied artificial evolution: Artificial evolutionary systems in the 21st Century.

    Science.gov (United States)

    Eiben, A E; Kernbach, S; Haasdijk, Evert

    2012-12-01

    Evolution is one of the major omnipresent powers in the universe that has been studied for about two centuries. Recent scientific and technical developments make it possible to make the transition from passively understanding to actively using evolutionary processes. Today this is possible in Evolutionary Computing, where human experimenters can design and manipulate all components of evolutionary processes in digital spaces. We argue that in the near future it will be possible to implement artificial evolutionary processes outside such imaginary spaces and make them physically embodied. In other words, we envision the "Evolution of Things", rather than just the evolution of digital objects, leading to a new field of Embodied Artificial Evolution (EAE). The main objective of this paper is to present a unifying vision in order to aid the development of this high potential research area. To this end, we introduce the notion of EAE, discuss a few examples and applications, and elaborate on the expected benefits as well as the grand challenges this developing field will have to address.

  13. International Conference on Soft Computing Techniques and Engineering Application

    CERN Document Server

    Li, Xiaolong

    2014-01-01

    The main objective of ICSCTEA 2013 is to provide a platform for researchers, engineers and academicians from all over the world to present their research results and development activities in soft computing techniques and engineering application. This conference provides opportunities for them to exchange new ideas and application experiences face to face, to establish business or research relations and to find global partners for future collaboration.

  14. Polymorphic Evolutionary Games.

    Science.gov (United States)

    Fishman, Michael A

    2016-06-07

    In this paper, I present an analytical framework for polymorphic evolutionary games suitable for explicitly modeling evolutionary processes in diploid populations with sexual reproduction. The principal aspect of the proposed approach is adding diploid genetics cum sexual recombination to a traditional evolutionary game, and switching from phenotypes to haplotypes as the new game׳s pure strategies. Here, the relevant pure strategy׳s payoffs derived by summing the payoffs of all the phenotypes capable of producing gametes containing that particular haplotype weighted by the pertinent probabilities. The resulting game is structurally identical to the familiar Evolutionary Games with non-linear pure strategy payoffs (Hofbauer and Sigmund, 1998. Cambridge University Press), and can be analyzed in terms of an established analytical framework for such games. And these results can be translated into the terms of genotypic, and whence, phenotypic evolutionary stability pertinent to the original game. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Remembering the evolutionary Freud.

    Science.gov (United States)

    Young, Allan

    2006-03-01

    Throughout his career as a writer, Sigmund Freud maintained an interest in the evolutionary origins of the human mind and its neurotic and psychotic disorders. In common with many writers then and now, he believed that the evolutionary past is conserved in the mind and the brain. Today the "evolutionary Freud" is nearly forgotten. Even among Freudians, he is regarded to be a red herring, relevant only to the extent that he diverts attention from the enduring achievements of the authentic Freud. There are three ways to explain these attitudes. First, the evolutionary Freud's key work is the "Overview of the Transference Neurosis" (1915). But it was published at an inopportune moment, forty years after the author's death, during the so-called "Freud wars." Second, Freud eventually lost interest in the "Overview" and the prospect of a comprehensive evolutionary theory of psychopathology. The publication of The Ego and the Id (1923), introducing Freud's structural theory of the psyche, marked the point of no return. Finally, Freud's evolutionary theory is simply not credible. It is based on just-so stories and a thoroughly discredited evolutionary mechanism, Lamarckian use-inheritance. Explanations one and two are probably correct but also uninteresting. Explanation number three assumes that there is a fundamental difference between Freud's evolutionary narratives (not credible) and the evolutionary accounts of psychopathology that currently circulate in psychiatry and mainstream journals (credible). The assumption is mistaken but worth investigating.

  16. Advance Trends in Soft Computing

    CERN Document Server

    Kreinovich, Vladik; Kacprzyk, Janusz; WCSC 2013

    2014-01-01

    This book is the proceedings of the 3rd World Conference on Soft Computing (WCSC), which was held in San Antonio, TX, USA, on December 16-18, 2013. It presents start-of-the-art theory and applications of soft computing together with an in-depth discussion of current and future challenges in the field, providing readers with a 360 degree view on soft computing. Topics range from fuzzy sets, to fuzzy logic, fuzzy mathematics, neuro-fuzzy systems, fuzzy control, decision making in fuzzy environments, image processing and many more. The book is dedicated to Lotfi A. Zadeh, a renowned specialist in signal analysis and control systems research who proposed the idea of fuzzy sets, in which an element may have a partial membership, in the early 1960s, followed by the idea of fuzzy logic, in which a statement can be true only to a certain degree, with degrees described by numbers in the interval [0,1]. The performance of fuzzy systems can often be improved with the help of optimization techniques, e.g. evolutionary co...

  17. Contributions of computational chemistry and biophysical techniques to fragment-based drug discovery.

    Science.gov (United States)

    Gozalbes, Rafael; Carbajo, Rodrigo J; Pineda-Lucena, Antonio

    2010-01-01

    In the last decade, fragment-based drug discovery (FBDD) has evolved from a novel approach in the search of new hits to a valuable alternative to the high-throughput screening (HTS) campaigns of many pharmaceutical companies. The increasing relevance of FBDD in the drug discovery universe has been concomitant with an implementation of the biophysical techniques used for the detection of weak inhibitors, e.g. NMR, X-ray crystallography or surface plasmon resonance (SPR). At the same time, computational approaches have also been progressively incorporated into the FBDD process and nowadays several computational tools are available. These stretch from the filtering of huge chemical databases in order to build fragment-focused libraries comprising compounds with adequate physicochemical properties, to more evolved models based on different in silico methods such as docking, pharmacophore modelling, QSAR and virtual screening. In this paper we will review the parallel evolution and complementarities of biophysical techniques and computational methods, providing some representative examples of drug discovery success stories by using FBDD.

  18. Recent advances in evolutionary multi-objective optimization

    CERN Document Server

    Datta, Rituparna; Gupta, Abhishek

    2017-01-01

    This book covers the most recent advances in the field of evolutionary multiobjective optimization. With the aim of drawing the attention of up-andcoming scientists towards exciting prospects at the forefront of computational intelligence, the authors have made an effort to ensure that the ideas conveyed herein are accessible to the widest audience. The book begins with a summary of the basic concepts in multi-objective optimization. This is followed by brief discussions on various algorithms that have been proposed over the years for solving such problems, ranging from classical (mathematical) approaches to sophisticated evolutionary ones that are capable of seamlessly tackling practical challenges such as non-convexity, multi-modality, the presence of multiple constraints, etc. Thereafter, some of the key emerging aspects that are likely to shape future research directions in the field are presented. These include:< optimization in dynamic environments, multi-objective bilevel programming, handling high ...

  19. Understanding the mind from an evolutionary perspective: an overview of evolutionary psychology.

    Science.gov (United States)

    Shackelford, Todd K; Liddle, James R

    2014-05-01

    The theory of evolution by natural selection provides the only scientific explanation for the existence of complex adaptations. The design features of the brain, like any organ, are the result of selection pressures operating over deep time. Evolutionary psychology posits that the human brain comprises a multitude of evolved psychological mechanisms, adaptations to specific and recurrent problems of survival and reproduction faced over human evolutionary history. Although some mistakenly view evolutionary psychology as promoting genetic determinism, evolutionary psychologists appreciate and emphasize the interactions between genes and environments. This approach to psychology has led to a richer understanding of a variety of psychological phenomena, and has provided a powerful foundation for generating novel hypotheses. Critics argue that evolutionary psychologists resort to storytelling, but as with any branch of science, empirical testing is a vital component of the field, with hypotheses standing or falling with the weight of the evidence. Evolutionary psychology is uniquely suited to provide a unifying theoretical framework for the disparate subdisciplines of psychology. An evolutionary perspective has provided insights into several subdisciplines of psychology, while simultaneously demonstrating the arbitrary nature of dividing psychological science into such subdisciplines. Evolutionary psychologists have amassed a substantial empirical and theoretical literature, but as a relatively new approach to psychology, many questions remain, with several promising directions for future research. For further resources related to this article, please visit the WIREs website. The authors have declared no conflicts of interest for this article. © 2014 John Wiley & Sons, Ltd.

  20. An AmI-Based Software Architecture Enabling Evolutionary Computation in Blended Commerce: The Shopping Plan Application

    Directory of Open Access Journals (Sweden)

    Giuseppe D’Aniello

    2015-01-01

    Full Text Available This work describes an approach to synergistically exploit ambient intelligence technologies, mobile devices, and evolutionary computation in order to support blended commerce or ubiquitous commerce scenarios. The work proposes a software architecture consisting of three main components: linked data for e-commerce, cloud-based services, and mobile apps. The three components implement a scenario where a shopping mall is presented as an intelligent environment in which customers use NFC capabilities of their smartphones in order to handle e-coupons produced, suggested, and consumed by the abovesaid environment. The main function of the intelligent environment is to help customers define shopping plans, which minimize the overall shopping cost by looking for best prices, discounts, and coupons. The paper proposes a genetic algorithm to find suboptimal solutions for the shopping plan problem in a highly dynamic context, where the final cost of a product for an individual customer is dependent on his previous purchases. In particular, the work provides details on the Shopping Plan software prototype and some experimentation results showing the overall performance of the genetic algorithm.

  1. Integration of computational modeling and experimental techniques to design fuel surrogates

    DEFF Research Database (Denmark)

    Choudhury, H.A.; Intikhab, S.; Kalakul, Sawitree

    2017-01-01

    performance. A simplified alternative is to develop surrogate fuels that have fewer compounds and emulate certain important desired physical properties of the target fuels. Six gasoline blends were formulated through a computer aided model based technique “Mixed Integer Non-Linear Programming” (MINLP...... Virtual Process-Product Design Laboratory (VPPD-Lab) are applied onto the defined compositions of the surrogate gasoline. The aim is to primarily verify the defined composition of gasoline by means of VPPD-Lab. ρ, η and RVP are calculated with more accuracy and constraints such as distillation curve...... and flash point on the blend design are also considered. A post-design experiment-based verification step is proposed to further improve and fine-tune the “best” selected gasoline blends following the computation work. Here, advanced experimental techniques are used to measure the RVP, ρ, η, RON...

  2. System optimization for HVAC energy management using the robust evolutionary algorithm

    International Nuclear Information System (INIS)

    Fong, K.F.; Hanby, V.I.; Chow, T.T.

    2009-01-01

    For an installed centralized heating, ventilating and air conditioning (HVAC) system, appropriate energy management measures would achieve energy conservation targets through the optimal control and operation. The performance optimization of conventional HVAC systems may be handled by operation experience, but it may not cover different optimization scenarios and parameters in response to a variety of load and weather conditions. In this regard, it is common to apply the suitable simulation-optimization technique to model the system then determine the required operation parameters. The particular plant simulation models can be built up by either using the available simulation programs or a system of mathematical expressions. To handle the simulation models, iterations would be involved in the numerical solution methods. Since the gradient information is not easily available due to the complex nature of equations, the traditional gradient-based optimization methods are not applicable for this kind of system models. For the heuristic optimization methods, the continual search is commonly necessary, and the system function call is required for each search. The frequency of simulation function calls would then be a time-determining step, and an efficient optimization method is crucial, in order to find the solution through a number of function calls in a reasonable computational period. In this paper, the robust evolutionary algorithm (REA) is presented to tackle this nature of the HVAC simulation models. REA is based on one of the paradigms of evolutionary algorithm, evolution strategy, which is a stochastic population-based searching technique emphasized on mutation. The REA, which incorporates the Cauchy deterministic mutation, tournament selection and arithmetic recombination, would provide a synergetic effect for optimal search. The REA is effective to cope with the complex simulation models, as well as those represented by explicit mathematical expressions of

  3. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm.

    Science.gov (United States)

    Abdulhamid, Shafi'i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques.

  4. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm

    Science.gov (United States)

    Abdulhamid, Shafi’i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques. PMID:27384239

  5. Fixed Parameter Evolutionary Algorithms and Maximum Leaf Spanning Trees: A Matter of Mutations

    DEFF Research Database (Denmark)

    Kratsch, Stefan; Lehre, Per Kristian; Neumann, Frank

    2011-01-01

    Evolutionary algorithms have been shown to be very successful for a wide range of NP-hard combinatorial optimization problems. We investigate the NP-hard problem of computing a spanning tree that has a maximal number of leaves by evolutionary algorithms in the context of fixed parameter tractabil...... two common mutation operators, we show that an operator related to spanning tree problems leads to an FPT running time in contrast to a general mutation operator that does not have this property....

  6. Evolutionary optimization technique for site layout planning

    KAUST Repository

    El Ansary, Ayman M.

    2014-02-01

    Solving the site layout planning problem is a challenging task. It requires an iterative approach to satisfy design requirements (e.g. energy efficiency, skyview, daylight, roads network, visual privacy, and clear access to favorite views). These design requirements vary from one project to another based on location and client preferences. In the Gulf region, the most important socio-cultural factor is the visual privacy in indoor space. Hence, most of the residential houses in this region are surrounded by high fences to provide privacy, which has a direct impact on other requirements (e.g. daylight and direction to a favorite view). This paper introduces a novel technique to optimally locate and orient residential buildings to satisfy a set of design requirements. The developed technique is based on genetic algorithm which explores the search space for possible solutions. This study considers two dimensional site planning problems. However, it can be extended to solve three dimensional cases. A case study is presented to demonstrate the efficiency of this technique in solving the site layout planning of simple residential dwellings. © 2013 Elsevier B.V. All rights reserved.

  7. Costs incurred by applying computer-aided design/computer-aided manufacturing techniques for the reconstruction of maxillofacial defects.

    Science.gov (United States)

    Rustemeyer, Jan; Melenberg, Alex; Sari-Rieger, Aynur

    2014-12-01

    This study aims to evaluate the additional costs incurred by using a computer-aided design/computer-aided manufacturing (CAD/CAM) technique for reconstructing maxillofacial defects by analyzing typical cases. The medical charts of 11 consecutive patients who were subjected to the CAD/CAM technique were considered, and invoices from the companies providing the CAD/CAM devices were reviewed for every case. The number of devices used was significantly correlated with cost (r = 0.880; p costs were found between cases in which prebent reconstruction plates were used (€3346.00 ± €29.00) and cases in which they were not (€2534.22 ± €264.48; p costs of two, three and four devices, even when ignoring the cost of reconstruction plates. Additional fees provided by statutory health insurance covered a mean of 171.5% ± 25.6% of the cost of the CAD/CAM devices. Since the additional fees provide financial compensation, we believe that the CAD/CAM technique is suited for wide application and not restricted to complex cases. Where additional fees/funds are not available, the CAD/CAM technique might be unprofitable, so the decision whether or not to use it remains a case-to-case decision with respect to cost versus benefit. Copyright © 2014 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  8. Dual-scan technique for the customization of zirconia computer-aided design/computer-aided manufacturing frameworks.

    Science.gov (United States)

    Andreiuolo, Rafael Ferrone; Sabrosa, Carlos Eduardo; Dias, Katia Regina H Cervantes

    2013-09-01

    The use of bi-layered all-ceramic crowns has continuously grown since the introduction of computer-aided design/computer-aided manufacturing (CAD/CAM) zirconia cores. Unfortunately, despite the outstanding mechanical properties of zirconia, problems related to porcelain cracking or chipping remain. One of the reasons for this is that ceramic copings are usually milled to uniform thicknesses of 0.3-0.6 mm around the whole tooth preparation. This may not provide uniform thickness or appropriate support for the veneering porcelain. To prevent these problems, the dual-scan technique demonstrates an alternative that allows the restorative team to customize zirconia CAD/CAM frameworks with adequate porcelain thickness and support in a simple manner.

  9. [Evolutionary medicine].

    Science.gov (United States)

    Wjst, M

    2013-12-01

    Evolutionary medicine allows new insights into long standing medical problems. Are we "really stoneagers on the fast lane"? This insight might have enormous consequences and will allow new answers that could never been provided by traditional anthropology. Only now this is made possible using data from molecular medicine and systems biology. Thereby evolutionary medicine takes a leap from a merely theoretical discipline to practical fields - reproductive, nutritional and preventive medicine, as well as microbiology, immunology and psychiatry. Evolutionary medicine is not another "just so story" but a serious candidate for the medical curriculum providing a universal understanding of health and disease based on our biological origin. © Georg Thieme Verlag KG Stuttgart · New York.

  10. Core principles of evolutionary medicine

    Science.gov (United States)

    Grunspan, Daniel Z; Nesse, Randolph M; Barnes, M Elizabeth; Brownell, Sara E

    2018-01-01

    Abstract Background and objectives Evolutionary medicine is a rapidly growing field that uses the principles of evolutionary biology to better understand, prevent and treat disease, and that uses studies of disease to advance basic knowledge in evolutionary biology. Over-arching principles of evolutionary medicine have been described in publications, but our study is the first to systematically elicit core principles from a diverse panel of experts in evolutionary medicine. These principles should be useful to advance recent recommendations made by The Association of American Medical Colleges and the Howard Hughes Medical Institute to make evolutionary thinking a core competency for pre-medical education. Methodology The Delphi method was used to elicit and validate a list of core principles for evolutionary medicine. The study included four surveys administered in sequence to 56 expert panelists. The initial open-ended survey created a list of possible core principles; the three subsequent surveys winnowed the list and assessed the accuracy and importance of each principle. Results Fourteen core principles elicited at least 80% of the panelists to agree or strongly agree that they were important core principles for evolutionary medicine. These principles over-lapped with concepts discussed in other articles discussing key concepts in evolutionary medicine. Conclusions and implications This set of core principles will be helpful for researchers and instructors in evolutionary medicine. We recommend that evolutionary medicine instructors use the list of core principles to construct learning goals. Evolutionary medicine is a young field, so this list of core principles will likely change as the field develops further. PMID:29493660

  11. An evolutionary firefly algorithm for the estimation of nonlinear biological model parameters.

    Directory of Open Access Journals (Sweden)

    Afnizanfaizal Abdullah

    Full Text Available The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test.

  12. An evolutionary firefly algorithm for the estimation of nonlinear biological model parameters.

    Science.gov (United States)

    Abdullah, Afnizanfaizal; Deris, Safaai; Anwar, Sohail; Arjunan, Satya N V

    2013-01-01

    The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test.

  13. Aggregate meta-models for evolutionary multiobjective and many-objective optimization

    Czech Academy of Sciences Publication Activity Database

    Pilát, Martin; Neruda, Roman

    Roč. 116, 20 September (2013), s. 392-402 ISSN 0925-2312 R&D Projects: GA ČR GAP202/11/1368 Institutional support: RVO:67985807 Keywords : evolutionary algorithms * multiobjective optimization * many-objective optimization * surrogate models * meta-models * memetic algorithm Subject RIV: IN - Informatics, Computer Science Impact factor: 2.005, year: 2013

  14. 4th International Joint Conference on Computational Intelligence

    CERN Document Server

    Correia, António; Rosa, Agostinho; Filipe, Joaquim

    2015-01-01

    The present book includes extended and revised versions of a set of selected papers from the Fourth International Joint Conference on Computational Intelligence (IJCCI 2012)., held in Barcelona, Spain, from 5 to 7 October, 2012. The conference was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and was organized in cooperation with the Association for the Advancement of Artificial Intelligence (AAAI). The conference brought together researchers, engineers and practitioners in computational technologies, especially those related to the areas of fuzzy computation, evolutionary computation and neural computation. It is composed of three co-located conferences, each one specialized in one of the aforementioned -knowledge areas. Namely: - International Conference on Evolutionary Computation Theory and Applications (ECTA) - International Conference on Fuzzy Computation Theory and Applications (FCTA) - International Conference on Neural Computation Theory a...

  15. Applying evolutionary anthropology.

    Science.gov (United States)

    Gibson, Mhairi A; Lawson, David W

    2015-01-01

    Evolutionary anthropology provides a powerful theoretical framework for understanding how both current environments and legacies of past selection shape human behavioral diversity. This integrative and pluralistic field, combining ethnographic, demographic, and sociological methods, has provided new insights into the ultimate forces and proximate pathways that guide human adaptation and variation. Here, we present the argument that evolutionary anthropological studies of human behavior also hold great, largely untapped, potential to guide the design, implementation, and evaluation of social and public health policy. Focusing on the key anthropological themes of reproduction, production, and distribution we highlight classic and recent research demonstrating the value of an evolutionary perspective to improving human well-being. The challenge now comes in transforming relevance into action and, for that, evolutionary behavioral anthropologists will need to forge deeper connections with other applied social scientists and policy-makers. We are hopeful that these developments are underway and that, with the current tide of enthusiasm for evidence-based approaches to policy, evolutionary anthropology is well positioned to make a strong contribution. © 2015 Wiley Periodicals, Inc.

  16. Applying Evolutionary Anthropology

    Science.gov (United States)

    Gibson, Mhairi A; Lawson, David W

    2015-01-01

    Evolutionary anthropology provides a powerful theoretical framework for understanding how both current environments and legacies of past selection shape human behavioral diversity. This integrative and pluralistic field, combining ethnographic, demographic, and sociological methods, has provided new insights into the ultimate forces and proximate pathways that guide human adaptation and variation. Here, we present the argument that evolutionary anthropological studies of human behavior also hold great, largely untapped, potential to guide the design, implementation, and evaluation of social and public health policy. Focusing on the key anthropological themes of reproduction, production, and distribution we highlight classic and recent research demonstrating the value of an evolutionary perspective to improving human well-being. The challenge now comes in transforming relevance into action and, for that, evolutionary behavioral anthropologists will need to forge deeper connections with other applied social scientists and policy-makers. We are hopeful that these developments are underway and that, with the current tide of enthusiasm for evidence-based approaches to policy, evolutionary anthropology is well positioned to make a strong contribution. PMID:25684561

  17. International Conference on Artificial Intelligence and Evolutionary Algorithms in Engineering Systems

    CERN Document Server

    Dash, Subhransu; Panigrahi, Bijaya

    2015-01-01

      The book is a collection of high-quality peer-reviewed research papers presented in Proceedings of International Conference on Artificial Intelligence and Evolutionary Algorithms in Engineering Systems (ICAEES 2014) held at Noorul Islam Centre for Higher Education, Kumaracoil, India. These research papers provide the latest developments in the broad area of use of artificial intelligence and evolutionary algorithms in engineering systems. The book discusses wide variety of industrial, engineering and scientific applications of the emerging techniques. It presents invited papers from the inventors/originators of new applications and advanced technologies.

  18. Shaded computer graphic techniques for visualizing and interpreting analytic fluid flow models

    Science.gov (United States)

    Parke, F. I.

    1981-01-01

    Mathematical models which predict the behavior of fluid flow in different experiments are simulated using digital computers. The simulations predict values of parameters of the fluid flow (pressure, temperature and velocity vector) at many points in the fluid. Visualization of the spatial variation in the value of these parameters is important to comprehend and check the data generated, to identify the regions of interest in the flow, and for effectively communicating information about the flow to others. The state of the art imaging techniques developed in the field of three dimensional shaded computer graphics is applied to visualization of fluid flow. Use of an imaging technique known as 'SCAN' for visualizing fluid flow, is studied and the results are presented.

  19. A New Computational Technique for the Generation of Optimised Aircraft Trajectories

    Science.gov (United States)

    Chircop, Kenneth; Gardi, Alessandro; Zammit-Mangion, David; Sabatini, Roberto

    2017-12-01

    A new computational technique based on Pseudospectral Discretisation (PSD) and adaptive bisection ɛ-constraint methods is proposed to solve multi-objective aircraft trajectory optimisation problems formulated as nonlinear optimal control problems. This technique is applicable to a variety of next-generation avionics and Air Traffic Management (ATM) Decision Support Systems (DSS) for strategic and tactical replanning operations. These include the future Flight Management Systems (FMS) and the 4-Dimensional Trajectory (4DT) planning and intent negotiation/validation tools envisaged by SESAR and NextGen for a global implementation. In particular, after describing the PSD method, the adaptive bisection ɛ-constraint method is presented to allow an efficient solution of problems in which two or multiple performance indices are to be minimized simultaneously. Initial simulation case studies were performed adopting suitable aircraft dynamics models and addressing a classical vertical trajectory optimisation problem with two objectives simultaneously. Subsequently, a more advanced 4DT simulation case study is presented with a focus on representative ATM optimisation objectives in the Terminal Manoeuvring Area (TMA). The simulation results are analysed in-depth and corroborated by flight performance analysis, supporting the validity of the proposed computational techniques.

  20. Divergent evolutionary processes associated with colonization of offshore islands.

    Science.gov (United States)

    Martínková, Natália; Barnett, Ross; Cucchi, Thomas; Struchen, Rahel; Pascal, Marine; Pascal, Michel; Fischer, Martin C; Higham, Thomas; Brace, Selina; Ho, Simon Y W; Quéré, Jean-Pierre; O'Higgins, Paul; Excoffier, Laurent; Heckel, Gerald; Hoelzel, A Rus; Dobney, Keith M; Searle, Jeremy B

    2013-10-01

    Oceanic islands have been a test ground for evolutionary theory, but here, we focus on the possibilities for evolutionary study created by offshore islands. These can be colonized through various means and by a wide range of species, including those with low dispersal capabilities. We use morphology, modern and ancient sequences of cytochrome b (cytb) and microsatellite genotypes to examine colonization history and evolutionary change associated with occupation of the Orkney archipelago by the common vole (Microtus arvalis), a species found in continental Europe but not in Britain. Among possible colonization scenarios, our results are most consistent with human introduction at least 5100 bp (confirmed by radiocarbon dating). We used approximate Bayesian computation of population history to infer the coast of Belgium as the possible source and estimated the evolutionary timescale using a Bayesian coalescent approach. We showed substantial morphological divergence of the island populations, including a size increase presumably driven by selection and reduced microsatellite variation likely reflecting founder events and genetic drift. More surprisingly, our results suggest that a recent and widespread cytb replacement event in the continental source area purged cytb variation there, whereas the ancestral diversity is largely retained in the colonized islands as a genetic 'ark'. The replacement event in the continental M. arvalis was probably triggered by anthropogenic causes (land-use change). Our studies illustrate that small offshore islands can act as field laboratories for studying various evolutionary processes over relatively short timescales, informing about the mainland source area as well as the island. © 2013 John Wiley & Sons Ltd.

  1. A review of computer-aided design/computer-aided manufacture techniques for removable denture fabrication

    Science.gov (United States)

    Bilgin, Mehmet Selim; Baytaroğlu, Ebru Nur; Erdem, Ali; Dilber, Erhan

    2016-01-01

    The aim of this review was to investigate usage of computer-aided design/computer-aided manufacture (CAD/CAM) such as milling and rapid prototyping (RP) technologies for removable denture fabrication. An electronic search was conducted in the PubMed/MEDLINE, ScienceDirect, Google Scholar, and Web of Science databases. Databases were searched from 1987 to 2014. The search was performed using a variety of keywords including CAD/CAM, complete/partial dentures, RP, rapid manufacturing, digitally designed, milled, computerized, and machined. The identified developments (in chronological order), techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication are summarized. Using a variety of keywords and aiming to find the topic, 78 publications were initially searched. For the main topic, the abstract of these 78 articles were scanned, and 52 publications were selected for reading in detail. Full-text of these articles was gained and searched in detail. Totally, 40 articles that discussed the techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication and the articles were incorporated in this review. Totally, 16 of the papers summarized in the table. Following review of all relevant publications, it can be concluded that current innovations and technological developments of CAD/CAM and RP allow the digitally planning and manufacturing of removable dentures from start to finish. As a result according to the literature review CAD/CAM techniques and supportive maxillomandibular relationship transfer devices are growing fast. In the close future, fabricating removable dentures will become medical informatics instead of needing a technical staff and procedures. However the methods have several limitations for now. PMID:27095912

  2. Computational intelligence techniques for biological data mining: An overview

    Science.gov (United States)

    Faye, Ibrahima; Iqbal, Muhammad Javed; Said, Abas Md; Samir, Brahim Belhaouari

    2014-10-01

    Computational techniques have been successfully utilized for a highly accurate analysis and modeling of multifaceted and raw biological data gathered from various genome sequencing projects. These techniques are proving much more effective to overcome the limitations of the traditional in-vitro experiments on the constantly increasing sequence data. However, most critical problems that caught the attention of the researchers may include, but not limited to these: accurate structure and function prediction of unknown proteins, protein subcellular localization prediction, finding protein-protein interactions, protein fold recognition, analysis of microarray gene expression data, etc. To solve these problems, various classification and clustering techniques using machine learning have been extensively used in the published literature. These techniques include neural network algorithms, genetic algorithms, fuzzy ARTMAP, K-Means, K-NN, SVM, Rough set classifiers, decision tree and HMM based algorithms. Major difficulties in applying the above algorithms include the limitations found in the previous feature encoding and selection methods while extracting the best features, increasing classification accuracy and decreasing the running time overheads of the learning algorithms. The application of this research would be potentially useful in the drug design and in the diagnosis of some diseases. This paper presents a concise overview of the well-known protein classification techniques.

  3. Enhancing Evolutionary Optimization in Uncertain Environments by Allocating Evaluations via Multi-armed Bandit Algorithms

    OpenAIRE

    Qiu, Xin; Miikkulainen, Risto

    2018-01-01

    Optimization problems with uncertain fitness functions are common in the real world, and present unique challenges for evolutionary optimization approaches. Existing issues include excessively expensive evaluation, lack of solution reliability, and incapability in maintaining high overall fitness during optimization. Using conversion rate optimization as an example, this paper proposes a series of new techniques for addressing these issues. The main innovation is to augment evolutionary algor...

  4. A computational technique for turbulent flow of wastewater sludge.

    Science.gov (United States)

    Bechtel, Tom B

    2005-01-01

    A computational fluid dynamics (CFD) technique applied to the turbulent flow of wastewater sludge in horizontal, smooth-wall, circular pipes is presented. The technique uses the Crank-Nicolson finite difference method in conjunction with the variable secant method, an algorithm for determining the pressure gradient of the flow. A simple algebraic turbulence model is used. A Bingham-plastic rheological model is used to describe the shear stress/shear rate relationship for the wastewater sludge. The method computes velocity gradient and head loss, given a fixed volumetric flow, pipe size, and solids concentration. Solids concentrations ranging from 3 to 10% (by weight) and nominal pipe sizes from 0.15 m (6 in.) to 0.36 m (14 in.) are studied. Comparison of the CFD results for water to established values serves to validate the numerical method. The head loss results are presented in terms of a head loss ratio, R(hl), which is the ratio of sludge head loss to water head loss. An empirical equation relating R(hl) to pipe velocity and solids concentration, derived from the results of the CFD calculations, is presented. The results are compared with published values of Rhl for solids concentrations of 3 and 6%. A new expression for the Fanning friction factor for wastewater sludge flow is also presented.

  5. Acceleration of FDTD mode solver by high-performance computing techniques.

    Science.gov (United States)

    Han, Lin; Xi, Yanping; Huang, Wei-Ping

    2010-06-21

    A two-dimensional (2D) compact finite-difference time-domain (FDTD) mode solver is developed based on wave equation formalism in combination with the matrix pencil method (MPM). The method is validated for calculation of both real guided and complex leaky modes of typical optical waveguides against the bench-mark finite-difference (FD) eigen mode solver. By taking advantage of the inherent parallel nature of the FDTD algorithm, the mode solver is implemented on graphics processing units (GPUs) using the compute unified device architecture (CUDA). It is demonstrated that the high-performance computing technique leads to significant acceleration of the FDTD mode solver with more than 30 times improvement in computational efficiency in comparison with the conventional FDTD mode solver running on CPU of a standard desktop computer. The computational efficiency of the accelerated FDTD method is in the same order of magnitude of the standard finite-difference eigen mode solver and yet require much less memory (e.g., less than 10%). Therefore, the new method may serve as an efficient, accurate and robust tool for mode calculation of optical waveguides even when the conventional eigen value mode solvers are no longer applicable due to memory limitation.

  6. Evolutionary dynamics on graphs: Efficient method for weak selection

    Science.gov (United States)

    Fu, Feng; Wang, Long; Nowak, Martin A.; Hauert, Christoph

    2009-04-01

    Investigating the evolutionary dynamics of game theoretical interactions in populations where individuals are arranged on a graph can be challenging in terms of computation time. Here, we propose an efficient method to study any type of game on arbitrary graph structures for weak selection. In this limit, evolutionary game dynamics represents a first-order correction to neutral evolution. Spatial correlations can be empirically determined under neutral evolution and provide the basis for formulating the game dynamics as a discrete Markov process by incorporating a detailed description of the microscopic dynamics based on the neutral correlations. This framework is then applied to one of the most intriguing questions in evolutionary biology: the evolution of cooperation. We demonstrate that the degree heterogeneity of a graph impedes cooperation and that the success of tit for tat depends not only on the number of rounds but also on the degree of the graph. Moreover, considering the mutation-selection equilibrium shows that the symmetry of the stationary distribution of states under weak selection is skewed in favor of defectors for larger selection strengths. In particular, degree heterogeneity—a prominent feature of scale-free networks—generally results in a more pronounced increase in the critical benefit-to-cost ratio required for evolution to favor cooperation as compared to regular graphs. This conclusion is corroborated by an analysis of the effects of population structures on the fixation probabilities of strategies in general 2×2 games for different types of graphs. Computer simulations confirm the predictive power of our method and illustrate the improved accuracy as compared to previous studies.

  7. Molluscan Evolutionary Genomics

    Energy Technology Data Exchange (ETDEWEB)

    Simison, W. Brian; Boore, Jeffrey L.

    2005-12-01

    In the last 20 years there have been dramatic advances in techniques of high-throughput DNA sequencing, most recently accelerated by the Human Genome Project, a program that has determined the three billion base pair code on which we are based. Now this tremendous capability is being directed at other genome targets that are being sampled across the broad range of life. This opens up opportunities as never before for evolutionary and organismal biologists to address questions of both processes and patterns of organismal change. We stand at the dawn of a new 'modern synthesis' period, paralleling that of the early 20th century when the fledgling field of genetics first identified the underlying basis for Darwin's theory. We must now unite the efforts of systematists, paleontologists, mathematicians, computer programmers, molecular biologists, developmental biologists, and others in the pursuit of discovering what genomics can teach us about the diversity of life. Genome-level sampling for mollusks to date has mostly been limited to mitochondrial genomes and it is likely that these will continue to provide the best targets for broad phylogenetic sampling in the near future. However, we are just beginning to see an inroad into complete nuclear genome sequencing, with several mollusks and other eutrochozoans having been selected for work about to begin. Here, we provide an overview of the state of molluscan mitochondrial genomics, highlight a few of the discoveries from this research, outline the promise of broadening this dataset, describe upcoming projects to sequence whole mollusk nuclear genomes, and challenge the community to prepare for making the best use of these data.

  8. Introducing E-Learning in a Norwegian Service Company with Participatory Design and Evolutionary Prototyping Techniques

    OpenAIRE

    Mørch , Anders I.; Engen , Bård Ketil; Hansen Åsand , Hege-René; Brynhildsen , Camilla; Tødenes , Ida

    2004-01-01

    Over a 2-year period, we have participated in the introduction of e-learning in a Norwegian service company, a gas station division of an oil company. This company has an advanced computer network infrastructure for communication and information sharing, but the primary task of the employees is serving customers. We identify some challenges to introducing e-learning in this kind of environment. A primary emphasis has been on using participatory design techniques during the planning stages and...

  9. An evolutionary computation approach to examine functional brain plasticity

    Directory of Open Access Journals (Sweden)

    Arnab eRoy

    2016-04-01

    Full Text Available One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN and the executive control network (ECN during recovery from traumatic brain injury (TBI; the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in

  10. PERFORMANCE EVALUATION OF OR1200 PROCESSOR WITH EVOLUTIONARY PARALLEL HPRC USING GEP

    Directory of Open Access Journals (Sweden)

    R. Maheswari

    2012-04-01

    Full Text Available In this fast computing era, most of the embedded system requires more computing power to complete the complex function/ task at the lesser amount of time. One way to achieve this is by boosting up the processor performance which allows processor core to run faster. This paper presents a novel technique of increasing the performance by parallel HPRC (High Performance Reconfigurable Computing in the CPU/DSP (Digital Signal Processor unit of OR1200 (Open Reduced Instruction Set Computer (RISC 1200 using Gene Expression Programming (GEP an evolutionary programming model. OR1200 is a soft-core RISC processor of the Intellectual Property cores that can efficiently run any modern operating system. In the manufacturing process of OR1200 a parallel HPRC is placed internally in the Integer Execution Pipeline unit of the CPU/DSP core to increase the performance. The GEP Parallel HPRC is activated /deactivated by triggering the signals i HPRC_Gene_Start ii HPRC_Gene_End. A Verilog HDL(Hardware Description language functional code for Gene Expression Programming parallel HPRC is developed and synthesised using XILINX ISE in the former part of the work and a CoreMark processor core benchmark is used to test the performance of the OR1200 soft core in the later part of the work. The result of the implementation ensures the overall speed-up increased to 20.59% by GEP based parallel HPRC in the execution unit of OR1200.

  11. Plant process computer replacements - techniques to limit installation schedules and costs

    International Nuclear Information System (INIS)

    Baker, M.D.; Olson, J.L.

    1992-01-01

    Plant process computer systems, a standard fixture in all nuclear power plants, are used to monitor and display important plant process parameters. Scanning thousands of field sensors and alarming out-of-limit values, these computer systems are heavily relied on by control room operators. The original nuclear steam supply system (NSSS) vendor for the power plant often supplied the plant process computer. Designed using sixties and seventies technology, a plant's original process computer has been obsolete for some time. Driven by increased maintenance costs and new US Nuclear Regulatory Commission regulations such as NUREG-0737, Suppl. 1, many utilities have replaced their process computers with more modern computer systems. Given that computer systems are by their nature prone to rapid obsolescence, this replacement cycle will likely repeat. A process computer replacement project can be a significant capital expenditure and must be performed during a scheduled refueling outage. The object of the installation process is to install a working system on schedule. Experience gained by supervising several computer replacement installations has taught lessons that, if applied, will shorten the schedule and limit the risk of costly delays. Examples illustrating this technique are given. This paper and these examples deal only with the installation process and assume that the replacement computer system has been adequately designed, and development and factory tested

  12. GRAVTool, Advances on the Package to Compute Geoid Model path by the Remove-Compute-Restore Technique, Following Helmert's Condensation Method

    Science.gov (United States)

    Marotta, G. S.

    2017-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astrogeodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove Compute Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and Global Geopotential Model (GGM), respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and adjust these models to one local vertical datum. This research presents the advances on the package called GRAVTool to compute geoid models path by the RCR, following Helmert's condensation method, and its application in a study area. The studied area comprehends the federal district of Brazil, with 6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show a geoid model computed by the GRAVTool package, after analysis of the density, DTM and GGM values, more adequate to the reference values used on the study area. The accuracy of the computed model (σ = ± 0.058 m, RMS = 0.067 m, maximum = 0.124 m and minimum = -0.155 m), using density value of 2.702 g/cm³ ±0.024 g/cm³, DTM SRTM Void Filled 3 arc-second and GGM EIGEN-6C4 up to degree and order 250, matches the uncertainty (σ =± 0.073) of 26 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.076 m, RMS = 0.098 m, maximum = 0.320 m and minimum = -0.061 m).

  13. Evolutionary Expectations

    DEFF Research Database (Denmark)

    Nash, Ulrik William

    2014-01-01

    , they are correlated among people who share environments because these individuals satisfice within their cognitive bounds by using cues in order of validity, as opposed to using cues arbitrarily. Any difference in expectations thereby arise from differences in cognitive ability, because two individuals with identical...... cognitive bounds will perceive business opportunities identically. In addition, because cues provide information about latent causal structures of the environment, changes in causality must be accompanied by changes in cognitive representations if adaptation is to be maintained. The concept of evolutionary......The concept of evolutionary expectations descends from cue learning psychology, synthesizing ideas on rational expectations with ideas on bounded rationality, to provide support for these ideas simultaneously. Evolutionary expectations are rational, but within cognitive bounds. Moreover...

  14. Evolutionary Awareness

    Directory of Open Access Journals (Sweden)

    Gregory Gorelik

    2014-10-01

    Full Text Available In this article, we advance the concept of “evolutionary awareness,” a metacognitive framework that examines human thought and emotion from a naturalistic, evolutionary perspective. We begin by discussing the evolution and current functioning of the moral foundations on which our framework rests. Next, we discuss the possible applications of such an evolutionarily-informed ethical framework to several domains of human behavior, namely: sexual maturation, mate attraction, intrasexual competition, culture, and the separation between various academic disciplines. Finally, we discuss ways in which an evolutionary awareness can inform our cross-generational activities—which we refer to as “intergenerational extended phenotypes”—by helping us to construct a better future for ourselves, for other sentient beings, and for our environment.

  15. Applying ecological and evolutionary theory to cancer: a long and winding road.

    Science.gov (United States)

    Thomas, Frédéric; Fisher, Daniel; Fort, Philippe; Marie, Jean-Pierre; Daoust, Simon; Roche, Benjamin; Grunau, Christoph; Cosseau, Céline; Mitta, Guillaume; Baghdiguian, Stephen; Rousset, François; Lassus, Patrice; Assenat, Eric; Grégoire, Damien; Missé, Dorothée; Lorz, Alexander; Billy, Frédérique; Vainchenker, William; Delhommeau, François; Koscielny, Serge; Itzykson, Raphael; Tang, Ruoping; Fava, Fanny; Ballesta, Annabelle; Lepoutre, Thomas; Krasinska, Liliana; Dulic, Vjekoslav; Raynaud, Peggy; Blache, Philippe; Quittau-Prevostel, Corinne; Vignal, Emmanuel; Trauchessec, Hélène; Perthame, Benoit; Clairambault, Jean; Volpert, Vitali; Solary, Eric; Hibner, Urszula; Hochberg, Michael E

    2013-01-01

    Since the mid 1970s, cancer has been described as a process of Darwinian evolution, with somatic cellular selection and evolution being the fundamental processes leading to malignancy and its many manifestations (neoangiogenesis, evasion of the immune system, metastasis, and resistance to therapies). Historically, little attention has been placed on applications of evolutionary biology to understanding and controlling neoplastic progression and to prevent therapeutic failures. This is now beginning to change, and there is a growing international interest in the interface between cancer and evolutionary biology. The objective of this introduction is first to describe the basic ideas and concepts linking evolutionary biology to cancer. We then present four major fronts where the evolutionary perspective is most developed, namely laboratory and clinical models, mathematical models, databases, and techniques and assays. Finally, we discuss several of the most promising challenges and future prospects in this interdisciplinary research direction in the war against cancer.

  16. An evolutionary algorithm technique for intelligence, surveillance, and reconnaissance plan optimization

    Science.gov (United States)

    Langton, John T.; Caroli, Joseph A.; Rosenberg, Brad

    2008-04-01

    To support an Effects Based Approach to Operations (EBAO), Intelligence, Surveillance, and Reconnaissance (ISR) planners must optimize collection plans within an evolving battlespace. A need exists for a decision support tool that allows ISR planners to rapidly generate and rehearse high-performing ISR plans that balance multiple objectives and constraints to address dynamic collection requirements for assessment. To meet this need we have designed an evolutionary algorithm (EA)-based "Integrated ISR Plan Analysis and Rehearsal System" (I2PARS) to support Effects-based Assessment (EBA). I2PARS supports ISR mission planning and dynamic replanning to coordinate assets and optimize their routes, allocation and tasking. It uses an evolutionary algorithm to address the large parametric space of route-finding problems which is sometimes discontinuous in the ISR domain because of conflicting objectives such as minimizing asset utilization yet maximizing ISR coverage. EAs are uniquely suited for generating solutions in dynamic environments and also allow user feedback. They are therefore ideal for "streaming optimization" and dynamic replanning of ISR mission plans. I2PARS uses the Non-dominated Sorting Genetic Algorithm (NSGA-II) to automatically generate a diverse set of high performing collection plans given multiple objectives, constraints, and assets. Intended end users of I2PARS include ISR planners in the Combined Air Operations Centers and Joint Intelligence Centers. Here we show the feasibility of applying the NSGA-II algorithm and EAs in general to the ISR planning domain. Unique genetic representations and operators for optimization within the ISR domain are presented along with multi-objective optimization criteria for ISR planning. Promising results of the I2PARS architecture design, early software prototype, and limited domain testing of the new algorithm are discussed. We also present plans for future research and development, as well as technology

  17. EVOLUTIONARY FOUNDATIONS FOR MOLECULAR MEDICINE

    Science.gov (United States)

    Nesse, Randolph M.; Ganten, Detlev; Gregory, T. Ryan; Omenn, Gilbert S.

    2015-01-01

    Evolution has long provided a foundation for population genetics, but many major advances in evolutionary biology from the 20th century are only now being applied in molecular medicine. They include the distinction between proximate and evolutionary explanations, kin selection, evolutionary models for cooperation, and new strategies for tracing phylogenies and identifying signals of selection. Recent advances in genomics are further transforming evolutionary biology and creating yet more opportunities for progress at the interface of evolution with genetics, medicine, and public health. This article reviews 15 evolutionary principles and their applications in molecular medicine in hopes that readers will use them and others to speed the development of evolutionary molecular medicine. PMID:22544168

  18. Evolutionary Algorithm for Optimal Vaccination Scheme

    International Nuclear Information System (INIS)

    Parousis-Orthodoxou, K J; Vlachos, D S

    2014-01-01

    The following work uses the dynamic capabilities of an evolutionary algorithm in order to obtain an optimal immunization strategy in a user specified network. The produced algorithm uses a basic genetic algorithm with crossover and mutation techniques, in order to locate certain nodes in the inputted network. These nodes will be immunized in an SIR epidemic spreading process, and the performance of each immunization scheme, will be evaluated by the level of containment that provides for the spreading of the disease

  19. Teaching Computer Ergonomic Techniques: Practices and Perceptions of Secondary and Postsecondary Business Educators.

    Science.gov (United States)

    Alexander, Melody W.; Arp, Larry W.

    1997-01-01

    A survey of 260 secondary and 251 postsecondary business educators found the former more likely to think computer ergonomic techniques should taught in elementary school and to address the hazards of improper use. Both groups stated that over half of students they observe do not use good techniques and agreed that students need continual…

  20. Instrumentation, computer software and experimental techniques used in low-frequency internal friction studies at WNRE

    International Nuclear Information System (INIS)

    Sprugmann, K.W.; Ritchie, I.G.

    1980-04-01

    A detailed and comprehensive account of the equipment, computer programs and experimental methods developed at the Whiteshell Nuclear Research Estalbishment for the study of low-frequency internal friction is presented. Part 1 describes the mechanical apparatus, electronic instrumentation and computer software, while Part II describes in detail the laboratory techniques and various types of experiments performed together with data reduction and analysis. Experimental procedures for the study of internal friction as a function of temperature, strain amplitude or time are described. Computer control of these experiments using the free-decay technique is outlined. In addition, a pendulum constant-amplitude drive system is described. (auth)

  1. A neuro-fuzzy computing technique for modeling hydrological time series

    Science.gov (United States)

    Nayak, P. C.; Sudheer, K. P.; Rangan, D. M.; Ramasastri, K. S.

    2004-05-01

    Intelligent computing tools such as artificial neural network (ANN) and fuzzy logic approaches are proven to be efficient when applied individually to a variety of problems. Recently there has been a growing interest in combining both these approaches, and as a result, neuro-fuzzy computing techniques have evolved. This approach has been tested and evaluated in the field of signal processing and related areas, but researchers have only begun evaluating the potential of this neuro-fuzzy hybrid approach in hydrologic modeling studies. This paper presents the application of an adaptive neuro fuzzy inference system (ANFIS) to hydrologic time series modeling, and is illustrated by an application to model the river flow of Baitarani River in Orissa state, India. An introduction to the ANFIS modeling approach is also presented. The advantage of the method is that it does not require the model structure to be known a priori, in contrast to most of the time series modeling techniques. The results showed that the ANFIS forecasted flow series preserves the statistical properties of the original flow series. The model showed good performance in terms of various statistical indices. The results are highly promising, and a comparative analysis suggests that the proposed modeling approach outperforms ANNs and other traditional time series models in terms of computational speed, forecast errors, efficiency, peak flow estimation etc. It was observed that the ANFIS model preserves the potential of the ANN approach fully, and eases the model building process.

  2. Applying natural evolution for solving computational problems - Lecture 1

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.

  3. Applying natural evolution for solving computational problems - Lecture 2

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.

  4. Parallel computing techniques for rotorcraft aerodynamics

    Science.gov (United States)

    Ekici, Kivanc

    The modification of unsteady three-dimensional Navier-Stokes codes for application on massively parallel and distributed computing environments is investigated. The Euler/Navier-Stokes code TURNS (Transonic Unsteady Rotor Navier-Stokes) was chosen as a test bed because of its wide use by universities and industry. For the efficient implementation of TURNS on parallel computing systems, two algorithmic changes are developed. First, main modifications to the implicit operator, Lower-Upper Symmetric Gauss Seidel (LU-SGS) originally used in TURNS, is performed. Second, application of an inexact Newton method, coupled with a Krylov subspace iterative method (Newton-Krylov method) is carried out. Both techniques have been tried previously for the Euler equations mode of the code. In this work, we have extended the methods to the Navier-Stokes mode. Several new implicit operators were tried because of convergence problems of traditional operators with the high cell aspect ratio (CAR) grids needed for viscous calculations on structured grids. Promising results for both Euler and Navier-Stokes cases are presented for these operators. For the efficient implementation of Newton-Krylov methods to the Navier-Stokes mode of TURNS, efficient preconditioners must be used. The parallel implicit operators used in the previous step are employed as preconditioners and the results are compared. The Message Passing Interface (MPI) protocol has been used because of its portability to various parallel architectures. It should be noted that the proposed methodology is general and can be applied to several other CFD codes (e.g. OVERFLOW).

  5. Under pressure: evolutionary engineering of yeast strains for improved performance in fuels and chemicals production.

    Science.gov (United States)

    Mans, Robert; Daran, Jean-Marc G; Pronk, Jack T

    2018-04-01

    Evolutionary engineering, which uses laboratory evolution to select for industrially relevant traits, is a popular strategy in the development of high-performing yeast strains for industrial production of fuels and chemicals. By integrating whole-genome sequencing, bioinformatics, classical genetics and genome-editing techniques, evolutionary engineering has also become a powerful approach for identification and reverse engineering of molecular mechanisms that underlie industrially relevant traits. New techniques enable acceleration of in vivo mutation rates, both across yeast genomes and at specific loci. Recent studies indicate that phenotypic trade-offs, which are often observed after evolution under constant conditions, can be mitigated by using dynamic cultivation regimes. Advances in research on synthetic regulatory circuits offer exciting possibilities to extend the applicability of evolutionary engineering to products of yeasts whose synthesis requires a net input of cellular energy. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Artifact Elimination Technique in Tomogram of X-ray Computed Tomography

    International Nuclear Information System (INIS)

    Rasif Mohd Zain

    2015-01-01

    Artifacts of tomogram are main commonly problems occurred in x-ray computed tomography. The artifacts will be appearing in tomogram due to noise, beam hardening, and scattered radiation. The study has been carried out using CdTe time pix detector. The new technique has been developed to eliminate the artifact occurred in hardware and software. The hardware setup involved the careful alignment all of the components of the system and the introduction of a collimator beam. Meanwhile, in software development deal with the flat field correction, noise filter and data projection algorithm. The results show the technique developed produce good quality images and eliminate the artifacts. (author)

  7. Attractive evolutionary equilibria

    NARCIS (Netherlands)

    Joosten, Reinoud A.M.G.; Roorda, Berend

    2011-01-01

    We present attractiveness, a refinement criterion for evolutionary equilibria. Equilibria surviving this criterion are robust to small perturbations of the underlying payoff system or the dynamics at hand. Furthermore, certain attractive equilibria are equivalent to others for certain evolutionary

  8. The development of computer industry and applications of its relevant techniques in nuclear research laboratories

    International Nuclear Information System (INIS)

    Dai Guiliang

    1988-01-01

    The increasing needs for computers in the area of nuclear science and technology are described. The current status of commerical availabe computer products of different scale in world market are briefly reviewed. A survey of some noticeable techniques is given from the view point of computer applications in nuclear science research laboratories

  9. Solving Multi-Pollutant Emission Dispatch Problem Using Computational Intelligence Technique

    Directory of Open Access Journals (Sweden)

    Nur Azzammudin Rahmat

    2016-06-01

    Full Text Available Economic dispatch is a crucial process conducted by the utilities to correctly determine the satisfying amount of power to be generated and distributed to the consumers. During the process, the utilities also consider pollutant emission as the consequences of fossil-fuel consumption. Fossil-fuel includes petroleum, coal, and natural gas; each has its unique chemical composition of pollutants i.e. sulphur oxides (SOX, nitrogen oxides (NOX and carbon oxides (COX. This paper presents multi-pollutant emission dispatch problem using computational intelligence technique. In this study, a novel emission dispatch technique is formulated to determine the amount of the pollutant level. It utilizes a pre-developed optimization technique termed as differential evolution immunized ant colony optimization (DEIANT for the emission dispatch problem. The optimization results indicated high level of COX level, regardless of any type of fossil fuel consumed.

  10. Evolutionary optimization technique for site layout planning

    KAUST Repository

    El Ansary, Ayman M.; Shalaby, Mohamed

    2014-01-01

    of design requirements. The developed technique is based on genetic algorithm which explores the search space for possible solutions. This study considers two dimensional site planning problems. However, it can be extended to solve three dimensional cases. A

  11. Evolutionary modeling-based approach for model errors correction

    Directory of Open Access Journals (Sweden)

    S. Q. Wan

    2012-08-01

    Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."

    On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  12. Rayleigh’s quotient–based damage detection algorithm: Theoretical concepts, computational techniques, and field implementation strategies

    DEFF Research Database (Denmark)

    NJOMO WANDJI, Wilfried

    2017-01-01

    levels are targeted: existence, location, and severity. The proposed algorithm is analytically developed from the dynamics theory and the virtual energy principle. Some computational techniques are proposed for carrying out computations, including discretization, integration, derivation, and suitable...

  13. Inclusion of the fitness sharing technique in an evolutionary algorithm to analyze the fitness landscape of the genetic code adaptability.

    Science.gov (United States)

    Santos, José; Monteagudo, Ángel

    2017-03-27

    The canonical code, although prevailing in complex genomes, is not universal. It was shown the canonical genetic code superior robustness compared to random codes, but it is not clearly determined how it evolved towards its current form. The error minimization theory considers the minimization of point mutation adverse effect as the main selection factor in the evolution of the code. We have used simulated evolution in a computer to search for optimized codes, which helps to obtain information about the optimization level of the canonical code in its evolution. A genetic algorithm searches for efficient codes in a fitness landscape that corresponds with the adaptability of possible hypothetical genetic codes. The lower the effects of errors or mutations in the codon bases of a hypothetical code, the more efficient or optimal is that code. The inclusion of the fitness sharing technique in the evolutionary algorithm allows the extent to which the canonical genetic code is in an area corresponding to a deep local minimum to be easily determined, even in the high dimensional spaces considered. The analyses show that the canonical code is not in a deep local minimum and that the fitness landscape is not a multimodal fitness landscape with deep and separated peaks. Moreover, the canonical code is clearly far away from the areas of higher fitness in the landscape. Given the non-presence of deep local minima in the landscape, although the code could evolve and different forces could shape its structure, the fitness landscape nature considered in the error minimization theory does not explain why the canonical code ended its evolution in a location which is not an area of a localized deep minimum of the huge fitness landscape.

  14. A new technique for on-line and off-line high speed computation

    International Nuclear Information System (INIS)

    Hartouni, E.P.; Jensen, D.A.; Klima, B.; Kreisler, M.N.; Rabin, M.S.Z.; Uribe, J.; Gottschalk, E.; Gara, A.; Knapp, B.C.

    1989-01-01

    A new technique for both on-line and off-line computation has been developed. With this technique, a reconstruction analysis in Elementary Particle Physics, otherwise prohibitively long, has been accomplished. It will be used on-line in an upcoming Fermilab experiment to reconstruct more than 100,000 events per second and to trigger on the basis of that information. The technique delivers 40 Giga operations per second, has a bandwidth on the order of Gigabytes per second and has a modest cost. An overview of the program, details of the system, and performance measurements are presented in this paper

  15. Evolutionary engineering to enhance starter culture performance in food fermentations.

    Science.gov (United States)

    Bachmann, Herwig; Pronk, Jack T; Kleerebezem, Michiel; Teusink, Bas

    2015-04-01

    Microbial starter cultures are essential for consistent product quality and functional properties such as flavor, texture, pH or the alcohol content of various fermented foods. Strain improvement programs to achieve desired properties in starter cultures are diverse, but developments in next-generation sequencing lead to an increased interest in evolutionary engineering of desired phenotypes. We here discuss recent developments of strain selection protocols and how computational approaches can assist such experimental design. Furthermore the analysis of evolved phenotypes and possibilities with complex consortia are highlighted. Studies carried out with mainly yeast and lactic acid bacteria demonstrate the power of evolutionary engineering to deliver strains with novel phenotypes as well as insight into underlying mechanisms. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Asteroseismology of pulsating DA white dwarfs with fully evolutionary models

    Directory of Open Access Journals (Sweden)

    Althaus L.G.

    2013-03-01

    Full Text Available We present a new approach for asteroseismology of DA white dwarfs that consists in the employment of a large set of non-static, physically sound, fully evolutionary models representative of these stars. We already have applied this approach with success to pulsating PG1159 stars (GW Vir variables. Our white dwarf models, which cover a wide range of stellar masses, effective temperatures, and envelope thicknesses, are the result of fully evolutionary computations that take into account the complete history of the progenitor stars from the ZAMS. In particular, the models are characterized by self-consistent chemical structures from the centre to the surface, a crucial aspect of white dwarf asteroseismology. We apply this approach to an ensemble of 44 bright DAV (ZZ Ceti stars.

  17. Evolutionary Algorithms for Boolean Functions in Diverse Domains of Cryptography.

    Science.gov (United States)

    Picek, Stjepan; Carlet, Claude; Guilley, Sylvain; Miller, Julian F; Jakobovic, Domagoj

    2016-01-01

    The role of Boolean functions is prominent in several areas including cryptography, sequences, and coding theory. Therefore, various methods for the construction of Boolean functions with desired properties are of direct interest. New motivations on the role of Boolean functions in cryptography with attendant new properties have emerged over the years. There are still many combinations of design criteria left unexplored and in this matter evolutionary computation can play a distinct role. This article concentrates on two scenarios for the use of Boolean functions in cryptography. The first uses Boolean functions as the source of the nonlinearity in filter and combiner generators. Although relatively well explored using evolutionary algorithms, it still presents an interesting goal in terms of the practical sizes of Boolean functions. The second scenario appeared rather recently where the objective is to find Boolean functions that have various orders of the correlation immunity and minimal Hamming weight. In both these scenarios we see that evolutionary algorithms are able to find high-quality solutions where genetic programming performs the best.

  18. Evaluating alternative gait strategies using evolutionary robotics.

    Science.gov (United States)

    Sellers, William I; Dennis, Louise A; W -J, Wang; Crompton, Robin H

    2004-05-01

    Evolutionary robotics is a branch of artificial intelligence concerned with the automatic generation of autonomous robots. Usually the form of the robot is predefined and various computational techniques are used to control the machine's behaviour. One aspect is the spontaneous generation of walking in legged robots and this can be used to investigate the mechanical requirements for efficient walking in bipeds. This paper demonstrates a bipedal simulator that spontaneously generates walking and running gaits. The model can be customized to represent a range of hominoid morphologies and used to predict performance parameters such as preferred speed and metabolic energy cost. Because it does not require any motion capture data it is particularly suitable for investigating locomotion in fossil animals. The predictions for modern humans are highly accurate in terms of energy cost for a given speed and thus the values predicted for other bipeds are likely to be good estimates. To illustrate this the cost of transport is calculated for Australopithecus afarensis. The model allows the degree of maximum extension at the knee to be varied causing the model to adopt walking gaits varying from chimpanzee-like to human-like. The energy costs associated with these gait choices can thus be calculated and this information used to evaluate possible locomotor strategies in early hominids.

  19. Advanced technique for computing fuel combustion properties in pulverized-fuel fired boilers

    Energy Technology Data Exchange (ETDEWEB)

    Kotler, V.R. (Vsesoyuznyi Teplotekhnicheskii Institut (Russian Federation))

    1992-03-01

    Reviews foreign technical reports on advanced techniques for computing fuel combustion properties in pulverized-fuel fired boilers and analyzes a technique developed by Combustion Engineering, Inc. (USA). Characteristics of 25 fuel types, including 19 grades of coal, are listed along with a diagram of an installation with a drop tube furnace. Characteristics include burn-out intensity curves obtained using thermogravimetric analysis for high-volatile bituminous, semi-bituminous and coking coal. The patented LFP-SKM mathematical model is used to model combustion of a particular fuel under given conditions. The model allows for fuel particle size, air surplus, load, flame height, and portion of air supplied as tertiary blast. Good agreement between computational and experimental data was observed. The method is employed in designing new boilers as well as converting operating boilers to alternative types of fuel. 3 refs.

  20. Numeric treatment of nonlinear second order multi-point boundary value problems using ANN, GAs and sequential quadratic programming technique

    Directory of Open Access Journals (Sweden)

    Zulqurnain Sabir

    2014-06-01

    Full Text Available In this paper, computational intelligence technique are presented for solving multi-point nonlinear boundary value problems based on artificial neural networks, evolutionary computing approach, and active-set technique. The neural network is to provide convenient methods for obtaining useful model based on unsupervised error for the differential equations. The motivation for presenting this work comes actually from the aim of introducing a reliable framework that combines the powerful features of ANN optimized with soft computing frameworks to cope with such challenging system. The applicability and reliability of such methods have been monitored thoroughly for various boundary value problems arises in science, engineering and biotechnology as well. Comprehensive numerical experimentations have been performed to validate the accuracy, convergence, and robustness of the designed scheme. Comparative studies have also been made with available standard solution to analyze the correctness of the proposed scheme.

  1. A Wireless Sensor Network with Soft Computing Localization Techniques for Track Cycling Applications.

    Science.gov (United States)

    Gharghan, Sadik Kamel; Nordin, Rosdiadee; Ismail, Mahamod

    2016-08-06

    In this paper, we propose two soft computing localization techniques for wireless sensor networks (WSNs). The two techniques, Neural Fuzzy Inference System (ANFIS) and Artificial Neural Network (ANN), focus on a range-based localization method which relies on the measurement of the received signal strength indicator (RSSI) from the three ZigBee anchor nodes distributed throughout the track cycling field. The soft computing techniques aim to estimate the distance between bicycles moving on the cycle track for outdoor and indoor velodromes. In the first approach the ANFIS was considered, whereas in the second approach the ANN was hybridized individually with three optimization algorithms, namely Particle Swarm Optimization (PSO), Gravitational Search Algorithm (GSA), and Backtracking Search Algorithm (BSA). The results revealed that the hybrid GSA-ANN outperforms the other methods adopted in this paper in terms of accuracy localization and distance estimation accuracy. The hybrid GSA-ANN achieves a mean absolute distance estimation error of 0.02 m and 0.2 m for outdoor and indoor velodromes, respectively.

  2. Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques

    Science.gov (United States)

    Mai, J.; Tolson, B.

    2017-12-01

    The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method's independency of the convergence testing method, we applied it to two widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991) and the variance-based Sobol' method (Solbol' 1993). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an

  3. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    Science.gov (United States)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  4. New computing techniques in physics research

    International Nuclear Information System (INIS)

    Perret-Gallix, D.; Wojcik, W.

    1990-01-01

    These proceedings relate in a pragmatic way the use of methods and techniques of software engineering and artificial intelligence in high energy and nuclear physics. Such fundamental research can only be done through the design, the building and the running of equipments and systems among the most complex ever undertaken by mankind. The use of these new methods is mandatory in such an environment. However their proper integration in these real applications raise some unsolved problems. Their solution, beyond the research field, will lead to a better understanding of some fundamental aspects of software engineering and artificial intelligence. Here is a sample of subjects covered in the proceedings : Software engineering in a multi-users, multi-versions, multi-systems environment, project management, software validation and quality control, data structure and management object oriented languages, multi-languages application, interactive data analysis, expert systems for diagnosis, expert systems for real-time applications, neural networks for pattern recognition, symbolic manipulation for automatic computation of complex processes

  5. IDEA: Interactive Display for Evolutionary Analyses.

    Science.gov (United States)

    Egan, Amy; Mahurkar, Anup; Crabtree, Jonathan; Badger, Jonathan H; Carlton, Jane M; Silva, Joana C

    2008-12-08

    The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood) suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. We have developed IDEA (Interactive Display for Evolutionary Analyses), an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  6. Computer modeling for optimal placement of gloveboxes

    International Nuclear Information System (INIS)

    Hench, K.W.; Olivas, J.D.; Finch, P.R.

    1997-08-01

    Reduction of the nuclear weapons stockpile and the general downsizing of the nuclear weapons complex has presented challenges for Los Alamos. One is to design an optimized fabrication facility to manufacture nuclear weapon primary components (pits) in an environment of intense regulation and shrinking budgets. Historically, the location of gloveboxes in a processing area has been determined without benefit of industrial engineering studies to ascertain the optimal arrangement. The opportunity exists for substantial cost savings and increased process efficiency through careful study and optimization of the proposed layout by constructing a computer model of the fabrication process. This paper presents an integrative two- stage approach to modeling the casting operation for pit fabrication. The first stage uses a mathematical technique for the formulation of the facility layout problem; the solution procedure uses an evolutionary heuristic technique. The best solutions to the layout problem are used as input to the second stage - a computer simulation model that assesses the impact of competing layouts on operational performance. The focus of the simulation model is to determine the layout that minimizes personnel radiation exposures and nuclear material movement, and maximizes the utilization of capacity for finished units

  7. Computer modeling for optimal placement of gloveboxes

    Energy Technology Data Exchange (ETDEWEB)

    Hench, K.W.; Olivas, J.D. [Los Alamos National Lab., NM (United States); Finch, P.R. [New Mexico State Univ., Las Cruces, NM (United States)

    1997-08-01

    Reduction of the nuclear weapons stockpile and the general downsizing of the nuclear weapons complex has presented challenges for Los Alamos. One is to design an optimized fabrication facility to manufacture nuclear weapon primary components (pits) in an environment of intense regulation and shrinking budgets. Historically, the location of gloveboxes in a processing area has been determined without benefit of industrial engineering studies to ascertain the optimal arrangement. The opportunity exists for substantial cost savings and increased process efficiency through careful study and optimization of the proposed layout by constructing a computer model of the fabrication process. This paper presents an integrative two- stage approach to modeling the casting operation for pit fabrication. The first stage uses a mathematical technique for the formulation of the facility layout problem; the solution procedure uses an evolutionary heuristic technique. The best solutions to the layout problem are used as input to the second stage - a computer simulation model that assesses the impact of competing layouts on operational performance. The focus of the simulation model is to determine the layout that minimizes personnel radiation exposures and nuclear material movement, and maximizes the utilization of capacity for finished units.

  8. Combining Acceleration Techniques for Low-Dose X-Ray Cone Beam Computed Tomography Image Reconstruction.

    Science.gov (United States)

    Huang, Hsuan-Ming; Hsiao, Ing-Tsung

    2017-01-01

    Over the past decade, image quality in low-dose computed tomography has been greatly improved by various compressive sensing- (CS-) based reconstruction methods. However, these methods have some disadvantages including high computational cost and slow convergence rate. Many different speed-up techniques for CS-based reconstruction algorithms have been developed. The purpose of this paper is to propose a fast reconstruction framework that combines a CS-based reconstruction algorithm with several speed-up techniques. First, total difference minimization (TDM) was implemented using the soft-threshold filtering (STF). Second, we combined TDM-STF with the ordered subsets transmission (OSTR) algorithm for accelerating the convergence. To further speed up the convergence of the proposed method, we applied the power factor and the fast iterative shrinkage thresholding algorithm to OSTR and TDM-STF, respectively. Results obtained from simulation and phantom studies showed that many speed-up techniques could be combined to greatly improve the convergence speed of a CS-based reconstruction algorithm. More importantly, the increased computation time (≤10%) was minor as compared to the acceleration provided by the proposed method. In this paper, we have presented a CS-based reconstruction framework that combines several acceleration techniques. Both simulation and phantom studies provide evidence that the proposed method has the potential to satisfy the requirement of fast image reconstruction in practical CT.

  9. An Evolutionary Approach to Driving Tendency Recognition for Advanced Driver Assistance Systems

    Directory of Open Access Journals (Sweden)

    Lee Jong-Hyun

    2016-01-01

    Full Text Available Driving tendency recognition is important for constructing Advanced Driver Assistance Systems (ADAS. However, it had not been a lot of research using vehicle sensing data, due to the high difficulty to define it. In this paper, we attempt to improve the learning capability of a machine learning method using evolutionary computation. We propose a driving tendency recognition method, with consideration of data characteristics. Comparison of our classification system with conventional methods demonstrated the effectiveness and accuracy over 92% in our system. Our proposed evolutionary approach is confirmed that improve the classification accuracy of the learning method through evolution in the experiment.

  10. Soft computing in green and renewable energy systems

    Energy Technology Data Exchange (ETDEWEB)

    Gopalakrishnan, Kasthurirangan [Iowa State Univ., Ames, IA (United States). Iowa Bioeconomy Inst.; US Department of Energy, Ames, IA (United States). Ames Lab; Kalogirou, Soteris [Cyprus Univ. of Technology, Limassol (Cyprus). Dept. of Mechanical Engineering and Materials Sciences and Engineering; Khaitan, Siddhartha Kumar (eds.) [Iowa State Univ. of Science and Technology, Ames, IA (United States). Dept. of Electrical Engineering and Computer Engineering

    2011-07-01

    Soft Computing in Green and Renewable Energy Systems provides a practical introduction to the application of soft computing techniques and hybrid intelligent systems for designing, modeling, characterizing, optimizing, forecasting, and performance prediction of green and renewable energy systems. Research is proceeding at jet speed on renewable energy (energy derived from natural resources such as sunlight, wind, tides, rain, geothermal heat, biomass, hydrogen, etc.) as policy makers, researchers, economists, and world agencies have joined forces in finding alternative sustainable energy solutions to current critical environmental, economic, and social issues. The innovative models, environmentally benign processes, data analytics, etc. employed in renewable energy systems are computationally-intensive, non-linear and complex as well as involve a high degree of uncertainty. Soft computing technologies, such as fuzzy sets and systems, neural science and systems, evolutionary algorithms and genetic programming, and machine learning, are ideal in handling the noise, imprecision, and uncertainty in the data, and yet achieve robust, low-cost solutions. As a result, intelligent and soft computing paradigms are finding increasing applications in the study of renewable energy systems. Researchers, practitioners, undergraduate and graduate students engaged in the study of renewable energy systems will find this book very useful. (orig.)

  11. 进化作曲研究%Research on evolutionary music composer system

    Institute of Scientific and Technical Information of China (English)

    汪镭; 郑晓妹; 申林

    2014-01-01

    Algorithmic composition is the most attractive research area in computer music and genetic algorithm-based evolution-ary music composer system has become a hot spot in the algorithmic composition.This paper gives a structure of evolutionary mu-sic composer system,analyzes different goals of music composer systems,and then discusses two types of evolutionary music com-poser system from the aspect of fitness function design.Finally,several instances of evolutionary music composer system are ana-lyzed.%算法作曲是计算机音乐中最具吸引力的研究领域,而基于遗传算法的进化作曲系统已成为算法作曲中的热点。给出了进化作曲系统的结构,分析了系统不同的作曲目标,从适应度函数的设计讨论了两类作曲系统。最后给出了几个作曲系统实例分析。

  12. Evolutionary algorithms applied to Landau-gauge fixing

    International Nuclear Information System (INIS)

    Markham, J.F.

    1998-01-01

    Current algorithms used to put a lattice gauge configuration into Landau gauge either suffer from the problem of critical slowing-down or involve an additions computational expense to overcome it. Evolutionary Algorithms (EAs), which have been widely applied to other global optimisation problems, may be of use in gauge fixing. Also, being global, they should not suffer from critical slowing-down as do local gradient based algorithms. We apply EA'S and also a Steepest Descent (SD) based method to the problem of Landau Gauge Fixing and compare their performance. (authors)

  13. Finite-element-model updating using computational intelligence techniques applications to structural dynamics

    CERN Document Server

    Marwala, Tshilidzi

    2010-01-01

    Finite element models (FEMs) are widely used to understand the dynamic behaviour of various systems. FEM updating allows FEMs to be tuned better to reflect measured data and may be conducted using two different statistical frameworks: the maximum likelihood approach and Bayesian approaches. Finite Element Model Updating Using Computational Intelligence Techniques applies both strategies to the field of structural mechanics, an area vital for aerospace, civil and mechanical engineering. Vibration data is used for the updating process. Following an introduction a number of computational intelligence techniques to facilitate the updating process are proposed; they include: • multi-layer perceptron neural networks for real-time FEM updating; • particle swarm and genetic-algorithm-based optimization methods to accommodate the demands of global versus local optimization models; • simulated annealing to put the methodologies into a sound statistical basis; and • response surface methods and expectation m...

  14. Numerical and Evolutionary Optimization Workshop

    CERN Document Server

    Trujillo, Leonardo; Legrand, Pierrick; Maldonado, Yazmin

    2017-01-01

    This volume comprises a selection of works presented at the Numerical and Evolutionary Optimization (NEO) workshop held in September 2015 in Tijuana, Mexico. The development of powerful search and optimization techniques is of great importance in today’s world that requires researchers and practitioners to tackle a growing number of challenging real-world problems. In particular, there are two well-established and widely known fields that are commonly applied in this area: (i) traditional numerical optimization techniques and (ii) comparatively recent bio-inspired heuristics. Both paradigms have their unique strengths and weaknesses, allowing them to solve some challenging problems while still failing in others. The goal of the NEO workshop series is to bring together people from these and related fields to discuss, compare and merge their complimentary perspectives in order to develop fast and reliable hybrid methods that maximize the strengths and minimize the weaknesses of the underlying paradigms. Throu...

  15. Evolutionary experience design – the case of Otopia

    DEFF Research Database (Denmark)

    Hansen, Kenneth

    experiences with the case of “Otopia”. “Otopia” is a large scale, new media experiment, which combines the areas of computer games, sports and performance in to a spectator oriented concept; it was premiered in a dome tent at the Roskilde Festival in Denmark the summer 2005. This paper presents and discusses......The design of experiences is a complicated challenge. It might not even be possible to design such a “thing”, but only to design for it. If this is the case it could seem appropriate with an evolutionary approach. This paper introduces such an approach to the design of new public oriented...... used as a means of specifying the basic immaterial design form. This discussion leads to the suggestion of a rule-based evolutionary model for the design of situations as a practical option for designers of new spectator oriented experiences in the future The project of Otopia was supported...

  16. A Multiagent Evolutionary Algorithm for the Resource-Constrained Project Portfolio Selection and Scheduling Problem

    Directory of Open Access Journals (Sweden)

    Yongyi Shou

    2014-01-01

    Full Text Available A multiagent evolutionary algorithm is proposed to solve the resource-constrained project portfolio selection and scheduling problem. The proposed algorithm has a dual level structure. In the upper level a set of agents make decisions to select appropriate project portfolios. Each agent selects its project portfolio independently. The neighborhood competition operator and self-learning operator are designed to improve the agent’s energy, that is, the portfolio profit. In the lower level the selected projects are scheduled simultaneously and completion times are computed to estimate the expected portfolio profit. A priority rule-based heuristic is used by each agent to solve the multiproject scheduling problem. A set of instances were generated systematically from the widely used Patterson set. Computational experiments confirmed that the proposed evolutionary algorithm is effective for the resource-constrained project portfolio selection and scheduling problem.

  17. Evolutionary Demography

    DEFF Research Database (Denmark)

    Levitis, Daniel

    2015-01-01

    of biological and cultural evolution. Demographic variation within and among human populations is influenced by our biology, and therefore by natural selection and our evolutionary background. Demographic methods are necessary for studying populations of other species, and for quantifying evolutionary fitness......Demography is the quantitative study of population processes, while evolution is a population process that influences all aspects of biological organisms, including their demography. Demographic traits common to all human populations are the products of biological evolution or the interaction...

  18. Proteomics in evolutionary ecology.

    Science.gov (United States)

    Baer, B; Millar, A H

    2016-03-01

    Evolutionary ecologists are traditionally gene-focused, as genes propagate phenotypic traits across generations and mutations and recombination in the DNA generate genetic diversity required for evolutionary processes. As a consequence, the inheritance of changed DNA provides a molecular explanation for the functional changes associated with natural selection. A direct focus on proteins on the other hand, the actual molecular agents responsible for the expression of a phenotypic trait, receives far less interest from ecologists and evolutionary biologists. This is partially due to the central dogma of molecular biology that appears to define proteins as the 'dead-end of molecular information flow' as well as technical limitations in identifying and studying proteins and their diversity in the field and in many of the more exotic genera often favored in ecological studies. Here we provide an overview of a newly forming field of research that we refer to as 'Evolutionary Proteomics'. We point out that the origins of cellular function are related to the properties of polypeptide and RNA and their interactions with the environment, rather than DNA descent, and that the critical role of horizontal gene transfer in evolution is more about coopting new proteins to impact cellular processes than it is about modifying gene function. Furthermore, post-transcriptional and post-translational processes generate a remarkable diversity of mature proteins from a single gene, and the properties of these mature proteins can also influence inheritance through genetic and perhaps epigenetic mechanisms. The influence of post-transcriptional diversification on evolutionary processes could provide a novel mechanistic underpinning for elements of rapid, directed evolutionary changes and adaptations as observed for a variety of evolutionary processes. Modern state-of the art technologies based on mass spectrometry are now available to identify and quantify peptides, proteins, protein

  19. An Adaptive Evolutionary Algorithm for Traveling Salesman Problem with Precedence Constraints

    Directory of Open Access Journals (Sweden)

    Jinmo Sung

    2014-01-01

    Full Text Available Traveling sales man problem with precedence constraints is one of the most notorious problems in terms of the efficiency of its solution approach, even though it has very wide range of industrial applications. We propose a new evolutionary algorithm to efficiently obtain good solutions by improving the search process. Our genetic operators guarantee the feasibility of solutions over the generations of population, which significantly improves the computational efficiency even when it is combined with our flexible adaptive searching strategy. The efficiency of the algorithm is investigated by computational experiments.

  20. Template matching techniques in computer vision theory and practice

    CERN Document Server

    Brunelli, Roberto

    2009-01-01

    The detection and recognition of objects in images is a key research topic in the computer vision community.  Within this area, face recognition and interpretation has attracted increasing attention owing to the possibility of unveiling human perception mechanisms, and for the development of practical biometric systems. This book and the accompanying website, focus on template matching, a subset of object recognition techniques of wide applicability, which has proved to be particularly effective for face recognition applications. Using examples from face processing tasks throughout the book to illustrate more general object recognition approaches, Roberto Brunelli: examines the basics of digital image formation, highlighting points critical to the task of template matching;presents basic and  advanced template matching techniques, targeting grey-level images, shapes and point sets;discusses recent pattern classification paradigms from a template matching perspective;illustrates the development of a real fac...

  1. Bioinspired computation in combinatorial optimization: algorithms and their computational complexity

    DEFF Research Database (Denmark)

    Neumann, Frank; Witt, Carsten

    2012-01-01

    Bioinspired computation methods, such as evolutionary algorithms and ant colony optimization, are being applied successfully to complex engineering and combinatorial optimization problems, and it is very important that we understand the computational complexity of these algorithms. This tutorials...... problems. Classical single objective optimization is examined first. They then investigate the computational complexity of bioinspired computation applied to multiobjective variants of the considered combinatorial optimization problems, and in particular they show how multiobjective optimization can help...... to speed up bioinspired computation for single-objective optimization problems. The tutorial is based on a book written by the authors with the same title. Further information about the book can be found at www.bioinspiredcomputation.com....

  2. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  3. Application of Soft Computing Techniques and Multiple Regression Models for CBR prediction of Soils

    Directory of Open Access Journals (Sweden)

    Fatimah Khaleel Ibrahim

    2017-08-01

    Full Text Available The techniques of soft computing technique such as Artificial Neutral Network (ANN have improved the predicting capability and have actually discovered application in Geotechnical engineering. The aim of this research is to utilize the soft computing technique and Multiple Regression Models (MLR for forecasting the California bearing ratio CBR( of soil from its index properties. The indicator of CBR for soil could be predicted from various soils characterizing parameters with the assist of MLR and ANN methods. The data base that collected from the laboratory by conducting tests on 86 soil samples that gathered from different projects in Basrah districts. Data gained from the experimental result were used in the regression models and soft computing techniques by using artificial neural network. The liquid limit, plastic index , modified compaction test and the CBR test have been determined. In this work, different ANN and MLR models were formulated with the different collection of inputs to be able to recognize their significance in the prediction of CBR. The strengths of the models that were developed been examined in terms of regression coefficient (R2, relative error (RE% and mean square error (MSE values. From the results of this paper, it absolutely was noticed that all the proposed ANN models perform better than that of MLR model. In a specific ANN model with all input parameters reveals better outcomes than other ANN models.

  4. Evolutionary Trails of Plant Group II Pyridoxal Phosphate-Dependent Decarboxylase Genes.

    Science.gov (United States)

    Kumar, Rahul

    2016-01-01

    Type II pyridoxal phosphate-dependent decarboxylase (PLP_deC) enzymes play important metabolic roles during nitrogen metabolism. Recent evolutionary profiling of these genes revealed a sharp expansion of histidine decarboxylase genes in the members of Solanaceae family. In spite of the high sequence homology shared by PLP_deC orthologs, these enzymes display remarkable differences in their substrate specificities. Currently, limited information is available on the gene repertoires and substrate specificities of PLP_deCs which renders their precise annotation challenging and offers technical challenges in the immediate identification and biochemical characterization of their full gene complements in plants. Herein, we explored their evolutionary trails in a comprehensive manner by taking advantage of high-throughput data accessibility and computational approaches. We discussed the premise that has enabled an improved reconstruction of their evolutionary lineage and evaluated the factors offering constraints in their rapid functional characterization, till date. We envisage that the synthesized information herein would act as a catalyst for the rapid exploration of their biochemical specificity and physiological roles in more plant species.

  5. Social traits, social networks and evolutionary biology.

    Science.gov (United States)

    Fisher, D N; McAdam, A G

    2017-12-01

    The social environment is both an important agent of selection for most organisms, and an emergent property of their interactions. As an aggregation of interactions among members of a population, the social environment is a product of many sets of relationships and so can be represented as a network or matrix. Social network analysis in animals has focused on why these networks possess the structure they do, and whether individuals' network traits, representing some aspect of their social phenotype, relate to their fitness. Meanwhile, quantitative geneticists have demonstrated that traits expressed in a social context can depend on the phenotypes and genotypes of interacting partners, leading to influences of the social environment on the traits and fitness of individuals and the evolutionary trajectories of populations. Therefore, both fields are investigating similar topics, yet have arrived at these points relatively independently. We review how these approaches are diverged, and yet how they retain clear parallelism and so strong potential for complementarity. This demonstrates that, despite separate bodies of theory, advances in one might inform the other. Techniques in network analysis for quantifying social phenotypes, and for identifying community structure, should be useful for those studying the relationship between individual behaviour and group-level phenotypes. Entering social association matrices into quantitative genetic models may also reduce bias in heritability estimates, and allow the estimation of the influence of social connectedness on trait expression. Current methods for measuring natural selection in a social context explicitly account for the fact that a trait is not necessarily the property of a single individual, something the network approaches have not yet considered when relating network metrics to individual fitness. Harnessing evolutionary models that consider traits affected by genes in other individuals (i.e. indirect genetic

  6. IDEA: Interactive Display for Evolutionary Analyses

    Directory of Open Access Journals (Sweden)

    Carlton Jane M

    2008-12-01

    Full Text Available Abstract Background The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. Results We have developed IDEA (Interactive Display for Evolutionary Analyses, an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. Conclusion IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  7. Evolutionary modelling of transitions to sustainable development

    International Nuclear Information System (INIS)

    Safarzynska, K.

    2010-01-01

    This thesis has examined how evolutionary economics can contribute to modelling the micromechanisms that underlie transitions towards sustainable development. In general, transitions are fundamental or structural system changes. They involve, or even require, escaping lock-in of dominant, environmentally unsustainable technologies, introducing major technical or social innovations, and changing prevailing social practices and structures. Due to the complexity of socioeconomic interactions, it is not always possible to identify, and thus target with appropriate policy instruments, causes of specific unsustainable patterns of behaviour. Formal modelling exercises can help improve our understanding of the interaction of various transition mechanisms which are otherwise difficult to grasp intuitively. They allow exploring effects of policy interventions in complex systems. However, existing models of transitions focus on social phenomena and seldom address economic problems. As opposed, mainstream (neoclassical) economic models of technological change do not account for social interactions, and changing heterogeneity of users and their perspectives - even though all of these can influence the direction of innovations and patterns of socio-technological development. Evolutionary economics offers an approach that goes beyond neoclassical economics - in the sense of employing more realistic assumptions regarding the behaviour and heterogeneity of consumers, firms and investors. It can complement current transition models by providing them with a better understanding of associated economic dynamics. In this thesis, formal models were proposed to illustrate the usefulness of a range of evolutionary-economic techniques for modelling transitions. Modelling exercises aimed to explain the core properties of socio-economic systems, such as lock-in, path-dependence, coevolution, group selection and recombinant innovation. The studies collected in this dissertation illustrate that

  8. Evolutionary design optimization of traffic signals applied to Quito city.

    Science.gov (United States)

    Armas, Rolando; Aguirre, Hernán; Daolio, Fabio; Tanaka, Kiyoshi

    2017-01-01

    This work applies evolutionary computation and machine learning methods to study the transportation system of Quito from a design optimization perspective. It couples an evolutionary algorithm with a microscopic transport simulator and uses the outcome of the optimization process to deepen our understanding of the problem and gain knowledge about the system. The work focuses on the optimization of a large number of traffic lights deployed on a wide area of the city and studies their impact on travel time, emissions and fuel consumption. An evolutionary algorithm with specialized mutation operators is proposed to search effectively in large decision spaces, evolving small populations for a short number of generations. The effects of the operators combined with a varying mutation schedule are studied, and an analysis of the parameters of the algorithm is also included. In addition, hierarchical clustering is performed on the best solutions found in several runs of the algorithm. An analysis of signal clusters and their geolocation, estimation of fuel consumption, spatial analysis of emissions, and an analysis of signal coordination provide an overall picture of the systemic effects of the optimization process.

  9. Can An Evolutionary Process Create English Text?

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.

    2008-10-29

    Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed to produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).

  10. On techniques of ATR lattice computation

    International Nuclear Information System (INIS)

    1997-08-01

    Lattice computation is to compute the average nuclear constants of unit fuel lattice which are required for computing core nuclear characteristics such as core power distribution and reactivity characteristics. The main nuclear constants are infinite multiplying rate, neutron movement area, cross section for diffusion computation, local power distribution and isotope composition. As for the lattice computation code, WIMS-ATR is used, which is based on the WIMS-D code developed in U.K., and for the purpose of heightening the accuracy of analysis, which was improved by adding heavy water scattering cross section considering the temperature dependence by Honeck model. For the computation of the neutron absorption by control rods, LOIEL BLUE code is used. The extrapolation distance of neutron flux on control rod surfaces is computed by using THERMOS and DTF codes, and the lattice constants of adjoining lattices are computed by using the WIMS-ATR code. As for the WIMS-ATR code, the computation flow and nuclear data library, and as for the LOIEL BLUE code, the computation flow are explained. The local power distribution in fuel assemblies determined by the WIMS-ATR code was verified with the measured data, and the results are reported. (K.I.)

  11. Attractive evolutionary equilibria

    OpenAIRE

    Roorda, Berend; Joosten, Reinoud

    2011-01-01

    We present attractiveness, a refinement criterion for evolutionary equilibria. Equilibria surviving this criterion are robust to small perturbations of the underlying payoff system or the dynamics at hand. Furthermore, certain attractive equilibria are equivalent to others for certain evolutionary dynamics. For instance, each attractive evolutionarily stable strategy is an attractive evolutionarily stable equilibrium for certain barycentric ray-projection dynamics, and vice versa.

  12. Computational efficiency improvement with Wigner rotation technique in studying atoms in intense few-cycle circularly polarized pulses

    International Nuclear Information System (INIS)

    Yuan, Minghu; Feng, Liqiang; Lü, Rui; Chu, Tianshu

    2014-01-01

    We show that by introducing Wigner rotation technique into the solution of time-dependent Schrödinger equation in length gauge, computational efficiency can be greatly improved in describing atoms in intense few-cycle circularly polarized laser pulses. The methodology with Wigner rotation technique underlying our openMP parallel computational code for circularly polarized laser pulses is described. Results of test calculations to investigate the scaling property of the computational code with the number of the electronic angular basis function l as well as the strong field phenomena are presented and discussed for the hydrogen atom

  13. Electrostatic afocal-zoom lens design using computer optimization technique

    Energy Technology Data Exchange (ETDEWEB)

    Sise, Omer, E-mail: omersise@gmail.com

    2014-12-15

    Highlights: • We describe the detailed design of a five-element electrostatic afocal-zoom lens. • The simplex optimization is used to optimize lens voltages. • The method can be applied to multi-element electrostatic lenses. - Abstract: Electron optics is the key to the successful operation of electron collision experiments where well designed electrostatic lenses are needed to drive electron beam before and after the collision. In this work, the imaging properties and aberration analysis of an electrostatic afocal-zoom lens design were investigated using a computer optimization technique. We have found a whole new range of voltage combinations that has gone unnoticed until now. A full range of voltage ratios and spherical and chromatic aberration coefficients were systematically analyzed with a range of magnifications between 0.3 and 3.2. The grid-shadow evaluation was also employed to show the effect of spherical aberration. The technique is found to be useful for searching the optimal configuration in a multi-element lens system.

  14. Computer vision techniques for the diagnosis of skin cancer

    CERN Document Server

    Celebi, M

    2014-01-01

    The goal of this volume is to summarize the state-of-the-art in the utilization of computer vision techniques in the diagnosis of skin cancer. Malignant melanoma is one of the most rapidly increasing cancers in the world. Early diagnosis is particularly important since melanoma can be cured with a simple excision if detected early. In recent years, dermoscopy has proved valuable in visualizing the morphological structures in pigmented lesions. However, it has also been shown that dermoscopy is difficult to learn and subjective. Newer technologies such as infrared imaging, multispectral imaging, and confocal microscopy, have recently come to the forefront in providing greater diagnostic accuracy. These imaging technologies presented in this book can serve as an adjunct to physicians and  provide automated skin cancer screening. Although computerized techniques cannot as yet provide a definitive diagnosis, they can be used to improve biopsy decision-making as well as early melanoma detection, especially for pa...

  15. Evolutionary principles and their practical application.

    Science.gov (United States)

    Hendry, Andrew P; Kinnison, Michael T; Heino, Mikko; Day, Troy; Smith, Thomas B; Fitt, Gary; Bergstrom, Carl T; Oakeshott, John; Jørgensen, Peter S; Zalucki, Myron P; Gilchrist, George; Southerton, Simon; Sih, Andrew; Strauss, Sharon; Denison, Robert F; Carroll, Scott P

    2011-03-01

    Evolutionary principles are now routinely incorporated into medicine and agriculture. Examples include the design of treatments that slow the evolution of resistance by weeds, pests, and pathogens, and the design of breeding programs that maximize crop yield or quality. Evolutionary principles are also increasingly incorporated into conservation biology, natural resource management, and environmental science. Examples include the protection of small and isolated populations from inbreeding depression, the identification of key traits involved in adaptation to climate change, the design of harvesting regimes that minimize unwanted life-history evolution, and the setting of conservation priorities based on populations, species, or communities that harbor the greatest evolutionary diversity and potential. The adoption of evolutionary principles has proceeded somewhat independently in these different fields, even though the underlying fundamental concepts are the same. We explore these fundamental concepts under four main themes: variation, selection, connectivity, and eco-evolutionary dynamics. Within each theme, we present several key evolutionary principles and illustrate their use in addressing applied problems. We hope that the resulting primer of evolutionary concepts and their practical utility helps to advance a unified multidisciplinary field of applied evolutionary biology.

  16. A surrogate assisted evolutionary optimization method with application to the transonic airfoil design

    Science.gov (United States)

    Shahrokhi, Ava; Jahangirian, Alireza

    2010-06-01

    A multi-layer perceptron neural network (NN) method is used for efficient estimation of the expensive objective functions in the evolutionary optimization with the genetic algorithm (GA). The estimation capability of the NN is improved by dynamic retraining using the data from successive generations. In addition, the normal distribution of the training data variables is used to determine well-trained parts of the design space for the NN approximation. The efficiency of the method is demonstrated by two transonic airfoil design problems considering inviscid and viscous flow solvers. Results are compared with those of the simple GA and an alternative surrogate method. The total number of flow solver calls is reduced by about 40% using this fitness approximation technique, which in turn reduces the total computational time without influencing the convergence rate of the optimization algorithm. The accuracy of the NN estimation is considerably improved using the normal distribution approach compared with the alternative method.

  17. X-ray computer tomography, ultrasound and vibrational spectroscopic evaluation techniques of polymer gel dosimeters

    International Nuclear Information System (INIS)

    Baldock, Clive

    2004-01-01

    Since Gore et al published their paper on Fricke gel dosimetry, the predominant method of evaluation of both Fricke and polymer gel dosimeters has been magnetic resonance imaging (MRI). More recently optical computer tomography (CT) has also been a favourable evaluation method. Other techniques have been explored and developed as potential evaluation techniques in gel dosimetry. This paper reviews these other developments

  18. Head and neck computed tomography virtual endoscopy: evaluation of a new imaging technique.

    Science.gov (United States)

    Gallivan, R P; Nguyen, T H; Armstrong, W B

    1999-10-01

    To evaluate a new radiographic imaging technique: computed tomography virtual endoscopy (CTVE) for head and neck tumors. Twenty-one patients presenting with head and neck masses who underwent axial computed tomography (CT) scan with contrast were evaluated by CTVE. Comparisons were made with video-recorded images and operative records to evaluate the potential utility of this new imaging technique. Twenty-one patients with aerodigestive head and neck tumors were evaluated by CTVE. One patient had a nasal cylindrical cell papilloma; the remainder, squamous cell carcinomas distributed throughout the upper aerodigestive tract. Patients underwent complete head and neck examination, flexible laryngoscopy, axial CT with contrast, CTVE, and in most cases, operative endoscopy. Available clinical and radiographic evaluations were compared and correlated to CTVE findings. CTVE accurately demonstrated abnormalities caused by intraluminal tumor, but where there was apposition of normal tissue against tumor, inaccurate depictions of surface contour occurred. Contour resolution was limited, and mucosal irregularity could not be defined. There was very good overall correlation between virtual images, flexible laryngoscopic findings, rigid endoscopy, and operative evaluation in cases where oncological resections were performed. CTVE appears to be most accurate in evaluation of subglottic and nasopharyngeal anatomy in our series of patients. CTVE is a new radiographic technique that provides surface-contour details. The technique is undergoing rapid technical evolution, and although the image quality is limited in situations where there is apposition of tissue folds, there are a number of potential applications for this new imaging technique.

  19. EvolQG - An R package for evolutionary quantitative genetics [version 2; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Diogo Melo

    2016-06-01

    Full Text Available We present an open source package for performing evolutionary quantitative genetics analyses in the R environment for statistical computing. Evolutionary theory shows that evolution depends critically on the available variation in a given population. When dealing with many quantitative traits this variation is expressed in the form of a covariance matrix, particularly the additive genetic covariance matrix or sometimes the phenotypic matrix, when the genetic matrix is unavailable and there is evidence the phenotypic matrix is sufficiently similar to the genetic matrix. Given this mathematical representation of available variation, the EvolQG package provides functions for calculation of relevant evolutionary statistics; estimation of sampling error; corrections for this error; matrix comparison via correlations, distances and matrix decomposition; analysis of modularity patterns; and functions for testing evolutionary hypotheses on taxa diversification.

  20. Stochastic Computational Approach for Complex Nonlinear Ordinary Differential Equations

    International Nuclear Information System (INIS)

    Khan, Junaid Ali; Raja, Muhammad Asif Zahoor; Qureshi, Ijaz Mansoor

    2011-01-01

    We present an evolutionary computational approach for the solution of nonlinear ordinary differential equations (NLODEs). The mathematical modeling is performed by a feed-forward artificial neural network that defines an unsupervised error. The training of these networks is achieved by a hybrid intelligent algorithm, a combination of global search with genetic algorithm and local search by pattern search technique. The applicability of this approach ranges from single order NLODEs, to systems of coupled differential equations. We illustrate the method by solving a variety of model problems and present comparisons with solutions obtained by exact methods and classical numerical methods. The solution is provided on a continuous finite time interval unlike the other numerical techniques with comparable accuracy. With the advent of neuroprocessors and digital signal processors the method becomes particularly interesting due to the expected essential gains in the execution speed. (general)

  1. Statistical mechanics of spatial evolutionary games

    International Nuclear Information System (INIS)

    Miekisz, Jacek

    2004-01-01

    We discuss the long-run behaviour of stochastic dynamics of many interacting players in spatial evolutionary games. In particular, we investigate the effect of the number of players and the noise level on the stochastic stability of Nash equilibria. We discuss similarities and differences between systems of interacting players maximizing their individual payoffs and particles minimizing their interaction energy. We use concepts and techniques of statistical mechanics to study game-theoretic models. In order to obtain results in the case of the so-called potential games, we analyse the thermodynamic limit of the appropriate models of interacting particles

  2. Quasistatic zooming of FDTD E-field computations: the impact of down-scaling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Van de Kamer, J.B.; Kroeze, H.; De Leeuw, A.A.C.; Lagendijk, J.J.W. [Department of Radiotherapy, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX, Utrecht (Netherlands)

    2001-05-01

    Due to current computer limitations, regional hyperthermia treatment planning (HTP) is practically limited to a resolution of 1 cm, whereas a millimetre resolution is desired. Using the centimetre resolution E-vector-field distribution, computed with, for example, the finite-difference time-domain (FDTD) method and the millimetre resolution patient anatomy it is possible to obtain a millimetre resolution SAR distribution in a volume of interest (VOI) by means of quasistatic zooming. To compute the required low-resolution E-vector-field distribution, a low-resolution dielectric geometry is needed which is constructed by down-scaling the millimetre resolution dielectric geometry. In this study we have investigated which down-scaling technique results in a dielectric geometry that yields the best low-resolution E-vector-field distribution as input for quasistatic zooming. A segmented 2 mm resolution CT data set of a patient has been down-scaled to 1 cm resolution using three different techniques: 'winner-takes-all', 'volumetric averaging' and 'anisotropic volumetric averaging'. The E-vector-field distributions computed for those low-resolution dielectric geometries have been used as input for quasistatic zooming. The resulting zoomed-resolution SAR distributions were compared with a reference: the 2 mm resolution SAR distribution computed with the FDTD method. The E-vector-field distribution for both a simple phantom and the complex partial patient geometry down-scaled using 'anisotropic volumetric averaging' resulted in zoomed-resolution SAR distributions that best approximate the corresponding high-resolution SAR distribution (correlation 97, 96% and absolute averaged difference 6, 14% respectively). (author)

  3. Design of a computation tool for neutron spectrometry and dosimetry through evolutionary neural networks; Diseno de una herramienta de computo para la espectrometria y dosimetria de neutrones por medio de redes neuronales evolutivas

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz R, J. M.; Vega C, H. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica, Av. Ramon Lopez Velarde No. 801, Col. Centro, Zacatecas (Mexico); Martinez B, M. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Av. Ramon Lopez Velarde No. 801, Col. Centro, Zacatecas (Mexico); Gallego, E. [Universidad Politecnica de Madrid, Departamento de Ingenieria Nuclear, Jose Gutierrez Abascal No. 2, E-28006 Madrid (Spain)], e-mail: morvymmyahoo@com.mx

    2009-10-15

    The neutron dosimetry is one of the most complicated tasks of radiation protection, due to it is a complex technique and highly dependent of neutron energy. One of the first devices used to perform neutron spectrometry is the system known as spectrometric system of Bonner spheres, that continuous being one of spectrometers most commonly used. This system has disadvantages such as: the components weight, the low resolution of spectrum, long and drawn out procedure for the spectra reconstruction, which require an expert user in system management, the need of use a reconstruction code as BUNKIE, SAND, etc., which are based on an iterative reconstruction algorithm and whose greatest inconvenience is that for the spectrum reconstruction, are needed to provide to system and initial spectrum as close as possible to the desired spectrum get. Consequently, researchers have mentioned the need to developed alternative measurement techniques to improve existing monitoring systems for workers. Among these alternative techniques have been reported several reconstruction procedures based on artificial intelligence techniques such as genetic algorithms, artificial neural networks and hybrid systems of evolutionary artificial neural networks using genetic algorithms. However, the use of these techniques in the nuclear science area is not free of problems, so it has been suggested that more research is conducted in such a way as to solve these disadvantages. Because they are emerging technologies, there are no tools for the results analysis, so in this paper we present first the design of a computation tool that allow to analyze the neutron spectra and equivalent doses, obtained through the hybrid technology of neural networks and genetic algorithms. This tool provides an user graphical environment, friendly, intuitive and easy of operate. The speed of program operation is high, executing the analysis in a few seconds, so it may storage and or print the obtained information for

  4. A novel technique for presurgical nasoalveolar molding using computer-aided reverse engineering and rapid prototyping.

    Science.gov (United States)

    Yu, Quan; Gong, Xin; Wang, Guo-Min; Yu, Zhe-Yuan; Qian, Yu-Fen; Shen, Gang

    2011-01-01

    To establish a new method of presurgical nasoalveolar molding (NAM) using computer-aided reverse engineering and rapid prototyping technique in infants with unilateral cleft lip and palate (UCLP). Five infants (2 males and 3 females with mean age of 1.2 w) with complete UCLP were recruited. All patients were subjected to NAM before the cleft lip repair. The upper denture casts were recorded using a three-dimensional laser scanner within 2 weeks after birth in UCLP infants. A digital model was constructed and analyzed to simulate the NAM procedure with reverse engineering software. The digital geometrical data were exported to print the solid model with rapid prototyping system. The whole set of appliances was fabricated based on these solid models. Laser scanning and digital model construction simplified the NAM procedure and estimated the treatment objective. The appliances were fabricated based on the rapid prototyping technique, and for each patient, the complete set of appliances could be obtained at one time. By the end of presurgical NAM treatment, the cleft was narrowed, and the malformation of nasoalveolar segments was aligned normally. We have developed a novel technique of presurgical NAM based on a computer-aided design. The accurate digital denture model of UCLP infants could be obtained with laser scanning. The treatment design and appliance fabrication could be simplified with a computer-aided reverse engineering and rapid prototyping technique.

  5. An evolutionary perspective on anti-tumor immunity

    Directory of Open Access Journals (Sweden)

    David John Klinke

    2013-01-01

    Full Text Available The challenges associated with demonstrating a durable response using molecular targeted therapies in cancer has sparked a renewed interest in viewing cancer from an evolutionary perspective. Evolutionary processes have three common traits: heterogeneity, dynamics, and a selective fitness landscape. Mutagens randomly alter the genome of host cells creating a population of cells that contain different somatic mutations. This genomic rearrangement perturbs cellular homeostasis through changing how cells interact with their tissue microenvironment. To counterbalance the ability of mutated cells to outcompete for limited resources, control structures are encoded within the cell and within the organ system, such as innate and adaptive immunity, to restore cellular homeostasis. These control structures shape the selective fitness landscape and determine whether a cell that harbors particular somatic mutations is retained or eliminated from a cell population. While next-generation sequencing has revealed the complexity and heterogeneity of oncogenic transformation, understanding the dynamics of oncogenesis and how cancer cells alter the selective fitness landscape remain unclear. In this technology review, we will summarize how recent advances in technology have impacted our understanding of these three attributes of cancer as an evolutionary process. In particular, we will focus on how advances in genome sequencing have enabled quantifying cellular heterogeneity, advances in computational power have enabled explicit testing of postulated intra- and intercellular control structures against the available data using simulation, and advances in proteomics have enabled identifying novel mechanisms of cellular cross-talk that cancer cells use to alter the fitness landscape.

  6. ABCtoolbox: a versatile toolkit for approximate Bayesian computations

    Directory of Open Access Journals (Sweden)

    Neuenschwander Samuel

    2010-03-01

    Full Text Available Abstract Background The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. Results Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC. It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. Conclusion ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.

  7. Quantifying evolutionary dynamics from variant-frequency time series

    Science.gov (United States)

    Khatri, Bhavin S.

    2016-09-01

    From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series.

  8. More efficient evolutionary strategies for model calibration with watershed model for demonstration

    Science.gov (United States)

    Baggett, J. S.; Skahill, B. E.

    2008-12-01

    Evolutionary strategies allow automatic calibration of more complex models than traditional gradient based approaches, but they are more computationally intensive. We present several efficiency enhancements for evolution strategies, many of which are not new, but when combined have been shown to dramatically decrease the number of model runs required for calibration of synthetic problems. To reduce the number of expensive model runs we employ a surrogate objective function for an adaptively determined fraction of the population at each generation (Kern et al., 2006). We demonstrate improvements to the adaptive ranking strategy that increase its efficiency while sacrificing little reliability and further reduce the number of model runs required in densely sampled parts of parameter space. Furthermore, we include a gradient individual in each generation that is usually not selected when the search is in a global phase or when the derivatives are poorly approximated, but when selected near a smooth local minimum can dramatically increase convergence speed (Tahk et al., 2007). Finally, the selection of the gradient individual is used to adapt the size of the population near local minima. We show, by incorporating these enhancements into the Covariance Matrix Adaption Evolution Strategy (CMAES; Hansen, 2006), that their synergetic effect is greater than their individual parts. This hybrid evolutionary strategy exploits smooth structure when it is present but degrades to an ordinary evolutionary strategy, at worst, if smoothness is not present. Calibration of 2D-3D synthetic models with the modified CMAES requires approximately 10%-25% of the model runs of ordinary CMAES. Preliminary demonstration of this hybrid strategy will be shown for watershed model calibration problems. Hansen, N. (2006). The CMA Evolution Strategy: A Comparing Review. In J.A. Lozano, P. Larrañga, I. Inza and E. Bengoetxea (Eds.). Towards a new evolutionary computation. Advances in estimation of

  9. An Evolutionary Formulation of the Crossing Number Problem

    Directory of Open Access Journals (Sweden)

    Che Sheng Gan

    2009-01-01

    Full Text Available A graph drawing algorithm is presented which results in complete graphs having minimum crossings equal to that of Guy's conjecture. It is then generalized and formulated in an evolutionary algorithm (EA to perform constrained search for the crossing numbers. The main objective of this work is to present a suitable two-dimensional scheme which can greatly reduce the complexity of finding crossing numbers by using computer. Program performance criteria are presented and discussed. It is shown that the EA implementation provides good confirmation of the predicted crossing numbers.

  10. A security mechanism based on evolutionary game in fog computing

    Directory of Open Access Journals (Sweden)

    Yan Sun

    2018-02-01

    Full Text Available Fog computing is a distributed computing paradigm at the edge of the network and requires cooperation of users and sharing of resources. When users in fog computing open their resources, their devices are easily intercepted and attacked because they are accessed through wireless network and present an extensive geographical distribution. In this study, a credible third party was introduced to supervise the behavior of users and protect the security of user cooperation. A fog computing security mechanism based on human nervous system is proposed, and the strategy for a stable system evolution is calculated. The MATLAB simulation results show that the proposed mechanism can reduce the number of attack behaviors effectively and stimulate users to cooperate in application tasks positively.

  11. A security mechanism based on evolutionary game in fog computing.

    Science.gov (United States)

    Sun, Yan; Lin, Fuhong; Zhang, Nan

    2018-02-01

    Fog computing is a distributed computing paradigm at the edge of the network and requires cooperation of users and sharing of resources. When users in fog computing open their resources, their devices are easily intercepted and attacked because they are accessed through wireless network and present an extensive geographical distribution. In this study, a credible third party was introduced to supervise the behavior of users and protect the security of user cooperation. A fog computing security mechanism based on human nervous system is proposed, and the strategy for a stable system evolution is calculated. The MATLAB simulation results show that the proposed mechanism can reduce the number of attack behaviors effectively and stimulate users to cooperate in application tasks positively.

  12. A Simple Technique for Securing Data at Rest Stored in a Computing Cloud

    Science.gov (United States)

    Sedayao, Jeff; Su, Steven; Ma, Xiaohao; Jiang, Minghao; Miao, Kai

    "Cloud Computing" offers many potential benefits, including cost savings, the ability to deploy applications and services quickly, and the ease of scaling those application and services once they are deployed. A key barrier for enterprise adoption is the confidentiality of data stored on Cloud Computing Infrastructure. Our simple technique implemented with Open Source software solves this problem by using public key encryption to render stored data at rest unreadable by unauthorized personnel, including system administrators of the cloud computing service on which the data is stored. We validate our approach on a network measurement system implemented on PlanetLab. We then use it on a service where confidentiality is critical - a scanning application that validates external firewall implementations.

  13. 16th International workshop on Advanced Computing and Analysis Techniques in physics (ACAT)

    CERN Document Server

    Lokajicek, M; Tumova, N

    2015-01-01

    16th International workshop on Advanced Computing and Analysis Techniques in physics (ACAT). The ACAT workshop series, formerly AIHENP (Artificial Intelligence in High Energy and Nuclear Physics), was created back in 1990. Its main purpose is to gather researchers related with computing in physics research together, from both physics and computer science sides, and bring them a chance to communicate with each other. It has established bridges between physics and computer science research, facilitating the advances in our understanding of the Universe at its smallest and largest scales. With the Large Hadron Collider and many astronomy and astrophysics experiments collecting larger and larger amounts of data, such bridges are needed now more than ever. The 16th edition of ACAT aims to bring related researchers together, once more, to explore and confront the boundaries of computing, automatic data analysis and theoretical calculation technologies. It will create a forum for exchanging ideas among the fields an...

  14. Computer-aided auscultation learning system for nursing technique instruction.

    Science.gov (United States)

    Hou, Chun-Ju; Chen, Yen-Ting; Hu, Ling-Chen; Chuang, Chih-Chieh; Chiu, Yu-Hsien; Tsai, Ming-Shih

    2008-01-01

    Pulmonary auscultation is a physical assessment skill learned by nursing students for examining the respiratory system. Generally, a sound simulator equipped mannequin is used to group teach auscultation techniques via classroom demonstration. However, nursing students cannot readily duplicate this learning environment for self-study. The advancement of electronic and digital signal processing technologies facilitates simulating this learning environment. This study aims to develop a computer-aided auscultation learning system for assisting teachers and nursing students in auscultation teaching and learning. This system provides teachers with signal recording and processing of lung sounds and immediate playback of lung sounds for students. A graphical user interface allows teachers to control the measuring device, draw lung sound waveforms, highlight lung sound segments of interest, and include descriptive text. Effects on learning lung sound auscultation were evaluated for verifying the feasibility of the system. Fifteen nursing students voluntarily participated in the repeated experiment. The results of a paired t test showed that auscultative abilities of the students were significantly improved by using the computer-aided auscultation learning system.

  15. Large fluctuations and fixation in evolutionary games

    International Nuclear Information System (INIS)

    Assaf, Michael; Mobilia, Mauro

    2010-01-01

    We study large fluctuations in evolutionary games belonging to the coordination and anti-coordination classes. The dynamics of these games, modeling cooperation dilemmas, is characterized by a coexistence fixed point separating two absorbing states. We are particularly interested in the problem of fixation that refers to the possibility that a few mutants take over the entire population. Here, the fixation phenomenon is induced by large fluctuations and is investigated by a semiclassical WKB (Wentzel–Kramers–Brillouin) theory generalized to treat stochastic systems possessing multiple absorbing states. Importantly, this method allows us to analyze the combined influence of selection and random fluctuations on the evolutionary dynamics beyond the weak selection limit often considered in previous works. We accurately compute, including pre-exponential factors, the probability distribution function in the long-lived coexistence state and the mean fixation time necessary for a few mutants to take over the entire population in anti-coordination games, and also the fixation probability in the coordination class. Our analytical results compare excellently with extensive numerical simulations. Furthermore, we demonstrate that our treatment is superior to the Fokker–Planck approximation when the selection intensity is finite

  16. Quality-of-service sensitivity to bio-inspired/evolutionary computational methods for intrusion detection in wireless ad hoc multimedia sensor networks

    Science.gov (United States)

    Hortos, William S.

    2012-06-01

    In the author's previous work, a cross-layer protocol approach to wireless sensor network (WSN) intrusion detection an identification is created with multiple bio-inspired/evolutionary computational methods applied to the functions of the protocol layers, a single method to each layer, to improve the intrusion-detection performance of the protocol over that of one method applied to only a single layer's functions. The WSN cross-layer protocol design embeds GAs, anti-phase synchronization, ACO, and a trust model based on quantized data reputation at the physical, MAC, network, and application layer, respectively. The construct neglects to assess the net effect of the combined bioinspired methods on the quality-of-service (QoS) performance for "normal" data streams, that is, streams without intrusions. Analytic expressions of throughput, delay, and jitter, coupled with simulation results for WSNs free of intrusion attacks, are the basis for sensitivity analyses of QoS metrics for normal traffic to the bio-inspired methods.

  17. Evolutionary Explanations of Eating Disorders

    Directory of Open Access Journals (Sweden)

    Igor Kardum

    2008-12-01

    Full Text Available This article reviews several most important evolutionary mechanisms that underlie eating disorders. The first part clarifies evolutionary foundations of mental disorders and various mechanisms leading to their development. In the second part selective pressures and evolved adaptations causing contemporary epidemic of obesity as well as differences in dietary regimes and life-style between modern humans and their ancestors are described. Concerning eating disorders, a number of current evolutionary explanations of anorexia nervosa are presented together with their main weaknesses. Evolutionary explanations of eating disorders based on the reproductive suppression hypothesis and its variants derived from kin selection theory and the model of parental manipulation were elaborated. The sexual competition hypothesis of eating disorder, adapted to flee famine hypothesis as well as explanation based on the concept of social attention holding power and the need to belonging were also explained. The importance of evolutionary theory in modern conceptualization and research of eating disorders is emphasized.

  18. Evolutionary dynamics of fluctuating populations with strong mutualism

    Science.gov (United States)

    Chotibut, Thiparat; Nelson, David

    2013-03-01

    Evolutionary game theory with finite interacting populations is receiving increased attention, including subtle phenomena associated with number fluctuations, i.e., ``genetic drift.'' Models of cooperation and competition often utilize a simplified Moran model, with a strictly fixed total population size. We explore a more general evolutionary model with independent fluctuations in the numbers of two distinct species, in a regime characterized by ``strong mutualism.'' The model has two absorbing states, each corresponding to fixation of one of the two species, and allows exploration of the interplay between growth, competition, and mutualism. When mutualism is favored, number fluctuations eventually drive the system away from a stable fixed point, characterized by cooperation, to one of the absorbing states. Well-mixed populations will thus be taken over by a single species in a finite time, despite the bias towards cooperation. We calculate both the fixation probability and the mean fixation time as a function of the initial conditions and carrying capacities in the strong mutualism regime, using the method of matched asymptotic expansions. Our results are compared to computer simulations.

  19. The citation field of evolutionary economics

    NARCIS (Netherlands)

    Dolfsma, Wilfred; Leydesdorff, Loet

    2010-01-01

    Evolutionary economics has developed into an academic field of its own, institutionalized around, amongst others, the Journal of Evolutionary Economics (JEE). This paper analyzes the way and extent to which evolutionary economics has become an interdisciplinary journal, as its aim was: a journal

  20. Learning, epigenetics, and computation: An extension on Fitch's proposal. Comment on “Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition” by W. Tecumseh Fitch

    Science.gov (United States)

    Okanoya, Kazuo

    2014-09-01

    The comparative computational approach of Fitch [1] attempts to renew the classical David Marr paradigm of computation, algorithm, and implementation, by introducing evolutionary view of the relationship between neural architecture and cognition. This comparative evolutionary view provides constraints useful in narrowing down the problem space for both cognition and neural mechanisms. I will provide two examples from our own studies that reinforce and extend Fitch's proposal.

  1. Towards a mechanistic foundation of evolutionary theory.

    Science.gov (United States)

    Doebeli, Michael; Ispolatov, Yaroslav; Simon, Burt

    2017-02-15

    Most evolutionary thinking is based on the notion of fitness and related ideas such as fitness landscapes and evolutionary optima. Nevertheless, it is often unclear what fitness actually is, and its meaning often depends on the context. Here we argue that fitness should not be a basal ingredient in verbal or mathematical descriptions of evolution. Instead, we propose that evolutionary birth-death processes, in which individuals give birth and die at ever-changing rates, should be the basis of evolutionary theory, because such processes capture the fundamental events that generate evolutionary dynamics. In evolutionary birth-death processes, fitness is at best a derived quantity, and owing to the potential complexity of such processes, there is no guarantee that there is a simple scalar, such as fitness, that would describe long-term evolutionary outcomes. We discuss how evolutionary birth-death processes can provide useful perspectives on a number of central issues in evolution.

  2. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    Science.gov (United States)

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging. © 2014 Japanese Society of Hepato-Biliary-Pancreatic Surgery.

  3. Evolutionary game theory for physical and biological scientists. I. Training and validating population dynamics equations.

    Science.gov (United States)

    Liao, David; Tlsty, Thea D

    2014-08-06

    Failure to understand evolutionary dynamics has been hypothesized as limiting our ability to control biological systems. An increasing awareness of similarities between macroscopic ecosystems and cellular tissues has inspired optimism that game theory will provide insights into the progression and control of cancer. To realize this potential, the ability to compare game theoretic models and experimental measurements of population dynamics should be broadly disseminated. In this tutorial, we present an analysis method that can be used to train parameters in game theoretic dynamics equations, used to validate the resulting equations, and used to make predictions to challenge these equations and to design treatment strategies. The data analysis techniques in this tutorial are adapted from the analysis of reaction kinetics using the method of initial rates taught in undergraduate general chemistry courses. Reliance on computer programming is avoided to encourage the adoption of these methods as routine bench activities.

  4. Evolutionary optimization methods for accelerator design

    Science.gov (United States)

    Poklonskiy, Alexey A.

    Many problems from the fields of accelerator physics and beam theory can be formulated as optimization problems and, as such, solved using optimization methods. Despite growing efficiency of the optimization methods, the adoption of modern optimization techniques in these fields is rather limited. Evolutionary Algorithms (EAs) form a relatively new and actively developed optimization methods family. They possess many attractive features such as: ease of the implementation, modest requirements on the objective function, a good tolerance to noise, robustness, and the ability to perform a global search efficiently. In this work we study the application of EAs to problems from accelerator physics and beam theory. We review the most commonly used methods of unconstrained optimization and describe the GATool, evolutionary algorithm and the software package, used in this work, in detail. Then we use a set of test problems to assess its performance in terms of computational resources, quality of the obtained result, and the tradeoff between them. We justify the choice of GATool as a heuristic method to generate cutoff values for the COSY-GO rigorous global optimization package for the COSY Infinity scientific computing package. We design the model of their mutual interaction and demonstrate that the quality of the result obtained by GATool increases as the information about the search domain is refined, which supports the usefulness of this model. We Giscuss GATool's performance on the problems suffering from static and dynamic noise and study useful strategies of GATool parameter tuning for these and other difficult problems. We review the challenges of constrained optimization with EAs and methods commonly used to overcome them. We describe REPA, a new constrained optimization method based on repairing, in exquisite detail, including the properties of its two repairing techniques: REFIND and REPROPT. We assess REPROPT's performance on the standard constrained

  5. Evolutionary thinking: "A conversation with Carter Phipps about the role of evolutionary thinking in modern culture".

    Science.gov (United States)

    Hunt, Tam

    2014-12-01

    Evolution as an idea has a lengthy history, even though the idea of evolution is generally associated with Darwin today. Rebecca Stott provides an engaging and thoughtful overview of this history of evolutionary thinking in her 2013 book, Darwin's Ghosts: The Secret History of Evolution. Since Darwin, the debate over evolution-both how it takes place and, in a long war of words with religiously-oriented thinkers, whether it takes place-has been sustained and heated. A growing share of this debate is now devoted to examining how evolutionary thinking affects areas outside of biology. How do our lives change when we recognize that all is in flux? What can we learn about life more generally if we study change instead of stasis? Carter Phipps' book, Evolutionaries: Unlocking the Spiritual and Cultural Potential of Science's Greatest Idea, delves deep into this relatively new development. Phipps generally takes as a given the validity of the Modern Synthesis of evolutionary biology. His story takes us into, as the subtitle suggests, the spiritual and cultural implications of evolutionary thinking. Can religion and evolution be reconciled? Can evolutionary thinking lead to a new type of spirituality? Is our culture already being changed in ways that we don't realize by evolutionary thinking? These are all important questions and Phipps book is a great introduction to this discussion. Phipps is an author, journalist, and contributor to the emerging "integral" or "evolutionary" cultural movement that combines the insights of Integral Philosophy, evolutionary science, developmental psychology, and the social sciences. He has served as the Executive Editor of EnlightenNext magazine (no longer published) and more recently is the co-founder of the Institute for Cultural Evolution, a public policy think tank addressing the cultural roots of America's political challenges. What follows is an email interview with Phipps.

  6. Where Evolutionary Psychology Meets Cognitive Neuroscience: A Précis to Evolutionary Cognitive Neuroscience1

    Directory of Open Access Journals (Sweden)

    Austen L. Krill

    2007-01-01

    Full Text Available Cognitive neuroscience, the study of brain-behavior relationships, has long attempted to map the brain. The discipline is flourishing, with an increasing number of functional neuroimaging studies appearing in the scientific literature daily. Unlike biology and even psychology, the cognitive neurosciences have only recently begun to apply evolutionary meta-theory and methodological guidance. Approaching cognitive neuroscience from an evolutionary perspective allows scientists to apply biologically based theoretical guidance to their investigations and can be conducted in both humans and nonhuman animals. In fact, several investigations of this sort are underway in laboratories around the world. This paper and two new volumes (Platek, Keenan, and Shackelford [Eds.], 2007; Platek and Shackelford [Eds.], under contract represent the first formal attempts to document the burgeoning field of evolutionary cognitive neuroscience. Here, we briefly review the current state of the science of evolutionary cognitive neuroscience, the methods available to the evolutionary cognitive neuroscientist, and what we foresee as the future directions of the discipline.

  7. Culture belief based multi-objective hybrid differential evolutionary algorithm in short term hydrothermal scheduling

    International Nuclear Information System (INIS)

    Zhang Huifeng; Zhou Jianzhong; Zhang Yongchuan; Lu Youlin; Wang Yongqiang

    2013-01-01

    Highlights: ► Culture belief is integrated into multi-objective differential evolution. ► Chaotic sequence is imported to improve evolutionary population diversity. ► The priority of convergence rate is proved in solving hydrothermal problem. ► The results show the quality and potential of proposed algorithm. - Abstract: A culture belief based multi-objective hybrid differential evolution (CB-MOHDE) is presented to solve short term hydrothermal optimal scheduling with economic emission (SHOSEE) problem. This problem is formulated for compromising thermal cost and emission issue while considering its complicated non-linear constraints with non-smooth and non-convex characteristics. The proposed algorithm integrates a modified multi-objective differential evolutionary algorithm into the computation model of culture algorithm (CA) as well as some communication protocols between population space and belief space, three knowledge structures in belief space are redefined according to these problem-solving characteristics, and in the differential evolution a chaotic factor is embedded into mutation operator for avoiding the premature convergence by enlarging the search scale when the search trajectory reaches local optima. Furthermore, a new heuristic constraint-handling technique is utilized to handle those complex equality and inequality constraints of SHOSEE problem. After the application on hydrothermal scheduling system, the efficiency and stability of the proposed CB-MOHDE is verified by its more desirable results in comparison to other method established recently, and the simulation results also reveal that CB-MOHDE can be a promising alternative for solving SHOSEE.

  8. Essays on nonlinear evolutionary game dynamics

    NARCIS (Netherlands)

    Ochea, M.I.

    2010-01-01

    Evolutionary game theory has been viewed as an evolutionary repair of rational actor game theory in the hope that a population of boundedly rational players may attain convergence to classic rational solutions, such as the Nash Equilibrium, via some learning or evolutionary process. In this thesis

  9. Integrating genomics into evolutionary medicine.

    Science.gov (United States)

    Rodríguez, Juan Antonio; Marigorta, Urko M; Navarro, Arcadi

    2014-12-01

    The application of the principles of evolutionary biology into medicine was suggested long ago and is already providing insight into the ultimate causes of disease. However, a full systematic integration of medical genomics and evolutionary medicine is still missing. Here, we briefly review some cases where the combination of the two fields has proven profitable and highlight two of the main issues hindering the development of evolutionary genomic medicine as a mature field, namely the dissociation between fitness and health and the still considerable difficulties in predicting phenotypes from genotypes. We use publicly available data to illustrate both problems and conclude that new approaches are needed for evolutionary genomic medicine to overcome these obstacles. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Multi-objective optimization of an arch dam shape under static loads using an evolutionary game method

    Science.gov (United States)

    Meng, Rui; Cheong, Kang Hao; Bao, Wei; Wong, Kelvin Kian Loong; Wang, Lu; Xie, Neng-gang

    2018-06-01

    This article attempts to evaluate the safety and economic performance of an arch dam under the action of static loads. The geometric description of a crown cantilever section and the horizontal arch ring is presented. A three-objective optimization model of arch dam shape is established based on the arch dam volume, maximum principal tensile stress and total strain energy. The evolutionary game method is then applied to obtain the optimal solution. In the evolutionary game technique, a novel and more efficient exploration method of the game players' strategy space, named the 'sorting partition method under the threshold limit', is presented, with the game profit functions constructed according to both competitive and cooperative behaviour. By way of example, three optimization goals have all shown improvements over the initial solutions. In particular, the evolutionary game method has potentially faster convergence. This demonstrates the preliminary proof of principle of the evolutionary game method.

  11. Form of an evolutionary tradeoff affects eco-evolutionary dynamics in a predator-prey system.

    Science.gov (United States)

    Kasada, Minoru; Yamamichi, Masato; Yoshida, Takehito

    2014-11-11

    Evolution on a time scale similar to ecological dynamics has been increasingly recognized for the last three decades. Selection mediated by ecological interactions can change heritable phenotypic variation (i.e., evolution), and evolution of traits, in turn, can affect ecological interactions. Hence, ecological and evolutionary dynamics can be tightly linked and important to predict future dynamics, but our understanding of eco-evolutionary dynamics is still in its infancy and there is a significant gap between theoretical predictions and empirical tests. Empirical studies have demonstrated that the presence of genetic variation can dramatically change ecological dynamics, whereas theoretical studies predict that eco-evolutionary dynamics depend on the details of the genetic variation, such as the form of a tradeoff among genotypes, which can be more important than the presence or absence of the genetic variation. Using a predator-prey (rotifer-algal) experimental system in laboratory microcosms, we studied how different forms of a tradeoff between prey defense and growth affect eco-evolutionary dynamics. Our experimental results show for the first time to our knowledge that different forms of the tradeoff produce remarkably divergent eco-evolutionary dynamics, including near fixation, near extinction, and coexistence of algal genotypes, with quantitatively different population dynamics. A mathematical model, parameterized from completely independent experiments, explains the observed dynamics. The results suggest that knowing the details of heritable trait variation and covariation within a population is essential for understanding how evolution and ecology will interact and what form of eco-evolutionary dynamics will result.

  12. Development of computational technique for labeling magnetic flux-surfaces

    International Nuclear Information System (INIS)

    Nunami, Masanori; Kanno, Ryutaro; Satake, Shinsuke; Hayashi, Takaya; Takamaru, Hisanori

    2006-03-01

    In recent Large Helical Device (LHD) experiments, radial profiles of ion temperature, electric field, etc. are measured in the m/n=1/1 magnetic island produced by island control coils, where m is the poloidal mode number and n the toroidal mode number. When the transport of the plasma in the radial profiles is numerically analyzed, an average over a magnetic flux-surface in the island is a very useful concept to understand the transport. On averaging, a proper labeling of the flux-surfaces is necessary. In general, it is not easy to label the flux-surfaces in the magnetic field with the island, compared with the case of a magnetic field configuration having nested flux-surfaces. In the present paper, we have developed a new computational technique to label the magnetic flux-surfaces. This technique is constructed by using an optimization algorithm, which is known as an optimization method called the simulated annealing method. The flux-surfaces are discerned by using two labels: one is classification of the magnetic field structure, i.e., core, island, ergodic, and outside regions, and the other is a value of the toroidal magnetic flux. We have applied the technique to an LHD configuration with the m/n=1/1 island, and successfully obtained the discrimination of the magnetic field structure. (author)

  13. Application of parallel computing techniques to a large-scale reservoir simulation

    International Nuclear Information System (INIS)

    Zhang, Keni; Wu, Yu-Shu; Ding, Chris; Pruess, Karsten

    2001-01-01

    Even with the continual advances made in both computational algorithms and computer hardware used in reservoir modeling studies, large-scale simulation of fluid and heat flow in heterogeneous reservoirs remains a challenge. The problem commonly arises from intensive computational requirement for detailed modeling investigations of real-world reservoirs. This paper presents the application of a massive parallel-computing version of the TOUGH2 code developed for performing large-scale field simulations. As an application example, the parallelized TOUGH2 code is applied to develop a three-dimensional unsaturated-zone numerical model simulating flow of moisture, gas, and heat in the unsaturated zone of Yucca Mountain, Nevada, a potential repository for high-level radioactive waste. The modeling approach employs refined spatial discretization to represent the heterogeneous fractured tuffs of the system, using more than a million 3-D gridblocks. The problem of two-phase flow and heat transfer within the model domain leads to a total of 3,226,566 linear equations to be solved per Newton iteration. The simulation is conducted on a Cray T3E-900, a distributed-memory massively parallel computer. Simulation results indicate that the parallel computing technique, as implemented in the TOUGH2 code, is very efficient. The reliability and accuracy of the model results have been demonstrated by comparing them to those of small-scale (coarse-grid) models. These comparisons show that simulation results obtained with the refined grid provide more detailed predictions of the future flow conditions at the site, aiding in the assessment of proposed repository performance

  14. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    Science.gov (United States)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  15. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  16. Computer vision techniques applied to the quality control of ceramic plates

    OpenAIRE

    Silveira, Joaquim; Ferreira, Manuel João Oliveira; Santos, Cristina; Martins, Teresa

    2009-01-01

    This paper presents a system, based on computer vision techniques, that detects and quantifies different types of defects in ceramic plates. It was developed in collaboration with the industrial ceramic sector and consequently it was focused on the defects that are considered more quality depreciating by the Portuguese industry. They are of three main types: cracks; granules and relief surface. For each type the development was specific as far as image processing techn...

  17. Evolutionary and Comparative Genomics to Drive Rational Drug Design, with Particular Focus on Neuropeptide Seven-Transmembrane Receptors.

    Science.gov (United States)

    Furlong, Michael; Seong, Jae Young

    2017-01-01

    Seven transmembrane receptors (7TMRs), also known as G protein-coupled receptors, are popular targets of drug development, particularly 7TMR systems that are activated by peptide ligands. Although many pharmaceutical drugs have been discovered via conventional bulk analysis techniques the increasing availability of structural and evolutionary data are facilitating change to rational, targeted drug design. This article discusses the appeal of neuropeptide-7TMR systems as drug targets and provides an overview of concepts in the evolution of vertebrate genomes and gene families. Subsequently, methods that use evolutionary concepts and comparative analysis techniques to aid in gene discovery, gene function identification, and novel drug design are provided along with case study examples.

  18. Nano-computed tomography. Technique and applications; Nanocomputertomografie. Technik und Applikationen

    Energy Technology Data Exchange (ETDEWEB)

    Kampschulte, M.; Sender, J.; Litzlbauer, H.D.; Althoehn, U.; Schwab, J.D.; Alejandre-Lafont, E.; Martels, G.; Krombach, G.A. [University Hospital Giessen (Germany). Dept. of Diagnostic and Interventional Radiology; Langheinirch, A.C. [BG Trauma Hospital Frankfurt/Main (Germany). Dept. of Diagnostic and Interventional Radiology

    2016-02-15

    Nano-computed tomography (nano-CT) is an emerging, high-resolution cross-sectional imaging technique and represents a technical advancement of the established micro-CT technology. Based on the application of a transmission target X-ray tube, the focal spot size can be decreased down to diameters less than 400 nanometers (nm). Together with specific detectors and examination protocols, a superior spatial resolution up to 400 nm (10 % MTF) can be achieved, thereby exceeding the resolution capacity of typical micro-CT systems. The technical concept of nano-CT imaging as well as the basics of specimen preparation are demonstrated exemplarily. Characteristics of atherosclerotic plaques (intraplaque hemorrhage and calcifications) in a murine model of atherosclerosis (ApoE{sub (-/-)}/LDLR{sub (-/-)} double knockout mouse) are demonstrated in the context of superior spatial resolution in comparison to micro-CT. Furthermore, this article presents the application of nano-CT for imaging cerebral microcirculation (murine), lung structures (porcine), and trabecular microstructure (ovine) in contrast to micro-CT imaging. This review shows the potential of nano-CT as a radiological method in biomedical basic research and discusses the application of experimental, high resolution CT techniques in consideration of other high resolution cross-sectional imaging techniques.

  19. Efficient technique for computational design of thermoelectric materials

    Science.gov (United States)

    Núñez-Valdez, Maribel; Allahyari, Zahed; Fan, Tao; Oganov, Artem R.

    2018-01-01

    Efficient thermoelectric materials are highly desirable, and the quest for finding them has intensified as they could be promising alternatives to fossil energy sources. Here we present a general first-principles approach to predict, in multicomponent systems, efficient thermoelectric compounds. The method combines a robust evolutionary algorithm, a Pareto multiobjective optimization, density functional theory and a Boltzmann semi-classical calculation of thermoelectric efficiency. To test the performance and reliability of our overall framework, we use the well-known system Bi2Te3-Sb2Te3.

  20. 17th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2016)

    International Nuclear Information System (INIS)

    2016-01-01

    Preface The 2016 version of the International Workshop on Advanced Computing and Analysis Techniques in Physics Research took place on January 18-22, 2016, at the Universidad Técnica Federico Santa Maria -UTFSM- in Valparaiso, Chile. The present volume of IOP Conference Series is devoted to the selected scientific contributions presented at the workshop. In order to guarantee the scientific quality of the Proceedings all papers were thoroughly peer-reviewed by an ad-hoc Editorial Committee with the help of many careful reviewers. The ACAT Workshop series has a long tradition starting in 1990 (Lyon, France), and takes place in intervals of a year and a half. Formerly these workshops were known under the name AIHENP (Artificial Intelligence for High Energy and Nuclear Physics). Each edition brings together experimental and theoretical physicists and computer scientists/experts, from particle and nuclear physics, astronomy and astrophysics in order to exchange knowledge and experience in computing and data analysis in physics. Three tracks cover the main topics: Computing technology: languages and system architectures. Data analysis: algorithms and tools. Theoretical Physics: techniques and methods. Although most contributions and discussions are related to particle physics and computing, other fields like condensed matter physics, earth physics, biophysics are often addressed in the hope to share our approaches and visions. It created a forum for exchanging ideas among fields, exploring and promoting cutting-edge computing technologies and debating hot topics. (paper)

  1. Soft computing techniques toward modeling the water supplies of Cyprus.

    Science.gov (United States)

    Iliadis, L; Maris, F; Tachos, S

    2011-10-01

    This research effort aims in the application of soft computing techniques toward water resources management. More specifically, the target is the development of reliable soft computing models capable of estimating the water supply for the case of "Germasogeia" mountainous watersheds in Cyprus. Initially, ε-Regression Support Vector Machines (ε-RSVM) and fuzzy weighted ε-RSVMR models have been developed that accept five input parameters. At the same time, reliable artificial neural networks have been developed to perform the same job. The 5-fold cross validation approach has been employed in order to eliminate bad local behaviors and to produce a more representative training data set. Thus, the fuzzy weighted Support Vector Regression (SVR) combined with the fuzzy partition has been employed in an effort to enhance the quality of the results. Several rational and reliable models have been produced that can enhance the efficiency of water policy designers. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. A Novel Evolutionary Engineering Design Approach for Mixed-Domain Systems

    DEFF Research Database (Denmark)

    Fan, Zhun; Hu, J.; Seo, K.

    2004-01-01

    This paper presents an approach to engineering design of mixed-domain dynamic systems. The approach aims at system-level design and has two key features: first, it generates engineering designs that satisfy predefined specifications in an automatic manner; second, it can design systems belonging ...... often encountered in evolutionary computation, a HFC (Hierarchical Fair Competition) model is adopted in this work. Examples of an analog filter design and a MEM filter design illustrate the application of the approach....

  3. An Evolutionary Real-Time 3D Route Planner for Aircraft

    Institute of Scientific and Technical Information of China (English)

    郑昌文; 丁明跃; 周成平

    2003-01-01

    A novel evolutionary route planner for aircraft is proposed in this paper. In the new planner, individual candidates are evaluated with respect to the workspace, thus the computation of the configuration space is not required. By using problem-specific chromosome structure and genetic operators, the routes are generated in real time,with different mission constraints such as minimum route leg length and flying altitude, maximum turning angle, maximum climbing/diving angle and route distance constraint taken into account.

  4. Calculating evolutionary dynamics in structured populations.

    Directory of Open Access Journals (Sweden)

    Charles G Nathanson

    2009-12-01

    Full Text Available Evolution is shaping the world around us. At the core of every evolutionary process is a population of reproducing individuals. The outcome of an evolutionary process depends on population structure. Here we provide a general formula for calculating evolutionary dynamics in a wide class of structured populations. This class includes the recently introduced "games in phenotype space" and "evolutionary set theory." There can be local interactions for determining the relative fitness of individuals, but we require global updating, which means all individuals compete uniformly for reproduction. We study the competition of two strategies in the context of an evolutionary game and determine which strategy is favored in the limit of weak selection. We derive an intuitive formula for the structure coefficient, sigma, and provide a method for efficient numerical calculation.

  5. Evolutionary Multiplayer Games

    OpenAIRE

    Gokhale, Chaitanya S.; Traulsen, Arne

    2014-01-01

    Evolutionary game theory has become one of the most diverse and far reaching theories in biology. Applications of this theory range from cell dynamics to social evolution. However, many applications make it clear that inherent non-linearities of natural systems need to be taken into account. One way of introducing such non-linearities into evolutionary games is by the inclusion of multiple players. An example is of social dilemmas, where group benefits could e.g.\\ increase less than linear wi...

  6. Generator Approach to Evolutionary Optimization of Catalysts and its Integration with Surrogate Modeling

    Czech Academy of Sciences Publication Activity Database

    Holeňa, Martin; Linke, D.; Rodemerck, U.

    2011-01-01

    Roč. 159, č. 1 (2011), s. 84-95 ISSN 0920-5861 R&D Projects: GA ČR GA201/08/0802 Institutional research plan: CEZ:AV0Z10300504 Keywords : optimization of catalytic materials * evolutionary optimization * surrogate modeling * artificial neural networks * multilayer perceptron * regression boosting Subject RIV: IN - Informatics, Computer Science Impact factor: 3.407, year: 2011

  7. A Multi Agent System for Flow-Based Intrusion Detection Using Reputation and Evolutionary Computation

    Science.gov (United States)

    2011-03-01

    pertinent example of the application of Evolutionary Algorithms to pattern recognition comes from Radtke et al. [130]. The authors apply Multi- Objective...J., T. Zseby, and B. Claise. S. Zander,” Requirements for IP Flow Information Export (IPFIX). Technical report, RFC 3917, October 2004. [130] Radtke ...hal.inria.fr/inria-00104200/en/. [131] Radtke , P.V.W., T. Wong, and R. Sabourin. “A multi-objective memetic al- gorithm for intelligent feature extraction

  8. Asymmetric Evolutionary Games

    Science.gov (United States)

    McAvoy, Alex; Hauert, Christoph

    2015-01-01

    Evolutionary game theory is a powerful framework for studying evolution in populations of interacting individuals. A common assumption in evolutionary game theory is that interactions are symmetric, which means that the players are distinguished by only their strategies. In nature, however, the microscopic interactions between players are nearly always asymmetric due to environmental effects, differing baseline characteristics, and other possible sources of heterogeneity. To model these phenomena, we introduce into evolutionary game theory two broad classes of asymmetric interactions: ecological and genotypic. Ecological asymmetry results from variation in the environments of the players, while genotypic asymmetry is a consequence of the players having differing baseline genotypes. We develop a theory of these forms of asymmetry for games in structured populations and use the classical social dilemmas, the Prisoner’s Dilemma and the Snowdrift Game, for illustrations. Interestingly, asymmetric games reveal essential differences between models of genetic evolution based on reproduction and models of cultural evolution based on imitation that are not apparent in symmetric games. PMID:26308326

  9. Multi-agent evolutionary systems for the generation of complex virtual worlds

    Directory of Open Access Journals (Sweden)

    J. Kruse

    2016-01-01

    Full Text Available Modern films, games and virtual reality applications are dependent on convincing computer graphics. Highly complex models are a requirement for the successful delivery of many scenes and environments. While workflows such as rendering, compositing and animation have been streamlined to accommodate increasing demands, modelling complex models is still a laborious task. This paper introduces the computational benefits of an Interactive Genetic Algorithm (IGA to computer graphics modelling while compensating the effects of user fatigue, a common issue with Interactive Evolutionary Computation. An intelligent agent is used in conjunction with an IGA that offers the potential to reduce the effects of user fatigue by learning from the choices made by the human designer and directing the search accordingly. This workflow accelerates the layout and distribution of basic elements to form complex models. It captures the designer’s intent through interaction, and encourages playful discovery.

  10. Computer aided production of manufacturing CAMAC-wired boards by the multiwire-technique

    Energy Technology Data Exchange (ETDEWEB)

    Martini, M; Brehmer, W

    1975-10-01

    The multiwire-technique is a computer controlled wiring method for the manufacturing of circuit boards with insulated conductors. The technical data for production are dimensional drawings of the board and a list of all points which are to be connected. The listing must be in absolute co-ordinates including a list of all soldering points for component parts and a reproducible print pattern for inscription. For this wiring method a CAMAC standard board, a layout plan with alpha-numeric symbols, and a computer program which produces the essential technical data were developed. A description of the alpha-numeric symbols, the quality of the program, recognition and checking of these symbols, and the produced technical data is presented. (auth)

  11. Evolutionary global optimization, manifolds and applications

    CERN Document Server

    Aguiar e Oliveira Junior, Hime

    2016-01-01

    This book presents powerful techniques for solving global optimization problems on manifolds by means of evolutionary algorithms, and shows in practice how these techniques can be applied to solve real-world problems. It describes recent findings and well-known key facts in general and differential topology, revisiting them all in the context of application to current optimization problems. Special emphasis is put on game theory problems. Here, these problems are reformulated as constrained global optimization tasks and solved with the help of Fuzzy ASA. In addition, more abstract examples, including minimizations of well-known functions, are also included. Although the Fuzzy ASA approach has been chosen as the main optimizing paradigm, the book suggests that other metaheuristic methods could be used as well. Some of them are introduced, together with their advantages and disadvantages. Readers should possess some knowledge of linear algebra, and of basic concepts of numerical analysis and probability theory....

  12. Artificial intelligence in pharmaceutical product formulation: neural computing

    Directory of Open Access Journals (Sweden)

    Svetlana Ibrić

    2009-10-01

    Full Text Available The properties of a formulation are determined not only by the ratios in which the ingredients are combined but also by the processing conditions. Although the relationships between the ingredient levels, processing conditions, and product performance may be known anecdotally, they can rarely be quantified. In the past, formulators tended to use statistical techniques to model their formulations, relying on response surfaces to provide a mechanism for optimazation. However, the optimization by such a method can be misleading, especially if the formulation is complex. More recently, advances in mathematics and computer science have led to the development of alternative modeling and data mining techniques which work with a wider range of data sources: neural networks (an attempt to mimic the processing of the human brain; genetic algorithms (an attempt to mimic the evolutionary process by which biological systems self-organize and adapt, and fuzzy logic (an attempt to mimic the ability of the human brain to draw conclusions and generate responses based on incomplete or imprecise information. In this review the current technology will be examined, as well as its application in pharmaceutical formulation and processing. The challenges, benefits and future possibilities of neural computing will be discussed.

  13. Core principles of evolutionary medicine: A Delphi study.

    Science.gov (United States)

    Grunspan, Daniel Z; Nesse, Randolph M; Barnes, M Elizabeth; Brownell, Sara E

    2018-01-01

    Evolutionary medicine is a rapidly growing field that uses the principles of evolutionary biology to better understand, prevent and treat disease, and that uses studies of disease to advance basic knowledge in evolutionary biology. Over-arching principles of evolutionary medicine have been described in publications, but our study is the first to systematically elicit core principles from a diverse panel of experts in evolutionary medicine. These principles should be useful to advance recent recommendations made by The Association of American Medical Colleges and the Howard Hughes Medical Institute to make evolutionary thinking a core competency for pre-medical education. The Delphi method was used to elicit and validate a list of core principles for evolutionary medicine. The study included four surveys administered in sequence to 56 expert panelists. The initial open-ended survey created a list of possible core principles; the three subsequent surveys winnowed the list and assessed the accuracy and importance of each principle. Fourteen core principles elicited at least 80% of the panelists to agree or strongly agree that they were important core principles for evolutionary medicine. These principles over-lapped with concepts discussed in other articles discussing key concepts in evolutionary medicine. This set of core principles will be helpful for researchers and instructors in evolutionary medicine. We recommend that evolutionary medicine instructors use the list of core principles to construct learning goals. Evolutionary medicine is a young field, so this list of core principles will likely change as the field develops further.

  14. Conceptual Barriers to Progress Within Evolutionary Biology.

    Science.gov (United States)

    Laland, Kevin N; Odling-Smee, John; Feldman, Marcus W; Kendal, Jeremy

    2009-08-01

    In spite of its success, Neo-Darwinism is faced with major conceptual barriers to further progress, deriving directly from its metaphysical foundations. Most importantly, neo-Darwinism fails to recognize a fundamental cause of evolutionary change, "niche construction". This failure restricts the generality of evolutionary theory, and introduces inaccuracies. It also hinders the integration of evolutionary biology with neighbouring disciplines, including ecosystem ecology, developmental biology, and the human sciences. Ecology is forced to become a divided discipline, developmental biology is stubbornly difficult to reconcile with evolutionary theory, and the majority of biologists and social scientists are still unhappy with evolutionary accounts of human behaviour. The incorporation of niche construction as both a cause and a product of evolution removes these disciplinary boundaries while greatly generalizing the explanatory power of evolutionary theory.

  15. Computer vision techniques for rotorcraft low-altitude flight

    Science.gov (United States)

    Sridhar, Banavar; Cheng, Victor H. L.

    1988-01-01

    A description is given of research that applies techniques from computer vision to automation of rotorcraft navigation. The effort emphasizes the development of a methodology for detecting the ranges to obstacles in the region of interest based on the maximum utilization of passive sensors. The range map derived from the obstacle detection approach can be used as obstacle data for the obstacle avoidance in an automataic guidance system and as advisory display to the pilot. The lack of suitable flight imagery data, however, presents a problem in the verification of concepts for obstacle detection. This problem is being addressed by the development of an adequate flight database and by preprocessing of currently available flight imagery. Some comments are made on future work and how research in this area relates to the guidance of other autonomous vehicles.

  16. Implementation of an evolutionary algorithm in planning investment in a power distribution system

    Directory of Open Access Journals (Sweden)

    Carlos Andrés García Montoya

    2011-06-01

    Full Text Available The definition of an investment plan to implement in a distribution power system, is a task that constantly faced by utilities. This work presents a methodology for determining the investment plan for a distribution power system under a shortterm, using as a criterion for evaluating investment projects, associated costs and customers benefit from its implementation. Given the number of projects carried out annually on the system, the definition of an investment plan requires the use of computational tools to evaluate, a set of possibilities, the one that best suits the needs of the present system and better results. That is why in the job, implementing a multi objective evolutionary algorithm SPEA (Strength Pareto Evolutionary Algorithm, which, based on the principles of Pareto optimality, it deliver to the planning expert, the best solutions found in the optimization process. The performance of the algorithm is tested using a set of projects to determine the best among the possible plans. We analyze also the effect of operators on the performance of evolutionary algorithm and results.

  17. Contemporary issues in evolutionary biology

    Indian Academy of Sciences (India)

    These discussions included, among others, the possible consequences of nonDNA-based inheritance—epigenetics and cultural evolution, niche construction, and developmental mechanisms on our understanding of the evolutionary process, speciation, complexity in biology, and constructing a formal evolutionary theory.

  18. Full parallax three-dimensional computer generated hologram with occlusion effect using ray casting technique

    International Nuclear Information System (INIS)

    Zhang, Hao; Tan, Qiaofeng; Jin, Guofan

    2013-01-01

    Holographic display is capable of reconstructing the whole optical wave field of a three-dimensional (3D) scene. It is the only one among all the 3D display techniques that can produce all the depth cues. With the development of computing technology and spatial light modulators, computer generated holograms (CGHs) can now be used to produce dynamic 3D images of synthetic objects. Computation holography becomes highly complicated and demanding when it is employed to produce real 3D images. Here we present a novel algorithm for generating a full parallax 3D CGH with occlusion effect, which is an important property of 3D perception, but has often been neglected in fully computed hologram synthesis. The ray casting technique, which is widely used in computer graphics, is introduced to handle the occlusion issue of CGH computation. Horizontally and vertically distributed rays are projected from each hologram sample to the 3D objects to obtain the complex amplitude distribution. The occlusion issue is handled by performing ray casting calculations to all the hologram samples. The proposed algorithm has no restriction on or approximation to the 3D objects, and hence it can produce reconstructed images with correct shading effect and no visible artifacts. Programmable graphics processing unit (GPU) is used to perform parallel calculation. This is made possible because each hologram sample belongs to an independent operation. To demonstrate the performance of our proposed algorithm, an optical experiment is performed to reconstruct the 3D scene by using a phase-only spatial light modulator. We can easily perceive the accommodation cue by focusing our eyes on different depths of the scene and the motion parallax cue with occlusion effect by moving our eyes around. The experiment result confirms that the CGHs produced by our algorithm can successfully reconstruct 3D images with all the depth cues.

  19. NNLO computational techniques: The cases H→γγ and H→gg

    Science.gov (United States)

    Actis, Stefano; Passarino, Giampiero; Sturm, Christian; Uccirati, Sandro

    2009-04-01

    A large set of techniques needed to compute decay rates at the two-loop level are derived and systematized. The main emphasis of the paper is on the two Standard Model decays H→γγ and H→gg. The techniques, however, have a much wider range of application: they give practical examples of general rules for two-loop renormalization; they introduce simple recipes for handling internal unstable particles in two-loop processes; they illustrate simple procedures for the extraction of collinear logarithms from the amplitude. The latter is particularly relevant to show cancellations, e.g. cancellation of collinear divergencies. Furthermore, the paper deals with the proper treatment of non-enhanced two-loop QCD and electroweak contributions to different physical (pseudo-)observables, showing how they can be transformed in a way that allows for a stable numerical integration. Numerical results for the two-loop percentage corrections to H→γγ,gg are presented and discussed. When applied to the process pp→gg+X→H+X, the results show that the electroweak scaling factor for the cross section is between -4% and +6% in the range 100 GeV500 GeV, without incongruent large effects around the physical electroweak thresholds, thereby showing that only a complete implementation of the computational scheme keeps two-loop corrections under control.

  20. An innovative privacy preserving technique for incremental datasets on cloud computing.

    Science.gov (United States)

    Aldeen, Yousra Abdul Alsahib S; Salleh, Mazleena; Aljeroudi, Yazan

    2016-08-01

    Cloud computing (CC) is a magnificent service-based delivery with gigantic computer processing power and data storage across connected communications channels. It imparted overwhelming technological impetus in the internet (web) mediated IT industry, where users can easily share private data for further analysis and mining. Furthermore, user affable CC services enable to deploy sundry applications economically. Meanwhile, simple data sharing impelled various phishing attacks and malware assisted security threats. Some privacy sensitive applications like health services on cloud that are built with several economic and operational benefits necessitate enhanced security. Thus, absolute cyberspace security and mitigation against phishing blitz became mandatory to protect overall data privacy. Typically, diverse applications datasets are anonymized with better privacy to owners without providing all secrecy requirements to the newly added records. Some proposed techniques emphasized this issue by re-anonymizing the datasets from the scratch. The utmost privacy protection over incremental datasets on CC is far from being achieved. Certainly, the distribution of huge datasets volume across multiple storage nodes limits the privacy preservation. In this view, we propose a new anonymization technique to attain better privacy protection with high data utility over distributed and incremental datasets on CC. The proficiency of data privacy preservation and improved confidentiality requirements is demonstrated through performance evaluation. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Self-Organized Criticality and Mass Extinction in Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Krink, Thiemo; Thomsen, Rene

    2001-01-01

    The gaps in the fossil record gave rise to the hypothesis that evolution proceeded in long periods of stasis, which alternated with occasional, rapid changes that yielded evolutionary progress. One mechanism that could cause these punctuated bursts is the re-colonbation of changing and deserted...... at a critical state between chaos and order, known as self-organized criticality (SOC). Based on this background, we used SOC to control the size of spatial extinction zones in a diffusion model. The SOC selection process was easy to implement and implied only negligible computational costs. Our results show...

  2. An Evolutionary Approach for Robust Layout Synthesis of MEMS

    DEFF Research Database (Denmark)

    Fan, Zhun; Wang, Jiachuan; Goodman, Erik

    2005-01-01

    The paper introduces a robust design method for layout synthesis of MEM resonators subject to inherent geometric uncertainties such as the fabrication error on the sidewall of the structure. The robust design problem is formulated as a multi-objective constrained optimisation problem after certain...... assumptions and treated with multiobjective genetic algorithm (MOGA), a special type of evolutionary computing approaches. Case study based on layout synthesis of a comb-driven MEM resonator shows that the approach proposed in this paper can lead to design results that meet the target performance and are less...

  3. Research traditions and evolutionary explanations in medicine.

    Science.gov (United States)

    Méthot, Pierre-Olivier

    2011-02-01

    In this article, I argue that distinguishing 'evolutionary' from 'Darwinian' medicine will help us assess the variety of roles that evolutionary explanations can play in a number of medical contexts. Because the boundaries of evolutionary and Darwinian medicine overlap to some extent, however, they are best described as distinct 'research traditions' rather than as competing paradigms. But while evolutionary medicine does not stand out as a new scientific field of its own, Darwinian medicine is united by a number of distinctive theoretical and methodological claims. For example, evolutionary medicine and Darwinian medicine can be distinguished with respect to the styles of evolutionary explanations they employ. While the former primarily involves 'forward looking' explanations, the latter depends mostly on 'backward looking' explanations. A forward looking explanation tries to predict the effects of ongoing evolutionary processes on human health and disease in contemporary environments (e.g., hospitals). In contrast, a backward looking explanation typically applies evolutionary principles from the vantage point of humans' distant biological past in order to assess present states of health and disease. Both approaches, however, are concerned with the prevention and control of human diseases. In conclusion, I raise some concerns about the claim that 'nothing in medicine makes sense except in the light of evolution'.

  4. Applications of evolutionary economic geography

    NARCIS (Netherlands)

    Boschma, R.A.; Frenken, K.; Puranam, Krishna Kishore; Ravi Kumar Jain B., xx

    2008-01-01

    This paper is written as the first chapter of an edited volume on evolutionary economics and economic geography (Frenken, K., editor, Applied Evolutionary Economics and Economic Geography, Cheltenham: Edward Elgar, expected publication date February 2007). The paper reviews empirical applications of

  5. Development of a technique for three-dimensional image reconstruction from emission computed tomograms (ECT)

    International Nuclear Information System (INIS)

    Gerischer, R.

    1987-01-01

    The described technique for three-dimensional image reconstruction from ECT sections is based on a simple procedure, which can be carried out with the aid of any standard-type computer used in nuclear medicine and requires no sophisticated arithmetic approach. (TRV) [de

  6. Home and away- the evolutionary dynamics of homing endonucleases

    Directory of Open Access Journals (Sweden)

    Barzel Adi

    2011-11-01

    Full Text Available Abstract Background Homing endonucleases (HEases are a large and diverse group of site-specific DNAases. They reside within self-splicing introns and inteins, and promote their horizontal dissemination. In recent years, HEases have been the focus of extensive research due to their promising potential use in gene targeting procedures for the treatment of genetic diseases and for the genetic engineering of crop, animal models and cell lines. Results Using mathematical analysis and computational modeling, we present here a novel account for the evolution and population dynamics of HEase genes (HEGs. We describe HEGs as paradoxical selfish elements whose long-term persistence in a single population relies on low transmission rates and a positive correlation between transmission efficiency and toxicity. Conclusion Plausible conditions allow HEGs to sustain at high frequency through long evolutionary periods, with the endonuclease frequency being either at equilibrium or periodically oscillating. The predictions of our model may prove important not only for evolutionary theory but also for gene therapy and bio-engineering applications of HEases.

  7. Application of Neural Network Optimized by Mind Evolutionary Computation in Building Energy Prediction

    Science.gov (United States)

    Song, Chen; Zhong-Cheng, Wu; Hong, Lv

    2018-03-01

    Building Energy forecasting plays an important role in energy management and plan. Using mind evolutionary algorithm to find the optimal network weights and threshold, to optimize the BP neural network, can overcome the problem of the BP neural network into a local minimum point. The optimized network is used for time series prediction, and the same month forecast, to get two predictive values. Then two kinds of predictive values are put into neural network, to get the final forecast value. The effectiveness of the method was verified by experiment with the energy value of three buildings in Hefei.

  8. Multislice Spiral Computed Tomography of the Heart: Technique, Current Applications, and Perspective

    International Nuclear Information System (INIS)

    Mahnken, Andreas H.; Wildberger, Joachim E.; Koos, Ralf; Guenther, Rolf W.

    2005-01-01

    Multislice spiral computed tomography (MSCT) is a rapidly evolving, noninvasive technique for cardiac imaging. Knowledge of the principle of electrocardiogram-gated MSCT and its limitations in clinical routine are needed to optimize image quality. Therefore, the basic technical principle including essentials of image postprocessing is described. Cardiac MSCT imaging was initially focused on coronary calcium scoring, MSCT coronary angiography, and analysis of left ventricular function. Recent studies also evaluated the ability of cardiac MSCT to visualize myocardial infarction and assess valvular morphology. In combination with experimental approaches toward the assessment of aortic valve function and myocardial viability, cardiac MSCT holds the potential for a comprehensive examination of the heart using one single examination technique

  9. Measurement of mesothelioma on thoracic CT scans: A comparison of manual and computer-assisted techniques

    International Nuclear Information System (INIS)

    Armato, Samuel G. III; Oxnard, Geoffrey R.; MacMahon, Heber; Vogelzang, Nicholas J.; Kindler, Hedy L.; Kocherginsky, Masha; Starkey, Adam

    2004-01-01

    Our purpose in this study was to evaluate the variability of manual mesothelioma tumor thickness measurements in computed tomography (CT) scans and to assess the relative performance of six computerized measurement algorithms. The CT scans of 22 patients with malignant pleural mesothelioma were collected. In each scan, an initial observer identified up to three sites in each of three CT sections at which tumor thickness measurements were to be made. At each site, five observers manually measured tumor thickness through a computer interface. Three observers repeated these measurements during three separate sessions. Inter- and intra-observer variability in the manual measurement of tumor thickness was assessed. Six automated measurement algorithms were developed based on the geometric relationship between a specified measurement site and the automatically extracted lung regions. Computer-generated measurements were compared with manual measurements. The tumor thickness measurements of different observers were highly correlated (r≥0.99); however, the 95% limits of agreement for relative inter-observer difference spanned a range of 30%. Tumor thickness measurements generated by the computer algorithms also correlated highly with the average of observer measurements (r≥0.93). We have developed computerized techniques for the measurement of mesothelioma tumor thickness in CT scans. These techniques achieved varying levels of agreement with measurements made by human observers

  10. Mathematics and evolutionary biology make bioinformatics education comprehensible

    Science.gov (United States)

    Weisstein, Anton E.

    2013-01-01

    The patterns of variation within a molecular sequence data set result from the interplay between population genetic, molecular evolutionary and macroevolutionary processes—the standard purview of evolutionary biologists. Elucidating these patterns, particularly for large data sets, requires an understanding of the structure, assumptions and limitations of the algorithms used by bioinformatics software—the domain of mathematicians and computer scientists. As a result, bioinformatics often suffers a ‘two-culture’ problem because of the lack of broad overlapping expertise between these two groups. Collaboration among specialists in different fields has greatly mitigated this problem among active bioinformaticians. However, science education researchers report that much of bioinformatics education does little to bridge the cultural divide, the curriculum too focused on solving narrow problems (e.g. interpreting pre-built phylogenetic trees) rather than on exploring broader ones (e.g. exploring alternative phylogenetic strategies for different kinds of data sets). Herein, we present an introduction to the mathematics of tree enumeration, tree construction, split decomposition and sequence alignment. We also introduce off-line downloadable software tools developed by the BioQUEST Curriculum Consortium to help students learn how to interpret and critically evaluate the results of standard bioinformatics analyses. PMID:23821621

  11. Mathematics and evolutionary biology make bioinformatics education comprehensible.

    Science.gov (United States)

    Jungck, John R; Weisstein, Anton E

    2013-09-01

    The patterns of variation within a molecular sequence data set result from the interplay between population genetic, molecular evolutionary and macroevolutionary processes-the standard purview of evolutionary biologists. Elucidating these patterns, particularly for large data sets, requires an understanding of the structure, assumptions and limitations of the algorithms used by bioinformatics software-the domain of mathematicians and computer scientists. As a result, bioinformatics often suffers a 'two-culture' problem because of the lack of broad overlapping expertise between these two groups. Collaboration among specialists in different fields has greatly mitigated this problem among active bioinformaticians. However, science education researchers report that much of bioinformatics education does little to bridge the cultural divide, the curriculum too focused on solving narrow problems (e.g. interpreting pre-built phylogenetic trees) rather than on exploring broader ones (e.g. exploring alternative phylogenetic strategies for different kinds of data sets). Herein, we present an introduction to the mathematics of tree enumeration, tree construction, split decomposition and sequence alignment. We also introduce off-line downloadable software tools developed by the BioQUEST Curriculum Consortium to help students learn how to interpret and critically evaluate the results of standard bioinformatics analyses.

  12. Bidirectional Dynamic Diversity Evolutionary Algorithm for Constrained Optimization

    Directory of Open Access Journals (Sweden)

    Weishang Gao

    2013-01-01

    Full Text Available Evolutionary algorithms (EAs were shown to be effective for complex constrained optimization problems. However, inflexible exploration-exploitation and improper penalty in EAs with penalty function would lead to losing the global optimum nearby or on the constrained boundary. To determine an appropriate penalty coefficient is also difficult in most studies. In this paper, we propose a bidirectional dynamic diversity evolutionary algorithm (Bi-DDEA with multiagents guiding exploration-exploitation through local extrema to the global optimum in suitable steps. In Bi-DDEA potential advantage is detected by three kinds of agents. The scale and the density of agents will change dynamically according to the emerging of potential optimal area, which play an important role of flexible exploration-exploitation. Meanwhile, a novel double optimum estimation strategy with objective fitness and penalty fitness is suggested to compute, respectively, the dominance trend of agents in feasible region and forbidden region. This bidirectional evolving with multiagents can not only effectively avoid the problem of determining penalty coefficient but also quickly converge to the global optimum nearby or on the constrained boundary. By examining the rapidity and veracity of Bi-DDEA across benchmark functions, the proposed method is shown to be effective.

  13. Evolutionary psychology: new perspectives on cognition and motivation.

    Science.gov (United States)

    Cosmides, Leda; Tooby, John

    2013-01-01

    Evolutionary psychology is the second wave of the cognitive revolution. The first wave focused on computational processes that generate knowledge about the world: perception, attention, categorization, reasoning, learning, and memory. The second wave views the brain as composed of evolved computational systems, engineered by natural selection to use information to adaptively regulate physiology and behavior. This shift in focus--from knowledge acquisition to the adaptive regulation of behavior--provides new ways of thinking about every topic in psychology. It suggests a mind populated by a large number of adaptive specializations, each equipped with content-rich representations, concepts, inference systems, and regulatory variables, which are functionally organized to solve the complex problems of survival and reproduction encountered by the ancestral hunter-gatherers from whom we are descended. We present recent empirical examples that illustrate how this approach has been used to discover new features of attention, categorization, reasoning, learning, emotion, and motivation.

  14. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, J.G., E-mail: jglezg2002@gmail.es [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Rubiano, J.G. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Winter, G. [Instituto Universitario de Sistemas Inteligentes y Aplicaciones Numéricas en la Ingeniería, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Martel, P. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Bolivar, J.P. [Departamento de Física Aplicada, Universidad de Huelva, 21071 Huelva (Spain)

    2017-06-21

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs. - Highlights: • A computational method for characterizing HPGe detectors has been generalized. • The new version is usable for a wider range of sample geometries. • It starts from reference FEPEs obtained through a standard calibration procedure. • A model of an HPGe XtRa detector has been

  15. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    International Nuclear Information System (INIS)

    Guerra, J.G.; Rubiano, J.G.; Winter, G.; Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Martel, P.; Bolivar, J.P.

    2017-01-01

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs. - Highlights: • A computational method for characterizing HPGe detectors has been generalized. • The new version is usable for a wider range of sample geometries. • It starts from reference FEPEs obtained through a standard calibration procedure. • A model of an HPGe XtRa detector has been

  16. Evolutionary fuzzy ARTMAP neural networks for classification of semiconductor defects.

    Science.gov (United States)

    Tan, Shing Chiang; Watada, Junzo; Ibrahim, Zuwairie; Khalid, Marzuki

    2015-05-01

    Wafer defect detection using an intelligent system is an approach of quality improvement in semiconductor manufacturing that aims to enhance its process stability, increase production capacity, and improve yields. Occasionally, only few records that indicate defective units are available and they are classified as a minority group in a large database. Such a situation leads to an imbalanced data set problem, wherein it engenders a great challenge to deal with by applying machine-learning techniques for obtaining effective solution. In addition, the database may comprise overlapping samples of different classes. This paper introduces two models of evolutionary fuzzy ARTMAP (FAM) neural networks to deal with the imbalanced data set problems in a semiconductor manufacturing operations. In particular, both the FAM models and hybrid genetic algorithms are integrated in the proposed evolutionary artificial neural networks (EANNs) to classify an imbalanced data set. In addition, one of the proposed EANNs incorporates a facility to learn overlapping samples of different classes from the imbalanced data environment. The classification results of the proposed evolutionary FAM neural networks are presented, compared, and analyzed using several classification metrics. The outcomes positively indicate the effectiveness of the proposed networks in handling classification problems with imbalanced data sets.

  17. Evolutionary disarmament in interspecific competition.

    Science.gov (United States)

    Kisdi, E; Geritz, S A

    2001-12-22

    Competitive asymmetry, which is the advantage of having a larger body or stronger weaponry than a contestant, drives spectacular evolutionary arms races in intraspecific competition. Similar asymmetries are well documented in interspecific competition, yet they seldom lead to exaggerated traits. Here we demonstrate that two species with substantially different size may undergo parallel coevolution towards a smaller size under the same ecological conditions where a single species would exhibit an evolutionary arms race. We show that disarmament occurs for a wide range of parameters in an ecologically explicit model of competition for a single shared resource; disarmament also occurs in a simple Lotka-Volterra competition model. A key property of both models is the interplay between evolutionary dynamics and population density. The mechanism does not rely on very specific features of the model. Thus, evolutionary disarmament may be widespread and may help to explain the lack of interspecific arms races.

  18. Learning-based computing techniques in geoid modeling for precise height transformation

    Science.gov (United States)

    Erol, B.; Erol, S.

    2013-03-01

    Precise determination of local geoid is of particular importance for establishing height control in geodetic GNSS applications, since the classical leveling technique is too laborious. A geoid model can be accurately obtained employing properly distributed benchmarks having GNSS and leveling observations using an appropriate computing algorithm. Besides the classical multivariable polynomial regression equations (MPRE), this study attempts an evaluation of learning based computing algorithms: artificial neural networks (ANNs), adaptive network-based fuzzy inference system (ANFIS) and especially the wavelet neural networks (WNNs) approach in geoid surface approximation. These algorithms were developed parallel to advances in computer technologies and recently have been used for solving complex nonlinear problems of many applications. However, they are rather new in dealing with precise modeling problem of the Earth gravity field. In the scope of the study, these methods were applied to Istanbul GPS Triangulation Network data. The performances of the methods were assessed considering the validation results of the geoid models at the observation points. In conclusion the ANFIS and WNN revealed higher prediction accuracies compared to ANN and MPRE methods. Beside the prediction capabilities, these methods were also compared and discussed from the practical point of view in conclusions.

  19. Incorporating evolutionary principles into environmental management and policy

    DEFF Research Database (Denmark)

    Lankau, Richard; Jørgensen, Peter Søgaard; Harris, David J.

    2011-01-01

    As policymakers and managers work to mitigate the effects of rapid anthropogenic environmental changes, they need to consider organisms’ responses. In light of recent evidence that evolution can be quite rapid, this now includes evolutionary responses. Evolutionary principles have a long history...... in conservation biology, and the necessary next step for the field is to consider ways in which conservation policy makers and managers can proactively manipulate evolutionary processes to achieve their goals. In this review, we aim to illustrate the potential conservation benefits of an increased understanding...... of evolutionary history and prescriptive manipulation of three basic evolutionary factors: selection, variation, and gene flow. For each, we review and propose ways that policy makers and managers can use evolutionary thinking to preserve threatened species, combat pest species, or reduce undesirable evolutionary...

  20. Knowledge Generation as Natural Computation

    Directory of Open Access Journals (Sweden)

    Gordana Dodig-Crnkovic

    2008-04-01

    Full Text Available Knowledge generation can be naturalized by adopting computational model of cognition and evolutionary approach. In this framework knowledge is seen as a result of the structuring of input data (data ? information ? knowledge by an interactive computational process going on in the agent during the adaptive interplay with the environment, which clearly presents developmental advantage by increasing agent's ability to cope with the situation dynamics. This paper addresses the mechanism of knowledge generation, a process that may be modeled as natural computation in order to be better understood and improved.

  1. Now and next-generation sequencing techniques: future of sequence analysis using cloud computing.

    Science.gov (United States)

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed "cloud computing") has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows.

  2. Evolutionary economics and industry location

    NARCIS (Netherlands)

    Boschma, R.A.; Frenken, K.

    2003-01-01

    This paper aims to provide the outlines of an evolutionary economic geography of industry location. We discuss two evolutionary explanations of industry location, that is, one that concentrates on spin-offs, and one that focuses attention on knowledge and agglomeration economies. We claim that both

  3. Evolutionary institutionalism.

    Science.gov (United States)

    Fürstenberg, Dr Kai

    Institutions are hard to define and hard to study. Long prominent in political science have been two theories: Rational Choice Institutionalism (RCI) and Historical Institutionalism (HI). Arising from the life sciences is now a third: Evolutionary Institutionalism (EI). Comparative strengths and weaknesses of these three theories warrant review, and the value-to-be-added by expanding the third beyond Darwinian evolutionary theory deserves consideration. Should evolutionary institutionalism expand to accommodate new understanding in ecology, such as might apply to the emergence of stability, and in genetics, such as might apply to political behavior? Core arguments are reviewed for each theory with more detailed exposition of the third, EI. Particular attention is paid to EI's gene-institution analogy; to variation, selection, and retention of institutional traits; to endogeneity and exogeneity; to agency and structure; and to ecosystem effects, institutional stability, and empirical limitations in behavioral genetics. RCI, HI, and EI are distinct but complementary. Institutional change, while amenable to rational-choice analysis and, retrospectively, to criticaljuncture and path-dependency analysis, is also, and importantly, ecological. Stability, like change, is an emergent property of institutions, which tend to stabilize after change in a manner analogous to allopatric speciation. EI is more than metaphorically biological in that institutional behaviors are driven by human behaviors whose evolution long preceded the appearance of institutions themselves.

  4. Evolutionary foundations for cancer biology.

    Science.gov (United States)

    Aktipis, C Athena; Nesse, Randolph M

    2013-01-01

    New applications of evolutionary biology are transforming our understanding of cancer. The articles in this special issue provide many specific examples, such as microorganisms inducing cancers, the significance of within-tumor heterogeneity, and the possibility that lower dose chemotherapy may sometimes promote longer survival. Underlying these specific advances is a large-scale transformation, as cancer research incorporates evolutionary methods into its toolkit, and asks new evolutionary questions about why we are vulnerable to cancer. Evolution explains why cancer exists at all, how neoplasms grow, why cancer is remarkably rare, and why it occurs despite powerful cancer suppression mechanisms. Cancer exists because of somatic selection; mutations in somatic cells result in some dividing faster than others, in some cases generating neoplasms. Neoplasms grow, or do not, in complex cellular ecosystems. Cancer is relatively rare because of natural selection; our genomes were derived disproportionally from individuals with effective mechanisms for suppressing cancer. Cancer occurs nonetheless for the same six evolutionary reasons that explain why we remain vulnerable to other diseases. These four principles-cancers evolve by somatic selection, neoplasms grow in complex ecosystems, natural selection has shaped powerful cancer defenses, and the limitations of those defenses have evolutionary explanations-provide a foundation for understanding, preventing, and treating cancer.

  5. An Improved Method for Reconfiguring and Optimizing Electrical Active Distribution Network Using Evolutionary Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Nur Faziera Napis

    2018-05-01

    Full Text Available The presence of optimized distributed generation (DG with suitable distribution network reconfiguration (DNR in the electrical distribution network has an advantage for voltage support, power losses reduction, deferment of new transmission line and distribution structure and system stability improvement. However, installation of a DG unit at non-optimal size with non-optimal DNR may lead to higher power losses, power quality problem, voltage instability and incremental of operational cost. Thus, an appropriate DG and DNR planning are essential and are considered as an objective of this research. An effective heuristic optimization technique named as improved evolutionary particle swarm optimization (IEPSO is proposed in this research. The objective function is formulated to minimize the total power losses (TPL and to improve the voltage stability index (VSI. The voltage stability index is determined for three load demand levels namely light load, nominal load, and heavy load with proper optimal DNR and DG sizing. The performance of the proposed technique is compared with other optimization techniques, namely particle swarm optimization (PSO and iteration particle swarm optimization (IPSO. Four case studies on IEEE 33-bus and IEEE 69-bus distribution systems have been conducted to validate the effectiveness of the proposed IEPSO. The optimization results show that, the best achievement is done by IEPSO technique with power losses reduction up to 79.26%, and 58.41% improvement in the voltage stability index. Moreover, IEPSO has the fastest computational time for all load conditions as compared to other algorithms.

  6. Iterative reconstruction techniques for computed tomography Part 1: Technical principles

    International Nuclear Information System (INIS)

    Willemink, Martin J.; Jong, Pim A. de; Leiner, Tim; Nievelstein, Rutger A.J.; Schilham, Arnold M.R.; Heer, Linda M. de; Budde, Ricardo P.J.

    2013-01-01

    To explain the technical principles of and differences between commercially available iterative reconstruction (IR) algorithms for computed tomography (CT) in non-mathematical terms for radiologists and clinicians. Technical details of the different proprietary IR techniques were distilled from available scientific articles and manufacturers' white papers and were verified by the manufacturers. Clinical results were obtained from a literature search spanning January 2006 to January 2012, including only original research papers concerning IR for CT. IR for CT iteratively reduces noise and artefacts in either image space or raw data, or both. Reported dose reductions ranged from 23 % to 76 % compared to locally used default filtered back-projection (FBP) settings, with similar noise, artefacts, subjective, and objective image quality. IR has the potential to allow reducing the radiation dose while preserving image quality. Disadvantages of IR include blotchy image appearance and longer computational time. Future studies need to address differences between IR algorithms for clinical low-dose CT. circle Iterative reconstruction technology for CT is presented in non-mathematical terms. (orig.)

  7. Wheeze sound analysis using computer-based techniques: a systematic review.

    Science.gov (United States)

    Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian

    2017-10-31

    Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.

  8. Evolutionary principles and their practical application

    DEFF Research Database (Denmark)

    Hendry, A. P.; Kinnison, M. T.; Heino, M.

    2011-01-01

    Evolutionary principles are now routinely incorporated into medicine and agriculture. Examples include the design of treatments that slow the evolution of resistance by weeds, pests, and pathogens, and the design of breeding programs that maximize crop yield or quality. Evolutionary principles...... are also increasingly incorporated into conservation biology, natural resource management, and environmental science. Examples include the protection of small and isolated populations from inbreeding depression, the identification of key traits involved in adaptation to climate change, the design...... of harvesting regimes that minimize unwanted life-history evolution, and the setting of conservation priorities based on populations, species, or communities that harbor the greatest evolutionary diversity and potential. The adoption of evolutionary principles has proceeded somewhat independently...

  9. Ubiquitous Green Computing Techniques for High Demand Applications in Smart Environments

    Directory of Open Access Journals (Sweden)

    Jose M. Moya

    2012-08-01

    Full Text Available Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time.

  10. Ubiquitous green computing techniques for high demand applications in Smart environments.

    Science.gov (United States)

    Zapater, Marina; Sanchez, Cesar; Ayala, Jose L; Moya, Jose M; Risco-Martín, José L

    2012-01-01

    Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time.

  11. Power system dynamic state estimation using prediction based evolutionary technique

    International Nuclear Information System (INIS)

    Basetti, Vedik; Chandel, Ashwani K.; Chandel, Rajeevan

    2016-01-01

    In this paper, a new robust LWS (least winsorized square) estimator is proposed for dynamic state estimation of a power system. One of the main advantages of this estimator is that it has an inbuilt bad data rejection property and is less sensitive to bad data measurements. In the proposed approach, Brown's double exponential smoothing technique has been utilised for its reliable performance at the prediction step. The state estimation problem is solved as an optimisation problem using a new jDE-self adaptive differential evolution with prediction based population re-initialisation technique at the filtering step. This new stochastic search technique has been embedded with different state scenarios using the predicted state. The effectiveness of the proposed LWS technique is validated under different conditions, namely normal operation, bad data, sudden load change, and loss of transmission line conditions on three different IEEE test bus systems. The performance of the proposed approach is compared with the conventional extended Kalman filter. On the basis of various performance indices, the results thus obtained show that the proposed technique increases the accuracy and robustness of power system dynamic state estimation performance. - Highlights: • To estimate the states of the power system under dynamic environment. • The performance of the EKF method is degraded during anomaly conditions. • The proposed method remains robust towards anomalies. • The proposed method provides precise state estimates even in the presence of anomalies. • The results show that prediction accuracy is enhanced by using the proposed model.

  12. Building v/s Exploring Models: Comparing Learning of Evolutionary Processes through Agent-based Modeling

    Science.gov (United States)

    Wagh, Aditi

    Two strands of work motivate the three studies in this dissertation. Evolutionary change can be viewed as a computational complex system in which a small set of rules operating at the individual level result in different population level outcomes under different conditions. Extensive research has documented students' difficulties with learning about evolutionary change (Rosengren et al., 2012), particularly in terms of levels slippage (Wilensky & Resnick, 1999). Second, though building and using computational models is becoming increasingly common in K-12 science education, we know little about how these two modalities compare. This dissertation adopts agent-based modeling as a representational system to compare these modalities in the conceptual context of micro-evolutionary processes. Drawing on interviews, Study 1 examines middle-school students' productive ways of reasoning about micro-evolutionary processes to find that the specific framing of traits plays a key role in whether slippage explanations are cued. Study 2, which was conducted in 2 schools with about 150 students, forms the crux of the dissertation. It compares learning processes and outcomes when students build their own models or explore a pre-built model. Analysis of Camtasia videos of student pairs reveals that builders' and explorers' ways of accessing rules, and sense-making of observed trends are of a different character. Builders notice rules through available blocks-based primitives, often bypassing their enactment while explorers attend to rules primarily through the enactment. Moreover, builders' sense-making of observed trends is more rule-driven while explorers' is more enactment-driven. Pre and posttests reveal that builders manifest a greater facility with accessing rules, providing explanations manifesting targeted assembly. Explorers use rules to construct explanations manifesting non-targeted assembly. Interviews reveal varying degrees of shifts away from slippage in both

  13. Gender Inequality in Interaction--An Evolutionary Account

    Science.gov (United States)

    Hopcroft, Rosemary L.

    2009-01-01

    In this article I argue that evolutionary theorizing can help sociologists and feminists better understand gender inequality. Evolutionary theory explains why control of the sexuality of young women is a priority across most human societies both past and present. Evolutionary psychology has extended our understanding of male violence against…

  14. Ensemble Architecture for Prediction of Enzyme-ligand Binding Residues Using Evolutionary Information.

    Science.gov (United States)

    Pai, Priyadarshini P; Dattatreya, Rohit Kadam; Mondal, Sukanta

    2017-11-01

    Enzyme interactions with ligands are crucial for various biochemical reactions governing life. Over many years attempts to identify these residues for biotechnological manipulations have been made using experimental and computational techniques. The computational approaches have gathered impetus with the accruing availability of sequence and structure information, broadly classified into template-based and de novo methods. One of the predominant de novo methods using sequence information involves application of biological properties for supervised machine learning. Here, we propose a support vector machines-based ensemble for prediction of protein-ligand interacting residues using one of the most important discriminative contributing properties in the interacting residue neighbourhood, i. e., evolutionary information in the form of position-specific- scoring matrix (PSSM). The study has been performed on a non-redundant dataset comprising of 9269 interacting and 91773 non-interacting residues for prediction model generation and further evaluation. Of the various PSSM-based models explored, the proposed method named ROBBY (pRediction Of Biologically relevant small molecule Binding residues on enzYmes) shows an accuracy of 84.0 %, Matthews Correlation Coefficient of 0.343 and F-measure of 39.0 % on 78 test enzymes. Further, scope of adding domain knowledge such as pocket information has also been investigated; results showed significant enhancement in method precision. Findings are hoped to boost the reliability of small-molecule ligand interaction prediction for enzyme applications and drug design. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Making evolutionary biology a basic science for medicine

    Science.gov (United States)

    Nesse, Randolph M.; Bergstrom, Carl T.; Ellison, Peter T.; Flier, Jeffrey S.; Gluckman, Peter; Govindaraju, Diddahally R.; Niethammer, Dietrich; Omenn, Gilbert S.; Perlman, Robert L.; Schwartz, Mark D.; Thomas, Mark G.; Stearns, Stephen C.; Valle, David

    2010-01-01

    New applications of evolutionary biology in medicine are being discovered at an accelerating rate, but few physicians have sufficient educational background to use them fully. This article summarizes suggestions from several groups that have considered how evolutionary biology can be useful in medicine, what physicians should learn about it, and when and how they should learn it. Our general conclusion is that evolutionary biology is a crucial basic science for medicine. In addition to looking at established evolutionary methods and topics, such as population genetics and pathogen evolution, we highlight questions about why natural selection leaves bodies vulnerable to disease. Knowledge about evolution provides physicians with an integrative framework that links otherwise disparate bits of knowledge. It replaces the prevalent view of bodies as machines with a biological view of bodies shaped by evolutionary processes. Like other basic sciences, evolutionary biology needs to be taught both before and during medical school. Most introductory biology courses are insufficient to establish competency in evolutionary biology. Premedical students need evolution courses, possibly ones that emphasize medically relevant aspects. In medical school, evolutionary biology should be taught as one of the basic medical sciences. This will require a course that reviews basic principles and specific medical applications, followed by an integrated presentation of evolutionary aspects that apply to each disease and organ system. Evolutionary biology is not just another topic vying for inclusion in the curriculum; it is an essential foundation for a biological understanding of health and disease. PMID:19918069

  16. Methods and experimental techniques in computer engineering

    CERN Document Server

    Schiaffonati, Viola

    2014-01-01

    Computing and science reveal a synergic relationship. On the one hand, it is widely evident that computing plays an important role in the scientific endeavor. On the other hand, the role of scientific method in computing is getting increasingly important, especially in providing ways to experimentally evaluate the properties of complex computing systems. This book critically presents these issues from a unitary conceptual and methodological perspective by addressing specific case studies at the intersection between computing and science. The book originates from, and collects the experience of, a course for PhD students in Information Engineering held at the Politecnico di Milano. Following the structure of the course, the book features contributions from some researchers who are working at the intersection between computing and science.

  17. Context dependent DNA evolutionary models

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    This paper is about stochastic models for the evolution of DNA. For a set of aligned DNA sequences, connected in a phylogenetic tree, the models should be able to explain - in probabilistic terms - the differences seen in the sequences. From the estimates of the parameters in the model one can...... start to make biologically interpretations and conclusions concerning the evolutionary forces at work. In parallel with the increase in computing power, models have become more complex. Starting with Markov processes on a space with 4 states, and extended to Markov processes with 64 states, we are today...... studying models on spaces with 4n (or 64n) number of states with n well above one hundred, say. For such models it is no longer possible to calculate the transition probability analytically, and often Markov chain Monte Carlo is used in connection with likelihood analysis. This is also the approach taken...

  18. A Computer Based Moire Technique To Measure Very Small Displacements

    Science.gov (United States)

    Sciammarella, Cesar A.; Amadshahi, Mansour A.; Subbaraman, B.

    1987-02-01

    The accuracy that can be achieved in the measurement of very small displacements in techniques such as moire, holography and speckle is limited by the noise inherent to the utilized optical devices. To reduce the noise to signal ratio, the moire method can be utilized. Two system of carrier fringes are introduced, an initial system before the load is applied and a final system when the load is applied. The moire pattern of these two systems contains the sought displacement information and the noise common to the two patterns is eliminated. The whole process is performed by a computer on digitized versions of the patterns. Examples of application are given.

  19. APPLICATION OF OBJECT ORIENTED PROGRAMMING TECHNIQUES IN FRONT END COMPUTERS

    International Nuclear Information System (INIS)

    SKELLY, J.F.

    1997-01-01

    The Front End Computer (FEC) environment imposes special demands on software, beyond real time performance and robustness. FEC software must manage a diverse inventory of devices with individualistic timing requirements and hardware interfaces. It must implement network services which export device access to the control system at large, interpreting a uniform network communications protocol into the specific control requirements of the individual devices. Object oriented languages provide programming techniques which neatly address these challenges, and also offer benefits in terms of maintainability and flexibility. Applications are discussed which exhibit the use of inheritance, multiple inheritance and inheritance trees, and polymorphism to address the needs of FEC software

  20. Automated computation of femoral angles in dogs from three-dimensional computed tomography reconstructions: Comparison with manual techniques.

    Science.gov (United States)

    Longo, F; Nicetto, T; Banzato, T; Savio, G; Drigo, M; Meneghello, R; Concheri, G; Isola, M

    2018-02-01

    The aim of this ex vivo study was to test a novel three-dimensional (3D) automated computer-aided design (CAD) method (aCAD) for the computation of femoral angles in dogs from 3D reconstructions of computed tomography (CT) images. The repeatability and reproducibility of three manual radiography, manual CT reconstructions and the aCAD method for the measurement of three femoral angles were evaluated: (1) anatomical lateral distal femoral angle (aLDFA); (2) femoral neck angle (FNA); and (3) femoral torsion angle (FTA). Femoral angles of 22 femurs obtained from 16 cadavers were measured by three blinded observers. Measurements were repeated three times by each observer for each diagnostic technique. Femoral angle measurements were analysed using a mixed effects linear model for repeated measures to determine the levels of intra-observer agreement (repeatability) and inter-observer agreement (reproducibility). Repeatability and reproducibility of measurements using the aCAD method were excellent (intra-class coefficients, ICCs≥0.98) for all three angles assessed. Manual radiography and CT exhibited excellent agreement for the aLDFA measurement (ICCs≥0.90). However, FNA repeatability and reproducibility were poor (ICCscomputation of the 3D aCAD method provided the highest repeatability and reproducibility among the tested methodologies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Multi-step-prediction of chaotic time series based on co-evolutionary recurrent neural network

    International Nuclear Information System (INIS)

    Ma Qianli; Zheng Qilun; Peng Hong; Qin Jiangwei; Zhong Tanwei

    2008-01-01

    This paper proposes a co-evolutionary recurrent neural network (CERNN) for the multi-step-prediction of chaotic time series, it estimates the proper parameters of phase space reconstruction and optimizes the structure of recurrent neural networks by co-evolutionary strategy. The searching space was separated into two subspaces and the individuals are trained in a parallel computational procedure. It can dynamically combine the embedding method with the capability of recurrent neural network to incorporate past experience due to internal recurrence. The effectiveness of CERNN is evaluated by using three benchmark chaotic time series data sets: the Lorenz series, Mackey-Glass series and real-world sun spot series. The simulation results show that CERNN improves the performances of multi-step-prediction of chaotic time series

  2. Contemporary issues in evolutionary biology

    Indian Academy of Sciences (India)

    We are delighted to bring to the readers, a set of peer-reviewed papers on evolutionary biology, published as a special issue of the Journal of Genetics. These papers emanated from ruminations upon and discussions at the Foundations of. Evolutionary Theory: the Ongoing Synthesis meeting at Coorg, India, in February ...

  3. Toward a method for tracking virus evolutionary trajectory applied to the pandemic H1N1 2009 influenza virus.

    Science.gov (United States)

    Squires, R Burke; Pickett, Brett E; Das, Sajal; Scheuermann, Richard H

    2014-12-01

    In 2009 a novel pandemic H1N1 influenza virus (H1N1pdm09) emerged as the first official influenza pandemic of the 21st century. Early genomic sequence analysis pointed to the swine origin of the virus. Here we report a novel computational approach to determine the evolutionary trajectory of viral sequences that uses data-driven estimations of nucleotide substitution rates to track the gradual accumulation of observed sequence alterations over time. Phylogenetic analysis and multiple sequence alignments show that sequences belonging to the resulting evolutionary trajectory of the H1N1pdm09 lineage exhibit a gradual accumulation of sequence variations and tight temporal correlations in the topological structure of the phylogenetic trees. These results suggest that our evolutionary trajectory analysis (ETA) can more effectively pinpoint the evolutionary history of viruses, including the host and geographical location traversed by each segment, when compared against either BLAST or traditional phylogenetic analysis alone. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Nanoscopical dissection of ancestral nucleoli in Archaea: a case of study in Evolutionary Cell Biology

    KAUST Repository

    Islas Morales, Parsifal

    2018-04-01

    Is the nucleolus a sine qua non condition of eukaryotes? The present project starts from this central question to contribute to our knowledge about the origin and the evolution of the cells. The nucleolus is a cryptic organelle that plays a central role in cell function. It is responsible for the orchestration of ribosomal RNA expression, maturation and modification in the regulatory context of cellular homeostasis. Ribosomal expression is undoubtedly the greatest transcriptional and regulatory activity of any cell. The nucleolus is not just a conventional organelle –membrane-limited-, but a magnificent transcriptional puff: a dichotomy between structure and process, form and function. What is the minimum nucleolus? Evolution should bring some light into these questions. Evolutionary cell biology (ECB) has raised increasing attention in the last decades. Is this a new discipline and an historical opportunity to combine functional and evolutionary biology towards the insight that cell evolution underlies organismic complexity? In the post-genomic era, we have developed the potential of combining high throughput acquisition of data with functional in situ and in sillico approaches: integration understood as omics approaches. Can this provide a real consilience between evolutionary and functional approaches? The reduced number of model organisms and cultivation techniques still excludes the majority of the extant diversity of cells from the scope of experimental inquiry. Furthermore, at the conceptual level, the simplification of evolutionary processes in biosciences still limits the conformation of a successful disciplinary link between functional and evolutionary biology. This limits the formulation of questions and experiments that properly address the mechanistic nature of cellular events that underlie microbial and organismic diversity and evolution. Here we provide a critical and comparative review to the historical background of ECB. This project takes the

  5. Evaluation of computer-based NDE techniques and regional support of inspection activities

    International Nuclear Information System (INIS)

    Taylor, T.T.; Kurtz, R.J.; Heasler, P.G.; Doctor, S.R.

    1991-01-01

    This paper describes the technical progress during fiscal year 1990 for the program entitled 'Evaluation of Computer-Based nondestructive evaluation (NDE) Techniques and Regional Support of Inspection Activities.' Highlights of the technical progress include: development of a seminar to provide basic knowledge required to review and evaluate computer-based systems; review of a typical computer-based field procedure to determine compliance with applicable codes, ambiguities in procedure guidance, and overall effectiveness and utility; design and fabrication of a series of three test blocks for NRC staff use for training or audit of UT systems; technical assistance in reviewing (1) San Onofre ten year reactor pressure vessel inservice inspection activities and (2) the capability of a proposed phased array inspection of the feedwater nozzle at Oyster Creek; completion of design calculations to determine the feasibility and significance of various sizes of mockup assemblies that could be used to evaluate the effectiveness of eddy current examinations performed on steam generators; and discussion of initial mockup design features and methods for fabricating flaws in steam generator tubes

  6. Evolutionary multimodal optimization using the principle of locality

    KAUST Repository

    Wong, Kachun; Wu, Chunho; Mok, Ricky; Peng, Chengbin; Zhang, Zhaolei

    2012-01-01

    The principle of locality is one of the most widely used concepts in designing computing systems. To explore the principle in evolutionary computation, crowding differential evolution is incorporated with locality for multimodal optimization. Instead of generating trial vectors randomly, the first method proposed takes advantage of spatial locality to generate trial vectors. Temporal locality is also adopted to help generate offspring in the second method proposed. Temporal and spatial locality are then applied together in the third method proposed. Numerical experiments are conducted to compare the proposed methods with the state-of-the-art methods on benchmark functions. Experimental analysis is undertaken to observe the effect of locality and the synergy between temporal locality and spatial locality. Further experiments are also conducted on two application problems. One is the varied-line-spacing holographic grating design problem, while the other is the protein structure prediction problem. The numerical results demonstrate the effectiveness of the methods proposed. © 2012 Elsevier Inc. All rights reserved.

  7. Evolutionary multimodal optimization using the principle of locality

    KAUST Repository

    Wong, Kachun

    2012-07-01

    The principle of locality is one of the most widely used concepts in designing computing systems. To explore the principle in evolutionary computation, crowding differential evolution is incorporated with locality for multimodal optimization. Instead of generating trial vectors randomly, the first method proposed takes advantage of spatial locality to generate trial vectors. Temporal locality is also adopted to help generate offspring in the second method proposed. Temporal and spatial locality are then applied together in the third method proposed. Numerical experiments are conducted to compare the proposed methods with the state-of-the-art methods on benchmark functions. Experimental analysis is undertaken to observe the effect of locality and the synergy between temporal locality and spatial locality. Further experiments are also conducted on two application problems. One is the varied-line-spacing holographic grating design problem, while the other is the protein structure prediction problem. The numerical results demonstrate the effectiveness of the methods proposed. © 2012 Elsevier Inc. All rights reserved.

  8. Estimating the ratios of the stationary distribution values for Markov chains modeling evolutionary algorithms.

    Science.gov (United States)

    Mitavskiy, Boris; Cannings, Chris

    2009-01-01

    The evolutionary algorithm stochastic process is well-known to be Markovian. These have been under investigation in much of the theoretical evolutionary computing research. When the mutation rate is positive, the Markov chain modeling of an evolutionary algorithm is irreducible and, therefore, has a unique stationary distribution. Rather little is known about the stationary distribution. In fact, the only quantitative facts established so far tell us that the stationary distributions of Markov chains modeling evolutionary algorithms concentrate on uniform populations (i.e., those populations consisting of a repeated copy of the same individual). At the same time, knowing the stationary distribution may provide some information about the expected time it takes for the algorithm to reach a certain solution, assessment of the biases due to recombination and selection, and is of importance in population genetics to assess what is called a "genetic load" (see the introduction for more details). In the recent joint works of the first author, some bounds have been established on the rates at which the stationary distribution concentrates on the uniform populations. The primary tool used in these papers is the "quotient construction" method. It turns out that the quotient construction method can be exploited to derive much more informative bounds on ratios of the stationary distribution values of various subsets of the state space. In fact, some of the bounds obtained in the current work are expressed in terms of the parameters involved in all the three main stages of an evolutionary algorithm: namely, selection, recombination, and mutation.

  9. Parameterless evolutionary algorithm applied to the nuclear reload problem

    International Nuclear Information System (INIS)

    Caldas, Gustavo Henrique Flores; Schirru, Roberto

    2008-01-01

    In this work, an evolutionary algorithm with no parameters called FPBIL (parameter free PBIL) is developed based on PBIL (population-based incremental learning). Moreover, the analysis reveals how the parameters from PBIL can be replaced by self-adaptable mechanisms which appear from the radically different form by which the evolution is processed. Despite the advantages, the FPBIL reveals itself compact and relatively modest in the use of computational resources. The FPBIL is then applied to the nuclear reload problem. The experimental results observed are compared to those of other works and corroborate to affirm the superiority of the new algorithm

  10. The use of automatic programming techniques for fault tolerant computing systems

    Science.gov (United States)

    Wild, C.

    1985-01-01

    It is conjectured that the production of software for ultra-reliable computing systems such as required by Space Station, aircraft, nuclear power plants and the like will require a high degree of automation as well as fault tolerance. In this paper, the relationship between automatic programming techniques and fault tolerant computing systems is explored. Initial efforts in the automatic synthesis of code from assertions to be used for error detection as well as the automatic generation of assertions and test cases from abstract data type specifications is outlined. Speculation on the ability to generate truly diverse designs capable of recovery from errors by exploring alternate paths in the program synthesis tree is discussed. Some initial thoughts on the use of knowledge based systems for the global detection of abnormal behavior using expectations and the goal-directed reconfiguration of resources to meet critical mission objectives are given. One of the sources of information for these systems would be the knowledge captured during the automatic programming process.

  11. A Survey on Evolutionary Algorithm Based Hybrid Intelligence in Bioinformatics

    Directory of Open Access Journals (Sweden)

    Shan Li

    2014-01-01

    Full Text Available With the rapid advance in genomics, proteomics, metabolomics, and other types of omics technologies during the past decades, a tremendous amount of data related to molecular biology has been produced. It is becoming a big challenge for the bioinformatists to analyze and interpret these data with conventional intelligent techniques, for example, support vector machines. Recently, the hybrid intelligent methods, which integrate several standard intelligent approaches, are becoming more and more popular due to their robustness and efficiency. Specifically, the hybrid intelligent approaches based on evolutionary algorithms (EAs are widely used in various fields due to the efficiency and robustness of EAs. In this review, we give an introduction about the applications of hybrid intelligent methods, in particular those based on evolutionary algorithm, in bioinformatics. In particular, we focus on their applications to three common problems that arise in bioinformatics, that is, feature selection, parameter estimation, and reconstruction of biological networks.

  12. The impact of computers on the nuclear utility industry

    International Nuclear Information System (INIS)

    Taylor, J.J.

    1984-01-01

    The applications of computer technology to the nuclear utility industry are discussed in light of recent phenomenal growth of computer hardware and software. Computer applications in existence in the power plants are presented, as well as potential future development for plant design, construction, operation, maintenance and retrofit. Utility concerns are addressed. The study concludes that the applications of computer technology to the nuclear utility industry are highly promising and evolutionary in nature

  13. Archaeogenetics in evolutionary medicine.

    Science.gov (United States)

    Bouwman, Abigail; Rühli, Frank

    2016-09-01

    Archaeogenetics is the study of exploration of ancient DNA (aDNA) of more than 70 years old. It is an important part of the wider studies of many different areas of our past, including animal, plant and pathogen evolution and domestication events. Hereby, we address specifically the impact of research in archaeogenetics in the broader field of evolutionary medicine. Studies on ancient hominid genomes help to understand even modern health patterns. Human genetic microevolution, e.g. related to abilities of post-weaning milk consumption, and specifically genetic adaptation in disease susceptibility, e.g. towards malaria and other infectious diseases, are of the upmost importance in contributions of archeogenetics on the evolutionary understanding of human health and disease. With the increase in both the understanding of modern medical genetics and the ability to deep sequence ancient genetic information, the field of archaeogenetic evolutionary medicine is blossoming.

  14. EPR by AREVA. An evolutionary reactor

    International Nuclear Information System (INIS)

    Horstmann, Marion

    2010-01-01

    The EPR development goals are as follows: 1. Evolutionary design to fully capitalize on the design, construction and operating experience based on the 86 AREVA's PWR operating worldwide; 2. Enhanced Safety compared to operating PWRs: reduce core damage frequency (CDF), accommodate severe accidents with no long-term population effect, Withstand large airplane crash (APC); 3. High availability; 4. Simplified operation and maintenance; and 5. Generation cost at least 10 % lower than 1500 MWe series in operation.The design builds on the achievements of the N4 and Konvoi reactors. The main plant data are tabulated. The PWR structure is shown as an example of the stepwise improvement. Focus of the presentation is on the construction techniques, supply chain, and project delivery. (P.A.)

  15. Evolutionary relevance facilitates visual information processing.

    Science.gov (United States)

    Jackson, Russell E; Calvillo, Dusti P

    2013-11-03

    Visual search of the environment is a fundamental human behavior that perceptual load affects powerfully. Previously investigated means for overcoming the inhibitions of high perceptual load, however, generalize poorly to real-world human behavior. We hypothesized that humans would process evolutionarily relevant stimuli more efficiently than evolutionarily novel stimuli, and evolutionary relevance would mitigate the repercussions of high perceptual load during visual search. Animacy is a significant component to evolutionary relevance of visual stimuli because perceiving animate entities is time-sensitive in ways that pose significant evolutionary consequences. Participants completing a visual search task located evolutionarily relevant and animate objects fastest and with the least impact of high perceptual load. Evolutionarily novel and inanimate objects were located slowest and with the highest impact of perceptual load. Evolutionary relevance may importantly affect everyday visual information processing.

  16. Is evolutionary psychology a metatheory for psychology? A discussion of four major issues in psychology from an evolutionary developmental perspective

    NARCIS (Netherlands)

    Ploeger, A.; van der Maas, H.L.J.; Raijmakers, M.E.J.

    2008-01-01

    Evolutionary psychology has been proposed as a metatheoretical framework for psychology. We argue that evolutionary psychology should be expanded if it is to offer new insights regarding the major issues in psychology. Evolutionary developmental biology can provide valuable new insights into issues

  17. Evolutionary optimization and game strategies for advanced multi-disciplinary design applications to aeronautics and UAV design

    CERN Document Server

    Periaux, Jacques; Lee, Dong Seop Chris

    2015-01-01

    Many complex aeronautical design problems can be formulated with efficient multi-objective evolutionary optimization methods and game strategies. This book describes the role of advanced innovative evolution tools in the solution, or the set of solutions of single or multi disciplinary optimization. These tools use the concept of multi-population, asynchronous parallelization and hierarchical topology which allows different models including precise, intermediate and approximate models with each node belonging to the different hierarchical layer handled by a different Evolutionary Algorithm. The efficiency of evolutionary algorithms for both single and multi-objective optimization problems are significantly improved by the coupling of EAs with games and in particular by a new dynamic methodology named “Hybridized Nash-Pareto games”. Multi objective Optimization techniques and robust design problems taking into account uncertainties are introduced and explained in detail. Several applications dealing with c...

  18. Workshop on Computational Optimization

    CERN Document Server

    2016-01-01

    This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization 2014, held at Warsaw, Poland, September 7-10, 2014. The book presents recent advances in computational optimization. The volume includes important real problems like parameter settings for controlling processes in bioreactor and other processes, resource constrained project scheduling, infection distribution, molecule distance geometry, quantum computing, real-time management and optimal control, bin packing, medical image processing, localization the abrupt atmospheric contamination source and so on. It shows how to develop algorithms for them based on new metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming and others. This research demonstrates how some real-world problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks.

  19. Computation techniques and computer programs to analyze Stirling cycle engines using characteristic dynamic energy equations

    Science.gov (United States)

    Larson, V. H.

    1982-01-01

    The basic equations that are used to describe the physical phenomena in a Stirling cycle engine are the general energy equations and equations for the conservation of mass and conversion of momentum. These equations, together with the equation of state, an analytical expression for the gas velocity, and an equation for mesh temperature are used in this computer study of Stirling cycle characteristics. The partial differential equations describing the physical phenomena that occurs in a Stirling cycle engine are of the hyperbolic type. The hyperbolic equations have real characteristic lines. By utilizing appropriate points along these curved lines the partial differential equations can be reduced to ordinary differential equations. These equations are solved numerically using a fourth-fifth order Runge-Kutta integration technique.

  20. 3rd International Conference on Frontiers of Intelligent Computing : Theory and Applications

    CERN Document Server

    Biswal, Bhabendra; Udgata, Siba; Mandal, JK

    2015-01-01

    Volume 1 contains 95 papers presented at FICTA 2014: Third International Conference on Frontiers in Intelligent Computing: Theory and Applications. The conference was held during 14-15, November, 2014 at Bhubaneswar, Odisha, India.  This volume contains papers mainly focused on Data Warehousing and Mining, Machine Learning, Mobile and Ubiquitous Computing, AI, E-commerce & Distributed Computing and Soft Computing, Evolutionary Computing, Bio-inspired Computing and its Applications.