WorldWideScience

Sample records for computational evolutionary methodology

  1. Computational Evolutionary Methodology for Knowledge Discovery and Forecasting in Epidemiology and Medicine

    International Nuclear Information System (INIS)

    Rao, Dhananjai M.; Chernyakhovsky, Alexander; Rao, Victoria

    2008-01-01

    Humanity is facing an increasing number of highly virulent and communicable diseases such as avian influenza. Researchers believe that avian influenza has potential to evolve into one of the deadliest pandemics. Combating these diseases requires in-depth knowledge of their epidemiology. An effective methodology for discovering epidemiological knowledge is to utilize a descriptive, evolutionary, ecological model and use bio-simulations to study and analyze it. These types of bio-simulations fall under the category of computational evolutionary methods because the individual entities participating in the simulation are permitted to evolve in a natural manner by reacting to changes in the simulated ecosystem. This work describes the application of the aforementioned methodology to discover epidemiological knowledge about avian influenza using a novel eco-modeling and bio-simulation environment called SEARUMS. The mathematical principles underlying SEARUMS, its design, and the procedure for using SEARUMS are discussed. The bio-simulations and multi-faceted case studies conducted using SEARUMS elucidate its ability to pinpoint timelines, epicenters, and socio-economic impacts of avian influenza. This knowledge is invaluable for proactive deployment of countermeasures in order to minimize negative socioeconomic impacts, combat the disease, and avert a pandemic

  2. Evolutionary Computing Methods for Spectral Retrieval

    Science.gov (United States)

    Terrile, Richard; Fink, Wolfgang; Huntsberger, Terrance; Lee, Seugwon; Tisdale, Edwin; VonAllmen, Paul; Tinetti, Geivanna

    2009-01-01

    A methodology for processing spectral images to retrieve information on underlying physical, chemical, and/or biological phenomena is based on evolutionary and related computational methods implemented in software. In a typical case, the solution (the information that one seeks to retrieve) consists of parameters of a mathematical model that represents one or more of the phenomena of interest. The methodology was developed for the initial purpose of retrieving the desired information from spectral image data acquired by remote-sensing instruments aimed at planets (including the Earth). Examples of information desired in such applications include trace gas concentrations, temperature profiles, surface types, day/night fractions, cloud/aerosol fractions, seasons, and viewing angles. The methodology is also potentially useful for retrieving information on chemical and/or biological hazards in terrestrial settings. In this methodology, one utilizes an iterative process that minimizes a fitness function indicative of the degree of dissimilarity between observed and synthetic spectral and angular data. The evolutionary computing methods that lie at the heart of this process yield a population of solutions (sets of the desired parameters) within an accuracy represented by a fitness-function value specified by the user. The evolutionary computing methods (ECM) used in this methodology are Genetic Algorithms and Simulated Annealing, both of which are well-established optimization techniques and have also been described in previous NASA Tech Briefs articles. These are embedded in a conceptual framework, represented in the architecture of the implementing software, that enables automatic retrieval of spectral and angular data and analysis of the retrieved solutions for uniqueness.

  3. Soft computing integrating evolutionary, neural, and fuzzy systems

    CERN Document Server

    Tettamanzi, Andrea

    2001-01-01

    Soft computing encompasses various computational methodologies, which, unlike conventional algorithms, are tolerant of imprecision, uncertainty, and partial truth. Soft computing technologies offer adaptability as a characteristic feature and thus permit the tracking of a problem through a changing environment. Besides some recent developments in areas like rough sets and probabilistic networks, fuzzy logic, evolutionary algorithms, and artificial neural networks are core ingredients of soft computing, which are all bio-inspired and can easily be combined synergetically. This book presents a well-balanced integration of fuzzy logic, evolutionary computing, and neural information processing. The three constituents are introduced to the reader systematically and brought together in differentiated combinations step by step. The text was developed from courses given by the authors and offers numerous illustrations as

  4. Algorithmic Mechanism Design of Evolutionary Computation.

    Science.gov (United States)

    Pei, Yan

    2015-01-01

    We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm.

  5. International Conference of Intelligence Computation and Evolutionary Computation ICEC 2012

    CERN Document Server

    Intelligence Computation and Evolutionary Computation

    2013-01-01

    2012 International Conference of Intelligence Computation and Evolutionary Computation (ICEC 2012) is held on July 7, 2012 in Wuhan, China. This conference is sponsored by Information Technology & Industrial Engineering Research Center.  ICEC 2012 is a forum for presentation of new research results of intelligent computation and evolutionary computation. Cross-fertilization of intelligent computation, evolutionary computation, evolvable hardware and newly emerging technologies is strongly encouraged. The forum aims to bring together researchers, developers, and users from around the world in both industry and academia for sharing state-of-art results, for exploring new areas of research and development, and to discuss emerging issues facing intelligent computation and evolutionary computation.

  6. Computational intelligence synergies of fuzzy logic, neural networks and evolutionary computing

    CERN Document Server

    Siddique, Nazmul

    2013-01-01

    Computational Intelligence: Synergies of Fuzzy Logic, Neural Networks and Evolutionary Computing presents an introduction to some of the cutting edge technological paradigms under the umbrella of computational intelligence. Computational intelligence schemes are investigated with the development of a suitable framework for fuzzy logic, neural networks and evolutionary computing, neuro-fuzzy systems, evolutionary-fuzzy systems and evolutionary neural systems. Applications to linear and non-linear systems are discussed with examples. Key features: Covers all the aspect

  7. Practical advantages of evolutionary computation

    Science.gov (United States)

    Fogel, David B.

    1997-10-01

    Evolutionary computation is becoming a common technique for solving difficult, real-world problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as their ability to self-adapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine.

  8. Evolutionary Based Solutions for Green Computing

    CERN Document Server

    Kołodziej, Joanna; Li, Juan; Zomaya, Albert

    2013-01-01

    Today’s highly parameterized large-scale distributed computing systems may be composed  of a large number of various components (computers, databases, etc) and must provide a wide range of services. The users of such systems, located at different (geographical or managerial) network cluster may have a limited access to the system’s services and resources, and different, often conflicting, expectations and requirements. Moreover, the information and data processed in such dynamic environments may be incomplete, imprecise, fragmentary, and overloading. All of the above mentioned issues require some intelligent scalable methodologies for the management of the whole complex structure, which unfortunately may increase the energy consumption of such systems.   This book in its eight chapters, addresses the fundamental issues related to the energy usage and the optimal low-cost system design in high performance ``green computing’’ systems. The recent evolutionary and general metaheuristic-based solutions ...

  9. Evolutionary computation in zoology and ecology.

    Science.gov (United States)

    Boone, Randall B

    2017-12-01

    Evolutionary computational methods have adopted attributes of natural selection and evolution to solve problems in computer science, engineering, and other fields. The method is growing in use in zoology and ecology. Evolutionary principles may be merged with an agent-based modeling perspective to have individual animals or other agents compete. Four main categories are discussed: genetic algorithms, evolutionary programming, genetic programming, and evolutionary strategies. In evolutionary computation, a population is represented in a way that allows for an objective function to be assessed that is relevant to the problem of interest. The poorest performing members are removed from the population, and remaining members reproduce and may be mutated. The fitness of the members is again assessed, and the cycle continues until a stopping condition is met. Case studies include optimizing: egg shape given different clutch sizes, mate selection, migration of wildebeest, birds, and elk, vulture foraging behavior, algal bloom prediction, and species richness given energy constraints. Other case studies simulate the evolution of species and a means to project shifts in species ranges in response to a changing climate that includes competition and phenotypic plasticity. This introduction concludes by citing other uses of evolutionary computation and a review of the flexibility of the methods. For example, representing species' niche spaces subject to selective pressure allows studies on cladistics, the taxon cycle, neutral versus niche paradigms, fundamental versus realized niches, community structure and order of colonization, invasiveness, and responses to a changing climate.

  10. Evolutionary Computing for Intelligent Power System Optimization and Control

    DEFF Research Database (Denmark)

    This new book focuses on how evolutionary computing techniques benefit engineering research and development tasks by converting practical problems of growing complexities into simple formulations, thus largely reducing development efforts. This book begins with an overview of the optimization the...... theory and modern evolutionary computing techniques, and goes on to cover specific applications of evolutionary computing to power system optimization and control problems....

  11. Part E: Evolutionary Computation

    DEFF Research Database (Denmark)

    2015-01-01

    of Computational Intelligence. First, comprehensive surveys of genetic algorithms, genetic programming, evolution strategies, parallel evolutionary algorithms are presented, which are readable and constructive so that a large audience might find them useful and – to some extent – ready to use. Some more general...... kinds of evolutionary algorithms, have been prudently analyzed. This analysis was followed by a thorough analysis of various issues involved in stochastic local search algorithms. An interesting survey of various technological and industrial applications in mechanical engineering and design has been...... topics like the estimation of distribution algorithms, indicator-based selection, etc., are also discussed. An important problem, from a theoretical and practical point of view, of learning classifier systems is presented in depth. Multiobjective evolutionary algorithms, which constitute one of the most...

  12. An evolutionary computing frame work toward object extraction from satellite images

    Directory of Open Access Journals (Sweden)

    P.V. Arun

    2013-12-01

    Full Text Available Image interpretation domains have witnessed the application of many intelligent methodologies over the past decade; however the effective use of evolutionary computing techniques for feature detection has been less explored. In this paper, we critically analyze the possibility of using cellular neural network for accurate feature detection. Contextual knowledge has been effectively represented by incorporating spectral and spatial aspects using adaptive kernel strategy. Developed methodology has been compared with traditional approaches in an object based context and investigations revealed that considerable success has been achieved with the procedure. Intelligent interpretation, automatic interpolation, and effective contextual representations are the features of the system.

  13. Comparison of evolutionary computation algorithms for solving bi ...

    Indian Academy of Sciences (India)

    failure probability. Multiobjective Evolutionary Computation algorithms (MOEAs) are well-suited for Multiobjective task scheduling on heterogeneous environment. The two Multi-Objective Evolutionary Algorithms such as Multiobjective Genetic. Algorithm (MOGA) and Multiobjective Evolutionary Programming (MOEP) with.

  14. Applications of Evolutionary Computation

    NARCIS (Netherlands)

    Mora, Antonio M.; Squillero, Giovanni; Di Chio, C; Agapitos, Alexandros; Cagnoni, Stefano; Cotta, Carlos; Fernández De Vega, F; Di Caro, G A; Drechsler, R.; Ekárt, A; Esparcia-Alcázar, Anna I.; Farooq, M; Langdon, W B; Merelo-Guervós, J.J.; Preuss, M; Richter, O.-M.H.; Silva, Sara; Sim$\\$~oes, A; Squillero, Giovanni; Tarantino, Ernesto; Tettamanzi, Andrea G B; Togelius, J; Urquhart, Neil; Uyar, A S; Yannakakis, G N; Smith, Stephen L; Caserta, Marco; Ramirez, Adriana; Voß, Stefan; Squillero, Giovanni; Burelli, Paolo; Mora, Antonio M.; Squillero, Giovanni; Jan, Mathieu; Matthias, M; Di Chio, C; Agapitos, Alexandros; Cagnoni, Stefano; Cotta, Carlos; Fernández De Vega, F; Di Caro, G A; Drechsler, R.; Ekárt, A; Esparcia-Alcázar, Anna I.; Farooq, M; Langdon, W B; Merelo-Guervós, J.J.; Preuss, M; Richter, O.-M.H.; Silva, Sara; Sim$\\$~oes, A; Squillero, Giovanni; Tarantino, Ernesto; Tettamanzi, Andrea G B; Togelius, J; Urquhart, Neil; Uyar, A S; Yannakakis, G N; Caserta, Marco; Ramirez, Adriana; Voß, Stefan; Squillero, Giovanni; Burelli, Paolo; Esparcia-Alcazar, Anna I; Silva, Sara; Agapitos, Alexandros; Cotta, Carlos; De Falco, Ivanoe; Cioppa, Antonio Della; Diwold, Konrad; Ekart, Aniko; Tarantino, Ernesto; Vega, Francisco Fernandez De; Burelli, Paolo; Sim, Kevin; Cagnoni, Stefano; Simoes, Anabela; Merelo, J.J.; Urquhart, Neil; Haasdijk, Evert; Zhang, Mengjie; Squillero, Giovanni; Eiben, A E; Tettamanzi, Andrea G B; Glette, Kyrre; Rohlfshagen, Philipp; Schaefer, Robert; Caserta, Marco; Ramirez, Adriana; Voß, Stefan

    2015-01-01

    The application of genetic and evolutionary computation to problems in medicine has increased rapidly over the past five years, but there are specific issues and challenges that distinguish it from other real-world applications. Obtaining reliable and coherent patient data, establishing the clinical

  15. Prospective Algorithms for Quantum Evolutionary Computation

    OpenAIRE

    Sofge, Donald A.

    2008-01-01

    This effort examines the intersection of the emerging field of quantum computing and the more established field of evolutionary computation. The goal is to understand what benefits quantum computing might offer to computational intelligence and how computational intelligence paradigms might be implemented as quantum programs to be run on a future quantum computer. We critically examine proposed algorithms and methods for implementing computational intelligence paradigms, primarily focused on ...

  16. Controller Design of DFIG Based Wind Turbine by Using Evolutionary Soft Computational Techniques

    Directory of Open Access Journals (Sweden)

    O. P. Bharti

    2017-06-01

    Full Text Available This manuscript illustrates the controller design for a doubly fed induction generator based variable speed wind turbine by using a bioinspired scheme. This methodology is based on exploiting two proficient swarm intelligence based evolutionary soft computational procedures. The particle swarm optimization (PSO and bacterial foraging optimization (BFO techniques are employed to design the controller intended for small damping plant of the DFIG. Wind energy overview and DFIG operating principle along with the equivalent circuit model is adequately discussed in this paper. The controller design for DFIG based WECS using PSO and BFO are described comparatively in detail. The responses of the DFIG system regarding terminal voltage, current, active-reactive power, and DC-Link voltage have slightly improved with the evolutionary soft computational procedure. Lastly, the obtained output is equated with a standard technique for performance improvement of DFIG based wind energy conversion system.

  17. Evolutionary-Simulative Methodology in the Management of Social and Economic Systems

    Directory of Open Access Journals (Sweden)

    Konyavskiy V.A.

    2017-01-01

    Full Text Available The article outlines the main provisions of the evolutionary-simulative methodology (ESM which is a methodology of mathematical modeling of equilibrium random processes (CPR, widely used in the economy. It discusses the basic directions of use of ESM solutions for social problems and economic management systems.

  18. Evolutionary computation for reinforcement learning

    NARCIS (Netherlands)

    Whiteson, S.; Wiering, M.; van Otterlo, M.

    2012-01-01

    Algorithms for evolutionary computation, which simulate the process of natural selection to solve optimization problems, are an effective tool for discovering high-performing reinforcement-learning policies. Because they can automatically find good representations, handle continuous action spaces,

  19. Evolutionary Computation Methods and their applications in Statistics

    Directory of Open Access Journals (Sweden)

    Francesco Battaglia

    2013-05-01

    Full Text Available A brief discussion of the genesis of evolutionary computation methods, their relationship to artificial intelligence, and the contribution of genetics and Darwin’s theory of natural evolution is provided. Then, the main evolutionary computation methods are illustrated: evolution strategies, genetic algorithms, estimation of distribution algorithms, differential evolution, and a brief description of some evolutionary behavior methods such as ant colony and particle swarm optimization. We also discuss the role of the genetic algorithm for multivariate probability distribution random generation, rather than as a function optimizer. Finally, some relevant applications of genetic algorithm to statistical problems are reviewed: selection of variables in regression, time series model building, outlier identification, cluster analysis, design of experiments.

  20. Evolutionary computing in Nuclear Engineering Institute/CNEN-Brazil

    International Nuclear Information System (INIS)

    Pereira, Claudio M.N.A.; Lapa, Celso M.F.; Lapa, Nelbia da Silva; Mol, Antonio C.

    2000-01-01

    This paper aims to discuss the importance of evolutionary computation (CE) for nuclear engineering and the development of this area in the Instituto de Engenharia Nuclear (IEN) at the last years. Are describe, briefly, the applications realized in this institute by the technical group of CE. For example: nuclear reactor core design optimization, preventive maintenance scheduling optimizing and nuclear reactor transient identifications. It is also shown a novel computational tool to implementation of genetic algorithm that was development in this institute and applied in those works. Some results were presents and the gains obtained with the evolutionary computation were discussing. (author)

  1. Integrated evolutionary computation neural network quality controller for automated systems

    Energy Technology Data Exchange (ETDEWEB)

    Patro, S.; Kolarik, W.J. [Texas Tech Univ., Lubbock, TX (United States). Dept. of Industrial Engineering

    1999-06-01

    With increasing competition in the global market, more and more stringent quality standards and specifications are being demands at lower costs. Manufacturing applications of computing power are becoming more common. The application of neural networks to identification and control of dynamic processes has been discussed. The limitations of using neural networks for control purposes has been pointed out and a different technique, evolutionary computation, has been discussed. The results of identifying and controlling an unstable, dynamic process using evolutionary computation methods has been presented. A framework for an integrated system, using both neural networks and evolutionary computation, has been proposed to identify the process and then control the product quality, in a dynamic, multivariable system, in real-time.

  2. Applications of evolutionary computation in image processing and pattern recognition

    CERN Document Server

    Cuevas, Erik; Perez-Cisneros, Marco

    2016-01-01

    This book presents the use of efficient Evolutionary Computation (EC) algorithms for solving diverse real-world image processing and pattern recognition problems. It provides an overview of the different aspects of evolutionary methods in order to enable the reader in reaching a global understanding of the field and, in conducting studies on specific evolutionary techniques that are related to applications in image processing and pattern recognition. It explains the basic ideas of the proposed applications in a way that can also be understood by readers outside of the field. Image processing and pattern recognition practitioners who are not evolutionary computation researchers will appreciate the discussed techniques beyond simple theoretical tools since they have been adapted to solve significant problems that commonly arise on such areas. On the other hand, members of the evolutionary computation community can learn the way in which image processing and pattern recognition problems can be translated into an...

  3. Massively parallel evolutionary computation on GPGPUs

    CERN Document Server

    Tsutsui, Shigeyoshi

    2013-01-01

    Evolutionary algorithms (EAs) are metaheuristics that learn from natural collective behavior and are applied to solve optimization problems in domains such as scheduling, engineering, bioinformatics, and finance. Such applications demand acceptable solutions with high-speed execution using finite computational resources. Therefore, there have been many attempts to develop platforms for running parallel EAs using multicore machines, massively parallel cluster machines, or grid computing environments. Recent advances in general-purpose computing on graphics processing units (GPGPU) have opened u

  4. Fundamentals of computational intelligence neural networks, fuzzy systems, and evolutionary computation

    CERN Document Server

    Keller, James M; Fogel, David B

    2016-01-01

    This book covers the three fundamental topics that form the basis of computational intelligence: neural networks, fuzzy systems, and evolutionary computation. The text focuses on inspiration, design, theory, and practical aspects of implementing procedures to solve real-world problems. While other books in the three fields that comprise computational intelligence are written by specialists in one discipline, this book is co-written by current former Editor-in-Chief of IEEE Transactions on Neural Networks and Learning Systems, a former Editor-in-Chief of IEEE Transactions on Fuzzy Systems, and the founding Editor-in-Chief of IEEE Transactions on Evolutionary Computation. The coverage across the three topics is both uniform and consistent in style and notation. Discusses single-layer and multilayer neural networks, radial-basi function networks, and recurrent neural networks Covers fuzzy set theory, fuzzy relations, fuzzy logic interference, fuzzy clustering and classification, fuzzy measures and fuzz...

  5. Evolutionary Statistical Procedures

    CERN Document Server

    Baragona, Roberto; Poli, Irene

    2011-01-01

    This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a

  6. Evolutionary Computation and Its Applications in Neural and Fuzzy Systems

    Directory of Open Access Journals (Sweden)

    Biaobiao Zhang

    2011-01-01

    Full Text Available Neural networks and fuzzy systems are two soft-computing paradigms for system modelling. Adapting a neural or fuzzy system requires to solve two optimization problems: structural optimization and parametric optimization. Structural optimization is a discrete optimization problem which is very hard to solve using conventional optimization techniques. Parametric optimization can be solved using conventional optimization techniques, but the solution may be easily trapped at a bad local optimum. Evolutionary computation is a general-purpose stochastic global optimization approach under the universally accepted neo-Darwinian paradigm, which is a combination of the classical Darwinian evolutionary theory, the selectionism of Weismann, and the genetics of Mendel. Evolutionary algorithms are a major approach to adaptation and optimization. In this paper, we first introduce evolutionary algorithms with emphasis on genetic algorithms and evolutionary strategies. Other evolutionary algorithms such as genetic programming, evolutionary programming, particle swarm optimization, immune algorithm, and ant colony optimization are also described. Some topics pertaining to evolutionary algorithms are also discussed, and a comparison between evolutionary algorithms and simulated annealing is made. Finally, the application of EAs to the learning of neural networks as well as to the structural and parametric adaptations of fuzzy systems is also detailed.

  7. 10th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Lin, Jerry; Wang, Chia-Hung; Jiang, Xin

    2017-01-01

    This book gathers papers presented at the 10th International Conference on Genetic and Evolutionary Computing (ICGEC 2016). The conference was co-sponsored by Springer, Fujian University of Technology in China, the University of Computer Studies in Yangon, University of Miyazaki in Japan, National Kaohsiung University of Applied Sciences in Taiwan, Taiwan Association for Web Intelligence Consortium, and VSB-Technical University of Ostrava, Czech Republic. The ICGEC 2016, which was held from November 7 to 9, 2016 in Fuzhou City, China, was intended as an international forum for researchers and professionals in all areas of genetic and evolutionary computing.

  8. From evolutionary computation to the evolution of things

    NARCIS (Netherlands)

    Eiben, A.E.; Smith, J.E.

    2015-01-01

    Evolution has provided a source of inspiration for algorithm designers since the birth of computers. The resulting field, evolutionary computation, has been successful in solving engineering tasks ranging in outlook from the molecular to the astronomical. Today, the field is entering a new phase as

  9. Computational Modeling of Teaching and Learning through Application of Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Richard Lamb

    2015-09-01

    Full Text Available Within the mind, there are a myriad of ideas that make sense within the bounds of everyday experience, but are not reflective of how the world actually exists; this is particularly true in the domain of science. Classroom learning with teacher explanation are a bridge through which these naive understandings can be brought in line with scientific reality. The purpose of this paper is to examine how the application of a Multiobjective Evolutionary Algorithm (MOEA can work in concert with an existing computational-model to effectively model critical-thinking in the science classroom. An evolutionary algorithm is an algorithm that iteratively optimizes machine learning based computational models. The research question is, does the application of an evolutionary algorithm provide a means to optimize the Student Task and Cognition Model (STAC-M and does the optimized model sufficiently represent and predict teaching and learning outcomes in the science classroom? Within this computational study, the authors outline and simulate the effect of teaching on the ability of a “virtual” student to solve a Piagetian task. Using the Student Task and Cognition Model (STAC-M a computational model of student cognitive processing in science class developed in 2013, the authors complete a computational experiment which examines the role of cognitive retraining on student learning. Comparison of the STAC-M and the STAC-M with inclusion of the Multiobjective Evolutionary Algorithm shows greater success in solving the Piagetian science-tasks post cognitive retraining with the Multiobjective Evolutionary Algorithm. This illustrates the potential uses of cognitive and neuropsychological computational modeling in educational research. The authors also outline the limitations and assumptions of computational modeling.

  10. Function Follows Performance in Evolutionary Computational Processing

    DEFF Research Database (Denmark)

    Pasold, Anke; Foged, Isak Worre

    2011-01-01

    As the title ‘Function Follows Performance in Evolutionary Computational Processing’ suggests, this paper explores the potentials of employing multiple design and evaluation criteria within one processing model in order to account for a number of performative parameters desired within varied...

  11. A performance-oriented power transformer design methodology using multi-objective evolutionary optimization.

    Science.gov (United States)

    Adly, Amr A; Abd-El-Hafiz, Salwa K

    2015-05-01

    Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper.

  12. A hybrid finite element analysis and evolutionary computation method for the design of lightweight lattice components with optimized strut diameter

    DEFF Research Database (Denmark)

    Salonitis, Konstantinos; Chantzis, Dimitrios; Kappatos, Vasileios

    2017-01-01

    approaches or with the use of topology optimization methodologies. An optimization approach utilizing multipurpose optimization algorithms has not been proposed yet. This paper presents a novel user-friendly method for the design optimization of lattice components towards weight minimization, which combines...... finite element analysis and evolutionary computation. The proposed method utilizes the cell homogenization technique in order to reduce the computational cost of the finite element analysis and a genetic algorithm in order to search for the most lightweight lattice configuration. A bracket consisting...

  13. Evolutionary computation techniques a comparative perspective

    CERN Document Server

    Cuevas, Erik; Oliva, Diego

    2017-01-01

    This book compares the performance of various evolutionary computation (EC) techniques when they are faced with complex optimization problems extracted from different engineering domains. Particularly focusing on recently developed algorithms, it is designed so that each chapter can be read independently. Several comparisons among EC techniques have been reported in the literature, however, they all suffer from one limitation: their conclusions are based on the performance of popular evolutionary approaches over a set of synthetic functions with exact solutions and well-known behaviors, without considering the application context or including recent developments. In each chapter, a complex engineering optimization problem is posed, and then a particular EC technique is presented as the best choice, according to its search characteristics. Lastly, a set of experiments is conducted in order to compare its performance to other popular EC methods.

  14. An efficient and accurate solution methodology for bilevel multi-objective programming problems using a hybrid evolutionary-local-search algorithm.

    Science.gov (United States)

    Deb, Kalyanmoy; Sinha, Ankur

    2010-01-01

    Bilevel optimization problems involve two optimization tasks (upper and lower level), in which every feasible upper level solution must correspond to an optimal solution to a lower level optimization problem. These problems commonly appear in many practical problem solving tasks including optimal control, process optimization, game-playing strategy developments, transportation problems, and others. However, they are commonly converted into a single level optimization problem by using an approximate solution procedure to replace the lower level optimization task. Although there exist a number of theoretical, numerical, and evolutionary optimization studies involving single-objective bilevel programming problems, not many studies look at the context of multiple conflicting objectives in each level of a bilevel programming problem. In this paper, we address certain intricate issues related to solving multi-objective bilevel programming problems, present challenging test problems, and propose a viable and hybrid evolutionary-cum-local-search based algorithm as a solution methodology. The hybrid approach performs better than a number of existing methodologies and scales well up to 40-variable difficult test problems used in this study. The population sizing and termination criteria are made self-adaptive, so that no additional parameters need to be supplied by the user. The study indicates a clear niche of evolutionary algorithms in solving such difficult problems of practical importance compared to their usual solution by a computationally expensive nested procedure. The study opens up many issues related to multi-objective bilevel programming and hopefully this study will motivate EMO and other researchers to pay more attention to this important and difficult problem solving activity.

  15. 7th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Krömer, Pavel; Snášel, Václav

    2014-01-01

    Genetic and Evolutionary Computing This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2013, the 7th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by The Waseda University in Japan, Kaohsiung University of Applied Science in Taiwan, and VSB-Technical University of Ostrava. ICGEC 2013 was held in Prague, Czech Republic. Prague is one of the most beautiful cities in the world whose magical atmosphere has been shaped over ten centuries. Places of the greatest tourist interest are on the Royal Route running from the Powder Tower through Celetna Street to Old Town Square, then across Charles Bridge through the Lesser Town up to the Hradcany Castle. One should not miss the Jewish Town, and the National Gallery with its fine collection of Czech Gothic art, collection of old European art, and a beautiful collection of French art. The conference was intended as an international forum for the res...

  16. 8th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Yang, Chin-Yu; Lin, Chun-Wei; Pan, Jeng-Shyang; Snasel, Vaclav; Abraham, Ajith

    2015-01-01

    This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2014, the 8th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by Nanchang Institute of Technology in China, Kaohsiung University of Applied Science in Taiwan, and VSB-Technical University of Ostrava. ICGEC 2014 is held from 18-20 October 2014 in Nanchang, China. Nanchang is one of is the capital of Jiangxi Province in southeastern China, located in the north-central portion of the province. As it is bounded on the west by the Jiuling Mountains, and on the east by Poyang Lake, it is famous for its scenery, rich history and cultural sites. Because of its central location relative to the Yangtze and Pearl River Delta regions, it is a major railroad hub in Southern China. The conference is intended as an international forum for the researchers and professionals in all areas of genetic and evolutionary computing.

  17. Coevolution of Artificial Agents Using Evolutionary Computation in Bargaining Game

    Directory of Open Access Journals (Sweden)

    Sangwook Lee

    2015-01-01

    Full Text Available Analysis of bargaining game using evolutionary computation is essential issue in the field of game theory. This paper investigates the interaction and coevolutionary process among heterogeneous artificial agents using evolutionary computation (EC in the bargaining game. In particular, the game performance with regard to payoff through the interaction and coevolution of agents is studied. We present three kinds of EC based agents (EC-agent participating in the bargaining game: genetic algorithm (GA, particle swarm optimization (PSO, and differential evolution (DE. The agents’ performance with regard to changing condition is compared. From the simulation results it is found that the PSO-agent is superior to the other agents.

  18. Optimizing a reconfigurable material via evolutionary computation

    Science.gov (United States)

    Wilken, Sam; Miskin, Marc Z.; Jaeger, Heinrich M.

    2015-08-01

    Rapid prototyping by combining evolutionary computation with simulations is becoming a powerful tool for solving complex design problems in materials science. This method of optimization operates in a virtual design space that simulates potential material behaviors and after completion needs to be validated by experiment. However, in principle an evolutionary optimizer can also operate on an actual physical structure or laboratory experiment directly, provided the relevant material parameters can be accessed by the optimizer and information about the material's performance can be updated by direct measurements. Here we provide a proof of concept of such direct, physical optimization by showing how a reconfigurable, highly nonlinear material can be tuned to respond to impact. We report on an entirely computer controlled laboratory experiment in which a 6 ×6 grid of electromagnets creates a magnetic field pattern that tunes the local rigidity of a concentrated suspension of ferrofluid and iron filings. A genetic algorithm is implemented and tasked to find field patterns that minimize the force transmitted through the suspension. Searching within a space of roughly 1010 possible configurations, after testing only 1500 independent trials the algorithm identifies an optimized configuration of layered rigid and compliant regions.

  19. Predicting protein complexes from weighted protein-protein interaction graphs with a novel unsupervised methodology: Evolutionary enhanced Markov clustering.

    Science.gov (United States)

    Theofilatos, Konstantinos; Pavlopoulou, Niki; Papasavvas, Christoforos; Likothanassis, Spiros; Dimitrakopoulos, Christos; Georgopoulos, Efstratios; Moschopoulos, Charalampos; Mavroudi, Seferina

    2015-03-01

    Proteins are considered to be the most important individual components of biological systems and they combine to form physical protein complexes which are responsible for certain molecular functions. Despite the large availability of protein-protein interaction (PPI) information, not much information is available about protein complexes. Experimental methods are limited in terms of time, efficiency, cost and performance constraints. Existing computational methods have provided encouraging preliminary results, but they phase certain disadvantages as they require parameter tuning, some of them cannot handle weighted PPI data and others do not allow a protein to participate in more than one protein complex. In the present paper, we propose a new fully unsupervised methodology for predicting protein complexes from weighted PPI graphs. The proposed methodology is called evolutionary enhanced Markov clustering (EE-MC) and it is a hybrid combination of an adaptive evolutionary algorithm and a state-of-the-art clustering algorithm named enhanced Markov clustering. EE-MC was compared with state-of-the-art methodologies when applied to datasets from the human and the yeast Saccharomyces cerevisiae organisms. Using public available datasets, EE-MC outperformed existing methodologies (in some datasets the separation metric was increased by 10-20%). Moreover, when applied to new human datasets its performance was encouraging in the prediction of protein complexes which consist of proteins with high functional similarity. In specific, 5737 protein complexes were predicted and 72.58% of them are enriched for at least one gene ontology (GO) function term. EE-MC is by design able to overcome intrinsic limitations of existing methodologies such as their inability to handle weighted PPI networks, their constraint to assign every protein in exactly one cluster and the difficulties they face concerning the parameter tuning. This fact was experimentally validated and moreover, new

  20. MEGA X: Molecular Evolutionary Genetics Analysis across Computing Platforms.

    Science.gov (United States)

    Kumar, Sudhir; Stecher, Glen; Li, Michael; Knyaz, Christina; Tamura, Koichiro

    2018-06-01

    The Molecular Evolutionary Genetics Analysis (Mega) software implements many analytical methods and tools for phylogenomics and phylomedicine. Here, we report a transformation of Mega to enable cross-platform use on Microsoft Windows and Linux operating systems. Mega X does not require virtualization or emulation software and provides a uniform user experience across platforms. Mega X has additionally been upgraded to use multiple computing cores for many molecular evolutionary analyses. Mega X is available in two interfaces (graphical and command line) and can be downloaded from www.megasoftware.net free of charge.

  1. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    DOE Order 5637.1, ''Classified Computer Security,'' requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, we have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system. 1 tab

  2. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    This paper reports on DIE Order 5637.1, Classified Computer Security, which requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, the authors have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system

  3. Interior spatial layout with soft objectives using evolutionary computation

    NARCIS (Netherlands)

    Chatzikonstantinou, I.; Bengisu, E.

    2016-01-01

    This paper presents the design problem of furniture arrangement in a residential interior living space, and addresses it by means of evolutionary computation. Interior arrangement is an important and interesting problem that occurs commonly when designing living spaces. It entails determining the

  4. Regulatory RNA design through evolutionary computation and strand displacement.

    Science.gov (United States)

    Rostain, William; Landrain, Thomas E; Rodrigo, Guillermo; Jaramillo, Alfonso

    2015-01-01

    The discovery and study of a vast number of regulatory RNAs in all kingdoms of life over the past decades has allowed the design of new synthetic RNAs that can regulate gene expression in vivo. Riboregulators, in particular, have been used to activate or repress gene expression. However, to accelerate and scale up the design process, synthetic biologists require computer-assisted design tools, without which riboregulator engineering will remain a case-by-case design process requiring expert attention. Recently, the design of RNA circuits by evolutionary computation and adapting strand displacement techniques from nanotechnology has proven to be suited to the automated generation of DNA sequences implementing regulatory RNA systems in bacteria. Herein, we present our method to carry out such evolutionary design and how to use it to create various types of riboregulators, allowing the systematic de novo design of genetic control systems in synthetic biology.

  5. Computer Network Operations Methodology

    Science.gov (United States)

    2004-03-01

    means of their computer information systems. Disrupt - This type of attack focuses on disrupting as “attackers might surreptitiously reprogram enemy...by reprogramming the computers that control distribution within the power grid. A disruption attack introduces disorder and inhibits the effective...between commanders. The use of methodologies is widespread and done subconsciously to assist individuals in decision making. The processes that

  6. Computing the Quartet Distance Between Evolutionary Trees in Time O(n log n)

    DEFF Research Database (Denmark)

    Brodal, Gerth Sølfting; Fagerberg, Rolf; Pedersen, Christian Nørgaard Storm

    2003-01-01

    Evolutionary trees describing the relationship for a set of species are central in evolutionary biology, and quantifying differences between evolutionary trees is therefore an important task. The quartet distance is a distance measure between trees previously proposed by Estabrook, McMorris, and ...... unrooted evolutionary trees of n species, where all internal nodes have degree three, in time O(n log n. The previous best algorithm for the problem uses time O(n 2).......Evolutionary trees describing the relationship for a set of species are central in evolutionary biology, and quantifying differences between evolutionary trees is therefore an important task. The quartet distance is a distance measure between trees previously proposed by Estabrook, Mc......Morris, and Meacham. The quartet distance between two unrooted evolutionary trees is the number of quartet topology differences between the two trees, where a quartet topology is the topological subtree induced by four species. In this paper we present an algorithm for computing the quartet distance between two...

  7. Predicting human miRNA target genes using a novel evolutionary methodology

    KAUST Repository

    Aigli, Korfiati; Kleftogiannis, Dimitrios A.; Konstantinos, Theofilatos; Spiros, Likothanassis; Athanasios, Tsakalidis; Seferina, Mavroudi

    2012-01-01

    The discovery of miRNAs had great impacts on traditional biology. Typically, miRNAs have the potential to bind to the 3'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. The experimental identification of their targets has many drawbacks including cost, time and low specificity and these are the reasons why many computational approaches have been developed so far. However, existing computational approaches do not include any advanced feature selection technique and they are facing problems concerning their classification performance and their interpretability. In the present paper, we propose a novel hybrid methodology which combines genetic algorithms and support vector machines in order to locate the optimal feature subset while achieving high classification performance. The proposed methodology was compared with two of the most promising existing methodologies in the problem of predicting human miRNA targets. Our approach outperforms existing methodologies in terms of classification performances while selecting a much smaller feature subset. © 2012 Springer-Verlag.

  8. Predicting human miRNA target genes using a novel evolutionary methodology

    KAUST Repository

    Aigli, Korfiati

    2012-01-01

    The discovery of miRNAs had great impacts on traditional biology. Typically, miRNAs have the potential to bind to the 3\\'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. The experimental identification of their targets has many drawbacks including cost, time and low specificity and these are the reasons why many computational approaches have been developed so far. However, existing computational approaches do not include any advanced feature selection technique and they are facing problems concerning their classification performance and their interpretability. In the present paper, we propose a novel hybrid methodology which combines genetic algorithms and support vector machines in order to locate the optimal feature subset while achieving high classification performance. The proposed methodology was compared with two of the most promising existing methodologies in the problem of predicting human miRNA targets. Our approach outperforms existing methodologies in terms of classification performances while selecting a much smaller feature subset. © 2012 Springer-Verlag.

  9. Practical Applications of Evolutionary Computation to Financial Engineering Robust Techniques for Forecasting, Trading and Hedging

    CERN Document Server

    Iba, Hitoshi

    2012-01-01

    “Practical Applications of Evolutionary Computation to Financial Engineering” presents the state of the art techniques in Financial Engineering using recent results in Machine Learning and Evolutionary Computation. This book bridges the gap between academics in computer science and traders and explains the basic ideas of the proposed systems and the financial problems in ways that can be understood by readers without previous knowledge on either of the fields. To cement the ideas discussed in the book, software packages are offered that implement the systems described within. The book is structured so that each chapter can be read independently from the others. Chapters 1 and 2 describe evolutionary computation. The third chapter is an introduction to financial engineering problems for readers who are unfamiliar with this area. The following chapters each deal, in turn, with a different problem in the financial engineering field describing each problem in detail and focusing on solutions based on evolutio...

  10. Evolutionary Robotics: What, Why, and Where to

    Directory of Open Access Journals (Sweden)

    Stephane eDoncieux

    2015-03-01

    Full Text Available Evolutionary robotics applies the selection, variation, and heredity principles of natural evolution to the design of robots with embodied intelligence. It can be considered as a subfield of robotics that aims to create more robust and adaptive robots. A pivotal feature of the evolutionary approach is that it considers the whole robot at once, and enables the exploitation of robot features in a holistic manner. Evolutionary robotics can also be seen as an innovative approach to the study of evolution based on a new kind of experimentalism. The use of robots as a substrate can help address questions that are difficult, if not impossible, to investigate through computer simulations or biological studies. In this paper we consider the main achievements of evolutionary robotics, focusing particularly on its contributions to both engineering and biology. We briefly elaborate on methodological issues, review some of the most interesting findings, and discuss important open issues and promising avenues for future work.

  11. EvoluCode: Evolutionary Barcodes as a Unifying Framework for Multilevel Evolutionary Data.

    Science.gov (United States)

    Linard, Benjamin; Nguyen, Ngoc Hoan; Prosdocimi, Francisco; Poch, Olivier; Thompson, Julie D

    2012-01-01

    Evolutionary systems biology aims to uncover the general trends and principles governing the evolution of biological networks. An essential part of this process is the reconstruction and analysis of the evolutionary histories of these complex, dynamic networks. Unfortunately, the methodologies for representing and exploiting such complex evolutionary histories in large scale studies are currently limited. Here, we propose a new formalism, called EvoluCode (Evolutionary barCode), which allows the integration of different evolutionary parameters (eg, sequence conservation, orthology, synteny …) in a unifying format and facilitates the multilevel analysis and visualization of complex evolutionary histories at the genome scale. The advantages of the approach are demonstrated by constructing barcodes representing the evolution of the complete human proteome. Two large-scale studies are then described: (i) the mapping and visualization of the barcodes on the human chromosomes and (ii) automatic clustering of the barcodes to highlight protein subsets sharing similar evolutionary histories and their functional analysis. The methodologies developed here open the way to the efficient application of other data mining and knowledge extraction techniques in evolutionary systems biology studies. A database containing all EvoluCode data is available at: http://lbgi.igbmc.fr/barcodes.

  12. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment.

    Science.gov (United States)

    Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che

    2014-01-16

    To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high

  13. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design

    International Nuclear Information System (INIS)

    Menges, Achim

    2012-01-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies. (paper)

  14. Markov Networks in Evolutionary Computation

    CERN Document Server

    Shakya, Siddhartha

    2012-01-01

    Markov networks and other probabilistic graphical modes have recently received an upsurge in attention from Evolutionary computation community, particularly in the area of Estimation of distribution algorithms (EDAs).  EDAs have arisen as one of the most successful experiences in the application of machine learning methods in optimization, mainly due to their efficiency to solve complex real-world optimization problems and their suitability for theoretical analysis. This book focuses on the different steps involved in the conception, implementation and application of EDAs that use Markov networks, and undirected models in general. It can serve as a general introduction to EDAs but covers also an important current void in the study of these algorithms by explaining the specificities and benefits of modeling optimization problems by means of undirected probabilistic models. All major developments to date in the progressive introduction of Markov networks based EDAs are reviewed in the book. Hot current researc...

  15. Advances of evolutionary computation methods and operators

    CERN Document Server

    Cuevas, Erik; Oliva Navarro, Diego Alberto

    2016-01-01

    The goal of this book is to present advances that discuss alternative Evolutionary Computation (EC) developments and non-conventional operators which have proved to be effective in the solution of several complex problems. The book has been structured so that each chapter can be read independently from the others. The book contains nine chapters with the following themes: 1) Introduction, 2) the Social Spider Optimization (SSO), 3) the States of Matter Search (SMS), 4) the collective animal behavior (CAB) algorithm, 5) the Allostatic Optimization (AO) method, 6) the Locust Search (LS) algorithm, 7) the Adaptive Population with Reduced Evaluations (APRE) method, 8) the multimodal CAB, 9) the constrained SSO method.

  16. Methodology of Implementation of Computer Forensics

    OpenAIRE

    Gelev, Saso; Golubovski, Roman; Hristov, Risto; Nikolov, Elenior

    2013-01-01

    Compared to other sciences, computer forensics (digital forensics) is a relatively young discipline. It was established in 1999 and it has been an irreplaceable tool in sanctioning cybercrime ever since. Good knowledge of computer forensics can be really helpful in uncovering a committed crime. Not adhering to the methodology of computer forensics, however, makes the obtained evidence invalid/irrelevant and as such it cannot be used in legal proceedings. This paper is to explain the methodolo...

  17. Recent advances in swarm intelligence and evolutionary computation

    CERN Document Server

    2015-01-01

    This timely review volume summarizes the state-of-the-art developments in nature-inspired algorithms and applications with the emphasis on swarm intelligence and bio-inspired computation. Topics include the analysis and overview of swarm intelligence and evolutionary computation, hybrid metaheuristic algorithms, bat algorithm, discrete cuckoo search, firefly algorithm, particle swarm optimization, and harmony search as well as convergent hybridization. Application case studies have focused on the dehydration of fruits and vegetables by the firefly algorithm and goal programming, feature selection by the binary flower pollination algorithm, job shop scheduling, single row facility layout optimization, training of feed-forward neural networks, damage and stiffness identification, synthesis of cross-ambiguity functions by the bat algorithm, web document clustering, truss analysis, water distribution networks, sustainable building designs and others. As a timely review, this book can serve as an ideal reference f...

  18. Multi-objective optimization of HVAC system with an evolutionary computation algorithm

    International Nuclear Information System (INIS)

    Kusiak, Andrew; Tang, Fan; Xu, Guanglin

    2011-01-01

    A data-mining approach for the optimization of a HVAC (heating, ventilation, and air conditioning) system is presented. A predictive model of the HVAC system is derived by data-mining algorithms, using a dataset collected from an experiment conducted at a research facility. To minimize the energy while maintaining the corresponding IAQ (indoor air quality) within a user-defined range, a multi-objective optimization model is developed. The solutions of this model are set points of the control system derived with an evolutionary computation algorithm. The controllable input variables - supply air temperature and supply air duct static pressure set points - are generated to reduce the energy use. The results produced by the evolutionary computation algorithm show that the control strategy saves energy by optimizing operations of an HVAC system. -- Highlights: → A data-mining approach for the optimization of a heating, ventilation, and air conditioning (HVAC) system is presented. → The data used in the project has been collected from an experiment conducted at an energy research facility. → The approach presented in the paper leads to accomplishing significant energy savings without compromising the indoor air quality. → The energy savings are accomplished by computing set points for the supply air temperature and the supply air duct static pressure.

  19. Statistical physics and computational methods for evolutionary game theory

    CERN Document Server

    Javarone, Marco Alberto

    2018-01-01

    This book presents an introduction to Evolutionary Game Theory (EGT) which is an emerging field in the area of complex systems attracting the attention of researchers from disparate scientific communities. EGT allows one to represent and study several complex phenomena, such as the emergence of cooperation in social systems, the role of conformity in shaping the equilibrium of a population, and the dynamics in biological and ecological systems. Since EGT models belong to the area of complex systems, statistical physics constitutes a fundamental ingredient for investigating their behavior. At the same time, the complexity of some EGT models, such as those realized by means of agent-based methods, often require the implementation of numerical simulations. Therefore, beyond providing an introduction to EGT, this book gives a brief overview of the main statistical physics tools (such as phase transitions and the Ising model) and computational strategies for simulating evolutionary games (such as Monte Carlo algor...

  20. International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems

    CERN Document Server

    Bhaskar, M; Panigrahi, Bijaya; Das, Swagatam

    2016-01-01

    The book is a collection of high-quality peer-reviewed research papers presented in the first International Conference on International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems (ICAIECES -2015) held at Velammal Engineering College (VEC), Chennai, India during 22 – 23 April 2015. The book discusses wide variety of industrial, engineering and scientific applications of the emerging techniques. Researchers from academic and industry present their original work and exchange ideas, information, techniques and applications in the field of Communication, Computing and Power Technologies.

  1. International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems

    CERN Document Server

    Vijayakumar, K; Panigrahi, Bijaya; Das, Swagatam

    2017-01-01

    The volume is a collection of high-quality peer-reviewed research papers presented in the International Conference on Artificial Intelligence and Evolutionary Computation in Engineering Systems (ICAIECES 2016) held at SRM University, Chennai, Tamilnadu, India. This conference is an international forum for industry professionals and researchers to deliberate and state their research findings, discuss the latest advancements and explore the future directions in the emerging areas of engineering and technology. The book presents original work and novel ideas, information, techniques and applications in the field of communication, computing and power technologies.

  2. Genomicus 2018: karyotype evolutionary trees and on-the-fly synteny computing.

    Science.gov (United States)

    Nguyen, Nga Thi Thuy; Vincens, Pierre; Roest Crollius, Hugues; Louis, Alexandra

    2018-01-04

    Since 2010, the Genomicus web server is available online at http://genomicus.biologie.ens.fr/genomicus. This graphical browser provides access to comparative genomic analyses in four different phyla (Vertebrate, Plants, Fungi, and non vertebrate Metazoans). Users can analyse genomic information from extant species, as well as ancestral gene content and gene order for vertebrates and flowering plants, in an integrated evolutionary context. New analyses and visualization tools have recently been implemented in Genomicus Vertebrate. Karyotype structures from several genomes can now be compared along an evolutionary pathway (Multi-KaryotypeView), and synteny blocks can be computed and visualized between any two genomes (PhylDiagView). © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  3. 9th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Lin, Jerry; Pan, Jeng-Shyang; Tin, Pyke; Yokota, Mitsuhiro; Genetic and Evolutionary Computing

    2016-01-01

    This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2015, the 9th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by Ministry of Science and Technology, Myanmar, University of Computer Studies, Yangon, University of Miyazaki in Japan, Kaohsiung University of Applied Science in Taiwan, Fujian University of Technology in China and VSB-Technical University of Ostrava. ICGEC 2015 is held from 26-28, August, 2015 in Yangon, Myanmar. Yangon, the most multiethnic and cosmopolitan city in Myanmar, is the main gateway to the country. Despite being the commercial capital of Myanmar, Yangon is a city engulfed by its rich history and culture, an integration of ancient traditions and spiritual heritage. The stunning SHWEDAGON Pagoda is the center piece of Yangon city, which itself is famous for the best British colonial era architecture. Of particular interest in many shops of Bogyoke Aung San Market,...

  4. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, J.G., E-mail: jglezg2002@gmail.es [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Rubiano, J.G. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Winter, G. [Instituto Universitario de Sistemas Inteligentes y Aplicaciones Numéricas en la Ingeniería, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Martel, P. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Bolivar, J.P. [Departamento de Física Aplicada, Universidad de Huelva, 21071 Huelva (Spain)

    2017-06-21

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs. - Highlights: • A computational method for characterizing HPGe detectors has been generalized. • The new version is usable for a wider range of sample geometries. • It starts from reference FEPEs obtained through a standard calibration procedure. • A model of an HPGe XtRa detector has been

  5. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    International Nuclear Information System (INIS)

    Guerra, J.G.; Rubiano, J.G.; Winter, G.; Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Martel, P.; Bolivar, J.P.

    2017-01-01

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs. - Highlights: • A computational method for characterizing HPGe detectors has been generalized. • The new version is usable for a wider range of sample geometries. • It starts from reference FEPEs obtained through a standard calibration procedure. • A model of an HPGe XtRa detector has been

  6. Evolutionary constrained optimization

    CERN Document Server

    Deb, Kalyanmoy

    2015-01-01

    This book makes available a self-contained collection of modern research addressing the general constrained optimization problems using evolutionary algorithms. Broadly the topics covered include constraint handling for single and multi-objective optimizations; penalty function based methodology; multi-objective based methodology; new constraint handling mechanism; hybrid methodology; scaling issues in constrained optimization; design of scalable test problems; parameter adaptation in constrained optimization; handling of integer, discrete and mix variables in addition to continuous variables; application of constraint handling techniques to real-world problems; and constrained optimization in dynamic environment. There is also a separate chapter on hybrid optimization, which is gaining lots of popularity nowadays due to its capability of bridging the gap between evolutionary and classical optimization. The material in the book is useful to researchers, novice, and experts alike. The book will also be useful...

  7. Intelligent computing systems emerging application areas

    CERN Document Server

    Virvou, Maria; Jain, Lakhmi

    2016-01-01

    This book at hand explores emerging scientific and technological areas in which Intelligent Computing Systems provide efficient solutions and, thus, may play a role in the years to come. It demonstrates how Intelligent Computing Systems make use of computational methodologies that mimic nature-inspired processes to address real world problems of high complexity for which exact mathematical solutions, based on physical and statistical modelling, are intractable. Common intelligent computational methodologies are presented including artificial neural networks, evolutionary computation, genetic algorithms, artificial immune systems, fuzzy logic, swarm intelligence, artificial life, virtual worlds and hybrid methodologies based on combinations of the previous. The book will be useful to researchers, practitioners and graduate students dealing with mathematically-intractable problems. It is intended for both the expert/researcher in the field of Intelligent Computing Systems, as well as for the general reader in t...

  8. TEACHING AND LEARNING METHODOLOGIES SUPPORTED BY ICT APPLIED IN COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Jose CAPACHO

    2016-04-01

    Full Text Available The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory. Genetic-Cognitive Psychology Theory and Dialectics Psychology. Based on the theoretical framework the following methodologies were developed: Game Theory, Constructivist Approach, Personalized Teaching, Problem Solving, Cooperative Collaborative learning, Learning projects using ICT. These methodologies were applied to the teaching learning process during the Algorithms and Complexity – A&C course, which belongs to the area of ​​Computer Science. The course develops the concepts of Computers, Complexity and Intractability, Recurrence Equations, Divide and Conquer, Greedy Algorithms, Dynamic Programming, Shortest Path Problem and Graph Theory. The main value of the research is the theoretical support of the methodologies and their application supported by ICT using learning objects. The course aforementioned was built on the Blackboard platform evaluating the operation of methodologies. The results of the evaluation are presented for each of them, showing the learning outcomes achieved by students, which verifies that methodologies are functional.

  9. A Simulation-Based Soft Error Estimation Methodology for Computer Systems

    OpenAIRE

    Sugihara, Makoto; Ishihara, Tohru; Hashimoto, Koji; Muroyama, Masanori

    2006-01-01

    This paper proposes a simulation-based soft error estimation methodology for computer systems. Accumulating soft error rates (SERs) of all memories in a computer system results in pessimistic soft error estimation. This is because memory cells are used spatially and temporally and not all soft errors in them make the computer system faulty. Our soft-error estimation methodology considers the locations and the timings of soft errors occurring at every level of memory hierarchy and estimates th...

  10. The Security Challenges in the IoT Enabled Cyber-Physical Systems and Opportunities for Evolutionary Computing & Other Computational Intelligence

    OpenAIRE

    He, H.; Maple, C.; Watson, T.; Tiwari, A.; Mehnen, J.; Jin, Y.; Gabrys, Bogdan

    2016-01-01

    Internet of Things (IoT) has given rise to the fourth industrial revolution (Industrie 4.0), and it brings great benefits by connecting people, processes and data. However, cybersecurity has become a critical challenge in the IoT enabled cyber physical systems, from connected supply chain, Big Data produced by huge amount of IoT devices, to industry control systems. Evolutionary computation combining with other computational intelligence will play an important role for cybersecurity, such as ...

  11. A Design Methodology for Computer Security Testing

    OpenAIRE

    Ramilli, Marco

    2013-01-01

    The field of "computer security" is often considered something in between Art and Science. This is partly due to the lack of widely agreed and standardized methodologies to evaluate the degree of the security of a system. This dissertation intends to contribute to this area by investigating the most common security testing strategies applied nowadays and by proposing an enhanced methodology that may be effectively applied to different threat scenarios with the same degree of effectiveness. ...

  12. Protein 3D structure computed from evolutionary sequence variation.

    Directory of Open Access Journals (Sweden)

    Debora S Marks

    Full Text Available The evolutionary trajectory of a protein through sequence space is constrained by its function. Collections of sequence homologs record the outcomes of millions of evolutionary experiments in which the protein evolves according to these constraints. Deciphering the evolutionary record held in these sequences and exploiting it for predictive and engineering purposes presents a formidable challenge. The potential benefit of solving this challenge is amplified by the advent of inexpensive high-throughput genomic sequencing.In this paper we ask whether we can infer evolutionary constraints from a set of sequence homologs of a protein. The challenge is to distinguish true co-evolution couplings from the noisy set of observed correlations. We address this challenge using a maximum entropy model of the protein sequence, constrained by the statistics of the multiple sequence alignment, to infer residue pair couplings. Surprisingly, we find that the strength of these inferred couplings is an excellent predictor of residue-residue proximity in folded structures. Indeed, the top-scoring residue couplings are sufficiently accurate and well-distributed to define the 3D protein fold with remarkable accuracy.We quantify this observation by computing, from sequence alone, all-atom 3D structures of fifteen test proteins from different fold classes, ranging in size from 50 to 260 residues, including a G-protein coupled receptor. These blinded inferences are de novo, i.e., they do not use homology modeling or sequence-similar fragments from known structures. The co-evolution signals provide sufficient information to determine accurate 3D protein structure to 2.7-4.8 Å C(α-RMSD error relative to the observed structure, over at least two-thirds of the protein (method called EVfold, details at http://EVfold.org. This discovery provides insight into essential interactions constraining protein evolution and will facilitate a comprehensive survey of the universe of

  13. Parallel evolutionary computation in bioinformatics applications.

    Science.gov (United States)

    Pinho, Jorge; Sobral, João Luis; Rocha, Miguel

    2013-05-01

    A large number of optimization problems within the field of Bioinformatics require methods able to handle its inherent complexity (e.g. NP-hard problems) and also demand increased computational efforts. In this context, the use of parallel architectures is a necessity. In this work, we propose ParJECoLi, a Java based library that offers a large set of metaheuristic methods (such as Evolutionary Algorithms) and also addresses the issue of its efficient execution on a wide range of parallel architectures. The proposed approach focuses on the easiness of use, making the adaptation to distinct parallel environments (multicore, cluster, grid) transparent to the user. Indeed, this work shows how the development of the optimization library can proceed independently of its adaptation for several architectures, making use of Aspect-Oriented Programming. The pluggable nature of parallelism related modules allows the user to easily configure its environment, adding parallelism modules to the base source code when needed. The performance of the platform is validated with two case studies within biological model optimization. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  14. Applied evolutionary economics and economic geography

    NARCIS (Netherlands)

    Frenken, K.

    2007-01-01

    Applied Evolutionary Economics and Economic Geography" aims to further advance empirical methodologies in evolutionary economics, with a special emphasis on geography and firm location. It does so by bringing together a select group of leading scholars including economists, geographers and

  15. Multi-objective evolutionary algorithms for fuzzy classification in survival prediction.

    Science.gov (United States)

    Jiménez, Fernando; Sánchez, Gracia; Juárez, José M

    2014-03-01

    This paper presents a novel rule-based fuzzy classification methodology for survival/mortality prediction in severe burnt patients. Due to the ethical aspects involved in this medical scenario, physicians tend not to accept a computer-based evaluation unless they understand why and how such a recommendation is given. Therefore, any fuzzy classifier model must be both accurate and interpretable. The proposed methodology is a three-step process: (1) multi-objective constrained optimization of a patient's data set, using Pareto-based elitist multi-objective evolutionary algorithms to maximize accuracy and minimize the complexity (number of rules) of classifiers, subject to interpretability constraints; this step produces a set of alternative (Pareto) classifiers; (2) linguistic labeling, which assigns a linguistic label to each fuzzy set of the classifiers; this step is essential to the interpretability of the classifiers; (3) decision making, whereby a classifier is chosen, if it is satisfactory, according to the preferences of the decision maker. If no classifier is satisfactory for the decision maker, the process starts again in step (1) with a different input parameter set. The performance of three multi-objective evolutionary algorithms, niched pre-selection multi-objective algorithm, elitist Pareto-based multi-objective evolutionary algorithm for diversity reinforcement (ENORA) and the non-dominated sorting genetic algorithm (NSGA-II), was tested using a patient's data set from an intensive care burn unit and a standard machine learning data set from an standard machine learning repository. The results are compared using the hypervolume multi-objective metric. Besides, the results have been compared with other non-evolutionary techniques and validated with a multi-objective cross-validation technique. Our proposal improves the classification rate obtained by other non-evolutionary techniques (decision trees, artificial neural networks, Naive Bayes, and case

  16. Reversible logic synthesis methodologies with application to quantum computing

    CERN Document Server

    Taha, Saleem Mohammed Ridha

    2016-01-01

    This book opens the door to a new interesting and ambitious world of reversible and quantum computing research. It presents the state of the art required to travel around that world safely. Top world universities, companies and government institutions  are in a race of developing new methodologies, algorithms and circuits on reversible logic, quantum logic, reversible and quantum computing and nano-technologies. In this book, twelve reversible logic synthesis methodologies are presented for the first time in a single literature with some new proposals. Also, the sequential reversible logic circuitries are discussed for the first time in a book. Reversible logic plays an important role in quantum computing. Any progress in the domain of reversible logic can be directly applied to quantum logic. One of the goals of this book is to show the application of reversible logic in quantum computing. A new implementation of wavelet and multiwavelet transforms using quantum computing is performed for this purpose. Rese...

  17. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    Science.gov (United States)

    Guerra, J. G.; Rubiano, J. G.; Winter, G.; Guerra, A. G.; Alonso, H.; Arnedo, M. A.; Tejera, A.; Martel, P.; Bolivar, J. P.

    2017-06-01

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs.

  18. Teaching and Learning Methodologies Supported by ICT Applied in Computer Science

    Science.gov (United States)

    Capacho, Jose

    2016-01-01

    The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…

  19. SHIPBUILDING PRODUCTION PROCESS DESIGN METHODOLOGY USING COMPUTER SIMULATION

    OpenAIRE

    Marko Hadjina; Nikša Fafandjel; Tin Matulja

    2015-01-01

    In this research a shipbuilding production process design methodology, using computer simulation, is suggested. It is expected from suggested methodology to give better and more efficient tool for complex shipbuilding production processes design procedure. Within the first part of this research existing practice for production process design in shipbuilding was discussed, its shortcomings and problem were emphasized. In continuing, discrete event simulation modelling method, as basis of sugge...

  20. Industrial Applications of Evolutionary Algorithms

    CERN Document Server

    Sanchez, Ernesto; Tonda, Alberto

    2012-01-01

    This book is intended as a reference both for experienced users of evolutionary algorithms and for researchers that are beginning to approach these fascinating optimization techniques. Experienced users will find interesting details of real-world problems, and advice on solving issues related to fitness computation, modeling and setting appropriate parameters to reach optimal solutions. Beginners will find a thorough introduction to evolutionary computation, and a complete presentation of all evolutionary algorithms exploited to solve different problems. The book could fill the gap between the

  1. A simple methodology for characterization of germanium coaxial detectors by using Monte Carlo simulation and evolutionary algorithms

    International Nuclear Information System (INIS)

    Guerra, J.G.; Rubiano, J.G.; Winter, G.; Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Gil, J.M.; Rodríguez, R.; Martel, P.; Bolivar, J.P.

    2015-01-01

    The determination in a sample of the activity concentration of a specific radionuclide by gamma spectrometry needs to know the full energy peak efficiency (FEPE) for the energy of interest. The difficulties related to the experimental calibration make it advisable to have alternative methods for FEPE determination, such as the simulation of the transport of photons in the crystal by the Monte Carlo method, which requires an accurate knowledge of the characteristics and geometry of the detector. The characterization process is mainly carried out by Canberra Industries Inc. using proprietary techniques and methodologies developed by that company. It is a costly procedure (due to shipping and to the cost of the process itself) and for some research laboratories an alternative in situ procedure can be very useful. The main goal of this paper is to find an alternative to this costly characterization process, by establishing a method for optimizing the parameters of characterizing the detector, through a computational procedure which could be reproduced at a standard research lab. This method consists in the determination of the detector geometric parameters by using Monte Carlo simulation in parallel with an optimization process, based on evolutionary algorithms, starting from a set of reference FEPEs determined experimentally or computationally. The proposed method has proven to be effective and simple to implement. It provides a set of characterization parameters which it has been successfully validated for different source-detector geometries, and also for a wide range of environmental samples and certified materials. - Highlights: • A computational method for characterizing an HPGe spectrometer has been developed. • Detector characterized using as reference photopeak efficiencies obtained experimentally or by Monte Carlo calibration. • The characterization obtained has been validated for samples with different geometries and composition. • Good agreement

  2. Evolutionary Cell Computing: From Protocells to Self-Organized Computing

    Science.gov (United States)

    Colombano, Silvano; New, Michael H.; Pohorille, Andrew; Scargle, Jeffrey; Stassinopoulos, Dimitris; Pearson, Mark; Warren, James

    2000-01-01

    On the path from inanimate to animate matter, a key step was the self-organization of molecules into protocells - the earliest ancestors of contemporary cells. Studies of the properties of protocells and the mechanisms by which they maintained themselves and reproduced are an important part of astrobiology. These studies also have the potential to greatly impact research in nanotechnology and computer science. Previous studies of protocells have focussed on self-replication. In these systems, Darwinian evolution occurs through a series of small alterations to functional molecules whose identities are stored. Protocells, however, may have been incapable of such storage. We hypothesize that under such conditions, the replication of functions and their interrelationships, rather than the precise identities of the functional molecules, is sufficient for survival and evolution. This process is called non-genomic evolution. Recent breakthroughs in experimental protein chemistry have opened the gates for experimental tests of non-genomic evolution. On the basis of these achievements, we have developed a stochastic model for examining the evolutionary potential of non-genomic systems. In this model, the formation and destruction (hydrolysis) of bonds joining amino acids in proteins occur through catalyzed, albeit possibly inefficient, pathways. Each protein can act as a substrate for polymerization or hydrolysis, or as a catalyst of these chemical reactions. When a protein is hydrolyzed to form two new proteins, or two proteins are joined into a single protein, the catalytic abilities of the product proteins are related to the catalytic abilities of the reactants. We will demonstrate that the catalytic capabilities of such a system can increase. Its evolutionary potential is dependent upon the competition between the formation of bond-forming and bond-cutting catalysts. The degree to which hydrolysis preferentially affects bonds in less efficient, and therefore less well

  3. New design methods for computer aided architecturald design methodology teaching

    NARCIS (Netherlands)

    Achten, H.H.

    2003-01-01

    Architects and architectural students are exploring new ways of design using Computer Aided Architectural Design software. This exploration is seldom backed up from a design methodological viewpoint. In this paper, a design methodological framework for reflection on innovate design processes by

  4. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  5. Artificial Intelligence, Evolutionary Computing and Metaheuristics In the Footsteps of Alan Turing

    CERN Document Server

    2013-01-01

    Alan Turing pioneered many research areas such as artificial intelligence, computability, heuristics and pattern formation.  Nowadays at the information age, it is hard to imagine how the world would be without computers and the Internet. Without Turing's work, especially the core concept of Turing Machine at the heart of every computer, mobile phone and microchip today, so many things on which we are so dependent would be impossible. 2012 is the Alan Turing year -- a centenary celebration of the life and work of Alan Turing. To celebrate Turing's legacy and follow the footsteps of this brilliant mind, we take this golden opportunity to review the latest developments in areas of artificial intelligence, evolutionary computation and metaheuristics, and all these areas can be traced back to Turing's pioneer work. Topics include Turing test, Turing machine, artificial intelligence, cryptography, software testing, image processing, neural networks, nature-inspired algorithms such as bat algorithm and cuckoo sear...

  6. Discovering Unique, Low-Energy Transition States Using Evolutionary Molecular Memetic Computing

    DEFF Research Database (Denmark)

    Ellabaan, Mostafa M Hashim; Ong, Y.S.; Handoko, S.D.

    2013-01-01

    In the last few decades, identification of transition states has experienced significant growth in research interests from various scientific communities. As per the transition states theory, reaction paths and landscape analysis as well as many thermodynamic properties of biochemical systems can...... be accurately identified through the transition states. Transition states describe the paths of molecular systems in transiting across stable states. In this article, we present the discovery of unique, low-energy transition states and showcase the efficacy of their identification using the memetic computing...... paradigm under a Molecular Memetic Computing (MMC) framework. In essence, the MMC is equipped with the tree-based representation of non-cyclic molecules and the covalent-bond-driven evolutionary operators, in addition to the typical backbone of memetic algorithms. Herein, we employ genetic algorithm...

  7. Spent fuel management fee methodology and computer code user's manual

    International Nuclear Information System (INIS)

    Engel, R.L.; White, M.K.

    1982-01-01

    The methodology and computer model described here were developed to analyze the cash flows for the federal government taking title to and managing spent nuclear fuel. The methodology has been used by the US Department of Energy (DOE) to estimate the spent fuel disposal fee that will provide full cost recovery. Although the methodology was designed to analyze interim storage followed by spent fuel disposal, it could be used to calculate a fee for reprocessing spent fuel and disposing of the waste. The methodology consists of two phases. The first phase estimates government expenditures for spent fuel management. The second phase determines the fees that will result in revenues such that the government attains full cost recovery assuming various revenue collection philosophies. These two phases are discussed in detail in subsequent sections of this report. Each of the two phases constitute a computer module, called SPADE (SPent fuel Analysis and Disposal Economics) and FEAN (FEe ANalysis), respectively

  8. Problems and Issues in Using Computer- Based Support Tools to Enhance 'Soft' Systems Methodologies

    Directory of Open Access Journals (Sweden)

    Mark Stansfield

    2001-11-01

    Full Text Available This paper explores the issue of whether computer-based support tools can enhance the use of 'soft' systems methodologies as applied to real-world problem situations. Although work has been carried out by a number of researchers in applying computer-based technology to concepts and methodologies relating to 'soft' systems thinking such as Soft Systems Methodology (SSM, such attempts appear to be still in their infancy and have not been applied widely to real-world problem situations. This paper will highlight some of the problems that may be encountered in attempting to develop computer-based support tools for 'soft' systems methodologies. Particular attention will be paid to an attempt by the author to develop a computer-based support tool for a particular 'soft' systems method of inquiry known as the Appreciative Inquiry Method that is based upon Vickers' notion of 'appreciation' (Vickers, 196S and Checkland's SSM (Checkland, 1981. The final part of the paper will explore some of the lessons learnt from developing and applying the computer-based support tool to a real world problem situation, as well as considering the feasibility of developing computer-based support tools for 'soft' systems methodologies. This paper will put forward the point that a mixture of manual and computer-based tools should be employed to allow a methodology to be used in an unconstrained manner, but the benefits provided by computer-based technology should be utilised in supporting and enhancing the more mundane and structured tasks.

  9. Design and analysis of sustainable computer mouse using design for disassembly methodology

    Science.gov (United States)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  10. Artificial intelligence in peer review: How can evolutionary computation support journal editors?

    Science.gov (United States)

    Mrowinski, Maciej J; Fronczak, Piotr; Fronczak, Agata; Ausloos, Marcel; Nedic, Olgica

    2017-01-01

    With the volume of manuscripts submitted for publication growing every year, the deficiencies of peer review (e.g. long review times) are becoming more apparent. Editorial strategies, sets of guidelines designed to speed up the process and reduce editors' workloads, are treated as trade secrets by publishing houses and are not shared publicly. To improve the effectiveness of their strategies, editors in small publishing groups are faced with undertaking an iterative trial-and-error approach. We show that Cartesian Genetic Programming, a nature-inspired evolutionary algorithm, can dramatically improve editorial strategies. The artificially evolved strategy reduced the duration of the peer review process by 30%, without increasing the pool of reviewers (in comparison to a typical human-developed strategy). Evolutionary computation has typically been used in technological processes or biological ecosystems. Our results demonstrate that genetic programs can improve real-world social systems that are usually much harder to understand and control than physical systems.

  11. Artificial intelligence in peer review: How can evolutionary computation support journal editors?

    Directory of Open Access Journals (Sweden)

    Maciej J Mrowinski

    Full Text Available With the volume of manuscripts submitted for publication growing every year, the deficiencies of peer review (e.g. long review times are becoming more apparent. Editorial strategies, sets of guidelines designed to speed up the process and reduce editors' workloads, are treated as trade secrets by publishing houses and are not shared publicly. To improve the effectiveness of their strategies, editors in small publishing groups are faced with undertaking an iterative trial-and-error approach. We show that Cartesian Genetic Programming, a nature-inspired evolutionary algorithm, can dramatically improve editorial strategies. The artificially evolved strategy reduced the duration of the peer review process by 30%, without increasing the pool of reviewers (in comparison to a typical human-developed strategy. Evolutionary computation has typically been used in technological processes or biological ecosystems. Our results demonstrate that genetic programs can improve real-world social systems that are usually much harder to understand and control than physical systems.

  12. A cyber kill chain based taxonomy of banking Trojans for evolutionary computational intelligence

    OpenAIRE

    Kiwia, D; Dehghantanha, A; Choo, K-KR; Slaughter, J

    2017-01-01

    Malware such as banking Trojans are popular with financially-motivated cybercriminals. Detection of banking Trojans remains a challenging task, due to the constant evolution of techniques used to obfuscate and circumvent existing detection and security solutions. Having a malware taxonomy can facilitate the design of mitigation strategies such as those based on evolutionary computational intelligence. Specifically, in this paper, we propose a cyber kill chain based taxonomy of banking Trojans...

  13. A New Methodology for Fuel Mass Computation of an operating Aircraft

    Directory of Open Access Journals (Sweden)

    M Souli

    2016-03-01

    Full Text Available The paper performs a new computational methodology for an accurate computation of fuel mass inside an aircraft wing during the flight. The computation is carried out using hydrodynamic equations, classically known as Navier-Stokes equations by the CFD community. For this purpose, a computational software is developed, the software computes the fuel mass inside the tank based on experimental data of pressure gages that are inserted in the fuel tank. Actually and for safety reasons, Optical fiber sensor for fluid level sensor detection is used. The optical system consists to an optically controlled acoustic transceiver system which measures the fuel level inside the each compartment of the fuel tank. The system computes fuel volume inside the tank and needs density to compute the total fuel mass. Using optical sensor technique, density measurement inside the tank is required. The method developed in the paper, requires pressure measurements in each tank compartment, the density is then computed based on pressure measurements and hydrostatic assumptions. The methodology is tested using a fuel tank provided by Airbus for time history refueling process.

  14. Optimization and Assessment of Wavelet Packet Decompositions with Evolutionary Computation

    Directory of Open Access Journals (Sweden)

    Schell Thomas

    2003-01-01

    Full Text Available In image compression, the wavelet transformation is a state-of-the-art component. Recently, wavelet packet decomposition has received quite an interest. A popular approach for wavelet packet decomposition is the near-best-basis algorithm using nonadditive cost functions. In contrast to additive cost functions, the wavelet packet decomposition of the near-best-basis algorithm is only suboptimal. We apply methods from the field of evolutionary computation (EC to test the quality of the near-best-basis results. We observe a phenomenon: the results of the near-best-basis algorithm are inferior in terms of cost-function optimization but are superior in terms of rate/distortion performance compared to EC methods.

  15. Emission computed tomography: methodology and applications

    International Nuclear Information System (INIS)

    Reivich, M.; Alavi, A.; Greenberg, J.; Fowler, J.; Christman, D.; Rosenquist, A.; Rintelmann, W.; Hand, P.; MacGregor, R.; Wolf, A.

    1980-01-01

    A technique for the determination of local cerebral glucose metabolism using positron emission computed tomography is described as an example of the development of use of this methodology for the study of these parameters in man. The method for the determination of local cerebral glucose metabolism utilizes 18 F-2-fluoro-2-deoxyglucose ([ 18 F]-FDG). In this method [ 18 F]-FDG is used as a tracer for the exchange of glucose between plasma and brain and its phosphorylation by hexokinase in the tissue. The labelled product of metabolism, [ 18 F]-FDG phosphate, is essentially trapped in the tissue over the time course of the measurement. The studies demonstrate the potential usefulness of emission computed tomography for the measurement of various biochemical and physiological parameters in man. (Auth.)

  16. Fuzzy Clustering based Methodology for Multidimensional Data Analysis in Computational Forensic Domain

    OpenAIRE

    Kilian Stoffel; Paul Cotofrei; Dong Han

    2012-01-01

    As interdisciplinary domain requiring advanced and innovative methodologies the computational forensics domain is characterized by data being simultaneously large scaled and uncertain multidimensional and approximate. Forensic domain experts trained to discover hidden pattern from crime data are limited in their analysis without the assistance of a computational intelligence approach. In this paper a methodology and an automatic procedure based on fuzzy set theory and designed to infer precis...

  17. MEGA-CC: computing core of molecular evolutionary genetics analysis program for automated and iterative data analysis.

    Science.gov (United States)

    Kumar, Sudhir; Stecher, Glen; Peterson, Daniel; Tamura, Koichiro

    2012-10-15

    There is a growing need in the research community to apply the molecular evolutionary genetics analysis (MEGA) software tool for batch processing a large number of datasets and to integrate it into analysis workflows. Therefore, we now make available the computing core of the MEGA software as a stand-alone executable (MEGA-CC), along with an analysis prototyper (MEGA-Proto). MEGA-CC provides users with access to all the computational analyses available through MEGA's graphical user interface version. This includes methods for multiple sequence alignment, substitution model selection, evolutionary distance estimation, phylogeny inference, substitution rate and pattern estimation, tests of natural selection and ancestral sequence inference. Additionally, we have upgraded the source code for phylogenetic analysis using the maximum likelihood methods for parallel execution on multiple processors and cores. Here, we describe MEGA-CC and outline the steps for using MEGA-CC in tandem with MEGA-Proto for iterative and automated data analysis. http://www.megasoftware.net/.

  18. Why is economic geography not an evolutionary science? : towards an evolutionary economic geography

    NARCIS (Netherlands)

    Boschma, R.A.; Frenken, K.; Martin, R.

    2008-01-01

    The paper explains the commonalities and differences between neoclassical, institutional and evolutionary approaches that have been influential in economic geography during the last couple of decades. By separating the three approaches in terms of theoretical content and research methodology, we can

  19. Why is economic geography not an evolutionary science? ; towards an evolutionary economic geography

    NARCIS (Netherlands)

    Boschma, R.A.; Frenken, K.

    2006-01-01

    The paper explains the commonalities and differences between neoclassical, institutional and evolutionary approaches that have been influential in economic geography during the last couple of decades. By separating the three approaches in terms of theoretical content and research methodology, wecan

  20. Applied evolutionary economics and economic geography

    OpenAIRE

    Peter Sunley

    2008-01-01

    Applied Evolutionary Economics and Economic Geography aims to further advance empirical methodologies in evolutionary economics, with a special emphasis on geography and firm location. It does so by bringing together a select group of leading scholars including economists, geographers and sociologists, all of whom share an interest in explaining the uneven distribution of economic activities in space and the historical processes that have produced these patterns.

  1. Computer Aided Methodology for Simultaneous Synthesis, Design & Analysis of Chemical Products-Processes

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2006-01-01

    A new combined methodology for computer aided molecular design and process flowsheet design is presented. The methodology is based on the group contribution approach for prediction of molecular properties and design of molecules. Using the same principles, process groups have been developed...... a wide range of problems. In this paper, only the computer aided flowsheet design related features are presented....... together with their corresponding flowsheet property models. To represent the process flowsheets in the same way as molecules, a unique but simple notation system has been developed. The methodology has been converted into a prototype software, which has been tested with several case studies covering...

  2. Where Evolutionary Psychology Meets Cognitive Neuroscience: A Précis to Evolutionary Cognitive Neuroscience1

    Directory of Open Access Journals (Sweden)

    Austen L. Krill

    2007-01-01

    Full Text Available Cognitive neuroscience, the study of brain-behavior relationships, has long attempted to map the brain. The discipline is flourishing, with an increasing number of functional neuroimaging studies appearing in the scientific literature daily. Unlike biology and even psychology, the cognitive neurosciences have only recently begun to apply evolutionary meta-theory and methodological guidance. Approaching cognitive neuroscience from an evolutionary perspective allows scientists to apply biologically based theoretical guidance to their investigations and can be conducted in both humans and nonhuman animals. In fact, several investigations of this sort are underway in laboratories around the world. This paper and two new volumes (Platek, Keenan, and Shackelford [Eds.], 2007; Platek and Shackelford [Eds.], under contract represent the first formal attempts to document the burgeoning field of evolutionary cognitive neuroscience. Here, we briefly review the current state of the science of evolutionary cognitive neuroscience, the methods available to the evolutionary cognitive neuroscientist, and what we foresee as the future directions of the discipline.

  3. Core principles of evolutionary medicine

    Science.gov (United States)

    Grunspan, Daniel Z; Nesse, Randolph M; Barnes, M Elizabeth; Brownell, Sara E

    2018-01-01

    Abstract Background and objectives Evolutionary medicine is a rapidly growing field that uses the principles of evolutionary biology to better understand, prevent and treat disease, and that uses studies of disease to advance basic knowledge in evolutionary biology. Over-arching principles of evolutionary medicine have been described in publications, but our study is the first to systematically elicit core principles from a diverse panel of experts in evolutionary medicine. These principles should be useful to advance recent recommendations made by The Association of American Medical Colleges and the Howard Hughes Medical Institute to make evolutionary thinking a core competency for pre-medical education. Methodology The Delphi method was used to elicit and validate a list of core principles for evolutionary medicine. The study included four surveys administered in sequence to 56 expert panelists. The initial open-ended survey created a list of possible core principles; the three subsequent surveys winnowed the list and assessed the accuracy and importance of each principle. Results Fourteen core principles elicited at least 80% of the panelists to agree or strongly agree that they were important core principles for evolutionary medicine. These principles over-lapped with concepts discussed in other articles discussing key concepts in evolutionary medicine. Conclusions and implications This set of core principles will be helpful for researchers and instructors in evolutionary medicine. We recommend that evolutionary medicine instructors use the list of core principles to construct learning goals. Evolutionary medicine is a young field, so this list of core principles will likely change as the field develops further. PMID:29493660

  4. Solution of Fractional Order System of Bagley-Torvik Equation Using Evolutionary Computational Intelligence

    Directory of Open Access Journals (Sweden)

    Muhammad Asif Zahoor Raja

    2011-01-01

    Full Text Available A stochastic technique has been developed for the solution of fractional order system represented by Bagley-Torvik equation. The mathematical model of the equation was developed with the help of feed-forward artificial neural networks. The training of the networks was made with evolutionary computational intelligence based on genetic algorithm hybrid with pattern search technique. Designed scheme was successfully applied to different forms of the equation. Results are compared with standard approximate analytic, stochastic numerical solvers and exact solutions.

  5. The integration of Darwinism and evolutionary morphology: Alexej Nikolajevich Sewertzoff (1866-1936) and the developmental basis of evolutionary change.

    Science.gov (United States)

    Levit, George S; Hossfeld, Uwe; Olsson, Lennart

    2004-07-15

    The growth of evolutionary morphology in the late 19th and early 20th centuries was inspired by the work of Carl Gegenbaur (1826-1903) and his protégé and friend Ernst Haeckel (1834-1919). However, neither of them succeeded in creating and applying a strictly Darwinian (selectionist) methodology. This task was left to the next generation of evolutionary morphologists. In this paper we present a relatively unknown researcher, Alexej Nikolajevich Sewertzoff (1866-1936) who made important contributions towards a synthesis of Darwinism and evolutionary morphology. Copyright 2004 Wiley-Liss, Inc.

  6. Darwinian foundations for evolutionary economics

    NARCIS (Netherlands)

    Stoelhorst, J.W.

    2008-01-01

    This paper engages with the methodological debate on the contribution of Darwinism to Veblen's (1898) evolutionary research program for economics. I argue that ontological continuity, generalized Darwinism, and multi-level selection are necessary building blocks for an explanatory framework that can

  7. Evolutionary Sound Synthesis Controlled by Gestural Data

    Directory of Open Access Journals (Sweden)

    Jose Fornari

    2011-05-01

    Full Text Available This article focuses on the interdisciplinary research involving Computer Music and Generative Visual Art. We describe the implementation of two interactive artistic systems based on principles of Gestural Data (WILSON, 2002 retrieval and self-organization (MORONI, 2003, to control an Evolutionary Sound Synthesis method (ESSynth. The first implementation uses, as gestural data, image mapping of handmade drawings. The second one uses gestural data from dynamic body movements of dance. The resulting computer output is generated by an interactive system implemented in Pure Data (PD. This system uses principles of Evolutionary Computation (EC, which yields the generation of a synthetic adaptive population of sound objects. Considering that music could be seen as “organized sound” the contribution of our study is to develop a system that aims to generate "self-organized sound" – a method that uses evolutionary computation to bridge between gesture, sound and music.

  8. Towards Automatic Controller Design using Multi-Objective Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf

    of evolutionary computation, a choice was made to use multi-objective algorithms for the purpose of aiding in automatic controller design. More specifically, the choice was made to use the Non-dominated Sorting Genetic Algorithm II (NSGAII), which is one of the most potent algorithms currently in use...... for automatic controller design. However, because the field of evolutionary computation is relatively unknown in the field of control engineering, this thesis also includes a comprehensive introduction to the basic field of evolutionary computation as well as a description of how the field has previously been......In order to design the controllers of tomorrow, a need has risen for tools that can aid in the design of these. A desire to use evolutionary computation as a tool to achieve that goal is what gave inspiration for the work contained in this thesis. After having studied the foundations...

  9. An efficient hysteresis modeling methodology and its implementation in field computation applications

    Energy Technology Data Exchange (ETDEWEB)

    Adly, A.A., E-mail: adlyamr@gmail.com [Electrical Power and Machines Dept., Faculty of Engineering, Cairo University, Giza 12613 (Egypt); Abd-El-Hafiz, S.K. [Engineering Mathematics Department, Faculty of Engineering, Cairo University, Giza 12613 (Egypt)

    2017-07-15

    Highlights: • An approach to simulate hysteresis while taking shape anisotropy into consideration. • Utilizing the ensemble of triangular sub-regions hysteresis models in field computation. • A novel tool capable of carrying out field computation while keeping track of hysteresis losses. • The approach may be extended for 3D tetra-hedra sub-volumes. - Abstract: Field computation in media exhibiting hysteresis is crucial to a variety of applications such as magnetic recording processes and accurate determination of core losses in power devices. Recently, Hopfield neural networks (HNN) have been successfully configured to construct scalar and vector hysteresis models. This paper presents an efficient hysteresis modeling methodology and its implementation in field computation applications. The methodology is based on the application of the integral equation approach on discretized triangular magnetic sub-regions. Within every triangular sub-region, hysteresis properties are realized using a 3-node HNN. Details of the approach and sample computation results are given in the paper.

  10. Crowd Computing as a Cooperation Problem: An Evolutionary Approach

    Science.gov (United States)

    Christoforou, Evgenia; Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A.; Sánchez, Angel

    2013-05-01

    Cooperation is one of the socio-economic issues that has received more attention from the physics community. The problem has been mostly considered by studying games such as the Prisoner's Dilemma or the Public Goods Game. Here, we take a step forward by studying cooperation in the context of crowd computing. We introduce a model loosely based on Principal-agent theory in which people (workers) contribute to the solution of a distributed problem by computing answers and reporting to the problem proposer (master). To go beyond classical approaches involving the concept of Nash equilibrium, we work on an evolutionary framework in which both the master and the workers update their behavior through reinforcement learning. Using a Markov chain approach, we show theoretically that under certain----not very restrictive—conditions, the master can ensure the reliability of the answer resulting of the process. Then, we study the model by numerical simulations, finding that convergence, meaning that the system reaches a point in which it always produces reliable answers, may in general be much faster than the upper bounds given by the theoretical calculation. We also discuss the effects of the master's level of tolerance to defectors, about which the theory does not provide information. The discussion shows that the system works even with very large tolerances. We conclude with a discussion of our results and possible directions to carry this research further.

  11. Investment Strategies Optimization based on a SAX-GA Methodology

    CERN Document Server

    Canelas, António M L; Horta, Nuno C G

    2013-01-01

    This book presents a new computational finance approach combining a Symbolic Aggregate approXimation (SAX) technique with an optimization kernel based on genetic algorithms (GA). While the SAX representation is used to describe the financial time series, the evolutionary optimization kernel is used in order to identify the most relevant patterns and generate investment rules. The proposed approach considers several different chromosomes structures in order to achieve better results on the trading platform The methodology presented in this book has great potential on investment markets.

  12. Implementation of an evolutionary algorithm in planning investment in a power distribution system

    Directory of Open Access Journals (Sweden)

    Carlos Andrés García Montoya

    2011-06-01

    Full Text Available The definition of an investment plan to implement in a distribution power system, is a task that constantly faced by utilities. This work presents a methodology for determining the investment plan for a distribution power system under a shortterm, using as a criterion for evaluating investment projects, associated costs and customers benefit from its implementation. Given the number of projects carried out annually on the system, the definition of an investment plan requires the use of computational tools to evaluate, a set of possibilities, the one that best suits the needs of the present system and better results. That is why in the job, implementing a multi objective evolutionary algorithm SPEA (Strength Pareto Evolutionary Algorithm, which, based on the principles of Pareto optimality, it deliver to the planning expert, the best solutions found in the optimization process. The performance of the algorithm is tested using a set of projects to determine the best among the possible plans. We analyze also the effect of operators on the performance of evolutionary algorithm and results.

  13. Distributed computing methodology for training neural networks in an image-guided diagnostic application.

    Science.gov (United States)

    Plagianakos, V P; Magoulas, G D; Vrahatis, M N

    2006-03-01

    Distributed computing is a process through which a set of computers connected by a network is used collectively to solve a single problem. In this paper, we propose a distributed computing methodology for training neural networks for the detection of lesions in colonoscopy. Our approach is based on partitioning the training set across multiple processors using a parallel virtual machine. In this way, interconnected computers of varied architectures can be used for the distributed evaluation of the error function and gradient values, and, thus, training neural networks utilizing various learning methods. The proposed methodology has large granularity and low synchronization, and has been implemented and tested. Our results indicate that the parallel virtual machine implementation of the training algorithms developed leads to considerable speedup, especially when large network architectures and training sets are used.

  14. Open Issues in Evolutionary Robotics.

    Science.gov (United States)

    Silva, Fernando; Duarte, Miguel; Correia, Luís; Oliveira, Sancho Moura; Christensen, Anders Lyhne

    2016-01-01

    One of the long-term goals in evolutionary robotics is to be able to automatically synthesize controllers for real autonomous robots based only on a task specification. While a number of studies have shown the applicability of evolutionary robotics techniques for the synthesis of behavioral control, researchers have consistently been faced with a number of issues preventing the widespread adoption of evolutionary robotics for engineering purposes. In this article, we review and discuss the open issues in evolutionary robotics. First, we analyze the benefits and challenges of simulation-based evolution and subsequent deployment of controllers versus evolution on real robotic hardware. Second, we discuss specific evolutionary computation issues that have plagued evolutionary robotics: (1) the bootstrap problem, (2) deception, and (3) the role of genomic encoding and genotype-phenotype mapping in the evolution of controllers for complex tasks. Finally, we address the absence of standard research practices in the field. We also discuss promising avenues of research. Our underlying motivation is the reduction of the current gap between evolutionary robotics and mainstream robotics, and the establishment of evolutionary robotics as a canonical approach for the engineering of autonomous robots.

  15. On the Existence of Evolutionary Learning Equilibriums

    Directory of Open Access Journals (Sweden)

    Masudul Alam Choudhury

    2011-12-01

    Full Text Available The usual kinds of Fixed-Point Theorems formalized on the existence of competitive equilibrium that explain much of economic theory at the core of economics can operate only on bounded and closed sets with convex mappings. But these conditions are hardly true of the real world of economic and financial complexities and perturbations. The category of learning sets explained by continuous fields of interactive, integrative and evolutionary behaviour caused by dynamic preferences at the individual and institutional and social levels cannot maintain the assumption of closed, bounded and convex sets. Thus learning sets and multi-system inter-temporal relations explained by pervasive complementarities and  participation between variables and entities, and evolution by learning, have evolutionary equilibriums. Such a study requires a new methodological approach. This paper formalizes such a methodology for evolutionary equilibriums in learning spaces. It briefly points out the universality of learning equilibriums in all mathematical structures. For a particular case though, the inter-systemic interdependence between sustainable development and ethics and economics in the specific understanding of learning domain is pointed out.

  16. MEGA5: Molecular Evolutionary Genetics Analysis Using Maximum Likelihood, Evolutionary Distance, and Maximum Parsimony Methods

    Science.gov (United States)

    Tamura, Koichiro; Peterson, Daniel; Peterson, Nicholas; Stecher, Glen; Nei, Masatoshi; Kumar, Sudhir

    2011-01-01

    Comparative analysis of molecular sequence data is essential for reconstructing the evolutionary histories of species and inferring the nature and extent of selective forces shaping the evolution of genes and species. Here, we announce the release of Molecular Evolutionary Genetics Analysis version 5 (MEGA5), which is a user-friendly software for mining online databases, building sequence alignments and phylogenetic trees, and using methods of evolutionary bioinformatics in basic biology, biomedicine, and evolution. The newest addition in MEGA5 is a collection of maximum likelihood (ML) analyses for inferring evolutionary trees, selecting best-fit substitution models (nucleotide or amino acid), inferring ancestral states and sequences (along with probabilities), and estimating evolutionary rates site-by-site. In computer simulation analyses, ML tree inference algorithms in MEGA5 compared favorably with other software packages in terms of computational efficiency and the accuracy of the estimates of phylogenetic trees, substitution parameters, and rate variation among sites. The MEGA user interface has now been enhanced to be activity driven to make it easier for the use of both beginners and experienced scientists. This version of MEGA is intended for the Windows platform, and it has been configured for effective use on Mac OS X and Linux desktops. It is available free of charge from http://www.megasoftware.net. PMID:21546353

  17. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  18. Exploring Tradeoffs in Demand-Side and Supply-Side Management of Urban Water Resources Using Agent-Based Modeling and Evolutionary Computation

    Directory of Open Access Journals (Sweden)

    Lufthansa Kanta

    2015-11-01

    Full Text Available Urban water supply systems may be managed through supply-side and demand-side strategies, which focus on water source expansion and demand reductions, respectively. Supply-side strategies bear infrastructure and energy costs, while demand-side strategies bear costs of implementation and inconvenience to consumers. To evaluate the performance of demand-side strategies, the participation and water use adaptations of consumers should be simulated. In this study, a Complex Adaptive Systems (CAS framework is developed to simulate consumer agents that change their consumption to affect the withdrawal from the water supply system, which, in turn influences operational policies and long-term resource planning. Agent-based models are encoded to represent consumers and a policy maker agent and are coupled with water resources system simulation models. The CAS framework is coupled with an evolutionary computation-based multi-objective methodology to explore tradeoffs in cost, inconvenience to consumers, and environmental impacts for both supply-side and demand-side strategies. Decisions are identified to specify storage levels in a reservoir that trigger: (1 increases in the volume of water pumped through inter-basin transfers from an external reservoir; and (2 drought stages, which restrict the volume of water that is allowed for residential outdoor uses. The proposed methodology is demonstrated for Arlington, Texas, water supply system to identify non-dominated strategies for an historic drought decade. Results demonstrate that pumping costs associated with maximizing environmental reliability exceed pumping costs associated with minimizing restrictions on consumer water use.

  19. Research traditions and evolutionary explanations in medicine.

    Science.gov (United States)

    Méthot, Pierre-Olivier

    2011-02-01

    In this article, I argue that distinguishing 'evolutionary' from 'Darwinian' medicine will help us assess the variety of roles that evolutionary explanations can play in a number of medical contexts. Because the boundaries of evolutionary and Darwinian medicine overlap to some extent, however, they are best described as distinct 'research traditions' rather than as competing paradigms. But while evolutionary medicine does not stand out as a new scientific field of its own, Darwinian medicine is united by a number of distinctive theoretical and methodological claims. For example, evolutionary medicine and Darwinian medicine can be distinguished with respect to the styles of evolutionary explanations they employ. While the former primarily involves 'forward looking' explanations, the latter depends mostly on 'backward looking' explanations. A forward looking explanation tries to predict the effects of ongoing evolutionary processes on human health and disease in contemporary environments (e.g., hospitals). In contrast, a backward looking explanation typically applies evolutionary principles from the vantage point of humans' distant biological past in order to assess present states of health and disease. Both approaches, however, are concerned with the prevention and control of human diseases. In conclusion, I raise some concerns about the claim that 'nothing in medicine makes sense except in the light of evolution'.

  20. A Hybrid Chaotic Quantum Evolutionary Algorithm

    DEFF Research Database (Denmark)

    Cai, Y.; Zhang, M.; Cai, H.

    2010-01-01

    A hybrid chaotic quantum evolutionary algorithm is proposed to reduce amount of computation, speed up convergence and restrain premature phenomena of quantum evolutionary algorithm. The proposed algorithm adopts the chaotic initialization method to generate initial population which will form a pe...... tests. The presented algorithm is applied to urban traffic signal timing optimization and the effect is satisfied....

  1. Fundamentals of natural computing basic concepts, algorithms, and applications

    CERN Document Server

    de Castro, Leandro Nunes

    2006-01-01

    Introduction A Small Sample of Ideas The Philosophy of Natural Computing The Three Branches: A Brief Overview When to Use Natural Computing Approaches Conceptualization General Concepts PART I - COMPUTING INSPIRED BY NATURE Evolutionary Computing Problem Solving as a Search Task Hill Climbing and Simulated Annealing Evolutionary Biology Evolutionary Computing The Other Main Evolutionary Algorithms From Evolutionary Biology to Computing Scope of Evolutionary Computing Neurocomputing The Nervous System Artif

  2. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  3. Conversion Rate Optimization through Evolutionary Computation

    OpenAIRE

    Miikkulainen, Risto; Iscoe, Neil; Shagrin, Aaron; Cordell, Ron; Nazari, Sam; Schoolland, Cory; Brundage, Myles; Epstein, Jonathan; Dean, Randy; Lamba, Gurmeet

    2017-01-01

    Conversion optimization means designing a web interface so that as many users as possible take a desired action on it, such as register or purchase. Such design is usually done by hand, testing one change at a time through A/B testing, or a limited number of combinations through multivariate testing, making it possible to evaluate only a small fraction of designs in a vast design space. This paper describes Sentient Ascend, an automatic conversion optimization system that uses evolutionary op...

  4. The human dark side: evolutionary psychology and original sin.

    Science.gov (United States)

    Lee, Joseph; Theol, M

    2014-04-01

    Human nature has a dark side, something important to religions. Evolutionary psychology has been used to illuminate the human shadow side, although as a discipline it has attracted criticism. This article seeks to examine the evolutionary psychology's understanding of human nature and to propose an unexpected dialog with an enduring account of human evil known as original sin. Two cases are briefly considered: murder and rape. To further the exchange, numerous theoretical and methodological criticisms and replies of evolutionary psychology are explored jointly with original sin. Evolutionary psychology can partner with original sin since they share some theoretical likenesses and together they offer insights into the nature of what it means to be human.

  5. Evolutionary accounts of human behavioural diversity

    Science.gov (United States)

    Brown, Gillian R.; Dickins, Thomas E.; Sear, Rebecca; Laland, Kevin N.

    2011-01-01

    Human beings persist in an extraordinary range of ecological settings, in the process exhibiting enormous behavioural diversity, both within and between populations. People vary in their social, mating and parental behaviour and have diverse and elaborate beliefs, traditions, norms and institutions. The aim of this theme issue is to ask whether, and how, evolutionary theory can help us to understand this diversity. In this introductory article, we provide a background to the debate surrounding how best to understand behavioural diversity using evolutionary models of human behaviour. In particular, we examine how diversity has been viewed by the main subdisciplines within the human evolutionary behavioural sciences, focusing in particular on the human behavioural ecology, evolutionary psychology and cultural evolution approaches. In addition to differences in focus and methodology, these subdisciplines have traditionally varied in the emphasis placed on human universals, ecological factors and socially learned behaviour, and on how they have addressed the issue of genetic variation. We reaffirm that evolutionary theory provides an essential framework for understanding behavioural diversity within and between human populations, but argue that greater integration between the subfields is critical to developing a satisfactory understanding of diversity. PMID:21199836

  6. Literary study and evolutionary theory : A review essay.

    Science.gov (United States)

    Carroll, J

    1998-09-01

    Several recent books have claimed to integrate literary study with evolutionary biology. All of the books here considered, except Robert Storey's, adopt conceptions of evolutionary theory that are in some way marginal to the Darwinian adaptationist program. All the works attempt to connect evolutionary study with various other disciplines or methodologies: for example, with cultural anthropology, cognitive psychology, the psychology of emotion, neurobiology, chaos theory, or structuralist linguistics. No empirical paradigm has yet been established for this field, but important steps have been taken, especially by Storey, in formulating basic principles, identifying appropriate disciplinary connections, and marking out lines of inquiry. Reciprocal efforts are needed from biologists and social scientists.

  7. Evaluation of models generated via hybrid evolutionary algorithms ...

    African Journals Online (AJOL)

    2016-04-02

    Apr 2, 2016 ... Evaluation of models generated via hybrid evolutionary algorithms for the prediction of Microcystis ... evolutionary algorithms (HEA) proved to be highly applica- ble to the hypertrophic reservoirs of South Africa. .... discovered and optimised using a large-scale parallel computational device and relevant soft-.

  8. Methodology, Algorithms, and Emerging Tool for Automated Design of Intelligent Integrated Multi-Sensor Systems

    Directory of Open Access Journals (Sweden)

    Andreas König

    2009-11-01

    Full Text Available The emergence of novel sensing elements, computing nodes, wireless communication and integration technology provides unprecedented possibilities for the design and application of intelligent systems. Each new application system must be designed from scratch, employing sophisticated methods ranging from conventional signal processing to computational intelligence. Currently, a significant part of this overall algorithmic chain of the computational system model still has to be assembled manually by experienced designers in a time and labor consuming process. In this research work, this challenge is picked up and a methodology and algorithms for automated design of intelligent integrated and resource-aware multi-sensor systems employing multi-objective evolutionary computation are introduced. The proposed methodology tackles the challenge of rapid-prototyping of such systems under realization constraints and, additionally, includes features of system instance specific self-correction for sustained operation of a large volume and in a dynamically changing environment. The extension of these concepts to the reconfigurable hardware platform renders so called self-x sensor systems, which stands, e.g., for self-monitoring, -calibrating, -trimming, and -repairing/-healing systems. Selected experimental results prove the applicability and effectiveness of our proposed methodology and emerging tool. By our approach, competitive results were achieved with regard to classification accuracy, flexibility, and design speed under additional design constraints.

  9. A Study On Traditional And Evolutionary Software Development Models

    Directory of Open Access Journals (Sweden)

    Kamran Rasheed

    2017-07-01

    Full Text Available Today Computing technologies are becoming the pioneers of the organizations and helpful in individual functionality i.e. added to computing device we need to add softwares. Set of instruction or computer program is known as software. The development of software is done through some traditional or some new or evolutionary models. Software development is becoming a key and a successful business nowadays. Without software all hardware is useless. Some collective steps that are performed in the development of these are known as Software development life cycle SDLC. There are some adaptive and predictive models for developing software. Predictive mean already known like WATERFALL Spiral Prototype and V-shaped models while Adaptive model include agile Scrum. All methodologies of both adaptive and predictive have their own procedure and steps. Predictive are Static and Adaptive are dynamic mean change cannot be made to the predictive while adaptive have the capability of changing. The purpose of this study is to get familiar with all these and discuss their uses and steps of development. This discussion will be helpful in deciding which model they should use in which circumstance and what are the development step including in each model.

  10. A computational methodology for formulating gasoline surrogate fuels with accurate physical and chemical kinetic properties

    KAUST Repository

    Ahmed, Ahfaz; Goteng, Gokop; Shankar, Vijai; Al-Qurashi, Khalid; Roberts, William L.; Sarathy, Mani

    2015-01-01

    simpler molecular composition that represent real fuel behavior in one or more aspects are needed to enable repeatable experimental and computational combustion investigations. This study presents a novel computational methodology for formulating

  11. Calculation and evaluation methodology of the flawed pipe and the compute program development

    International Nuclear Information System (INIS)

    Liu Chang; Qian Hao; Yao Weida; Liang Xingyun

    2013-01-01

    Background: The crack will grow gradually under alternating load for a pressurized pipe, whereas the load is less than the fatigue strength limit. Purpose: Both calculation and evaluation methodology for a flawed pipe that have been detected during in-service inspection is elaborated here base on the Elastic Plastic Fracture Mechanics (EPFM) criteria. Methods: In the compute, the depth and length interaction of a flaw has been considered and a compute program is developed per Visual C++. Results: The fluctuating load of the Reactor Coolant System transients, the initial flaw shape, the initial flaw orientation are all accounted here. Conclusions: The calculation and evaluation methodology here is an important basis for continue working or not. (authors)

  12. MicroComputed Tomography: Methodology and Applications

    International Nuclear Information System (INIS)

    Stock, Stuart R.

    2009-01-01

    Due to the availability of commercial laboratory systems and the emergence of user facilities at synchrotron radiation sources, studies of microcomputed tomography or microCT have increased exponentially. MicroComputed Technology provides a complete introduction to the technology, describing how to use it effectively and understand its results. The first part of the book focuses on methodology, covering experimental methods, data analysis, and visualization approaches. The second part addresses various microCT applications, including porous solids, microstructural evolution, soft tissue studies, multimode studies, and indirect analyses. The author presents a sufficient amount of fundamental material so that those new to the field can develop a relative understanding of how to design their own microCT studies. One of the first full-length references dedicated to microCT, this book provides an accessible introduction to field, supplemented with application examples and color images.

  13. Mean-Potential Law in Evolutionary Games

    Science.gov (United States)

    Nałecz-Jawecki, Paweł; Miekisz, Jacek

    2018-01-01

    The Letter presents a novel way to connect random walks, stochastic differential equations, and evolutionary game theory. We introduce a new concept of a potential function for discrete-space stochastic systems. It is based on a correspondence between one-dimensional stochastic differential equations and random walks, which may be exact not only in the continuous limit but also in finite-state spaces. Our method is useful for computation of fixation probabilities in discrete stochastic dynamical systems with two absorbing states. We apply it to evolutionary games, formulating two simple and intuitive criteria for evolutionary stability of pure Nash equilibria in finite populations. In particular, we show that the 1 /3 law of evolutionary games, introduced by Nowak et al. [Nature, 2004], follows from a more general mean-potential law.

  14. Statistical Methodologies to Integrate Experimental and Computational Research

    Science.gov (United States)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  15. The Methodology of Expert Audit in the Cloud Computing System

    Directory of Open Access Journals (Sweden)

    Irina Vladimirovna Mashkina

    2013-12-01

    Full Text Available The problem of information security audit in the cloud computing system is discussed. The methodology of the expert audit is described, it allows to estimate not only the value of information security risk level, but also operative value of information security risk level. The fuzzy cognitive maps and artificial neural network are used for solution of this problem.

  16. Evolutionary cell biology: two origins, one objective.

    Science.gov (United States)

    Lynch, Michael; Field, Mark C; Goodson, Holly V; Malik, Harmit S; Pereira-Leal, José B; Roos, David S; Turkewitz, Aaron P; Sazer, Shelley

    2014-12-02

    All aspects of biological diversification ultimately trace to evolutionary modifications at the cellular level. This central role of cells frames the basic questions as to how cells work and how cells come to be the way they are. Although these two lines of inquiry lie respectively within the traditional provenance of cell biology and evolutionary biology, a comprehensive synthesis of evolutionary and cell-biological thinking is lacking. We define evolutionary cell biology as the fusion of these two eponymous fields with the theoretical and quantitative branches of biochemistry, biophysics, and population genetics. The key goals are to develop a mechanistic understanding of general evolutionary processes, while specifically infusing cell biology with an evolutionary perspective. The full development of this interdisciplinary field has the potential to solve numerous problems in diverse areas of biology, including the degree to which selection, effectively neutral processes, historical contingencies, and/or constraints at the chemical and biophysical levels dictate patterns of variation for intracellular features. These problems can now be examined at both the within- and among-species levels, with single-cell methodologies even allowing quantification of variation within genotypes. Some results from this emerging field have already had a substantial impact on cell biology, and future findings will significantly influence applications in agriculture, medicine, environmental science, and synthetic biology.

  17. A Computer-Based System Integrating Instruction and Information Retrieval: A Description of Some Methodological Considerations.

    Science.gov (United States)

    Selig, Judith A.; And Others

    This report, summarizing the activities of the Vision Information Center (VIC) in the field of computer-assisted instruction from December, 1966 to August, 1967, describes the methodology used to load a large body of information--a programed text on basic opthalmology--onto a computer for subsequent information retrieval and computer-assisted…

  18. 11th International Conference on Computer and Information Science

    CERN Document Server

    Computer and Information 2012

    2012-01-01

    The series "Studies in Computational Intelligence" (SCI) publishes new developments and advances in the various areas of computational intelligence – quickly and with a high quality. The intent is to cover the theory, applications, and design methods of computational intelligence, as embedded in the fields of engineering, computer science, physics and life science, as well as the methodologies behind them. The series contains monographs, lecture notes and edited volumes in computational intelligence spanning the areas of neural networks, connectionist systems, genetic algorithms, evolutionary computation, artificial intelligence, cellular automata, self-organizing systems, soft computing, fuzzy systems, and hybrid intelligent systems. Critical to both contributors and readers are the short publication time and world-wide distribution - this permits a rapid and broad dissemination of research results.   The purpose of the 11th IEEE/ACIS International Conference on Computer and Information Science (ICIS 2012...

  19. Economic and evolutionary hypotheses for cross-population variation in parochialism

    OpenAIRE

    Daniel Jacob Hruschka; Joseph eHenrich

    2013-01-01

    Human populations differ reliably in the degree to which people favor family, friends and community members over strangers and outsiders. In the last decade, researchers have begun to propose several economic and evolutionary hypotheses for these cross-population differences in parochialism. In this paper, we outline major current theories and review recent attempts to test them. We also discuss the key methodological challenges in assessing these diverse economic and evolutionary theories...

  20. An Artificial Immune System-Inspired Multiobjective Evolutionary Algorithm with Application to the Detection of Distributed Computer Network Intrusions

    Science.gov (United States)

    2007-03-01

    Optimization Coello, Van Veldhuizen , and Lamont define global optimization as, “the process of finding the global minimum4 within some search space S [CVL02...Technology, Shapes Markets, and Manages People, Simon & Schuster, New York, 1995. [CVL02] Coello, C., Van Veldhuizen , D., Lamont, G.B., Evolutionary...Anomaly Detection, Technical Report CS- 2003-02, Computer Science Department, Florida Institute of Technology, 2003. [Marmelstein99] Marmelstein, R., Van

  1. FPGA hardware acceleration for high performance neutron transport computation based on agent methodology - 318

    International Nuclear Information System (INIS)

    Shanjie, Xiao; Tatjana, Jevremovic

    2010-01-01

    The accurate, detailed and 3D neutron transport analysis for Gen-IV reactors is still time-consuming regardless of advanced computational hardware available in developed countries. This paper introduces a new concept in addressing the computational time while persevering the detailed and accurate modeling; a specifically designed FPGA co-processor accelerates robust AGENT methodology for complex reactor geometries. For the first time this approach is applied to accelerate the neutronics analysis. The AGENT methodology solves neutron transport equation using the method of characteristics. The AGENT methodology performance was carefully analyzed before the hardware design based on the FPGA co-processor was adopted. The most time-consuming kernel part is then transplanted into the FPGA co-processor. The FPGA co-processor is designed with data flow-driven non von-Neumann architecture and has much higher efficiency than the conventional computer architecture. Details of the FPGA co-processor design are introduced and the design is benchmarked using two different examples. The advanced chip architecture helps the FPGA co-processor obtaining more than 20 times speed up with its working frequency much lower than the CPU frequency. (authors)

  2. Avoiding Local Optima with Interactive Evolutionary Robotics

    Science.gov (United States)

    2012-07-09

    the top of a flight of stairs selects for climbing ; suspending the robot and the target object above the ground and creating rungs between the two will...REPORT Avoiding Local Optimawith Interactive Evolutionary Robotics 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: The main bottleneck in evolutionary... robotics has traditionally been the time required to evolve robot controllers. However with the continued acceleration in computational resources, the

  3. Just-in-Time Compilation-Inspired Methodology for Parallelization of Compute Intensive Java Code

    Directory of Open Access Journals (Sweden)

    GHULAM MUSTAFA

    2017-01-01

    Full Text Available Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system

  4. Just-in-time compilation-inspired methodology for parallelization of compute intensive java code

    International Nuclear Information System (INIS)

    Mustafa, G.; Ghani, M.U.

    2017-01-01

    Compute intensive programs generally consume significant fraction of execution time in a small amount of repetitive code. Such repetitive code is commonly known as hotspot code. We observed that compute intensive hotspots often possess exploitable loop level parallelism. A JIT (Just-in-Time) compiler profiles a running program to identify its hotspots. Hotspots are then translated into native code, for efficient execution. Using similar approach, we propose a methodology to identify hotspots and exploit their parallelization potential on multicore systems. Proposed methodology selects and parallelizes each DOALL loop that is either contained in a hotspot method or calls a hotspot method. The methodology could be integrated in front-end of a JIT compiler to parallelize sequential code, just before native translation. However, compilation to native code is out of scope of this work. As a case study, we analyze eighteen JGF (Java Grande Forum) benchmarks to determine parallelization potential of hotspots. Eight benchmarks demonstrate a speedup of up to 7.6x on an 8-core system. (author)

  5. A new methodology for the computer-aided construction of fault trees

    International Nuclear Information System (INIS)

    Salem, S.L.; Apostolakis, G.E.; Okrent, D.

    1977-01-01

    A methodology for systematically constructing fault trees for general complex systems is developed. A means of modeling component behaviour via decision tables is presented, and a procedure, and a procedure for constructing and editing fault trees, either manually or by computer, is developed. The techniques employed result in a complete fault tree in standard form. In order to demonstrate the methodology, the computer program CAT was developed and is used to construct trees for a nuclear system. By analyzing and comparing these fault trees, several conclusions are reached. First, such an approach can be used to produce fault trees that accurately describe system behaviour. Second, multiple trees can be rapidly produced by defining various TOP events, including system success. Finally, the accuracy and utility of such trees is shown to depend upon the careful development of the decision table models by the analyst, and of the overall system definition itself. Thus the method is seen to be a tool for assisting in the work of fault tree construction rather than a replacement for the careful work of the fault tree analyst. (author)

  6. New computational methodology for large 3D neutron transport problems

    International Nuclear Information System (INIS)

    Dahmani, M.; Roy, R.; Koclas, J.

    2004-01-01

    We present a new computational methodology, based on 3D characteristics method, dedicated to solve very large 3D problems without spatial homogenization. In order to eliminate the input/output problems occurring when solving these large problems, we set up a new computing scheme that requires more CPU resources than the usual one, based on sweeps over large tracking files. The huge capacity of storage needed in some problems and the related I/O queries needed by the characteristics solver are replaced by on-the-fly recalculation of tracks at each iteration step. Using this technique, large 3D problems are no longer I/O-bound, and distributed CPU resources can be efficiently used. (authors)

  7. Evolutionary Studies in Business: A Presentation of a New Journal

    OpenAIRE

    Fernández Pérez, Paloma; Valls Pasola, Jaume

    2016-01-01

    The Journal of Evolutionary Studies in Business is a new open access journal led by an international interdisciplinary team of scholars located in eight institutions from three continents who wants to attract contributions that help shed light on the new questions, challenges, methodologies and realities, faced by businesses in an evolutionary perspective. The journal calls particularly for review essays that deal with new research topics about business, and provide useful overviews of...

  8. 6th International Workshop Soft Computing Applications

    CERN Document Server

    Jain, Lakhmi; Kovačević, Branko

    2016-01-01

    These volumes constitute the Proceedings of the 6th International Workshop on Soft Computing Applications, or SOFA 2014, held on 24-26 July 2014 in Timisoara, Romania. This edition was organized by the University of Belgrade, Serbia in conjunction with Romanian Society of Control Engineering and Technical Informatics (SRAIT) - Arad Section, The General Association of Engineers in Romania - Arad Section, Institute of Computer Science, Iasi Branch of the Romanian Academy and IEEE Romanian Section.                 The Soft Computing concept was introduced by Lotfi Zadeh in 1991 and serves to highlight the emergence of computing methodologies in which the accent is on exploiting the tolerance for imprecision and uncertainty to achieve tractability, robustness and low solution cost. Soft computing facilitates the use of fuzzy logic, neurocomputing, evolutionary computing and probabilistic computing in combination, leading to the concept of hybrid intelligent systems.        The combination of ...

  9. EVOLVE : a Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation II

    CERN Document Server

    Coello, Carlos; Tantar, Alexandru-Adrian; Tantar, Emilia; Bouvry, Pascal; Moral, Pierre; Legrand, Pierrick; EVOLVE 2012

    2013-01-01

    This book comprises a selection of papers from the EVOLVE 2012 held in Mexico City, Mexico. The aim of the EVOLVE is to build a bridge between probability, set oriented numerics and evolutionary computing, as to identify new common and challenging research aspects. The conference is also intended to foster a growing interest for robust and efficient methods with a sound theoretical background. EVOLVE is intended to unify theory-inspired methods and cutting-edge techniques ensuring performance guarantee factors. By gathering researchers with different backgrounds, a unified view and vocabulary can emerge where the theoretical advancements may echo in different domains. Summarizing, the EVOLVE focuses on challenging aspects arising at the passage from theory to new paradigms and aims to provide a unified view while raising questions related to reliability,  performance guarantees and modeling. The papers of the EVOLVE 2012 make a contribution to this goal. 

  10. A Comparison of the Methodological Quality of Articles in Computer Science Education Journals and Conference Proceedings

    Science.gov (United States)

    Randolph, Justus J.; Julnes, George; Bednarik, Roman; Sutinen, Erkki

    2007-01-01

    In this study we empirically investigate the claim that articles published in computer science education journals are more methodologically sound than articles published in computer science education conference proceedings. A random sample of 352 articles was selected from those articles published in major computer science education forums between…

  11. Evaluation of dose from kV cone-beam computed tomography during radiotherapy: a comparison of methodologies

    Science.gov (United States)

    Buckley, J.; Wilkinson, D.; Malaroda, A.; Metcalfe, P.

    2017-01-01

    Three alternative methodologies to the Computed-Tomography Dose Index for the evaluation of Cone-Beam Computed Tomography dose are compared, the Cone-Beam Dose Index, IAEA Human Health Report No. 5 recommended methodology and the AAPM Task Group 111 recommended methodology. The protocols were evaluated for Pelvis and Thorax scan modes on Varian® On-Board Imager and Truebeam kV XI imaging systems. The weighted planar average dose was highest for the AAPM methodology across all scans, with the CBDI being the second highest overall. A 17.96% and 1.14% decrease from the TG-111 protocol to the IAEA and CBDI protocols for the Pelvis mode and 18.15% and 13.10% decrease for the Thorax mode were observed for the XI system. For the OBI system, the variation was 16.46% and 7.14% for Pelvis mode and 15.93% to the CBDI protocol in Thorax mode respectively.

  12. Evolutionary impact assessment: accounting for evolutionary consequences of fishing in an ecosystem approach to fisheries management.

    Science.gov (United States)

    Laugen, Ane T; Engelhard, Georg H; Whitlock, Rebecca; Arlinghaus, Robert; Dankel, Dorothy J; Dunlop, Erin S; Eikeset, Anne M; Enberg, Katja; Jørgensen, Christian; Matsumura, Shuichi; Nusslé, Sébastien; Urbach, Davnah; Baulier, Loїc; Boukal, David S; Ernande, Bruno; Johnston, Fiona D; Mollet, Fabian; Pardoe, Heidi; Therkildsen, Nina O; Uusi-Heikkilä, Silva; Vainikka, Anssi; Heino, Mikko; Rijnsdorp, Adriaan D; Dieckmann, Ulf

    2014-03-01

    Managing fisheries resources to maintain healthy ecosystems is one of the main goals of the ecosystem approach to fisheries (EAF). While a number of international treaties call for the implementation of EAF, there are still gaps in the underlying methodology. One aspect that has received substantial scientific attention recently is fisheries-induced evolution (FIE). Increasing evidence indicates that intensive fishing has the potential to exert strong directional selection on life-history traits, behaviour, physiology, and morphology of exploited fish. Of particular concern is that reversing evolutionary responses to fishing can be much more difficult than reversing demographic or phenotypically plastic responses. Furthermore, like climate change, multiple agents cause FIE, with effects accumulating over time. Consequently, FIE may alter the utility derived from fish stocks, which in turn can modify the monetary value living aquatic resources provide to society. Quantifying and predicting the evolutionary effects of fishing is therefore important for both ecological and economic reasons. An important reason this is not happening is the lack of an appropriate assessment framework. We therefore describe the evolutionary impact assessment (EvoIA) as a structured approach for assessing the evolutionary consequences of fishing and evaluating the predicted evolutionary outcomes of alternative management options. EvoIA can contribute to EAF by clarifying how evolution may alter stock properties and ecological relations, support the precautionary approach to fisheries management by addressing a previously overlooked source of uncertainty and risk, and thus contribute to sustainable fisheries.

  13. Economic and evolutionary hypotheses for cross-population variation in parochialism.

    Science.gov (United States)

    Hruschka, Daniel J; Henrich, Joseph

    2013-09-11

    Human populations differ reliably in the degree to which people favor family, friends, and community members over strangers and outsiders. In the last decade, researchers have begun to propose several economic and evolutionary hypotheses for these cross-population differences in parochialism. In this paper, we outline major current theories and review recent attempts to test them. We also discuss the key methodological challenges in assessing these diverse economic and evolutionary theories for cross-population differences in parochialism.

  14. Computational methods using weighed-extreme learning machine to predict protein self-interactions with protein evolutionary information.

    Science.gov (United States)

    An, Ji-Yong; Zhang, Lei; Zhou, Yong; Zhao, Yu-Jun; Wang, Da-Fu

    2017-08-18

    Self-interactions Proteins (SIPs) is important for their biological activity owing to the inherent interaction amongst their secondary structures or domains. However, due to the limitations of experimental Self-interactions detection, one major challenge in the study of prediction SIPs is how to exploit computational approaches for SIPs detection based on evolutionary information contained protein sequence. In the work, we presented a novel computational approach named WELM-LAG, which combined the Weighed-Extreme Learning Machine (WELM) classifier with Local Average Group (LAG) to predict SIPs based on protein sequence. The major improvement of our method lies in presenting an effective feature extraction method used to represent candidate Self-interactions proteins by exploring the evolutionary information embedded in PSI-BLAST-constructed position specific scoring matrix (PSSM); and then employing a reliable and robust WELM classifier to carry out classification. In addition, the Principal Component Analysis (PCA) approach is used to reduce the impact of noise. The WELM-LAG method gave very high average accuracies of 92.94 and 96.74% on yeast and human datasets, respectively. Meanwhile, we compared it with the state-of-the-art support vector machine (SVM) classifier and other existing methods on human and yeast datasets, respectively. Comparative results indicated that our approach is very promising and may provide a cost-effective alternative for predicting SIPs. In addition, we developed a freely available web server called WELM-LAG-SIPs to predict SIPs. The web server is available at http://219.219.62.123:8888/WELMLAG/ .

  15. Evolutionary paths of Old French reciprocal markers. New and old methodological tools

    Directory of Open Access Journals (Sweden)

    Mikołaj Nkollo

    2012-01-01

    Full Text Available The present paper revolves around unusual paths of grammaticalization of Old French (12th century reciprocal markers. This methodological framework requires medieval means of expressing reciprocity to be compared both with their parent forms in Classical Latin and with the markers introduced in subsequent stages of the history of French language. The first hypothesis deals with how parallel markers, i.e. ones that have a common origin and that are used inside the same area of grammar (se… entre- and entre eux < Lat. inter se, are different from each other. This path is claimed to materialize provided one of the two terms begins to serve a particular function not performed by the other one. The second hypothesis accounts for what means were used to prevent ongoing reflexive / reciprocal homonymy. This task happened to be provisionally fulfilled by cors a cors and coste a coste until the advent of adverbs ending in -ment in 14th century. As a consequence, body-part nouns lost most of their grammatical potential. The third hypothesis, formulated in terms of exaptation, explains how and why languages are likely to recycle erstwhile peripheral lexical elements. This evolutionary path takes place in response to a need to convey a given meaning unambiguously. Attempts at avoiding reflexive / reciprocal homonymy prompted the revival of seemingly forgotten Latin items reciprocus / mutuus. Concluding remarks address the problem of whether the concept of exaptation is useful in historical linguistics and contain a proposal towards constraining its scope. As for current views of grammaticalization, this notion seems to call for further refinements, as well.

  16. A Novel Handwritten Letter Recognizer Using Enhanced Evolutionary Neural Network

    Science.gov (United States)

    Mahmoudi, Fariborz; Mirzashaeri, Mohsen; Shahamatnia, Ehsan; Faridnia, Saed

    This paper introduces a novel design for handwritten letter recognition by employing a hybrid back-propagation neural network with an enhanced evolutionary algorithm. Feeding the neural network consists of a new approach which is invariant to translation, rotation, and scaling of input letters. Evolutionary algorithm is used for the global search of the search space and the back-propagation algorithm is used for the local search. The results have been computed by implementing this approach for recognizing 26 English capital letters in the handwritings of different people. The computational results show that the neural network reaches very satisfying results with relatively scarce input data and a promising performance improvement in convergence of the hybrid evolutionary back-propagation algorithms is exhibited.

  17. Genomes, Phylogeny, and Evolutionary Systems Biology

    Energy Technology Data Exchange (ETDEWEB)

    Medina, Monica

    2005-03-25

    With the completion of the human genome and the growing number of diverse genomes being sequenced, a new age of evolutionary research is currently taking shape. The myriad of technological breakthroughs in biology that are leading to the unification of broad scientific fields such as molecular biology, biochemistry, physics, mathematics and computer science are now known as systems biology. Here I present an overview, with an emphasis on eukaryotes, of how the postgenomics era is adopting comparative approaches that go beyond comparisons among model organisms to shape the nascent field of evolutionary systems biology.

  18. Economic and evolutionary hypotheses for cross-population variation in parochialism

    Directory of Open Access Journals (Sweden)

    Daniel Jacob Hruschka

    2013-09-01

    Full Text Available Human populations differ reliably in the degree to which people favor family, friends and community members over strangers and outsiders. In the last decade, researchers have begun to propose several economic and evolutionary hypotheses for these cross-population differences in parochialism. In this paper, we outline major current theories and review recent attempts to test them. We also discuss the key methodological challenges in assessing these diverse economic and evolutionary theories for cross-population differences in parochialism.

  19. Analytical methodology for safety validation of computer controlled subsystems. Volume 1 : state-of-the-art and assessment of safety verification/validation methodologies

    Science.gov (United States)

    1995-09-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety critical functions in high-speed rail or magnetic levitation ...

  20. Computationally based methodology for reengineering the high-level waste planning process at SRS

    International Nuclear Information System (INIS)

    Paul, P.K.; Gregory, M.V.; Wells, M.N.

    1997-01-01

    The Savannah River Site (SRS) has started processing its legacy of 34 million gallons of high-level radioactive waste into its final disposable form. The SRS high-level waste (HLW) complex consists of 51 waste storage tanks, 3 evaporators, 6 waste treatment operations, and 2 waste disposal facilities. It is estimated that processing wastes to clean up all tanks will take 30+ yr of operation. Integrating all the highly interactive facility operations through the entire life cycle in an optimal fashion-while meeting all the budgetary, regulatory, and operational constraints and priorities-is a complex and challenging planning task. The waste complex operating plan for the entire time span is periodically published as an SRS report. A computationally based integrated methodology has been developed that has streamlined the planning process while showing how to run the operations at economically and operationally optimal conditions. The integrated computational model replaced a host of disconnected spreadsheet calculations and the analysts' trial-and-error solutions using various scenario choices. This paper presents the important features of the integrated computational methodology and highlights the parameters that are core components of the planning process

  1. Nash evolutionary algorithms : Testing problem size in reconstruction problems in frame structures

    OpenAIRE

    Greiner, D.; Periaux, Jacques; Emperador, J.M.; Galván, B.; Winter, G.

    2016-01-01

    The use of evolutionary algorithms has been enhanced in recent years for solving real engineering problems, where the requirements of intense computational calculations are needed, especially when computational engineering simulations are involved (use of finite element method, boundary element method, etc). The coupling of game-theory concepts in evolutionary algorithms has been a recent line of research which could enhance the efficiency of the optimum design procedure and th...

  2. Evolutionary search for new high-k dielectric materials: methodology and applications to hafnia-based oxides.

    Science.gov (United States)

    Zeng, Qingfeng; Oganov, Artem R; Lyakhov, Andriy O; Xie, Congwei; Zhang, Xiaodong; Zhang, Jin; Zhu, Qiang; Wei, Bingqing; Grigorenko, Ilya; Zhang, Litong; Cheng, Laifei

    2014-02-01

    High-k dielectric materials are important as gate oxides in microelectronics and as potential dielectrics for capacitors. In order to enable computational discovery of novel high-k dielectric materials, we propose a fitness model (energy storage density) that includes the dielectric constant, bandgap, and intrinsic breakdown field. This model, used as a fitness function in conjunction with first-principles calculations and the global optimization evolutionary algorithm USPEX, efficiently leads to practically important results. We found a number of high-fitness structures of SiO2 and HfO2, some of which correspond to known phases and some of which are new. The results allow us to propose characteristics (genes) common to high-fitness structures--these are the coordination polyhedra and their degree of distortion. Our variable-composition searches in the HfO2-SiO2 system uncovered several high-fitness states. This hybrid algorithm opens up a new avenue for discovering novel high-k dielectrics with both fixed and variable compositions, and will speed up the process of materials discovery.

  3. Evolution of teaching and evaluation methodologies: The experience in the computer programming course at the Universidad Nacional de Colombia

    Directory of Open Access Journals (Sweden)

    Jonatan Gomez Perdomo

    2014-05-01

    Full Text Available In this paper, we present the evolution of a computer-programming course at the Universidad Nacional de Colombia (UNAL. The teaching methodology has evolved from a linear and non-standardized methodology to a flexible, non-linear and student-centered methodology. Our methodology uses an e-learning platform that supports the learning process by offering students and professors custom navigation between the content and material in an interactive way (book chapters, exercises, videos. Moreover, the platform is open access, and approximately 900 students from the university take this course each term. However, our evaluation methodology has evolved from static evaluations based on paper tests to an online process based on computer adaptive testing (CAT that chooses the questions to ask a student and assigns the student a grade according to the student’s ability.

  4. A methodology for the design of experiments in computational intelligence with multiple regression models.

    Science.gov (United States)

    Fernandez-Lozano, Carlos; Gestal, Marcos; Munteanu, Cristian R; Dorado, Julian; Pazos, Alejandro

    2016-01-01

    The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  5. A methodology for the design of experiments in computational intelligence with multiple regression models

    Directory of Open Access Journals (Sweden)

    Carlos Fernandez-Lozano

    2016-12-01

    Full Text Available The design of experiments and the validation of the results achieved with them are vital in any research study. This paper focuses on the use of different Machine Learning approaches for regression tasks in the field of Computational Intelligence and especially on a correct comparison between the different results provided for different methods, as those techniques are complex systems that require further study to be fully understood. A methodology commonly accepted in Computational intelligence is implemented in an R package called RRegrs. This package includes ten simple and complex regression models to carry out predictive modeling using Machine Learning and well-known regression algorithms. The framework for experimental design presented herein is evaluated and validated against RRegrs. Our results are different for three out of five state-of-the-art simple datasets and it can be stated that the selection of the best model according to our proposal is statistically significant and relevant. It is of relevance to use a statistical approach to indicate whether the differences are statistically significant using this kind of algorithms. Furthermore, our results with three real complex datasets report different best models than with the previously published methodology. Our final goal is to provide a complete methodology for the use of different steps in order to compare the results obtained in Computational Intelligence problems, as well as from other fields, such as for bioinformatics, cheminformatics, etc., given that our proposal is open and modifiable.

  6. A Practical, Robust Methodology for Acquiring New Observation Data Using Computationally Expensive Groundwater Models

    Science.gov (United States)

    Siade, Adam J.; Hall, Joel; Karelse, Robert N.

    2017-11-01

    Regional groundwater flow models play an important role in decision making regarding water resources; however, the uncertainty embedded in model parameters and model assumptions can significantly hinder the reliability of model predictions. One way to reduce this uncertainty is to collect new observation data from the field. However, determining where and when to obtain such data is not straightforward. There exist a number of data-worth and experimental design strategies developed for this purpose. However, these studies often ignore issues related to real-world groundwater models such as computational expense, existing observation data, high-parameter dimension, etc. In this study, we propose a methodology, based on existing methods and software, to efficiently conduct such analyses for large-scale, complex regional groundwater flow systems for which there is a wealth of available observation data. The method utilizes the well-established d-optimality criterion, and the minimax criterion for robust sampling strategies. The so-called Null-Space Monte Carlo method is used to reduce the computational burden associated with uncertainty quantification. And, a heuristic methodology, based on the concept of the greedy algorithm, is proposed for developing robust designs with subsets of the posterior parameter samples. The proposed methodology is tested on a synthetic regional groundwater model, and subsequently applied to an existing, complex, regional groundwater system in the Perth region of Western Australia. The results indicate that robust designs can be obtained efficiently, within reasonable computational resources, for making regional decisions regarding groundwater level sampling.

  7. Multiobjective Multifactorial Optimization in Evolutionary Multitasking.

    Science.gov (United States)

    Gupta, Abhishek; Ong, Yew-Soon; Feng, Liang; Tan, Kay Chen

    2016-05-03

    In recent decades, the field of multiobjective optimization has attracted considerable interest among evolutionary computation researchers. One of the main features that makes evolutionary methods particularly appealing for multiobjective problems is the implicit parallelism offered by a population, which enables simultaneous convergence toward the entire Pareto front. While a plethora of related algorithms have been proposed till date, a common attribute among them is that they focus on efficiently solving only a single optimization problem at a time. Despite the known power of implicit parallelism, seldom has an attempt been made to multitask, i.e., to solve multiple optimization problems simultaneously. It is contended that the notion of evolutionary multitasking leads to the possibility of automated transfer of information across different optimization exercises that may share underlying similarities, thereby facilitating improved convergence characteristics. In particular, the potential for automated transfer is deemed invaluable from the standpoint of engineering design exercises where manual knowledge adaptation and reuse are routine. Accordingly, in this paper, we present a realization of the evolutionary multitasking paradigm within the domain of multiobjective optimization. The efficacy of the associated evolutionary algorithm is demonstrated on some benchmark test functions as well as on a real-world manufacturing process design problem from the composites industry.

  8. Resistance and relatedness on an evolutionary graph

    Science.gov (United States)

    Maciejewski, Wes

    2012-01-01

    When investigating evolution in structured populations, it is often convenient to consider the population as an evolutionary graph—individuals as nodes, and whom they may act with as edges. There has, in recent years, been a surge of interest in evolutionary graphs, especially in the study of the evolution of social behaviours. An inclusive fitness framework is best suited for this type of study. A central requirement for an inclusive fitness analysis is an expression for the genetic similarity between individuals residing on the graph. This has been a major hindrance for work in this area as highly technical mathematics are often required. Here, I derive a result that links genetic relatedness between haploid individuals on an evolutionary graph to the resistance between vertices on a corresponding electrical network. An example that demonstrates the potential computational advantage of this result over contemporary approaches is provided. This result offers more, however, to the study of population genetics than strictly computationally efficient methods. By establishing a link between gene transfer and electric circuit theory, conceptualizations of the latter can enhance understanding of the former. PMID:21849384

  9. Multi-objective evolutionary optimisation for product design and manufacturing

    CERN Document Server

    2011-01-01

    Presents state-of-the-art research in the area of multi-objective evolutionary optimisation for integrated product design and manufacturing Provides a comprehensive review of the literature Gives in-depth descriptions of recently developed innovative and novel methodologies, algorithms and systems in the area of modelling, simulation and optimisation

  10. Introduction to Evolutionary Algorithms

    CERN Document Server

    Yu, Xinjie

    2010-01-01

    Evolutionary algorithms (EAs) are becoming increasingly attractive for researchers from various disciplines, such as operations research, computer science, industrial engineering, electrical engineering, social science, economics, etc. This book presents an insightful, comprehensive, and up-to-date treatment of EAs, such as genetic algorithms, differential evolution, evolution strategy, constraint optimization, multimodal optimization, multiobjective optimization, combinatorial optimization, evolvable hardware, estimation of distribution algorithms, ant colony optimization, particle swarm opti

  11. Sensitivity versus accuracy in multiclass problems using memetic Pareto evolutionary neural networks.

    Science.gov (United States)

    Fernández Caballero, Juan Carlos; Martínez, Francisco José; Hervás, César; Gutiérrez, Pedro Antonio

    2010-05-01

    This paper proposes a multiclassification algorithm using multilayer perceptron neural network models. It tries to boost two conflicting main objectives of multiclassifiers: a high correct classification rate level and a high classification rate for each class. This last objective is not usually optimized in classification, but is considered here given the need to obtain high precision in each class in real problems. To solve this machine learning problem, we use a Pareto-based multiobjective optimization methodology based on a memetic evolutionary algorithm. We consider a memetic Pareto evolutionary approach based on the NSGA2 evolutionary algorithm (MPENSGA2). Once the Pareto front is built, two strategies or automatic individual selection are used: the best model in accuracy and the best model in sensitivity (extremes in the Pareto front). These methodologies are applied to solve 17 classification benchmark problems obtained from the University of California at Irvine (UCI) repository and one complex real classification problem. The models obtained show high accuracy and a high classification rate for each class.

  12. Evolutionary inference via the Poisson Indel Process.

    Science.gov (United States)

    Bouchard-Côté, Alexandre; Jordan, Michael I

    2013-01-22

    We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114-124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments.

  13. Evolutionary Models for Simple Biosystems

    Science.gov (United States)

    Bagnoli, Franco

    The concept of evolutionary development of structures constituted a real revolution in biology: it was possible to understand how the very complex structures of life can arise in an out-of-equilibrium system. The investigation of such systems has shown that indeed, systems under a flux of energy or matter can self-organize into complex patterns, think for instance to Rayleigh-Bernard convection, Liesegang rings, patterns formed by granular systems under shear. Following this line, one could characterize life as a state of matter, characterized by the slow, continuous process that we call evolution. In this paper we try to identify the organizational level of life, that spans several orders of magnitude from the elementary constituents to whole ecosystems. Although similar structures can be found in other contexts like ideas (memes) in neural systems and self-replicating elements (computer viruses, worms, etc.) in computer systems, we shall concentrate on biological evolutionary structure, and try to put into evidence the role and the emergence of network structure in such systems.

  14. Assessing fluctuating evolutionary pressure in yeast and mammal evolutionary rate covariation using bioinformatics of meiotic protein genetic sequences

    Science.gov (United States)

    Dehipawala, Sunil; Nguyen, A.; Tremberger, G.; Cheung, E.; Holden, T.; Lieberman, D.; Cheung, T.

    2013-09-01

    The evolutionary rate co-variation in meiotic proteins has been reported for yeast and mammal using phylogenic branch lengths which assess retention, duplication and mutation. The bioinformatics of the corresponding DNA sequences could be classified as a diagram of fractal dimension and Shannon entropy. Results from biomedical gene research provide examples on the diagram methodology. The identification of adaptive selection using entropy marker and functional-structural diversity using fractal dimension would support a regression analysis where the coefficient of determination would serve as evolutionary pathway marker for DNA sequences and be an important component in the astrobiology community. Comparisons between biomedical genes such as EEF2 (elongation factor 2 human, mouse, etc), WDR85 in epigenetics, HAR1 in human specificity, clinical trial targeted cancer gene CD47, SIRT6 in spermatogenesis, and HLA-C in mosquito bite immunology demonstrate the diagram classification methodology. Comparisons to the SEPT4-XIAP pair in stem cell apoptosis, testesexpressed taste genes TAS1R3-GNAT3 pair, and amyloid beta APLP1-APLP2 pair with the yeast-mammal DNA sequences for meiotic proteins RAD50-MRE11 pair and NCAPD2-ICK pair have accounted for the observed fluctuating evolutionary pressure systematically. Regression with high R-sq values or a triangular-like cluster pattern for concordant pairs in co-variation among the studied species could serve as evidences for the possible location of common ancestors in the entropy-fractal dimension diagram, consistent with an example of the human-chimp common ancestor study using the FOXP2 regulated genes reported in human fetal brain study. The Deinococcus radiodurans R1 Rad-A could be viewed as an outlier in the RAD50 diagram and also in the free energy versus fractal dimension regression Cook's distance, consistent with a non-Earth source for this radiation resistant bacterium. Convergent and divergent fluctuating evolutionary

  15. Computer-Aided Methodology for Syndromic Strabismus Diagnosis.

    Science.gov (United States)

    Sousa de Almeida, João Dallyson; Silva, Aristófanes Corrêa; Teixeira, Jorge Antonio Meireles; Paiva, Anselmo Cardoso; Gattass, Marcelo

    2015-08-01

    Strabismus is a pathology that affects approximately 4 % of the population, causing aesthetic problems reversible at any age and irreversible sensory alterations that modify the vision mechanism. The Hirschberg test is one type of examination for detecting this pathology. Computer-aided detection/diagnosis is being used with relative success to aid health professionals. Nevertheless, the routine use of high-tech devices for aiding ophthalmological diagnosis and therapy is not a reality within the subspecialty of strabismus. Thus, this work presents a methodology to aid in diagnosis of syndromic strabismus through digital imaging. Two hundred images belonging to 40 patients previously diagnosed by an specialist were tested. The method was demonstrated to be 88 % accurate in esotropias identification (ET), 100 % for exotropias (XT), 80.33 % for hypertropias (HT), and 83.33 % for hypotropias (HoT). The overall average error was 5.6Δ and 3.83Δ for horizontal and vertical deviations, respectively, against the measures presented by the specialist.

  16. Economic modeling using evolutionary algorithms : the effect of binary encoding of strategies

    NARCIS (Netherlands)

    Waltman, L.R.; Eck, van N.J.; Dekker, Rommert; Kaymak, U.

    2011-01-01

    We are concerned with evolutionary algorithms that are employed for economic modeling purposes. We focus in particular on evolutionary algorithms that use a binary encoding of strategies. These algorithms, commonly referred to as genetic algorithms, are popular in agent-based computational economics

  17. Towards a Population Dynamics Theory for Evolutionary Computing: Learning from Biological Population Dynamics in Nature

    Science.gov (United States)

    Ma, Zhanshan (Sam)

    In evolutionary computing (EC), population size is one of the critical parameters that a researcher has to deal with. Hence, it was no surprise that the pioneers of EC, such as De Jong (1975) and Holland (1975), had already studied the population sizing from the very beginning of EC. What is perhaps surprising is that more than three decades later, we still largely depend on the experience or ad-hoc trial-and-error approach to set the population size. For example, in a recent monograph, Eiben and Smith (2003) indicated: "In almost all EC applications, the population size is constant and does not change during the evolutionary search." Despite enormous research on this issue in recent years, we still lack a well accepted theory for population sizing. In this paper, I propose to develop a population dynamics theory forEC with the inspiration from the population dynamics theory of biological populations in nature. Essentially, the EC population is considered as a dynamic system over time (generations) and space (search space or fitness landscape), similar to the spatial and temporal dynamics of biological populations in nature. With this conceptual mapping, I propose to 'transplant' the biological population dynamics theory to EC via three steps: (i) experimentally test the feasibility—whether or not emulating natural population dynamics improves the EC performance; (ii) comparatively study the underlying mechanisms—why there are improvements, primarily via statistical modeling analysis; (iii) conduct theoretical analysis with theoretical models such as percolation theory and extended evolutionary game theory that are generally applicable to both EC and natural populations. This article is a summary of a series of studies we have performed to achieve the general goal [27][30]-[32]. In the following, I start with an extremely brief introduction on the theory and models of natural population dynamics (Sections 1 & 2). In Sections 4 to 6, I briefly discuss three

  18. Evolutionary Algorithms Application Analysis in Biometric Systems

    Directory of Open Access Journals (Sweden)

    N. Goranin

    2010-01-01

    Full Text Available Wide usage of biometric information for person identity verification purposes, terrorist acts prevention measures and authenticationprocess simplification in computer systems has raised significant attention to reliability and efficiency of biometricsystems. Modern biometric systems still face many reliability and efficiency related issues such as reference databasesearch speed, errors while recognizing of biometric information or automating biometric feature extraction. Current scientificinvestigations show that application of evolutionary algorithms may significantly improve biometric systems. In thisarticle we provide a comprehensive review of main scientific research done in sphere of evolutionary algorithm applicationfor biometric system parameter improvement.

  19. Air quality estimation by computational intelligence methodologies

    Directory of Open Access Journals (Sweden)

    Ćirić Ivan T.

    2012-01-01

    Full Text Available The subject of this study is to compare different computational intelligence methodologies based on artificial neural networks used for forecasting an air quality parameter - the emission of CO2, in the city of Niš. Firstly, inputs of the CO2 emission estimator are analyzed and their measurement is explained. It is known that the traffic is the single largest emitter of CO2 in Europe. Therefore, a proper treatment of this component of pollution is very important for precise estimation of emission levels. With this in mind, measurements of traffic frequency and CO2 concentration were carried out at critical intersections in the city, as well as the monitoring of a vehicle direction at the crossroad. Finally, based on experimental data, different soft computing estimators were developed, such as feed forward neural network, recurrent neural network, and hybrid neuro-fuzzy estimator of CO2 emission levels. Test data for some characteristic cases presented at the end of the paper shows good agreement of developed estimator outputs with experimental data. Presented results are a true indicator of the implemented method usability. [Projekat Ministarstva nauke Republike Srbije, br. III42008-2/2011: Evaluation of Energy Performances and br. TR35016/2011: Indoor Environment Quality of Educational Buildings in Serbia with Impact to Health and Research of MHD Flows around the Bodies, in the Tip Clearances and Channels and Application in the MHD Pumps Development

  20. Phylogenetic inference with weighted codon evolutionary distances.

    Science.gov (United States)

    Criscuolo, Alexis; Michel, Christian J

    2009-04-01

    We develop a new approach to estimate a matrix of pairwise evolutionary distances from a codon-based alignment based on a codon evolutionary model. The method first computes a standard distance matrix for each of the three codon positions. Then these three distance matrices are weighted according to an estimate of the global evolutionary rate of each codon position and averaged into a unique distance matrix. Using a large set of both real and simulated codon-based alignments of nucleotide sequences, we show that this approach leads to distance matrices that have a significantly better treelikeness compared to those obtained by standard nucleotide evolutionary distances. We also propose an alternative weighting to eliminate the part of the noise often associated with some codon positions, particularly the third position, which is known to induce a fast evolutionary rate. Simulation results show that fast distance-based tree reconstruction algorithms on distance matrices based on this codon position weighting can lead to phylogenetic trees that are at least as accurate as, if not better, than those inferred by maximum likelihood. Finally, a well-known multigene dataset composed of eight yeast species and 106 codon-based alignments is reanalyzed and shows that our codon evolutionary distances allow building a phylogenetic tree which is similar to those obtained by non-distance-based methods (e.g., maximum parsimony and maximum likelihood) and also significantly improved compared to standard nucleotide evolutionary distance estimates.

  1. ANALYSIS OF EFFECTIVENESS OF METHODOLOGICAL SYSTEM FOR PROBABILITY AND STOCHASTIC PROCESSES COMPUTER-BASED LEARNING FOR PRE-SERVICE ENGINEERS

    Directory of Open Access Journals (Sweden)

    E. Chumak

    2015-04-01

    Full Text Available The author substantiates that only methodological training systems of mathematical disciplines with implementation of information and communication technologies (ICT can meet the requirements of modern educational paradigm and make possible to increase the educational efficiency. Due to this fact, the necessity of developing the methodology of theory of probability and stochastic processes computer-based learning for pre-service engineers is underlined in the paper. The results of the experimental study for analysis of the efficiency of methodological system of theory of probability and stochastic processes computer-based learning for pre-service engineers are shown. The analysis includes three main stages: ascertaining, searching and forming. The key criteria of the efficiency of designed methodological system are the level of probabilistic and stochastic skills of students and their learning motivation. The effect of implementing the methodological system of probability theory and stochastic processes computer-based learning on the level of students’ IT literacy is shown in the paper. The expanding of the range of objectives of ICT applying by students is described by author. The level of formation of students’ learning motivation on the ascertaining and forming stages of the experiment is analyzed. The level of intrinsic learning motivation for pre-service engineers is defined on these stages of the experiment. For this purpose, the methodology of testing the students’ learning motivation in the chosen specialty is presented in the paper. The increasing of intrinsic learning motivation of the experimental group students (E group against the control group students (C group is demonstrated.

  2. Computational Methodologies for Developing Structure–Morphology–Performance Relationships in Organic Solar Cells: A Protocol Review

    KAUST Repository

    Do, Khanh; Ravva, Mahesh Kumar; Wang, Tonghui; Bredas, Jean-Luc

    2016-01-01

    We outline a step-by-step protocol that incorporates a number of theoretical and computational methodologies to evaluate the structural and electronic properties of pi-conjugated semiconducting materials in the condensed phase. Our focus

  3. Langley's CSI evolutionary model: Phase O

    Science.gov (United States)

    Belvin, W. Keith; Elliott, Kenny B.; Horta, Lucas G.; Bailey, Jim P.; Bruner, Anne M.; Sulla, Jeffrey L.; Won, John; Ugoletti, Roberto M.

    1991-01-01

    A testbed for the development of Controls Structures Interaction (CSI) technology to improve space science platform pointing is described. The evolutionary nature of the testbed will permit the study of global line-of-sight pointing in phases 0 and 1, whereas, multipayload pointing systems will be studied beginning with phase 2. The design, capabilities, and typical dynamic behavior of the phase 0 version of the CSI evolutionary model (CEM) is documented for investigator both internal and external to NASA. The model description includes line-of-sight pointing measurement, testbed structure, actuators, sensors, and real time computers, as well as finite element and state space models of major components.

  4. An Efficient Evolutionary Based Method For Image Segmentation

    OpenAIRE

    Aslanzadeh, Roohollah; Qazanfari, Kazem; Rahmati, Mohammad

    2017-01-01

    The goal of this paper is to present a new efficient image segmentation method based on evolutionary computation which is a model inspired from human behavior. Based on this model, a four layer process for image segmentation is proposed using the split/merge approach. In the first layer, an image is split into numerous regions using the watershed algorithm. In the second layer, a co-evolutionary process is applied to form centers of finals segments by merging similar primary regions. In the t...

  5. MoPCoM Methodology: Focus on Models of Computation

    Science.gov (United States)

    Koudri, Ali; Champeau, Joël; Le Lann, Jean-Christophe; Leilde, Vincent

    Today, developments of Real Time Embedded Systems have to face new challenges. On the one hand, Time-To-Market constraints require a reliable development process allowing quick design space exploration. On the other hand, rapidly developing technology, as stated by Moore's law, requires techniques to handle the resulting productivity gap. In a previous paper, we have presented our Model Based Engineering methodology addressing those issues. In this paper, we make a focus on Models of Computation design and analysis. We illustrate our approach on a Cognitive Radio System development implemented on an FPGA. This work is part of the MoPCoM research project gathering academic and industrial organizations (http://www.mopcom.fr).

  6. First insights into the evolutionary history of the Davallia repens complex

    NARCIS (Netherlands)

    Chen, C.-W.; Ngan, L.T.; Hidayat, A.; Evangelista, L.; Nooteboom, H.P.; Chiou, W.-L.

    2014-01-01

    Davallia repens and its close relatives have been identified as a species complex in this study because of the existence of continuously morphological variation. To decipher its evolutionary history, integrated methodologies were applied in this study including morphology, cytology, reproductive

  7. Safety of High Speed Ground Transportation Systems : Analytical Methodology for Safety Validation of Computer Controlled Subsystems : Volume 2. Development of a Safety Validation Methodology

    Science.gov (United States)

    1995-01-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...

  8. Methodological Reflections on Working with Young Children

    DEFF Research Database (Denmark)

    Korn, Matthias

    2009-01-01

    This paper provides methodological reflections on an evolutionary and participatory software development process for designing interactive systems with children of very young age. The approach was put into practice for the design of a software environment for self-directed project management...

  9. An alternative methodology for planning computer class where teaching means are used

    Directory of Open Access Journals (Sweden)

    Maria del Carmen Carrillo Hernández

    2016-06-01

    Full Text Available Teaching subject of Informatics II, is provided in the fourth year of teaching career Informática- Labor Education, one of the objectives of it is to develop skills in students that allow plan and structure independently, original and creative, Computer class, where the use of computer media education is the guiding element from which students acquire knowledge. Professional practice have been identified limitations in this regard, with the aim of contributing to the development of these skills, the authors of this article propose an alternative methodology that will guide teachers of this subject, to lead the process learning learning so that the goals are met the program guides.

  10. Thermal sensation prediction by soft computing methodology.

    Science.gov (United States)

    Jović, Srđan; Arsić, Nebojša; Vilimonović, Jovana; Petković, Dalibor

    2016-12-01

    Thermal comfort in open urban areas is very factor based on environmental point of view. Therefore it is need to fulfill demands for suitable thermal comfort during urban planning and design. Thermal comfort can be modeled based on climatic parameters and other factors. The factors are variables and they are changed throughout the year and days. Therefore there is need to establish an algorithm for thermal comfort prediction according to the input variables. The prediction results could be used for planning of time of usage of urban areas. Since it is very nonlinear task, in this investigation was applied soft computing methodology in order to predict the thermal comfort. The main goal was to apply extreme leaning machine (ELM) for forecasting of physiological equivalent temperature (PET) values. Temperature, pressure, wind speed and irradiance were used as inputs. The prediction results are compared with some benchmark models. Based on the results ELM can be used effectively in forecasting of PET. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Hardware for soft computing and soft computing for hardware

    CERN Document Server

    Nedjah, Nadia

    2014-01-01

    Single and Multi-Objective Evolutionary Computation (MOEA),  Genetic Algorithms (GAs), Artificial Neural Networks (ANNs), Fuzzy Controllers (FCs), Particle Swarm Optimization (PSO) and Ant colony Optimization (ACO) are becoming omnipresent in almost every intelligent system design. Unfortunately, the application of the majority of these techniques is complex and so requires a huge computational effort to yield useful and practical results. Therefore, dedicated hardware for evolutionary, neural and fuzzy computation is a key issue for designers. With the spread of reconfigurable hardware such as FPGAs, digital as well as analog hardware implementations of such computation become cost-effective. The idea behind this book is to offer a variety of hardware designs for soft computing techniques that can be embedded in any final product. Also, to introduce the successful application of soft computing technique to solve many hard problem encountered during the design of embedded hardware designs. Reconfigurable em...

  12. Robust and Flexible Scheduling with Evolutionary Computation

    DEFF Research Database (Denmark)

    Jensen, Mikkel T.

    Over the last ten years, there have been numerous applications of evolutionary algorithms to a variety of scheduling problems. Like most other research on heuristic scheduling, the primary aim of the research has been on deterministic formulations of the problems. This is in contrast to real world...... scheduling problems which are usually not deterministic. Usually at the time the schedule is made some information about the problem and processing environment is available, but this information is uncertain and likely to change during schedule execution. Changes frequently encountered in scheduling...... environments include machine breakdowns, uncertain processing times, workers getting sick, materials being delayed and the appearance of new jobs. These possible environmental changes mean that a schedule which was optimal for the information available at the time of scheduling can end up being highly...

  13. Remedial Action Assessment System: A computer-based methodology for conducting feasibility studies

    International Nuclear Information System (INIS)

    White, M.K.; Buelt, J.L.; Stottlemyre, J.A.

    1991-02-01

    Because of the complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, DOE is supporting the development of a computer-based methodology to streamline the remedial investigation/feasibility study process. The Remedial Action Assessment System (RAAS), can be used for screening, linking, and evaluating established technology processes in support of conducting feasibility studies. It is also intended to do the same in support of corrective measures studies. The user interface employs menus, windows, help features, and graphical information while RAAS is in operation. Object-oriented programming is used to link unit processes into sets of compatible processes that form appropriate remedial alternatives. Once the remedial alternatives are formed, the RAAS methodology can evaluate them in terms of effectiveness, implementability, and cost. RAAS will access a user-selected risk assessment code to determine the reduction of risk after remedial action by each recommended alternative. The methodology will also help determine the implementability of the remedial alternatives at a site and access cost estimating tools to provide estimates of capital, operating, and maintenance costs. This paper presents the characteristics of two RAAS prototypes currently being developed. These include the RAAS Technology Information System, which accesses graphical, tabular and textual information about technologies, and the main RAAS methodology, which screens, links, and evaluates remedial technologies. 4 refs., 3 figs., 1 tab

  14. Evolutionary speed limited by water in arid Australia.

    Science.gov (United States)

    Goldie, Xavier; Gillman, Len; Crisp, Mike; Wright, Shane

    2010-09-07

    The covariation of biodiversity with climate is a fundamental pattern in nature. However, despite the ubiquity of this relationship, a consensus on the ultimate cause remains elusive. The evolutionary speed hypothesis posits direct mechanistic links between ambient temperature, the tempo of micro-evolution and, ultimately, species richness. Previous research has demonstrated faster rates of molecular evolution in warmer climates for a broad range of poikilothermic and homeothermic organisms, in both terrestrial and aquatic environments. In terrestrial systems, species richness increases with both temperature and water availability and the interaction of those terms: productivity. However, the influence of water availability as an independent variable on micro-evolutionary processes has not been examined previously. Here, using methodology that limits the potentially confounding role of cladogenetic and demographic processes, we report, to our knowledge, the first evidence that woody plants living in the arid Australian Outback are evolving more slowly than related species growing at similar latitudes in moist habitats on the mesic continental margins. These results support a modified evolutionary speed explanation for the relationship between the water-energy balance and plant diversity patterns.

  15. PECULIARITIES OF USING THE METHODOLOGY DISTANCE LEARNING OF THE SUBJECT «ENGINEERING AND COMPUTER GRAPHICS» FOR STUDENTS STUDYING BY CORRESPONDENCE

    OpenAIRE

    Olena V. Slobodianiuk

    2010-01-01

    A great part of the distance course of the subject «Engineering and Computer Graphics» (ECG) placed in Internet looks as an electronic manual. But the distance training process has a complicated structure and combines not only studying theoretical material but also collaboration between students and a teacher, a work in group. The methodology distance learning of ECG is proposed. This methodology developed and researched on Faculty the engineering and computer graphics of Vinnitsa National Te...

  16. Optimality and stability of symmetric evolutionary games with applications in genetic selection.

    Science.gov (United States)

    Huang, Yuanyuan; Hao, Yiping; Wang, Min; Zhou, Wen; Wu, Zhijun

    2015-06-01

    Symmetric evolutionary games, i.e., evolutionary games with symmetric fitness matrices, have important applications in population genetics, where they can be used to model for example the selection and evolution of the genotypes of a given population. In this paper, we review the theory for obtaining optimal and stable strategies for symmetric evolutionary games, and provide some new proofs and computational methods. In particular, we review the relationship between the symmetric evolutionary game and the generalized knapsack problem, and discuss the first and second order necessary and sufficient conditions that can be derived from this relationship for testing the optimality and stability of the strategies. Some of the conditions are given in different forms from those in previous work and can be verified more efficiently. We also derive more efficient computational methods for the evaluation of the conditions than conventional approaches. We demonstrate how these conditions can be applied to justifying the strategies and their stabilities for a special class of genetic selection games including some in the study of genetic disorders.

  17. Biology Needs Evolutionary Software Tools: Let’s Build Them Right

    Science.gov (United States)

    Team, Galaxy; Goecks, Jeremy; Taylor, James

    2018-01-01

    Abstract Research in population genetics and evolutionary biology has always provided a computational backbone for life sciences as a whole. Today evolutionary and population biology reasoning are essential for interpretation of large complex datasets that are characteristic of all domains of today’s life sciences ranging from cancer biology to microbial ecology. This situation makes algorithms and software tools developed by our community more important than ever before. This means that we, developers of software tool for molecular evolutionary analyses, now have a shared responsibility to make these tools accessible using modern technological developments as well as provide adequate documentation and training. PMID:29688462

  18. A proposed methodology for computational fluid dynamics code verification, calibration, and validation

    Science.gov (United States)

    Aeschliman, D. P.; Oberkampf, W. L.; Blottner, F. G.

    Verification, calibration, and validation (VCV) of Computational Fluid Dynamics (CFD) codes is an essential element of the code development process. The exact manner in which code VCV activities are planned and conducted, however, is critically important. It is suggested that the way in which code validation, in particular, is often conducted--by comparison to published experimental data obtained for other purposes--is in general difficult and unsatisfactory, and that a different approach is required. This paper describes a proposed methodology for CFD code VCV that meets the technical requirements and is philosophically consistent with code development needs. The proposed methodology stresses teamwork and cooperation between code developers and experimentalists throughout the VCV process, and takes advantage of certain synergisms between CFD and experiment. A novel approach to uncertainty analysis is described which can both distinguish between and quantify various types of experimental error, and whose attributes are used to help define an appropriate experimental design for code VCV experiments. The methodology is demonstrated with an example of laminar, hypersonic, near perfect gas, 3-dimensional flow over a sliced sphere/cone of varying geometrical complexity.

  19. Network-level architecture and the evolutionary potential of underground metabolism.

    Science.gov (United States)

    Notebaart, Richard A; Szappanos, Balázs; Kintses, Bálint; Pál, Ferenc; Györkei, Ádám; Bogos, Balázs; Lázár, Viktória; Spohn, Réka; Csörgő, Bálint; Wagner, Allon; Ruppin, Eytan; Pál, Csaba; Papp, Balázs

    2014-08-12

    A central unresolved issue in evolutionary biology is how metabolic innovations emerge. Low-level enzymatic side activities are frequent and can potentially be recruited for new biochemical functions. However, the role of such underground reactions in adaptation toward novel environments has remained largely unknown and out of reach of computational predictions, not least because these issues demand analyses at the level of the entire metabolic network. Here, we provide a comprehensive computational model of the underground metabolism in Escherichia coli. Most underground reactions are not isolated and 45% of them can be fully wired into the existing network and form novel pathways that produce key precursors for cell growth. This observation allowed us to conduct an integrated genome-wide in silico and experimental survey to characterize the evolutionary potential of E. coli to adapt to hundreds of nutrient conditions. We revealed that underground reactions allow growth in new environments when their activity is increased. We estimate that at least ∼20% of the underground reactions that can be connected to the existing network confer a fitness advantage under specific environments. Moreover, our results demonstrate that the genetic basis of evolutionary adaptations via underground metabolism is computationally predictable. The approach used here has potential for various application areas from bioengineering to medical genetics.

  20. At the crossroads of evolutionary computation and music: self-programming synthesizers, swarm orchestras and the origins of melody.

    Science.gov (United States)

    Miranda, Eduardo Reck

    2004-01-01

    This paper introduces three approaches to using Evolutionary Computation (EC) in Music (namely, engineering, creative and musicological approaches) and discusses examples of representative systems that have been developed within the last decade, with emphasis on more recent and innovative works. We begin by reviewing engineering applications of EC in Music Technology such as Genetic Algorithms and Cellular Automata sound synthesis, followed by an introduction to applications where EC has been used to generate musical compositions. Next, we introduce ongoing research into EC models to study the origins of music and detail our own research work on modelling the evolution of melody. Copryright 2004 Massachusetts Institute of Technology

  1. Handbook of research on P2P and grid systems for service-oriented computing : models, methodologies, and applications

    NARCIS (Netherlands)

    Antonopoulos, N.; Exarchakos, G.; Li, Maozhen; Liotta, A.

    2010-01-01

    Introduction: Service-oriented computing is a popular design methodology for large scale business computing systems. A significant number of companies aim to reap the benefit of cost reduction by realizing B2B and B2C processes on large-scale SOA-compliant software system platforms. Peer-to-Peer

  2. A sense of life: computational and experimental investigations with models of biochemical and evolutionary processes.

    Science.gov (United States)

    Mishra, Bud; Daruwala, Raoul-Sam; Zhou, Yi; Ugel, Nadia; Policriti, Alberto; Antoniotti, Marco; Paxia, Salvatore; Rejali, Marc; Rudra, Archisman; Cherepinsky, Vera; Silver, Naomi; Casey, William; Piazza, Carla; Simeoni, Marta; Barbano, Paolo; Spivak, Marina; Feng, Jiawu; Gill, Ofer; Venkatesh, Mysore; Cheng, Fang; Sun, Bing; Ioniata, Iuliana; Anantharaman, Thomas; Hubbard, E Jane Albert; Pnueli, Amir; Harel, David; Chandru, Vijay; Hariharan, Ramesh; Wigler, Michael; Park, Frank; Lin, Shih-Chieh; Lazebnik, Yuri; Winkler, Franz; Cantor, Charles R; Carbone, Alessandra; Gromov, Mikhael

    2003-01-01

    We collaborate in a research program aimed at creating a rigorous framework, experimental infrastructure, and computational environment for understanding, experimenting with, manipulating, and modifying a diverse set of fundamental biological processes at multiple scales and spatio-temporal modes. The novelty of our research is based on an approach that (i) requires coevolution of experimental science and theoretical techniques and (ii) exploits a certain universality in biology guided by a parsimonious model of evolutionary mechanisms operating at the genomic level and manifesting at the proteomic, transcriptomic, phylogenic, and other higher levels. Our current program in "systems biology" endeavors to marry large-scale biological experiments with the tools to ponder and reason about large, complex, and subtle natural systems. To achieve this ambitious goal, ideas and concepts are combined from many different fields: biological experimentation, applied mathematical modeling, computational reasoning schemes, and large-scale numerical and symbolic simulations. From a biological viewpoint, the basic issues are many: (i) understanding common and shared structural motifs among biological processes; (ii) modeling biological noise due to interactions among a small number of key molecules or loss of synchrony; (iii) explaining the robustness of these systems in spite of such noise; and (iv) cataloging multistatic behavior and adaptation exhibited by many biological processes.

  3. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    International Nuclear Information System (INIS)

    Fonseca, T C Ferreira; Vanhavere, F; Bogaerts, R; Hunt, John

    2014-01-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium. (paper)

  4. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    Science.gov (United States)

    Ferreira Fonseca, T. C.; Bogaerts, R.; Hunt, John; Vanhavere, F.

    2014-11-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium.

  5. Development of Fuzzy Logic and Soft Computing Methodologies

    Science.gov (United States)

    Zadeh, L. A.; Yager, R.

    1999-01-01

    Our earlier research on computing with words (CW) has led to a new direction in fuzzy logic which points to a major enlargement of the role of natural languages in information processing, decision analysis and control. This direction is based on the methodology of computing with words and embodies a new theory which is referred to as the computational theory of perceptions (CTP). An important feature of this theory is that it can be added to any existing theory - especially to probability theory, decision analysis, and control - and enhance the ability of the theory to deal with real-world problems in which the decision-relevant information is a mixture of measurements and perceptions. The new direction is centered on an old concept - the concept of a perception - a concept which plays a central role in human cognition. The ability to reason with perceptions perceptions of time, distance, force, direction, shape, intent, likelihood, truth and other attributes of physical and mental objects - underlies the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Everyday examples of such tasks are parking a car, driving in city traffic, cooking a meal, playing golf and summarizing a story. Perceptions are intrinsically imprecise. Imprecision of perceptions reflects the finite ability of sensory organs and ultimately, the brain, to resolve detail and store information. More concretely, perceptions are both fuzzy and granular, or, for short, f-granular. Perceptions are f-granular in the sense that: (a) the boundaries of perceived classes are not sharply defined; and (b) the elements of classes are grouped into granules, with a granule being a clump of elements drawn together by indistinguishability, similarity. proximity or functionality. F-granularity of perceptions may be viewed as a human way of achieving data compression. In large measure, scientific progress has been, and continues to be

  6. Spore: Spawning Evolutionary Misconceptions?

    Science.gov (United States)

    Bean, Thomas E.; Sinatra, Gale M.; Schrader, P. G.

    2010-10-01

    The use of computer simulations as educational tools may afford the means to develop understanding of evolution as a natural, emergent, and decentralized process. However, special consideration of developmental constraints on learning may be necessary when using these technologies. Specifically, the essentialist (biological forms possess an immutable essence), teleological (assignment of purpose to living things and/or parts of living things that may not be purposeful), and intentionality (assumption that events are caused by an intelligent agent) biases may be reinforced through the use of computer simulations, rather than addressed with instruction. We examine the video game Spore for its depiction of evolutionary content and its potential to reinforce these cognitive biases. In particular, we discuss three pedagogical strategies to mitigate weaknesses of Spore and other computer simulations: directly targeting misconceptions through refutational approaches, targeting specific principles of scientific inquiry, and directly addressing issues related to models as cognitive tools.

  7. Women, behavior, and evolution: understanding the debate between feminist evolutionists and evolutionary psychologists.

    Science.gov (United States)

    Liesen, Laurette T

    2007-03-01

    Often since the early 1990s, feminist evolutionists have criticized evolutionary psychologists, finding fault in their analyses of human male and female reproductive behavior. Feminist evolutionists have criticized various evolutionary psychologists for perpetuating gender stereotypes, using questionable methodology, and exhibiting a chill toward feminism. Though these criticisms have been raised many times, the conflict itself has not been fully analyzed. Therefore, I reconsider this conflict, both in its origins and its implications. I find that the approaches and perspectives of feminist evolutionists and evolutionary psychologists are distinctly different, leading many of the former to work in behavioral ecology, primatology, and evolutionary biology. Invitingly to feminist evolutionists, these three fields emphasize social behavior and the influences of environmental variables; in contrast, evolutionary psychology has come to rely on assumptions deemphasizing the pliability of psychological mechanisms and the flexibility of human behavior. In behavioral ecology, primatology, and evolutionary biology, feminist evolutionists have found old biases easy to correct and new hypotheses practical to test, offering new insights into male and female behavior, explaining the emergence and persistence of patriarchy, and potentially bringing closer a prime feminist goal, sexual equality.

  8. Evolutionary Nephrology.

    Science.gov (United States)

    Chevalier, Robert L

    2017-05-01

    Progressive kidney disease follows nephron loss, hyperfiltration, and incomplete repair, a process described as "maladaptive." In the past 20 years, a new discipline has emerged that expands research horizons: evolutionary medicine. In contrast to physiologic (homeostatic) adaptation, evolutionary adaptation is the result of reproductive success that reflects natural selection. Evolutionary explanations for physiologically maladaptive responses can emerge from mismatch of the phenotype with environment or evolutionary tradeoffs. Evolutionary adaptation to a terrestrial environment resulted in a vulnerable energy-consuming renal tubule and a hypoxic, hyperosmolar microenvironment. Natural selection favors successful energy investment strategy: energy is allocated to maintenance of nephron integrity through reproductive years, but this declines with increasing senescence after ~40 years of age. Risk factors for chronic kidney disease include restricted fetal growth or preterm birth (life history tradeoff resulting in fewer nephrons), evolutionary selection for APOL1 mutations (that provide resistance to trypanosome infection, a tradeoff), and modern life experience (Western diet mismatch leading to diabetes and hypertension). Current advances in genomics, epigenetics, and developmental biology have revealed proximate causes of kidney disease, but attempts to slow kidney disease remain elusive. Evolutionary medicine provides a complementary approach by addressing ultimate causes of kidney disease. Marked variation in nephron number at birth, nephron heterogeneity, and changing susceptibility to kidney injury throughout life history are the result of evolutionary processes. Combined application of molecular genetics, evolutionary developmental biology (evo-devo), developmental programming and life history theory may yield new strategies for prevention and treatment of chronic kidney disease.

  9. Evolutionary Nephrology

    Directory of Open Access Journals (Sweden)

    Robert L. Chevalier

    2017-05-01

    Full Text Available Progressive kidney disease follows nephron loss, hyperfiltration, and incomplete repair, a process described as “maladaptive.” In the past 20 years, a new discipline has emerged that expands research horizons: evolutionary medicine. In contrast to physiologic (homeostatic adaptation, evolutionary adaptation is the result of reproductive success that reflects natural selection. Evolutionary explanations for physiologically maladaptive responses can emerge from mismatch of the phenotype with environment or from evolutionary tradeoffs. Evolutionary adaptation to a terrestrial environment resulted in a vulnerable energy-consuming renal tubule and a hypoxic, hyperosmolar microenvironment. Natural selection favors successful energy investment strategy: energy is allocated to maintenance of nephron integrity through reproductive years, but this declines with increasing senescence after ∼40 years of age. Risk factors for chronic kidney disease include restricted fetal growth or preterm birth (life history tradeoff resulting in fewer nephrons, evolutionary selection for APOL1 mutations (which provide resistance to trypanosome infection, a tradeoff, and modern life experience (Western diet mismatch leading to diabetes and hypertension. Current advances in genomics, epigenetics, and developmental biology have revealed proximate causes of kidney disease, but attempts to slow kidney disease remain elusive. Evolutionary medicine provides a complementary approach by addressing ultimate causes of kidney disease. Marked variation in nephron number at birth, nephron heterogeneity, and changing susceptibility to kidney injury throughout the life history are the result of evolutionary processes. Combined application of molecular genetics, evolutionary developmental biology (evo-devo, developmental programming, and life history theory may yield new strategies for prevention and treatment of chronic kidney disease.

  10. EVOLUTIONARY THEORY AND THE MARKET COMPETITION

    Directory of Open Access Journals (Sweden)

    SIRGHI Nicoleta

    2014-12-01

    Full Text Available Evolutionary theory study of processes that transform economy for firms, institutions, industries, employment, production, trade and growth within, through the actions of diverse agents from experience and interactions, using evolutionary methodology. Evolutionary theory analyses the unleashing of a process of technological and institutional innovation by generating and testing a diversity of ideas which discover and accumulate more survival value for the costs incurred than competing alternatives.This paper presents study the behavior of the firms on the market used the evolutionary theory.The paper is to present in full the developments that have led to the re-assessment of theories of firms starting from the criticism on Coase's theory based on the lack of testable hypotheses and on non-operative definition of transaction costs. In the literature in the field studies on firms were allotted a secondary place for a long period of time, to date the new theories of the firm hold a dominant place in the firms’ economic analysis. In an article, published in 1937, Ronald H. Coase identified the main sources of the cost of using the market mechanism. The firms theory represent a issue intensively studied in the literature in the field, regarding the survival, competitiveness and innovation of firm on the market. The research of Nelson and Winter, “An Evolutionary Theory of Economic Change” (1982 is the starting point for a modern literature in the field which considers the approach of the theory of the firm from an evolutionary perspective. Nelson and Winter have shown that the “orthodox” theory, is objectionable primarily by the fact that the hypothesis regarding profit maximization has a normative character and is not valid in any situation. Nelson and Winter reconsidered their microeconomic analysis showing that excessive attention should not be paid to market equilibrium but rather to dynamic processes resulting from irreversible

  11. Evolutionary molecular medicine.

    Science.gov (United States)

    Nesse, Randolph M; Ganten, Detlev; Gregory, T Ryan; Omenn, Gilbert S

    2012-05-01

    Evolution has long provided a foundation for population genetics, but some major advances in evolutionary biology from the twentieth century that provide foundations for evolutionary medicine are only now being applied in molecular medicine. They include the need for both proximate and evolutionary explanations, kin selection, evolutionary models for cooperation, competition between alleles, co-evolution, and new strategies for tracing phylogenies and identifying signals of selection. Recent advances in genomics are transforming evolutionary biology in ways that create even more opportunities for progress at its interfaces with genetics, medicine, and public health. This article reviews 15 evolutionary principles and their applications in molecular medicine in hopes that readers will use them and related principles to speed the development of evolutionary molecular medicine.

  12. Evolutionary Computation Techniques for Predicting Atmospheric Corrosion

    Directory of Open Access Journals (Sweden)

    Amine Marref

    2013-01-01

    Full Text Available Corrosion occurs in many engineering structures such as bridges, pipelines, and refineries and leads to the destruction of materials in a gradual manner and thus shortening their lifespan. It is therefore crucial to assess the structural integrity of engineering structures which are approaching or exceeding their designed lifespan in order to ensure their correct functioning, for example, carrying ability and safety. An understanding of corrosion and an ability to predict corrosion rate of a material in a particular environment plays a vital role in evaluating the residual life of the material. In this paper we investigate the use of genetic programming and genetic algorithms in the derivation of corrosion-rate expressions for steel and zinc. Genetic programming is used to automatically evolve corrosion-rate expressions while a genetic algorithm is used to evolve the parameters of an already engineered corrosion-rate expression. We show that both evolutionary techniques yield corrosion-rate expressions that have good accuracy.

  13. Evidence Combination From an Evolutionary Game Theory Perspective.

    Science.gov (United States)

    Deng, Xinyang; Han, Deqiang; Dezert, Jean; Deng, Yong; Shyr, Yu

    2016-09-01

    Dempster-Shafer evidence theory is a primary methodology for multisource information fusion because it is good at dealing with uncertain information. This theory provides a Dempster's rule of combination to synthesize multiple evidences from various information sources. However, in some cases, counter-intuitive results may be obtained based on that combination rule. Numerous new or improved methods have been proposed to suppress these counter-intuitive results based on perspectives, such as minimizing the information loss or deviation. Inspired by evolutionary game theory, this paper considers a biological and evolutionary perspective to study the combination of evidences. An evolutionary combination rule (ECR) is proposed to help find the most biologically supported proposition in a multievidence system. Within the proposed ECR, we develop a Jaccard matrix game to formalize the interaction between propositions in evidences, and utilize the replicator dynamics to mimick the evolution of propositions. Experimental results show that the proposed ECR can effectively suppress the counter-intuitive behaviors appeared in typical paradoxes of evidence theory, compared with many existing methods. Properties of the ECR, such as solution's stability and convergence, have been mathematically proved as well.

  14. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Manimaran, M.; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-01-01

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored

  15. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    Manimaran, M., E-mail: maran@igcar.gov.in; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-10-15

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored.

  16. An Angiotensin II type 1 receptor activation switch patch revealed through Evolutionary Trace analysis

    DEFF Research Database (Denmark)

    Bonde, Marie Mi; Yao, Rong; Ma, Jian-Nong

    2010-01-01

    to be completely resolved. Evolutionary Trace (ET) analysis is a computational method, which identifies clusters of functionally important residues by integrating information on evolutionary important residue variations with receptor structure. Combined with known mutational data, ET predicted a patch of residues......) displayed phenotypes associated with changed activation state, such as increased agonist affinity or basal activity, promiscuous activation, or constitutive internalization highlighting the importance of testing different signaling pathways. We conclude that this evolutionary important patch mediates...

  17. Wind turbine power coefficient estimation by soft computing methodologies: Comparative study

    International Nuclear Information System (INIS)

    Shamshirband, Shahaboddin; Petković, Dalibor; Saboohi, Hadi; Anuar, Nor Badrul; Inayat, Irum; Akib, Shatirah; Ćojbašić, Žarko; Nikolić, Vlastimir; Mat Kiah, Miss Laiha; Gani, Abdullah

    2014-01-01

    Highlights: • Variable speed operation of wind turbine to increase power generation. • Changeability and fluctuation of wind has to be accounted. • To build an effective prediction model of wind turbine power coefficient. • The impact of the variation in the blade pitch angle and tip speed ratio. • Support vector regression methodology application as predictive methodology. - Abstract: Wind energy has become a large contender of traditional fossil fuel energy, particularly with the successful operation of multi-megawatt sized wind turbines. However, reasonable wind speed is not adequately sustainable everywhere to build an economical wind farm. In wind energy conversion systems, one of the operational problems is the changeability and fluctuation of wind. In most cases, wind speed can vacillate rapidly. Hence, quality of produced energy becomes an important problem in wind energy conversion plants. Several control techniques have been applied to improve the quality of power generated from wind turbines. In this study, the polynomial and radial basis function (RBF) are applied as the kernel function of support vector regression (SVR) to estimate optimal power coefficient value of the wind turbines. Instead of minimizing the observed training error, SVR p oly and SVR r bf attempt to minimize the generalization error bound so as to achieve generalized performance. The experimental results show that an improvement in predictive accuracy and capability of generalization can be achieved by the SVR approach in compare to other soft computing methodologies

  18. Computational Vibrational Spectroscopy of glycine in aqueous solution - Fundamental considerations towards feasible methodologies

    Science.gov (United States)

    Lutz, Oliver M. D.; Messner, Christoph B.; Hofer, Thomas S.; Canaval, Lorenz R.; Bonn, Guenther K.; Huck, Christian W.

    2014-05-01

    In this work, the mid-infrared spectrum of aqueous glycine is predicted by a number of computational approaches. Velocity autocorrelation functions are applied to ab initio QMCF-MD and QM/MM-MD simulations in order to obtain IR power spectra. Furthermore, continuum solvation model augmented geometry optimizations are studied by anharmonic calculations relying on the PT2-VSCF and the VPT2 formalism. In this context, the potential based EFP hydration technique is discussed and the importance of a Monte Carlo search in conjunction with PT2-VSCF calculations is critically assessed. All results are directly compared to newly recorded experimental FT-IR spectroscopic data, elucidating the qualities of the respective methodology. Moreover, the computational approaches are discussed regarding their usefulness for the interpretation of experimental spectra.

  19. An extra-memetic empirical methodology to accompany theoretical memetics

    OpenAIRE

    Gill, Jameson

    2012-01-01

    Abstract\\ud \\ud Purpose: The paper describes the difficulties encountered by researchers who are looking to operationalise theoretical memetics and provides a methodological avenue for studies that can test meme theory.\\ud \\ud Design/Methodology/Approach: The application of evolutionary theory to organisations is reviewed by critically reflecting on the validity of its truth claims. To focus the discussion a number of applications of meme theory are reviewed to raise specific issues which oug...

  20. An Evolutionary Complex Systems Decision-Support Tool for the Management of Operations

    International Nuclear Information System (INIS)

    Baldwin, J S; Allen, P M; Ridgway, K

    2011-01-01

    This research aimed to add both to the development of complex systems thinking in the subject area of Operations and Production Management and to the limited number of applications of computational models and simulations from the science of complex systems. The latter potentially offer helpful decision-support tools for operations and production managers. A mechanical engineering firm was used as a case study where a combined qualitative and quantitative methodological approach was employed to extract the required data from four senior managers. Company performance measures as well as firm technologies, practices and policies, and their relation and interaction with one another, were elicited. The data were subjected to an evolutionary complex systems model resulting in a series of simulations. The findings included both reassuring and some unexpected results. The simulation based on the CEO's opinions led the most cohesive and synergistic collection of practices describing the firm, closely followed by the Marketing and R and D Managers. The Manufacturing Manager's responses led to the most extreme evolutionary trajectory where the integrity of the entire firm came into question particularly when considering how employees were utilised. By drawing directly from the opinions and views of managers rather than from logical 'if-then' rules and averaged mathematical representations of agents that characterise agent-based and other self-organisational models, this work builds on previous applications by capturing a micro-level description of diversity and a learning effect that has been problematical not only in terms of theory but also in application. This approach can be used as a decision-support tool for operations and other managers providing a forum with which to explore a) the strengths, weaknesses and consequences of different decision-making capacities within the firm; b) the introduction of new manufacturing technologies, practices and policies; and, c) the

  1. An Evolutionary Complex Systems Decision-Support Tool for the Management of Operations

    Science.gov (United States)

    Baldwin, J. S.; Allen, P. M.; Ridgway, K.

    2011-12-01

    This research aimed to add both to the development of complex systems thinking in the subject area of Operations and Production Management and to the limited number of applications of computational models and simulations from the science of complex systems. The latter potentially offer helpful decision-support tools for operations and production managers. A mechanical engineering firm was used as a case study where a combined qualitative and quantitative methodological approach was employed to extract the required data from four senior managers. Company performance measures as well as firm technologies, practices and policies, and their relation and interaction with one another, were elicited. The data were subjected to an evolutionary complex systems model resulting in a series of simulations. The findings included both reassuring and some unexpected results. The simulation based on the CEO's opinions led the most cohesive and synergistic collection of practices describing the firm, closely followed by the Marketing and R&D Managers. The Manufacturing Manager's responses led to the most extreme evolutionary trajectory where the integrity of the entire firm came into question particularly when considering how employees were utilised. By drawing directly from the opinions and views of managers rather than from logical 'if-then' rules and averaged mathematical representations of agents that characterise agent-based and other self-organisational models, this work builds on previous applications by capturing a micro-level description of diversity and a learning effect that has been problematical not only in terms of theory but also in application. This approach can be used as a decision-support tool for operations and other managers providing a forum with which to explore a) the strengths, weaknesses and consequences of different decision-making capacities within the firm; b) the introduction of new manufacturing technologies, practices and policies; and, c) the

  2. FOREWORD: Computational methodologies for designing materials Computational methodologies for designing materials

    Science.gov (United States)

    Rahman, Talat S.

    2009-02-01

    It would be fair to say that in the past few decades, theory and computer modeling have played a major role in elucidating the microscopic factors that dictate the properties of functional novel materials. Together with advances in experimental techniques, theoretical methods are becoming increasingly capable of predicting properties of materials at different length scales, thereby bringing in sight the long-sought goal of designing material properties according to need. Advances in computer technology and their availability at a reasonable cost around the world have made tit all the more urgent to disseminate what is now known about these modern computational techniques. In this special issue on computational methodologies for materials by design we have tried to solicit articles from authors whose works collectively represent the microcosm of developments in the area. This turned out to be a difficult task for a variety of reasons, not the least of which is space limitation in this special issue. Nevertheless, we gathered twenty articles that represent some of the important directions in which theory and modeling are proceeding in the general effort to capture the ability to produce materials by design. The majority of papers presented here focus on technique developments that are expected to uncover further the fundamental processes responsible for material properties, and for their growth modes and morphological evolutions. As for material properties, some of the articles here address the challenges that continue to emerge from attempts at accurate descriptions of magnetic properties, of electronically excited states, and of sparse matter, all of which demand new looks at density functional theory (DFT). I should hasten to add that much of the success in accurate computational modeling of materials emanates from the remarkable predictive power of DFT, without which we would not be able to place the subject on firm theoretical grounds. As we know and will also

  3. Bio-inspired algorithms applied to molecular docking simulations.

    Science.gov (United States)

    Heberlé, G; de Azevedo, W F

    2011-01-01

    Nature as a source of inspiration has been shown to have a great beneficial impact on the development of new computational methodologies. In this scenario, analyses of the interactions between a protein target and a ligand can be simulated by biologically inspired algorithms (BIAs). These algorithms mimic biological systems to create new paradigms for computation, such as neural networks, evolutionary computing, and swarm intelligence. This review provides a description of the main concepts behind BIAs applied to molecular docking simulations. Special attention is devoted to evolutionary algorithms, guided-directed evolutionary algorithms, and Lamarckian genetic algorithms. Recent applications of these methodologies to protein targets identified in the Mycobacterium tuberculosis genome are described.

  4. Computational Methodologies for Developing Structure–Morphology–Performance Relationships in Organic Solar Cells: A Protocol Review

    KAUST Repository

    Do, Khanh

    2016-09-08

    We outline a step-by-step protocol that incorporates a number of theoretical and computational methodologies to evaluate the structural and electronic properties of pi-conjugated semiconducting materials in the condensed phase. Our focus is on methodologies appropriate for the characterization, at the molecular level, of the morphology in blend systems consisting of an electron donor and electron acceptor, of importance for understanding the performance properties of bulk-heterojunction organic solar cells. The protocol is formulated as an introductory manual for investigators who aim to study the bulk-heterojunction morphology in molecular details, thereby facilitating the development of structure morphology property relationships when used in tandem with experimental results.

  5. REAL TIME PULVERISED COAL FLOW SOFT SENSOR FOR THERMAL POWER PLANTS USING EVOLUTIONARY COMPUTATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    B. Raja Singh

    2015-01-01

    Full Text Available Pulverised coal preparation system (Coal mills is the heart of coal-fired power plants. The complex nature of a milling process, together with the complex interactions between coal quality and mill conditions, would lead to immense difficulties for obtaining an effective mathematical model of the milling process. In this paper, vertical spindle coal mills (bowl mill that are widely used in coal-fired power plants, is considered for the model development and its pulverised fuel flow rate is computed using the model. For the steady state coal mill model development, plant measurements such as air-flow rate, differential pressure across mill etc., are considered as inputs/outputs. The mathematical model is derived from analysis of energy, heat and mass balances. An Evolutionary computation technique is adopted to identify the unknown model parameters using on-line plant data. Validation results indicate that this model is accurate enough to represent the whole process of steady state coal mill dynamics. This coal mill model is being implemented on-line in a 210 MW thermal power plant and the results obtained are compared with plant data. The model is found accurate and robust that will work better in power plants for system monitoring. Therefore, the model can be used for online monitoring, fault detection, and control to improve the efficiency of combustion.

  6. High Efficiency Computation of the Variances of Structural Evolutionary Random Responses

    Directory of Open Access Journals (Sweden)

    J.H. Lin

    2000-01-01

    Full Text Available For structures subjected to stationary or evolutionary white/colored random noise, their various response variances satisfy algebraic or differential Lyapunov equations. The solution of these Lyapunov equations used to be very difficult. A precise integration method is proposed in the present paper, which solves such Lyapunov equations accurately and very efficiently.

  7. Optimum dispatching by using evolutionary programming; Despacho optimo utilizando programacao evolucionaria

    Energy Technology Data Exchange (ETDEWEB)

    Proenca, Luis Miguel; Pinto, Joao Luis; Matos, Manuel Antonio [Instituto de Engenharia de Sistemas e Computadores (INESC), Porto (Portugal). E-mail: lproenca@inescn.pt; jpinto@duque.inescn.pt; mmatos@inescn.pt

    1999-07-01

    This paper presents a methodology for solving the problem of the optimum dispatching based on the Evolutionary Programming. Also, a software module is described which is based on this integrated approach, integrated in a real system for power systems control and having a high contribution from renewable resources, in the framework of the CARE European project.

  8. A computational methodology for formulating gasoline surrogate fuels with accurate physical and chemical kinetic properties

    KAUST Repository

    Ahmed, Ahfaz

    2015-03-01

    Gasoline is the most widely used fuel for light duty automobile transportation, but its molecular complexity makes it intractable to experimentally and computationally study the fundamental combustion properties. Therefore, surrogate fuels with a simpler molecular composition that represent real fuel behavior in one or more aspects are needed to enable repeatable experimental and computational combustion investigations. This study presents a novel computational methodology for formulating surrogates for FACE (fuels for advanced combustion engines) gasolines A and C by combining regression modeling with physical and chemical kinetics simulations. The computational methodology integrates simulation tools executed across different software platforms. Initially, the palette of surrogate species and carbon types for the target fuels were determined from a detailed hydrocarbon analysis (DHA). A regression algorithm implemented in MATLAB was linked to REFPROP for simulation of distillation curves and calculation of physical properties of surrogate compositions. The MATLAB code generates a surrogate composition at each iteration, which is then used to automatically generate CHEMKIN input files that are submitted to homogeneous batch reactor simulations for prediction of research octane number (RON). The regression algorithm determines the optimal surrogate composition to match the fuel properties of FACE A and C gasoline, specifically hydrogen/carbon (H/C) ratio, density, distillation characteristics, carbon types, and RON. The optimal surrogate fuel compositions obtained using the present computational approach was compared to the real fuel properties, as well as with surrogate compositions available in the literature. Experiments were conducted within a Cooperative Fuels Research (CFR) engine operating under controlled autoignition (CAI) mode to compare the formulated surrogates against the real fuels. Carbon monoxide measurements indicated that the proposed surrogates

  9. From philosophy to science (to natural philosophy): evolutionary developmental perspectives.

    Science.gov (United States)

    Love, Alan C

    2008-03-01

    This paper focuses on abstraction as a mode of reasoning that facilitates a productive relationship between philosophy and science. Using examples from evolutionary developmental biology, I argue that there are two areas where abstraction can be relevant to science: reasoning explication and problem clarification. The value of abstraction is characterized in terms of methodology (modeling or data gathering) and epistemology (explanatory evaluation or data interpretation).

  10. Bipartite Graphs as Models of Population Structures in Evolutionary Multiplayer Games

    Science.gov (United States)

    Peña, Jorge; Rochat, Yannick

    2012-01-01

    By combining evolutionary game theory and graph theory, “games on graphs” study the evolutionary dynamics of frequency-dependent selection in population structures modeled as geographical or social networks. Networks are usually represented by means of unipartite graphs, and social interactions by two-person games such as the famous prisoner’s dilemma. Unipartite graphs have also been used for modeling interactions going beyond pairwise interactions. In this paper, we argue that bipartite graphs are a better alternative to unipartite graphs for describing population structures in evolutionary multiplayer games. To illustrate this point, we make use of bipartite graphs to investigate, by means of computer simulations, the evolution of cooperation under the conventional and the distributed N-person prisoner’s dilemma. We show that several implicit assumptions arising from the standard approach based on unipartite graphs (such as the definition of replacement neighborhoods, the intertwining of individual and group diversity, and the large overlap of interaction neighborhoods) can have a large impact on the resulting evolutionary dynamics. Our work provides a clear example of the importance of construction procedures in games on graphs, of the suitability of bigraphs and hypergraphs for computational modeling, and of the importance of concepts from social network analysis such as centrality, centralization and bipartite clustering for the understanding of dynamical processes occurring on networked population structures. PMID:22970237

  11. Evolutionary and Modern Image Content Differentially Influence the Processing of Emotional Pictures

    Directory of Open Access Journals (Sweden)

    Matthias Dhum

    2017-08-01

    Full Text Available From an evolutionary perspective, environmental threats relevant for survival constantly challenged human beings. Current research suggests the evolution of a fear processing module in the brain to cope with these threats. Recently, humans increasingly encountered modern threats (e.g., guns or car accidents in addition to evolutionary threats (e.g., snakes or predators which presumably required an adaptation of perception and behavior. However, the neural processes underlying the perception of these different threats remain to be elucidated. We investigated the effect of image content (i.e., evolutionary vs. modern threats on the activation of neural networks of emotion processing. During functional magnetic resonance imaging (fMRI 41 participants watched affective pictures displaying evolutionary-threatening, modern-threatening, evolutionary-neutral and modern-neutral content. Evolutionary-threatening stimuli evoked stronger activations than modern-threatening stimuli in left inferior frontal gyrus and thalamus, right middle frontal gyrus and parietal regions as well as bilaterally in parietal regions, fusiform gyrus and bilateral amygdala. We observed the opposite effect, i.e., higher activity for modern-threatening than for evolutionary-threatening stimuli, bilaterally in the posterior cingulate and the parahippocampal gyrus. We found no differences in subjective arousal ratings between the two threatening conditions. On the valence scale though, subjects rated modern-threatening pictures significantly more negative than evolutionary-threatening pictures, indicating a higher level of perceived threat. The majority of previous studies show a positive relationship between arousal rating and amygdala activity. However, comparing fMRI results with behavioral findings we provide evidence that neural activity in fear processing areas is not only driven by arousal or valence, but presumably also by the evolutionary content of the stimulus. This has

  12. Evolutionary thinking

    Science.gov (United States)

    Hunt, Tam

    2014-01-01

    Evolution as an idea has a lengthy history, even though the idea of evolution is generally associated with Darwin today. Rebecca Stott provides an engaging and thoughtful overview of this history of evolutionary thinking in her 2013 book, Darwin's Ghosts: The Secret History of Evolution. Since Darwin, the debate over evolution—both how it takes place and, in a long war of words with religiously-oriented thinkers, whether it takes place—has been sustained and heated. A growing share of this debate is now devoted to examining how evolutionary thinking affects areas outside of biology. How do our lives change when we recognize that all is in flux? What can we learn about life more generally if we study change instead of stasis? Carter Phipps’ book, Evolutionaries: Unlocking the Spiritual and Cultural Potential of Science's Greatest Idea, delves deep into this relatively new development. Phipps generally takes as a given the validity of the Modern Synthesis of evolutionary biology. His story takes us into, as the subtitle suggests, the spiritual and cultural implications of evolutionary thinking. Can religion and evolution be reconciled? Can evolutionary thinking lead to a new type of spirituality? Is our culture already being changed in ways that we don't realize by evolutionary thinking? These are all important questions and Phipps book is a great introduction to this discussion. Phipps is an author, journalist, and contributor to the emerging “integral” or “evolutionary” cultural movement that combines the insights of Integral Philosophy, evolutionary science, developmental psychology, and the social sciences. He has served as the Executive Editor of EnlightenNext magazine (no longer published) and more recently is the co-founder of the Institute for Cultural Evolution, a public policy think tank addressing the cultural roots of America's political challenges. What follows is an email interview with Phipps. PMID:26478766

  13. Determination of phase diagrams via computer simulation: methodology and applications to water, electrolytes and proteins

    International Nuclear Information System (INIS)

    Vega, C; Sanz, E; Abascal, J L F; Noya, E G

    2008-01-01

    In this review we focus on the determination of phase diagrams by computer simulation, with particular attention to the fluid-solid and solid-solid equilibria. The methodology to compute the free energy of solid phases will be discussed. In particular, the Einstein crystal and Einstein molecule methodologies are described in a comprehensive way. It is shown that both methodologies yield the same free energies and that free energies of solid phases present noticeable finite size effects. In fact, this is the case for hard spheres in the solid phase. Finite size corrections can be introduced, although in an approximate way, to correct for the dependence of the free energy on the size of the system. The computation of free energies of solid phases can be extended to molecular fluids. The procedure to compute free energies of solid phases of water (ices) will be described in detail. The free energies of ices Ih, II, III, IV, V, VI, VII, VIII, IX, XI and XII will be presented for the SPC/E and TIP4P models of water. Initial coexistence points leading to the determination of the phase diagram of water for these two models will be provided. Other methods to estimate the melting point of a solid, such as the direct fluid-solid coexistence or simulations of the free surface of the solid, will be discussed. It will be shown that the melting points of ice Ih for several water models, obtained from free energy calculations, direct coexistence simulations and free surface simulations agree within their statistical uncertainty. Phase diagram calculations can indeed help to improve potential models of molecular fluids. For instance, for water, the potential model TIP4P/2005 can be regarded as an improved version of TIP4P. Here we will review some recent work on the phase diagram of the simplest ionic model, the restricted primitive model. Although originally devised to describe ionic liquids, the model is becoming quite popular to describe the behavior of charged colloids

  14. Computational optimization of biodiesel combustion using response surface methodology

    Directory of Open Access Journals (Sweden)

    Ganji Prabhakara Rao

    2017-01-01

    Full Text Available The present work focuses on optimization of biodiesel combustion phenomena through parametric approach using response surface methodology. Physical properties of biodiesel play a vital role for accurate simulations of the fuel spray, atomization, combustion, and emission formation processes. Typically methyl based biodiesel consists of five main types of esters: methyl palmitate, methyl oleate, methyl stearate, methyl linoleate, and methyl linolenate in its composition. Based on the amount of methyl esters present the properties of pongamia bio-diesel and its blends were estimated. CONVERGETM computational fluid dynamics software was used to simulate the fuel spray, turbulence and combustion phenomena. The simulation responses such as indicated specific fuel consumption, NOx, and soot were analyzed using design of experiments. Regression equations were developed for each of these responses. The optimum parameters were found out to be compression ratio – 16.75, start of injection – 21.9° before top dead center, and exhaust gas re-circulation – 10.94%. Results have been compared with baseline case.

  15. Evolutionary enhancement of the SLIM-MAUD method of estimating human error rates

    International Nuclear Information System (INIS)

    Zamanali, J.H.; Hubbard, F.R.; Mosleh, A.; Waller, M.A.

    1992-01-01

    The methodology described in this paper assigns plant-specific dynamic human error rates (HERs) for individual plant examinations based on procedural difficulty, on configuration features, and on the time available to perform the action. This methodology is an evolutionary improvement of the success likelihood index methodology (SLIM-MAUD) for use in systemic scenarios. It is based on the assumption that the HER in a particular situation depends of the combined effects of a comprehensive set of performance-shaping factors (PSFs) that influence the operator's ability to perform the action successfully. The PSFs relate the details of the systemic scenario in which the action must be performed according to the operator's psychological and cognitive condition

  16. Steps Towards Operationalising an Evolutionary Archaeological Definition of Culture

    DEFF Research Database (Denmark)

    Riede, Felix

    2011-01-01

    This paper examines the definition of archaeological cultures/techno-complexes from an evolutionary perspective, in which culture is defined as a system of social information transmission. A formal methodology is presented through which the concept of a culture can be operationalised, at least...... within this approach. It has already been argued that in order to study material culture evolution in a manner similar to how palaeontologists study biological change over time, we need explicitly constructed archaeological taxonomic units . In palaeontology, the definition of such taxonomic units ? most...... are constructed, it is possible to define an archaeological culture at any given point in this hierarchy, depending on the scale of analysis. A brief example from the Late Glacial in Southern Scandinavia is presented, and it is shown that this approach can be used to operationalise an evolutionary definition...

  17. Evolutionary Stable Strategy

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 21; Issue 9. Evolutionary Stable Strategy: Application of Nash Equilibrium in Biology. General Article Volume 21 Issue 9 September 2016 pp 803- ... Keywords. Evolutionary game theory, evolutionary stable state, conflict, cooperation, biological games.

  18. Self-organized modularization in evolutionary algorithms.

    Science.gov (United States)

    Dauscher, Peter; Uthmann, Thomas

    2005-01-01

    The principle of modularization has proven to be extremely successful in the field of technical applications and particularly for Software Engineering purposes. The question to be answered within the present article is whether mechanisms can also be identified within the framework of Evolutionary Computation that cause a modularization of solutions. We will concentrate on processes, where modularization results only from the typical evolutionary operators, i.e. selection and variation by recombination and mutation (and not, e.g., from special modularization operators). This is what we call Self-Organized Modularization. Based on a combination of two formalizations by Radcliffe and Altenberg, some quantitative measures of modularity are introduced. Particularly, we distinguish Built-in Modularity as an inherent property of a genotype and Effective Modularity, which depends on the rest of the population. These measures can easily be applied to a wide range of present Evolutionary Computation models. It will be shown, both theoretically and by simulation, that under certain conditions, Effective Modularity (as defined within this paper) can be a selection factor. This causes Self-Organized Modularization to take place. The experimental observations emphasize the importance of Effective Modularity in comparison with Built-in Modularity. Although the experimental results have been obtained using a minimalist toy model, they can lead to a number of consequences for existing models as well as for future approaches. Furthermore, the results suggest a complex self-amplification of highly modular equivalence classes in the case of respected relations. Since the well-known Holland schemata are just the equivalence classes of respected relations in most Simple Genetic Algorithms, this observation emphasizes the role of schemata as Building Blocks (in comparison with arbitrary subsets of the search space).

  19. Evolutionary process of deep-sea bathymodiolus mussels.

    Directory of Open Access Journals (Sweden)

    Jun-Ichi Miyazaki

    Full Text Available BACKGROUND: Since the discovery of deep-sea chemosynthesis-based communities, much work has been done to clarify their organismal and environmental aspects. However, major topics remain to be resolved, including when and how organisms invade and adapt to deep-sea environments; whether strategies for invasion and adaptation are shared by different taxa or unique to each taxon; how organisms extend their distribution and diversity; and how they become isolated to speciate in continuous waters. Deep-sea mussels are one of the dominant organisms in chemosynthesis-based communities, thus investigations of their origin and evolution contribute to resolving questions about life in those communities. METHODOLOGY/PRINCIPAL FINDING: We investigated worldwide phylogenetic relationships of deep-sea Bathymodiolus mussels and their mytilid relatives by analyzing nucleotide sequences of the mitochondrial cytochrome c oxidase subunit I (COI and NADH dehydrogenase subunit 4 (ND4 genes. Phylogenetic analysis of the concatenated sequence data showed that mussels of the subfamily Bathymodiolinae from vents and seeps were divided into four groups, and that mussels of the subfamily Modiolinae from sunken wood and whale carcasses assumed the outgroup position and shallow-water modioline mussels were positioned more distantly to the bathymodioline mussels. We provisionally hypothesized the evolutionary history of Bathymodilolus mussels by estimating evolutionary time under a relaxed molecular clock model. Diversification of bathymodioline mussels was initiated in the early Miocene, and subsequently diversification of the groups occurred in the early to middle Miocene. CONCLUSIONS/SIGNIFICANCE: The phylogenetic relationships support the "Evolutionary stepping stone hypothesis," in which mytilid ancestors exploited sunken wood and whale carcasses in their progressive adaptation to deep-sea environments. This hypothesis is also supported by the evolutionary transition of

  20. Ecological and evolutionary impacts of changing climatic variability.

    Science.gov (United States)

    Vázquez, Diego P; Gianoli, Ernesto; Morris, William F; Bozinovic, Francisco

    2017-02-01

    While average temperature is likely to increase in most locations on Earth, many places will simultaneously experience higher variability in temperature, precipitation, and other climate variables. Although ecologists and evolutionary biologists widely recognize the potential impacts of changes in average climatic conditions, relatively little attention has been paid to the potential impacts of changes in climatic variability and extremes. We review the evidence on the impacts of increased climatic variability and extremes on physiological, ecological and evolutionary processes at multiple levels of biological organization, from individuals to populations and communities. Our review indicates that climatic variability can have profound influences on biological processes at multiple scales of organization. Responses to increased climatic variability and extremes are likely to be complex and cannot always be generalized, although our conceptual and methodological toolboxes allow us to make informed predictions about the likely consequences of such climatic changes. We conclude that climatic variability represents an important component of climate that deserves further attention. © 2015 Cambridge Philosophical Society.

  1. Cash Management Policies By Evolutionary Models: A Comparison Using The MILLER-ORR Model

    Directory of Open Access Journals (Sweden)

    Marcelo Botelho da Costa Moraes

    2013-10-01

    Full Text Available This work aims to apply genetic algorithms (GA and particle swarm optimization (PSO to managing cash balance, comparing performance results between computational models and the Miller-Orr model. Thus, the paper proposes the application of computational evolutionary models to minimize the total cost of cash balance maintenance, obtaining the parameters for a cash management policy, using assumptions presented in the literature, considering the cost of maintenance and opportunity for cost of cash. For such, we developed computational experiments from cash flows simulated to implement the algorithms. For a control purpose, an algorithm has been developed that uses the Miller-Orr model defining the lower bound parameter, which is not obtained by the original model. The results indicate that evolutionary algorithms present better results than the Miller-Orr model, with prevalence for PSO algorithm in results.

  2. Public safety investigations-A new evolutionary step in safety enhancement?

    International Nuclear Information System (INIS)

    Stoop, John; Roed-Larsen, Sverre

    2009-01-01

    A historical overview highlights the evolutionary nature of developments in accident investigations in the transport industry. Based on a series of major events outside transportation, the concept of accident investigations has broadened to other domains and to a widening of the scope of the investigation. Consequently, existing investigation boards are forced to adapt their mandates, missions and methods. With the introduction of social risk perception and application of the concept of safety investigation in the public sector, a change of focus towards the aftermath and non-technical issues of a more generic nature emerges. This expansion has also gained the interest of social sciences and public governance, generating new underlying models and theories on risk and responsibility. The evolutionary development of safety investigations is demonstrated by the various organisational forms which shaped accident investigations in different countries. Underneath these organisational differences, a need for a common methodology and a reflection on fundamental notions is discussed. In particular differences among human operator models, the allocation of responsibilities in design concepts and methodological issue are elaborated. The needs and opportunities for a transition from accident prevention towards systems change are indicated. At present, the situation is ambiguous. An encompassing inventory can only provide a general oversight over emerging trends and lacks analytic rigor on specific topics. The societal dimensions, institutional changes at the level of governance and control and the powers that advocate or challenge investigations are not yet fully described. Therefore, in the conclusions a small number of critical challenges and threats are identified that should be open to scrutiny in order to facilitate a new, evolutionary step in safety enhancement.

  3. Intelligent systems/software engineering methodology - A process to manage cost and risk

    Science.gov (United States)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  4. The Algorithm for Algorithms: An Evolutionary Algorithm Based on Automatic Designing of Genetic Operators

    Directory of Open Access Journals (Sweden)

    Dazhi Jiang

    2015-01-01

    Full Text Available At present there is a wide range of evolutionary algorithms available to researchers and practitioners. Despite the great diversity of these algorithms, virtually all of the algorithms share one feature: they have been manually designed. A fundamental question is “are there any algorithms that can design evolutionary algorithms automatically?” A more complete definition of the question is “can computer construct an algorithm which will generate algorithms according to the requirement of a problem?” In this paper, a novel evolutionary algorithm based on automatic designing of genetic operators is presented to address these questions. The resulting algorithm not only explores solutions in the problem space like most traditional evolutionary algorithms do, but also automatically generates genetic operators in the operator space. In order to verify the performance of the proposed algorithm, comprehensive experiments on 23 well-known benchmark optimization problems are conducted. The results show that the proposed algorithm can outperform standard differential evolution algorithm in terms of convergence speed and solution accuracy which shows that the algorithm designed automatically by computers can compete with the algorithms designed by human beings.

  5. Beam standardization and dosimetric methodology in computed tomography

    International Nuclear Information System (INIS)

    Maia, Ana Figueiredo

    2005-01-01

    Special ionization chambers, named pencil ionization chambers, are used in dosimetric procedures in computed tomography beams (CT). In this work, an extensive study about pencil ionization chambers was performed, as a contribution to the accuracy of the dosimetric procedures in CT beams. The international scientific community has recently been discussing the need of the establishment of a specific calibration procedure for CT ionization chambers, once these chambers present special characteristics that differentiate them from other ionization chambers used in diagnostic radiology beams. In this work, an adequate calibration procedure for pencil ionization chambers was established at the Calibration Laboratory, of the Institute de Pesquisas Energeticas e Nucleares, in accordance with the most recent international recommendations. Two calibration methodologies were tested and analyzed by comparative studies. Moreover, a new extended length parallel plate ionization chamber, with a transversal section very similar to pencil ionization chambers, was developed. The operational characteristics of this chamber were determined and the results obtained showed that its behaviour is adequate as a reference system in CT standard beams. Two other studies were performed during this work, both using CT ionization chambers. The first study was about the performance of a pencil ionization chamber in standard radiation beams of several types and energies, and the results showed that this chamber presents satisfactory behaviour in other radiation qualities as of diagnostic radiology, mammography and radiotherapy. In the second study, a tandem system for verification of hal'-value layer variations in CT equipment, using a pencil ionization chamber, was developed. Because of the X rays tube rotation, the determination of half-value layers in computed tomography equipment is not an easy task, and it is usually not performed within quality control programs. (author)

  6. Evolutionary game theory using agent-based methods.

    Science.gov (United States)

    Adami, Christoph; Schossau, Jory; Hintze, Arend

    2016-12-01

    Evolutionary game theory is a successful mathematical framework geared towards understanding the selective pressures that affect the evolution of the strategies of agents engaged in interactions with potential conflicts. While a mathematical treatment of the costs and benefits of decisions can predict the optimal strategy in simple settings, more realistic settings such as finite populations, non-vanishing mutations rates, stochastic decisions, communication between agents, and spatial interactions, require agent-based methods where each agent is modeled as an individual, carries its own genes that determine its decisions, and where the evolutionary outcome can only be ascertained by evolving the population of agents forward in time. While highlighting standard mathematical results, we compare those to agent-based methods that can go beyond the limitations of equations and simulate the complexity of heterogeneous populations and an ever-changing set of interactors. We conclude that agent-based methods can predict evolutionary outcomes where purely mathematical treatments cannot tread (for example in the weak selection-strong mutation limit), but that mathematics is crucial to validate the computational simulations. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Exploiting Genomic Knowledge in Optimising Molecular Breeding Programmes: Algorithms from Evolutionary Computing

    Science.gov (United States)

    O'Hagan, Steve; Knowles, Joshua; Kell, Douglas B.

    2012-01-01

    Comparatively few studies have addressed directly the question of quantifying the benefits to be had from using molecular genetic markers in experimental breeding programmes (e.g. for improved crops and livestock), nor the question of which organisms should be mated with each other to best effect. We argue that this requires in silico modelling, an approach for which there is a large literature in the field of evolutionary computation (EC), but which has not really been applied in this way to experimental breeding programmes. EC seeks to optimise measurable outcomes (phenotypic fitnesses) by optimising in silico the mutation, recombination and selection regimes that are used. We review some of the approaches from EC, and compare experimentally, using a biologically relevant in silico landscape, some algorithms that have knowledge of where they are in the (genotypic) search space (G-algorithms) with some (albeit well-tuned ones) that do not (F-algorithms). For the present kinds of landscapes, F- and G-algorithms were broadly comparable in quality and effectiveness, although we recognise that the G-algorithms were not equipped with any ‘prior knowledge’ of epistatic pathway interactions. This use of algorithms based on machine learning has important implications for the optimisation of experimental breeding programmes in the post-genomic era when we shall potentially have access to the full genome sequence of every organism in a breeding population. The non-proprietary code that we have used is made freely available (via Supplementary information). PMID:23185279

  8. Evolutionary optimization of neural networks with heterogeneous computation: study and implementation

    OpenAIRE

    FE, JORGE DEOLINDO; Aliaga Varea, Ramón José; Gadea Gironés, Rafael

    2015-01-01

    In the optimization of artificial neural networks (ANNs) via evolutionary algorithms and the implementation of the necessary training for the objective function, there is often a trade-off between efficiency and flexibility. Pure software solutions on general-purpose processors tend to be slow because they do not take advantage of the inherent parallelism, whereas hardware realizations usually rely on optimizations that reduce the range of applicable network topologies, or they...

  9. Methodology and computer program for applying improved, inelastic ERR for the design of mine layouts on planar reefs.

    CSIR Research Space (South Africa)

    Spottiswoode, SM

    2002-08-01

    Full Text Available and the visco-plastic models of Napier and Malan (1997) and Malan (2002). Methodologies and a computer program (MINF) are developed during this project that write synthetic catalogues of seismic events to simulate the rock response to mining...

  10. Ancient Biomolecules and Evolutionary Inference.

    Science.gov (United States)

    Cappellini, Enrico; Prohaska, Ana; Racimo, Fernando; Welker, Frido; Pedersen, Mikkel Winther; Allentoft, Morten E; de Barros Damgaard, Peter; Gutenbrunner, Petra; Dunne, Julie; Hammann, Simon; Roffet-Salque, Mélanie; Ilardo, Melissa; Moreno-Mayar, J Víctor; Wang, Yucheng; Sikora, Martin; Vinner, Lasse; Cox, Jürgen; Evershed, Richard P; Willerslev, Eske

    2018-04-25

    Over the last decade, studies of ancient biomolecules-particularly ancient DNA, proteins, and lipids-have revolutionized our understanding of evolutionary history. Though initially fraught with many challenges, the field now stands on firm foundations. Researchers now successfully retrieve nucleotide and amino acid sequences, as well as lipid signatures, from progressively older samples, originating from geographic areas and depositional environments that, until recently, were regarded as hostile to long-term preservation of biomolecules. Sampling frequencies and the spatial and temporal scope of studies have also increased markedly, and with them the size and quality of the data sets generated. This progress has been made possible by continuous technical innovations in analytical methods, enhanced criteria for the selection of ancient samples, integrated experimental methods, and advanced computational approaches. Here, we discuss the history and current state of ancient biomolecule research, its applications to evolutionary inference, and future directions for this young and exciting field. Expected final online publication date for the Annual Review of Biochemistry Volume 87 is June 20, 2018. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.

  11. Effectively Tackling Reinsurance Problems by Using Evolutionary and Swarm Intelligence Algorithms

    Directory of Open Access Journals (Sweden)

    Sancho Salcedo-Sanz

    2014-04-01

    Full Text Available This paper is focused on solving different hard optimization problems that arise in the field of insurance and, more specifically, in reinsurance problems. In this area, the complexity of the models and assumptions considered in the definition of the reinsurance rules and conditions produces hard black-box optimization problems (problems in which the objective function does not have an algebraic expression, but it is the output of a system (usually a computer program, which must be solved in order to obtain the optimal output of the reinsurance. The application of traditional optimization approaches is not possible in this kind of mathematical problem, so new computational paradigms must be applied to solve these problems. In this paper, we show the performance of two evolutionary and swarm intelligence techniques (evolutionary programming and particle swarm optimization. We provide an analysis in three black-box optimization problems in reinsurance, where the proposed approaches exhibit an excellent behavior, finding the optimal solution within a fraction of the computational cost used by inspection or enumeration methods.

  12. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2011-01-01

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  13. Evolutionary Studies in Business: A Presentation of a New Journal

    Directory of Open Access Journals (Sweden)

    Paloma Fernández Pérez

    2016-01-01

    Full Text Available The Journal of Evolutionary Studies in Business is a new open access journal led by an international interdisciplinary team of scholars located in eight institutions from three continents who wants to attract contributions that help shed light on the new questions, challenges, methodologies and realities, faced by businesses in an evolutionary perspective. The journal calls particularly for review essays that deal with new research topics about business, and provide useful overviews of the key ideas, scholars, and debates about important research topics concerning business and its environment. The strategic areas of interest for submissions from authors are: Management Challenges, Entrepreneurship, Science and Business, Creative Industries, International Business, Business History, and Latin American Businesses. JESB will also publish articles about relevant online resources that contain information of interest to academic scholars and business practitioners.

  14. On the numerical treatment of selected oscillatory evolutionary problems

    Science.gov (United States)

    Cardone, Angelamaria; Conte, Dajana; D'Ambrosio, Raffaele; Paternoster, Beatrice

    2017-07-01

    We focus on evolutionary problems whose qualitative behaviour is known a-priori and exploited in order to provide efficient and accurate numerical schemes. For classical numerical methods, depending on constant coefficients, the required computational effort could be quite heavy, due to the necessary employ of very small stepsizes needed to accurately reproduce the qualitative behaviour of the solution. In these situations, it may be convenient to use special purpose formulae, i.e. non-polynomially fitted formulae on basis functions adapted to the problem (see [16, 17] and references therein). We show examples of special purpose strategies to solve two families of evolutionary problems exhibiting periodic solutions, i.e. partial differential equations and Volterra integral equations.

  15. Simplified Drift Analysis for Proving Lower Bounds in Evolutionary Computation

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten

    2011-01-01

    Drift analysis is a powerful tool used to bound the optimization time of evolutionary algorithms (EAs). Various previous works apply a drift theorem going back to Hajek in order to show exponential lower bounds on the optimization time of EAs. However, this drift theorem is tedious to read...... and to apply since it requires two bounds on the moment-generating (exponential) function of the drift. A recent work identifies a specialization of this drift theorem that is much easier to apply. Nevertheless, it is not as simple and not as general as possible. The present paper picks up Hajek’s line...

  16. Evolutionary inference across eukaryotes identifies specific pressures favoring mitochondrial gene retention

    OpenAIRE

    Williams, Ben; Johnston, Iain

    2016-01-01

    Since their endosymbiotic origin, mitochondria have lost most of their genes. Although many selective mechanisms underlying the evolution of mitochondrial genomes have been proposed, a data-driven exploration of these hypotheses is lacking, and a quantitatively supported consensus remains absent. We developed HyperTraPS, a methodology coupling stochastic modelling with Bayesian inference, to identify the ordering of evolutionary events and suggest their causes. Using 2015 complete mitochondri...

  17. A dose to curie conversion methodology

    International Nuclear Information System (INIS)

    Stowe, P.A.

    1987-01-01

    Development of the computer code RadCAT (Radioactive waste Classification And Tracking) has led to the development of a simple dose rate to curie content conversion methodology for containers with internally distributed radioactive material. It was determined early on that, if possible, the computerized dose rate to curie evaluation model employed in RadCAT should yield the same results as the hand method utilized and specified in plant procedures. A review of current industry practices indicated two distinct types of computational methodologies are presently in use. The most common methods are computer based calculations utilizing complex mathematical models specifically established for various containers geometries. This type of evaluation is tedious, however, and does not lend itself to repetition by hand. The second method of evaluation, therefore, is simplified expressions that sacrifice accuracy for ease of computation, and generally over estimate container curie content. To meet the aforementioned criterion current computer based models were deemed unacceptably complex and hand computational methods to be too inaccurate for serious consideration. The contact dose rate/curie content analysis methodology presented herein provides an equation that is easy to use in hand calculations yet provides accuracy equivalent to other computer based computations

  18. An integrated impact assessment and weighting methodology: evaluation of the environmental consequences of computer display technology substitution.

    Science.gov (United States)

    Zhou, Xiaoying; Schoenung, Julie M

    2007-04-01

    Computer display technology is currently in a state of transition, as the traditional technology of cathode ray tubes is being replaced by liquid crystal display flat-panel technology. Technology substitution and process innovation require the evaluation of the trade-offs among environmental impact, cost, and engineering performance attributes. General impact assessment methodologies, decision analysis and management tools, and optimization methods commonly used in engineering cannot efficiently address the issues needed for such evaluation. The conventional Life Cycle Assessment (LCA) process often generates results that can be subject to multiple interpretations, although the advantages of the LCA concept and framework obtain wide recognition. In the present work, the LCA concept is integrated with Quality Function Deployment (QFD), a popular industrial quality management tool, which is used as the framework for the development of our integrated model. The problem of weighting is addressed by using pairwise comparison of stakeholder preferences. Thus, this paper presents a new integrated analytical approach, Integrated Industrial Ecology Function Deployment (I2-EFD), to assess the environmental behavior of alternative technologies in correlation with their performance and economic characteristics. Computer display technology is used as the case study to further develop our methodology through the modification and integration of various quality management tools (e.g., process mapping, prioritization matrix) and statistical methods (e.g., multi-attribute analysis, cluster analysis). Life cycle thinking provides the foundation for our methodology, as we utilize a published LCA report, which stopped at the characterization step, as our starting point. Further, we evaluate the validity and feasibility of our methodology by considering uncertainty and conducting sensitivity analysis.

  19. Equipment qualification testing methodology research at Sandia Laboratories

    International Nuclear Information System (INIS)

    Jeppesen, D.

    1983-01-01

    The Equipment Qualification Research Testing (EQRT) program is an evolutionary outgrowth of the Qualification Testing Evaluation (QTE) program at Sandia. The primary emphasis of the program has been qualification methodology research. The EQRT program offers to the industry a research-oriented perspective on qualification-related component performance, as well as refinements to component testing standards which are based upon actual component testing research

  20. Basic emotions and adaptation. A computational and evolutionary model.

    Directory of Open Access Journals (Sweden)

    Daniela Pacella

    Full Text Available The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then

  1. Basic emotions and adaptation. A computational and evolutionary model.

    Science.gov (United States)

    Pacella, Daniela; Ponticorvo, Michela; Gigliotta, Onofrio; Miglino, Orazio

    2017-01-01

    The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then switching their behavior

  2. Population genetics of non-genetic traits: Evolutionary roles of stochasticity in gene expression

    KAUST Repository

    Mineta, Katsuhiko

    2015-05-01

    The role of stochasticity in evolutionary genetics has long been debated. To date, however, the potential roles of non-genetic traits in evolutionary processes have been largely neglected. In molecular biology, growing evidence suggests that stochasticity in gene expression (SGE) is common and that SGE has major impacts on phenotypes and fitness. Here, we provide a general overview of the potential effects of SGE on population genetic parameters, arguing that SGE can indeed have a profound effect on evolutionary processes. Our analyses suggest that SGE potentially alters the fate of mutations by influencing effective population size and fixation probability. In addition, a genetic control of SGE magnitude could evolve under certain conditions, if the fitness of the less-fit individual increases due to SGE and environmental fluctuation. Although empirical evidence for our arguments is yet to come, methodological developments for precisely measuring SGE in living organisms will further advance our understanding of SGE-driven evolution.

  3. Population genetics of non-genetic traits: Evolutionary roles of stochasticity in gene expression

    KAUST Repository

    Mineta, Katsuhiko; Matsumoto, Tomotaka; Osada, Naoki; Araki, Hitoshi

    2015-01-01

    The role of stochasticity in evolutionary genetics has long been debated. To date, however, the potential roles of non-genetic traits in evolutionary processes have been largely neglected. In molecular biology, growing evidence suggests that stochasticity in gene expression (SGE) is common and that SGE has major impacts on phenotypes and fitness. Here, we provide a general overview of the potential effects of SGE on population genetic parameters, arguing that SGE can indeed have a profound effect on evolutionary processes. Our analyses suggest that SGE potentially alters the fate of mutations by influencing effective population size and fixation probability. In addition, a genetic control of SGE magnitude could evolve under certain conditions, if the fitness of the less-fit individual increases due to SGE and environmental fluctuation. Although empirical evidence for our arguments is yet to come, methodological developments for precisely measuring SGE in living organisms will further advance our understanding of SGE-driven evolution.

  4. Embodied artificial evolution: Artificial evolutionary systems in the 21st Century.

    Science.gov (United States)

    Eiben, A E; Kernbach, S; Haasdijk, Evert

    2012-12-01

    Evolution is one of the major omnipresent powers in the universe that has been studied for about two centuries. Recent scientific and technical developments make it possible to make the transition from passively understanding to actively using evolutionary processes. Today this is possible in Evolutionary Computing, where human experimenters can design and manipulate all components of evolutionary processes in digital spaces. We argue that in the near future it will be possible to implement artificial evolutionary processes outside such imaginary spaces and make them physically embodied. In other words, we envision the "Evolution of Things", rather than just the evolution of digital objects, leading to a new field of Embodied Artificial Evolution (EAE). The main objective of this paper is to present a unifying vision in order to aid the development of this high potential research area. To this end, we introduce the notion of EAE, discuss a few examples and applications, and elaborate on the expected benefits as well as the grand challenges this developing field will have to address.

  5. Genetic networks and soft computing.

    Science.gov (United States)

    Mitra, Sushmita; Das, Ranajit; Hayashi, Yoichi

    2011-01-01

    The analysis of gene regulatory networks provides enormous information on various fundamental cellular processes involving growth, development, hormone secretion, and cellular communication. Their extraction from available gene expression profiles is a challenging problem. Such reverse engineering of genetic networks offers insight into cellular activity toward prediction of adverse effects of new drugs or possible identification of new drug targets. Tasks such as classification, clustering, and feature selection enable efficient mining of knowledge about gene interactions in the form of networks. It is known that biological data is prone to different kinds of noise and ambiguity. Soft computing tools, such as fuzzy sets, evolutionary strategies, and neurocomputing, have been found to be helpful in providing low-cost, acceptable solutions in the presence of various types of uncertainties. In this paper, we survey the role of these soft methodologies and their hybridizations, for the purpose of generating genetic networks.

  6. Polymorphic Evolutionary Games.

    Science.gov (United States)

    Fishman, Michael A

    2016-06-07

    In this paper, I present an analytical framework for polymorphic evolutionary games suitable for explicitly modeling evolutionary processes in diploid populations with sexual reproduction. The principal aspect of the proposed approach is adding diploid genetics cum sexual recombination to a traditional evolutionary game, and switching from phenotypes to haplotypes as the new game׳s pure strategies. Here, the relevant pure strategy׳s payoffs derived by summing the payoffs of all the phenotypes capable of producing gametes containing that particular haplotype weighted by the pertinent probabilities. The resulting game is structurally identical to the familiar Evolutionary Games with non-linear pure strategy payoffs (Hofbauer and Sigmund, 1998. Cambridge University Press), and can be analyzed in terms of an established analytical framework for such games. And these results can be translated into the terms of genotypic, and whence, phenotypic evolutionary stability pertinent to the original game. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Verification of a hybrid adjoint methodology in Titan for single photon emission computed tomography - 316

    International Nuclear Information System (INIS)

    Royston, K.; Haghighat, A.; Yi, C.

    2010-01-01

    The hybrid deterministic transport code TITAN is being applied to a Single Photon Emission Computed Tomography (SPECT) simulation of a myocardial perfusion study. The TITAN code's hybrid methodology allows the use of a discrete ordinates solver in the phantom region and a characteristics method solver in the collimator region. Currently we seek to validate the adjoint methodology in TITAN for this application using a SPECT model that has been created in the MCNP5 Monte Carlo code. The TITAN methodology was examined based on the response of a single voxel detector placed in front of the heart with and without collimation. For the case without collimation, the TITAN response for single voxel-sized detector had a -9.96% difference relative to the MCNP5 response. To simulate collimation, the adjoint source was specified in directions located within the collimator acceptance angle. For a single collimator hole with a diameter matching the voxel dimension, a difference of -0.22% was observed. Comparisons to groupings of smaller collimator holes of two different sizes resulted in relative differences of 0.60% and 0.12%. The number of adjoint source directions within an acceptance angle was increased and showed no significant change in accuracy. Our results indicate that the hybrid adjoint methodology of TITAN yields accurate solutions greater than a factor of two faster than MCNP5. (authors)

  8. Remembering the evolutionary Freud.

    Science.gov (United States)

    Young, Allan

    2006-03-01

    Throughout his career as a writer, Sigmund Freud maintained an interest in the evolutionary origins of the human mind and its neurotic and psychotic disorders. In common with many writers then and now, he believed that the evolutionary past is conserved in the mind and the brain. Today the "evolutionary Freud" is nearly forgotten. Even among Freudians, he is regarded to be a red herring, relevant only to the extent that he diverts attention from the enduring achievements of the authentic Freud. There are three ways to explain these attitudes. First, the evolutionary Freud's key work is the "Overview of the Transference Neurosis" (1915). But it was published at an inopportune moment, forty years after the author's death, during the so-called "Freud wars." Second, Freud eventually lost interest in the "Overview" and the prospect of a comprehensive evolutionary theory of psychopathology. The publication of The Ego and the Id (1923), introducing Freud's structural theory of the psyche, marked the point of no return. Finally, Freud's evolutionary theory is simply not credible. It is based on just-so stories and a thoroughly discredited evolutionary mechanism, Lamarckian use-inheritance. Explanations one and two are probably correct but also uninteresting. Explanation number three assumes that there is a fundamental difference between Freud's evolutionary narratives (not credible) and the evolutionary accounts of psychopathology that currently circulate in psychiatry and mainstream journals (credible). The assumption is mistaken but worth investigating.

  9. Recent advances in evolutionary multi-objective optimization

    CERN Document Server

    Datta, Rituparna; Gupta, Abhishek

    2017-01-01

    This book covers the most recent advances in the field of evolutionary multiobjective optimization. With the aim of drawing the attention of up-andcoming scientists towards exciting prospects at the forefront of computational intelligence, the authors have made an effort to ensure that the ideas conveyed herein are accessible to the widest audience. The book begins with a summary of the basic concepts in multi-objective optimization. This is followed by brief discussions on various algorithms that have been proposed over the years for solving such problems, ranging from classical (mathematical) approaches to sophisticated evolutionary ones that are capable of seamlessly tackling practical challenges such as non-convexity, multi-modality, the presence of multiple constraints, etc. Thereafter, some of the key emerging aspects that are likely to shape future research directions in the field are presented. These include:< optimization in dynamic environments, multi-objective bilevel programming, handling high ...

  10. Understanding the mind from an evolutionary perspective: an overview of evolutionary psychology.

    Science.gov (United States)

    Shackelford, Todd K; Liddle, James R

    2014-05-01

    The theory of evolution by natural selection provides the only scientific explanation for the existence of complex adaptations. The design features of the brain, like any organ, are the result of selection pressures operating over deep time. Evolutionary psychology posits that the human brain comprises a multitude of evolved psychological mechanisms, adaptations to specific and recurrent problems of survival and reproduction faced over human evolutionary history. Although some mistakenly view evolutionary psychology as promoting genetic determinism, evolutionary psychologists appreciate and emphasize the interactions between genes and environments. This approach to psychology has led to a richer understanding of a variety of psychological phenomena, and has provided a powerful foundation for generating novel hypotheses. Critics argue that evolutionary psychologists resort to storytelling, but as with any branch of science, empirical testing is a vital component of the field, with hypotheses standing or falling with the weight of the evidence. Evolutionary psychology is uniquely suited to provide a unifying theoretical framework for the disparate subdisciplines of psychology. An evolutionary perspective has provided insights into several subdisciplines of psychology, while simultaneously demonstrating the arbitrary nature of dividing psychological science into such subdisciplines. Evolutionary psychologists have amassed a substantial empirical and theoretical literature, but as a relatively new approach to psychology, many questions remain, with several promising directions for future research. For further resources related to this article, please visit the WIREs website. The authors have declared no conflicts of interest for this article. © 2014 John Wiley & Sons, Ltd.

  11. An AmI-Based Software Architecture Enabling Evolutionary Computation in Blended Commerce: The Shopping Plan Application

    Directory of Open Access Journals (Sweden)

    Giuseppe D’Aniello

    2015-01-01

    Full Text Available This work describes an approach to synergistically exploit ambient intelligence technologies, mobile devices, and evolutionary computation in order to support blended commerce or ubiquitous commerce scenarios. The work proposes a software architecture consisting of three main components: linked data for e-commerce, cloud-based services, and mobile apps. The three components implement a scenario where a shopping mall is presented as an intelligent environment in which customers use NFC capabilities of their smartphones in order to handle e-coupons produced, suggested, and consumed by the abovesaid environment. The main function of the intelligent environment is to help customers define shopping plans, which minimize the overall shopping cost by looking for best prices, discounts, and coupons. The paper proposes a genetic algorithm to find suboptimal solutions for the shopping plan problem in a highly dynamic context, where the final cost of a product for an individual customer is dependent on his previous purchases. In particular, the work provides details on the Shopping Plan software prototype and some experimentation results showing the overall performance of the genetic algorithm.

  12. On the methodology of feeding ecology in fish

    Directory of Open Access Journals (Sweden)

    Saikia Surjya Kumar

    2016-06-01

    Full Text Available Feeding ecology explains predator’s preference to some preys over others in their habitat and their competitions thereof. The subject, as a functional and applied biology, is highly neglected, and in case of fish, a uniform and consistent methodology is absent. The currently practiced methods are largely centred on mathematical indices and highly erroneous because of non-uniform outcomes. Therefore, it requires a relook into the subject to elucidate functional contributions and to make it more comparable and comprehensive science. In this article, approachable methodological strategies have been forwarded in three hierarchical steps, namely, food occurrence, feeding biology and interpretative ecology. All these steps involve wide ranges of techniques, within the scope of ecology but not limited to, and traverse from narrative to functional evolutionary ecology. The first step is an assumption-observation practice to assess food of fish, followed by feeding biology that links morphological, histological, cytological, bacteriological or enzymological correlations to preferred food in the environment. Interpretative ecology is the higher level of analysis in which the outcomes are tested and discussed against evolutionary theories. A description of possible pedagogics on the methods of feeding ecological studies has also been forwarded.

  13. Fixed Parameter Evolutionary Algorithms and Maximum Leaf Spanning Trees: A Matter of Mutations

    DEFF Research Database (Denmark)

    Kratsch, Stefan; Lehre, Per Kristian; Neumann, Frank

    2011-01-01

    Evolutionary algorithms have been shown to be very successful for a wide range of NP-hard combinatorial optimization problems. We investigate the NP-hard problem of computing a spanning tree that has a maximal number of leaves by evolutionary algorithms in the context of fixed parameter tractabil...... two common mutation operators, we show that an operator related to spanning tree problems leads to an FPT running time in contrast to a general mutation operator that does not have this property....

  14. A New Methodology for Solving Trajectory Planning and Dynamic Load-Carrying Capacity of a Robot Manipulator

    Directory of Open Access Journals (Sweden)

    Wanjin Guo

    2016-01-01

    Full Text Available A new methodology using a direct method for obtaining the best found trajectory planning and maximum dynamic load-carrying capacity (DLCC is presented for a 5-degree of freedom (DOF hybrid robot manipulator. A nonlinear constrained multiobjective optimization problem is formulated with four objective functions, namely, travel time, total energy involved in the motion, joint jerks, and joint acceleration. The vector of decision variables is defined by the sequence of the time-interval lengths associated with each two consecutive via-points on the desired trajectory of the 5-DOF robot generalized coordinates. Then this vector of decision variables is computed in order to minimize the cost function (which is the weighted sum of these four objective functions subject to constraints on joint positions, velocities, acceleration, jerks, forces/torques, and payload mass. Two separate approaches are proposed to deal with the trajectory planning problem and the maximum DLCC calculation for the 5-DOF robot manipulator using an evolutionary optimization technique. The adopted evolutionary algorithm is the elitist nondominated sorting genetic algorithm (NSGA-II. A numerical application is performed for obtaining best found solutions of trajectory planning and maximum DLCC calculation for the 5-DOF hybrid robot manipulator.

  15. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  16. Methodology for Computational Fluid Dynamic Validation for Medical Use: Application to Intracranial Aneurysm.

    Science.gov (United States)

    Paliwal, Nikhil; Damiano, Robert J; Varble, Nicole A; Tutino, Vincent M; Dou, Zhongwang; Siddiqui, Adnan H; Meng, Hui

    2017-12-01

    Computational fluid dynamics (CFD) is a promising tool to aid in clinical diagnoses of cardiovascular diseases. However, it uses assumptions that simplify the complexities of the real cardiovascular flow. Due to high-stakes in the clinical setting, it is critical to calculate the effect of these assumptions in the CFD simulation results. However, existing CFD validation approaches do not quantify error in the simulation results due to the CFD solver's modeling assumptions. Instead, they directly compare CFD simulation results against validation data. Thus, to quantify the accuracy of a CFD solver, we developed a validation methodology that calculates the CFD model error (arising from modeling assumptions). Our methodology identifies independent error sources in CFD and validation experiments, and calculates the model error by parsing out other sources of error inherent in simulation and experiments. To demonstrate the method, we simulated the flow field of a patient-specific intracranial aneurysm (IA) in the commercial CFD software star-ccm+. Particle image velocimetry (PIV) provided validation datasets for the flow field on two orthogonal planes. The average model error in the star-ccm+ solver was 5.63 ± 5.49% along the intersecting validation line of the orthogonal planes. Furthermore, we demonstrated that our validation method is superior to existing validation approaches by applying three representative existing validation techniques to our CFD and experimental dataset, and comparing the validation results. Our validation methodology offers a streamlined workflow to extract the "true" accuracy of a CFD solver.

  17. Multilevel Hierarchy of Economic Space: Formation of Evolutionary Taxonomy

    Directory of Open Access Journals (Sweden)

    Daniil Petrovich Frolov

    2013-12-01

    Full Text Available The article considers methodological problems of hierarchical structuring of economic space. The evolution survey of multilevel analysis concepts reveals a dominant role of two-level (micro- macro neoclassical models because of the path dependence effect. In institutional and evolutionary theories the application of mesoanalysis and three-level models gradually becomes more active, but conventions in the field of taxonomy are extremely inert. The main methodological problems of a hierarchical taksonomization of economic space include the problem of taxonomical «rupture» of a subject and a method of Economics, the problem of an identification of the level (rank and scale of economic phenomena, the problem of an identification of subjects and business location, the problem of terminological unification. The author›s hierarchical model of economic space is developed in a context of the generalized evolutionary theory on the basis of multilevel population thinking. The model offers differentiation of industrial and territorial (spatial division and cooperation of labour and, more widely, economic activity. Branches and generation are treated as objects of the industrial analysis, population and ecocenosis – objects of the spatial analysis that allows reintegration of spatial formations in the system of economic analysis. The study of mesolevels and interlevel relations is particularly important. Institutionalism can be considered as metanarrative, i.e. one of universal languages of Economics. Scales and ranks of the functions assigned to subjects and objects of transactions define level differentiation of institutions’ forms in economic space

  18. Improved quantum-inspired evolutionary algorithm with diversity information applied to economic dispatch problem with prohibited operating zones

    International Nuclear Information System (INIS)

    Vianna Neto, Julio Xavier; Andrade Bernert, Diego Luis de; Santos Coelho, Leandro dos

    2011-01-01

    The objective of the economic dispatch problem (EDP) of electric power generation, whose characteristics are complex and highly nonlinear, is to schedule the committed generating unit outputs so as to meet the required load demand at minimum operating cost while satisfying all unit and system equality and inequality constraints. Recently, as an alternative to the conventional mathematical approaches, modern meta-heuristic optimization techniques have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. Research on merging evolutionary computation and quantum computation has been started since late 1990. Inspired on the quantum computation, this paper presented an improved quantum-inspired evolutionary algorithm (IQEA) based on diversity information of population. A classical quantum-inspired evolutionary algorithm (QEA) and the IQEA were implemented and validated for a benchmark of EDP with 15 thermal generators with prohibited operating zones. From the results for the benchmark problem, it is observed that the proposed IQEA approach provides promising results when compared to various methods available in the literature.

  19. Improved quantum-inspired evolutionary algorithm with diversity information applied to economic dispatch problem with prohibited operating zones

    Energy Technology Data Exchange (ETDEWEB)

    Vianna Neto, Julio Xavier, E-mail: julio.neto@onda.com.b [Pontifical Catholic University of Parana, PUCPR, Undergraduate Program at Mechatronics Engineering, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil); Andrade Bernert, Diego Luis de, E-mail: dbernert@gmail.co [Pontifical Catholic University of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil); Santos Coelho, Leandro dos, E-mail: leandro.coelho@pucpr.b [Pontifical Catholic University of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil)

    2011-01-15

    The objective of the economic dispatch problem (EDP) of electric power generation, whose characteristics are complex and highly nonlinear, is to schedule the committed generating unit outputs so as to meet the required load demand at minimum operating cost while satisfying all unit and system equality and inequality constraints. Recently, as an alternative to the conventional mathematical approaches, modern meta-heuristic optimization techniques have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. Research on merging evolutionary computation and quantum computation has been started since late 1990. Inspired on the quantum computation, this paper presented an improved quantum-inspired evolutionary algorithm (IQEA) based on diversity information of population. A classical quantum-inspired evolutionary algorithm (QEA) and the IQEA were implemented and validated for a benchmark of EDP with 15 thermal generators with prohibited operating zones. From the results for the benchmark problem, it is observed that the proposed IQEA approach provides promising results when compared to various methods available in the literature.

  20. Improved quantum-inspired evolutionary algorithm with diversity information applied to economic dispatch problem with prohibited operating zones

    Energy Technology Data Exchange (ETDEWEB)

    Neto, Julio Xavier Vianna [Pontifical Catholic University of Parana, PUCPR, Undergraduate Program at Mechatronics Engineering, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil); Bernert, Diego Luis de Andrade; Coelho, Leandro dos Santos [Pontifical Catholic University of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil)

    2011-01-15

    The objective of the economic dispatch problem (EDP) of electric power generation, whose characteristics are complex and highly nonlinear, is to schedule the committed generating unit outputs so as to meet the required load demand at minimum operating cost while satisfying all unit and system equality and inequality constraints. Recently, as an alternative to the conventional mathematical approaches, modern meta-heuristic optimization techniques have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. Research on merging evolutionary computation and quantum computation has been started since late 1990. Inspired on the quantum computation, this paper presented an improved quantum-inspired evolutionary algorithm (IQEA) based on diversity information of population. A classical quantum-inspired evolutionary algorithm (QEA) and the IQEA were implemented and validated for a benchmark of EDP with 15 thermal generators with prohibited operating zones. From the results for the benchmark problem, it is observed that the proposed IQEA approach provides promising results when compared to various methods available in the literature. (author)

  1. [Evolutionary medicine].

    Science.gov (United States)

    Wjst, M

    2013-12-01

    Evolutionary medicine allows new insights into long standing medical problems. Are we "really stoneagers on the fast lane"? This insight might have enormous consequences and will allow new answers that could never been provided by traditional anthropology. Only now this is made possible using data from molecular medicine and systems biology. Thereby evolutionary medicine takes a leap from a merely theoretical discipline to practical fields - reproductive, nutritional and preventive medicine, as well as microbiology, immunology and psychiatry. Evolutionary medicine is not another "just so story" but a serious candidate for the medical curriculum providing a universal understanding of health and disease based on our biological origin. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Multi-objective evolutionary emergency response optimization for major accidents

    International Nuclear Information System (INIS)

    Georgiadou, Paraskevi S.; Papazoglou, Ioannis A.; Kiranoudis, Chris T.; Markatos, Nikolaos C.

    2010-01-01

    Emergency response planning in case of a major accident (hazardous material event, nuclear accident) is very important for the protection of the public and workers' safety and health. In this context, several protective actions can be performed, such as, evacuation of an area; protection of the population in buildings; and use of personal protective equipment. The best solution is not unique when multiple criteria are taken into consideration (e.g. health consequences, social disruption, economic cost). This paper presents a methodology for multi-objective optimization of emergency response planning in case of a major accident. The emergency policy with regards to protective actions to be implemented is optimized. An evolutionary algorithm has been used as the optimization tool. Case studies demonstrating the methodology and its application in emergency response decision-making in case of accidents related to hazardous materials installations are presented. However, the methodology with appropriate modification is suitable for supporting decisions in assessing emergency response procedures in other cases (nuclear accidents, transportation of hazardous materials) or for land-use planning issues.

  3. An evolutionary firefly algorithm for the estimation of nonlinear biological model parameters.

    Directory of Open Access Journals (Sweden)

    Afnizanfaizal Abdullah

    Full Text Available The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test.

  4. An evolutionary firefly algorithm for the estimation of nonlinear biological model parameters.

    Science.gov (United States)

    Abdullah, Afnizanfaizal; Deris, Safaai; Anwar, Sohail; Arjunan, Satya N V

    2013-01-01

    The development of accurate computational models of biological processes is fundamental to computational systems biology. These models are usually represented by mathematical expressions that rely heavily on the system parameters. The measurement of these parameters is often difficult. Therefore, they are commonly estimated by fitting the predicted model to the experimental data using optimization methods. The complexity and nonlinearity of the biological processes pose a significant challenge, however, to the development of accurate and fast optimization methods. We introduce a new hybrid optimization method incorporating the Firefly Algorithm and the evolutionary operation of the Differential Evolution method. The proposed method improves solutions by neighbourhood search using evolutionary procedures. Testing our method on models for the arginine catabolism and the negative feedback loop of the p53 signalling pathway, we found that it estimated the parameters with high accuracy and within a reasonable computation time compared to well-known approaches, including Particle Swarm Optimization, Nelder-Mead, and Firefly Algorithm. We have also verified the reliability of the parameters estimated by the method using an a posteriori practical identifiability test.

  5. Aggregate meta-models for evolutionary multiobjective and many-objective optimization

    Czech Academy of Sciences Publication Activity Database

    Pilát, Martin; Neruda, Roman

    Roč. 116, 20 September (2013), s. 392-402 ISSN 0925-2312 R&D Projects: GA ČR GAP202/11/1368 Institutional support: RVO:67985807 Keywords : evolutionary algorithms * multiobjective optimization * many-objective optimization * surrogate models * meta-models * memetic algorithm Subject RIV: IN - Informatics, Computer Science Impact factor: 2.005, year: 2013

  6. 4th International Joint Conference on Computational Intelligence

    CERN Document Server

    Correia, António; Rosa, Agostinho; Filipe, Joaquim

    2015-01-01

    The present book includes extended and revised versions of a set of selected papers from the Fourth International Joint Conference on Computational Intelligence (IJCCI 2012)., held in Barcelona, Spain, from 5 to 7 October, 2012. The conference was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and was organized in cooperation with the Association for the Advancement of Artificial Intelligence (AAAI). The conference brought together researchers, engineers and practitioners in computational technologies, especially those related to the areas of fuzzy computation, evolutionary computation and neural computation. It is composed of three co-located conferences, each one specialized in one of the aforementioned -knowledge areas. Namely: - International Conference on Evolutionary Computation Theory and Applications (ECTA) - International Conference on Fuzzy Computation Theory and Applications (FCTA) - International Conference on Neural Computation Theory a...

  7. Estimating true evolutionary distances under the DCJ model.

    Science.gov (United States)

    Lin, Yu; Moret, Bernard M E

    2008-07-01

    Modern techniques can yield the ordering and strandedness of genes on each chromosome of a genome; such data already exists for hundreds of organisms. The evolutionary mechanisms through which the set of the genes of an organism is altered and reordered are of great interest to systematists, evolutionary biologists, comparative genomicists and biomedical researchers. Perhaps the most basic concept in this area is that of evolutionary distance between two genomes: under a given model of genomic evolution, how many events most likely took place to account for the difference between the two genomes? We present a method to estimate the true evolutionary distance between two genomes under the 'double-cut-and-join' (DCJ) model of genome rearrangement, a model under which a single multichromosomal operation accounts for all genomic rearrangement events: inversion, transposition, translocation, block interchange and chromosomal fusion and fission. Our method relies on a simple structural characterization of a genome pair and is both analytically and computationally tractable. We provide analytical results to describe the asymptotic behavior of genomes under the DCJ model, as well as experimental results on a wide variety of genome structures to exemplify the very high accuracy (and low variance) of our estimator. Our results provide a tool for accurate phylogenetic reconstruction from multichromosomal gene rearrangement data as well as a theoretical basis for refinements of the DCJ model to account for biological constraints. All of our software is available in source form under GPL at http://lcbb.epfl.ch.

  8. Applying evolutionary anthropology.

    Science.gov (United States)

    Gibson, Mhairi A; Lawson, David W

    2015-01-01

    Evolutionary anthropology provides a powerful theoretical framework for understanding how both current environments and legacies of past selection shape human behavioral diversity. This integrative and pluralistic field, combining ethnographic, demographic, and sociological methods, has provided new insights into the ultimate forces and proximate pathways that guide human adaptation and variation. Here, we present the argument that evolutionary anthropological studies of human behavior also hold great, largely untapped, potential to guide the design, implementation, and evaluation of social and public health policy. Focusing on the key anthropological themes of reproduction, production, and distribution we highlight classic and recent research demonstrating the value of an evolutionary perspective to improving human well-being. The challenge now comes in transforming relevance into action and, for that, evolutionary behavioral anthropologists will need to forge deeper connections with other applied social scientists and policy-makers. We are hopeful that these developments are underway and that, with the current tide of enthusiasm for evidence-based approaches to policy, evolutionary anthropology is well positioned to make a strong contribution. © 2015 Wiley Periodicals, Inc.

  9. Applying Evolutionary Anthropology

    Science.gov (United States)

    Gibson, Mhairi A; Lawson, David W

    2015-01-01

    Evolutionary anthropology provides a powerful theoretical framework for understanding how both current environments and legacies of past selection shape human behavioral diversity. This integrative and pluralistic field, combining ethnographic, demographic, and sociological methods, has provided new insights into the ultimate forces and proximate pathways that guide human adaptation and variation. Here, we present the argument that evolutionary anthropological studies of human behavior also hold great, largely untapped, potential to guide the design, implementation, and evaluation of social and public health policy. Focusing on the key anthropological themes of reproduction, production, and distribution we highlight classic and recent research demonstrating the value of an evolutionary perspective to improving human well-being. The challenge now comes in transforming relevance into action and, for that, evolutionary behavioral anthropologists will need to forge deeper connections with other applied social scientists and policy-makers. We are hopeful that these developments are underway and that, with the current tide of enthusiasm for evidence-based approaches to policy, evolutionary anthropology is well positioned to make a strong contribution. PMID:25684561

  10. Methodologies, languages and tools

    International Nuclear Information System (INIS)

    Amako, Katsuya

    1994-01-01

    This is a summary of the open-quotes Methodologies, Languages and Toolsclose quotes session in the CHEP'94 conference. All the contributions to methodologies and languages are relevant to the object-oriented approach. Other topics presented are related to various software tools in the down-sized computing environment

  11. Divergent evolutionary processes associated with colonization of offshore islands.

    Science.gov (United States)

    Martínková, Natália; Barnett, Ross; Cucchi, Thomas; Struchen, Rahel; Pascal, Marine; Pascal, Michel; Fischer, Martin C; Higham, Thomas; Brace, Selina; Ho, Simon Y W; Quéré, Jean-Pierre; O'Higgins, Paul; Excoffier, Laurent; Heckel, Gerald; Hoelzel, A Rus; Dobney, Keith M; Searle, Jeremy B

    2013-10-01

    Oceanic islands have been a test ground for evolutionary theory, but here, we focus on the possibilities for evolutionary study created by offshore islands. These can be colonized through various means and by a wide range of species, including those with low dispersal capabilities. We use morphology, modern and ancient sequences of cytochrome b (cytb) and microsatellite genotypes to examine colonization history and evolutionary change associated with occupation of the Orkney archipelago by the common vole (Microtus arvalis), a species found in continental Europe but not in Britain. Among possible colonization scenarios, our results are most consistent with human introduction at least 5100 bp (confirmed by radiocarbon dating). We used approximate Bayesian computation of population history to infer the coast of Belgium as the possible source and estimated the evolutionary timescale using a Bayesian coalescent approach. We showed substantial morphological divergence of the island populations, including a size increase presumably driven by selection and reduced microsatellite variation likely reflecting founder events and genetic drift. More surprisingly, our results suggest that a recent and widespread cytb replacement event in the continental source area purged cytb variation there, whereas the ancestral diversity is largely retained in the colonized islands as a genetic 'ark'. The replacement event in the continental M. arvalis was probably triggered by anthropogenic causes (land-use change). Our studies illustrate that small offshore islands can act as field laboratories for studying various evolutionary processes over relatively short timescales, informing about the mainland source area as well as the island. © 2013 John Wiley & Sons Ltd.

  12. Evolutionary Inference across Eukaryotes Identifies Specific Pressures Favoring Mitochondrial Gene Retention.

    Science.gov (United States)

    Johnston, Iain G; Williams, Ben P

    2016-02-24

    Since their endosymbiotic origin, mitochondria have lost most of their genes. Although many selective mechanisms underlying the evolution of mitochondrial genomes have been proposed, a data-driven exploration of these hypotheses is lacking, and a quantitatively supported consensus remains absent. We developed HyperTraPS, a methodology coupling stochastic modeling with Bayesian inference, to identify the ordering of evolutionary events and suggest their causes. Using 2015 complete mitochondrial genomes, we inferred evolutionary trajectories of mtDNA gene loss across the eukaryotic tree of life. We find that proteins comprising the structural cores of the electron transport chain are preferentially encoded within mitochondrial genomes across eukaryotes. A combination of high GC content and high protein hydrophobicity is required to explain patterns of mtDNA gene retention; a model that accounts for these selective pressures can also predict the success of artificial gene transfer experiments in vivo. This work provides a general method for data-driven inference of the ordering of evolutionary and progressive events, here identifying the distinct features shaping mitochondrial genomes of present-day species. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Evolutionary dynamics on graphs: Efficient method for weak selection

    Science.gov (United States)

    Fu, Feng; Wang, Long; Nowak, Martin A.; Hauert, Christoph

    2009-04-01

    Investigating the evolutionary dynamics of game theoretical interactions in populations where individuals are arranged on a graph can be challenging in terms of computation time. Here, we propose an efficient method to study any type of game on arbitrary graph structures for weak selection. In this limit, evolutionary game dynamics represents a first-order correction to neutral evolution. Spatial correlations can be empirically determined under neutral evolution and provide the basis for formulating the game dynamics as a discrete Markov process by incorporating a detailed description of the microscopic dynamics based on the neutral correlations. This framework is then applied to one of the most intriguing questions in evolutionary biology: the evolution of cooperation. We demonstrate that the degree heterogeneity of a graph impedes cooperation and that the success of tit for tat depends not only on the number of rounds but also on the degree of the graph. Moreover, considering the mutation-selection equilibrium shows that the symmetry of the stationary distribution of states under weak selection is skewed in favor of defectors for larger selection strengths. In particular, degree heterogeneity—a prominent feature of scale-free networks—generally results in a more pronounced increase in the critical benefit-to-cost ratio required for evolution to favor cooperation as compared to regular graphs. This conclusion is corroborated by an analysis of the effects of population structures on the fixation probabilities of strategies in general 2×2 games for different types of graphs. Computer simulations confirm the predictive power of our method and illustrate the improved accuracy as compared to previous studies.

  14. An evolutionary computation approach to examine functional brain plasticity

    Directory of Open Access Journals (Sweden)

    Arnab eRoy

    2016-04-01

    Full Text Available One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN and the executive control network (ECN during recovery from traumatic brain injury (TBI; the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in

  15. Computer Presentation Programs and Teaching Research Methodologies

    OpenAIRE

    Motamedi, Vahid

    2015-01-01

    Supplementing traditional chalk and board instruction with computer delivery has been viewed positively by students who have reported increased understanding and more interaction with the instructor when computer presentations are used in the classroom. Some problems contributing to student errors while taking class notes might be transcription of numbers to the board, and handwriting of the instructor can be resolved in careful construction of computer presentations. The use of computer pres...

  16. Evolutionary Expectations

    DEFF Research Database (Denmark)

    Nash, Ulrik William

    2014-01-01

    , they are correlated among people who share environments because these individuals satisfice within their cognitive bounds by using cues in order of validity, as opposed to using cues arbitrarily. Any difference in expectations thereby arise from differences in cognitive ability, because two individuals with identical...... cognitive bounds will perceive business opportunities identically. In addition, because cues provide information about latent causal structures of the environment, changes in causality must be accompanied by changes in cognitive representations if adaptation is to be maintained. The concept of evolutionary......The concept of evolutionary expectations descends from cue learning psychology, synthesizing ideas on rational expectations with ideas on bounded rationality, to provide support for these ideas simultaneously. Evolutionary expectations are rational, but within cognitive bounds. Moreover...

  17. Evolutionary Awareness

    Directory of Open Access Journals (Sweden)

    Gregory Gorelik

    2014-10-01

    Full Text Available In this article, we advance the concept of “evolutionary awareness,” a metacognitive framework that examines human thought and emotion from a naturalistic, evolutionary perspective. We begin by discussing the evolution and current functioning of the moral foundations on which our framework rests. Next, we discuss the possible applications of such an evolutionarily-informed ethical framework to several domains of human behavior, namely: sexual maturation, mate attraction, intrasexual competition, culture, and the separation between various academic disciplines. Finally, we discuss ways in which an evolutionary awareness can inform our cross-generational activities—which we refer to as “intergenerational extended phenotypes”—by helping us to construct a better future for ourselves, for other sentient beings, and for our environment.

  18. Development of a computational methodology for internal dose calculations

    International Nuclear Information System (INIS)

    Yoriyaz, Helio

    2000-01-01

    A new approach for calculating internal dose estimates was developed through the use of a more realistic computational model of the human body and a more precise tool for the radiation transport simulation. The present technique shows the capability to build a patient-specific phantom with tomography data (a voxel-based phantom) for the simulation of radiation transport and energy deposition using Monte Carlo methods such as in the MCNP-4B code. In order to utilize the segmented human anatomy as a computational model for the simulation of radiation transport, an interface program, SCMS, was developed to build the geometric configurations for the phantom through the use of tomographic images. This procedure allows to calculate not only average dose values but also spatial distribution of dose in regions of interest. With the present methodology absorbed fractions for photons and electrons in various organs of the Zubal segmented phantom were calculated and compared to those reported for the mathematical phantoms of Snyder and Cristy-Eckerman. Although the differences in the organ's geometry between the phantoms are quite evident, the results demonstrate small discrepancies, however, in some cases, considerable discrepancies were found due to two major causes: differences in the organ masses between the phantoms and the occurrence of organ overlap in the Zubal segmented phantom, which is not considered in the mathematical phantom. This effect was quite evident for organ cross-irradiation from electrons. With the determination of spatial dose distribution it was demonstrated the possibility of evaluation of more detailed doses data than those obtained in conventional methods, which will give important information for the clinical analysis in therapeutic procedures and in radiobiologic studies of the human body. (author)

  19. Evolutionary optimization and game strategies for advanced multi-disciplinary design applications to aeronautics and UAV design

    CERN Document Server

    Periaux, Jacques; Lee, Dong Seop Chris

    2015-01-01

    Many complex aeronautical design problems can be formulated with efficient multi-objective evolutionary optimization methods and game strategies. This book describes the role of advanced innovative evolution tools in the solution, or the set of solutions of single or multi disciplinary optimization. These tools use the concept of multi-population, asynchronous parallelization and hierarchical topology which allows different models including precise, intermediate and approximate models with each node belonging to the different hierarchical layer handled by a different Evolutionary Algorithm. The efficiency of evolutionary algorithms for both single and multi-objective optimization problems are significantly improved by the coupling of EAs with games and in particular by a new dynamic methodology named “Hybridized Nash-Pareto games”. Multi objective Optimization techniques and robust design problems taking into account uncertainties are introduced and explained in detail. Several applications dealing with c...

  20. EVOLUTIONARY FOUNDATIONS FOR MOLECULAR MEDICINE

    Science.gov (United States)

    Nesse, Randolph M.; Ganten, Detlev; Gregory, T. Ryan; Omenn, Gilbert S.

    2015-01-01

    Evolution has long provided a foundation for population genetics, but many major advances in evolutionary biology from the 20th century are only now being applied in molecular medicine. They include the distinction between proximate and evolutionary explanations, kin selection, evolutionary models for cooperation, and new strategies for tracing phylogenies and identifying signals of selection. Recent advances in genomics are further transforming evolutionary biology and creating yet more opportunities for progress at the interface of evolution with genetics, medicine, and public health. This article reviews 15 evolutionary principles and their applications in molecular medicine in hopes that readers will use them and others to speed the development of evolutionary molecular medicine. PMID:22544168

  1. A shape and mesh adaptive computational methodology for gamma ray dose from volumetric sources

    International Nuclear Information System (INIS)

    Mirza, N.M.; Ali, B.; Mirza, S.M.; Tufail, M.; Ahmad, N.

    1991-01-01

    Indoor external exposure to the population is dominated by gamma rays emitted from the walls and the floor of a room. A shape and mesh size adaptive flux calculational approach has been developed for a typical wall source. Parametric studies of the effect of mesh size on flux calculations have been done. The optimum value of the mesh size is found to depend strongly on distance from the source, permissible limits on uncertainty in flux predictions and on computer Central Processing Unit time. To test the computations, a typical wall source was reduced to a point, a line and an infinite volume source having finite thickness, and the computed flux values were compared with values from corresponding analytical expressions for these sources. Results indicate that the errors under optimum conditions remain less than 6% for the fluxes calculated from this approach when compared with the analytical values for the point and the line source approximations. Also, when the wall is simulated as an infinite volume source having finite thickness, the errors in computed to analytical flux ratios remain large for smaller wall dimensions. However, the errors become less than 10% when the wall dimensions are greater than ten mean free paths for 3 MeV gamma rays. Also, specific dose rates from this methodology remain within the difference of 15% for the values obtained by Monte Carlo method. (author)

  2. Applying natural evolution for solving computational problems - Lecture 1

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.

  3. Applying natural evolution for solving computational problems - Lecture 2

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Darwin’s natural evolution theory has inspired computer scientists for solving computational problems. In a similar way to how humans and animals have evolved along millions of years, computational problems can be solved by evolving a population of solutions through generations until a good solution is found. In the first lecture, the fundaments of evolutionary computing (EC) will be described, covering the different phases that the evolutionary process implies. ECJ, a framework for researching in such field, will be also explained. In the second lecture, genetic programming (GP) will be covered. GP is a sub-field of EC where solutions are actual computational programs represented by trees. Bloat control and distributed evaluation will be introduced.

  4. Attractive evolutionary equilibria

    NARCIS (Netherlands)

    Joosten, Reinoud A.M.G.; Roorda, Berend

    2011-01-01

    We present attractiveness, a refinement criterion for evolutionary equilibria. Equilibria surviving this criterion are robust to small perturbations of the underlying payoff system or the dynamics at hand. Furthermore, certain attractive equilibria are equivalent to others for certain evolutionary

  5. Wavelet-Based Methodology for Evolutionary Spectra Estimation of Nonstationary Typhoon Processes

    Directory of Open Access Journals (Sweden)

    Guang-Dong Zhou

    2015-01-01

    Full Text Available Closed-form expressions are proposed to estimate the evolutionary power spectral density (EPSD of nonstationary typhoon processes by employing the wavelet transform. Relying on the definition of the EPSD and the concept of the wavelet transform, wavelet coefficients of a nonstationary typhoon process at a certain time instant are interpreted as the Fourier transform of a new nonstationary oscillatory process, whose modulating function is equal to the modulating function of the nonstationary typhoon process multiplied by the wavelet function in time domain. Then, the EPSD of nonstationary typhoon processes is deduced in a closed form and is formulated as a weighted sum of the squared moduli of time-dependent wavelet functions. The weighted coefficients are frequency-dependent functions defined by the wavelet coefficients of the nonstationary typhoon process and the overlapping area of two shifted wavelets. Compared with the EPSD, defined by a sum of the squared moduli of the wavelets in frequency domain in literature, this paper provides an EPSD estimation method in time domain. The theoretical results are verified by uniformly modulated nonstationary typhoon processes and non-uniformly modulated nonstationary typhoon processes.

  6. Smart learning objects for smart education in computer science theory, methodology and robot-based implementation

    CERN Document Server

    Stuikys, Vytautas

    2015-01-01

    This monograph presents the challenges, vision and context to design smart learning objects (SLOs) through Computer Science (CS) education modelling and feature model transformations. It presents the latest research on the meta-programming-based generative learning objects (the latter with advanced features are treated as SLOs) and the use of educational robots in teaching CS topics. The introduced methodology includes the overall processes to develop SLO and smart educational environment (SEE) and integrates both into the real education setting to provide teaching in CS using constructivist a

  7. Methodological testing: Are fast quantum computers illusions?

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Steven [Tachyon Design Automation, San Francisco, CA (United States)

    2013-07-01

    Popularity of the idea for computers constructed from the principles of QM started with Feynman's 'Lectures On Computation', but he called the idea crazy and dependent on statistical mechanics. In 1987, Feynman published a paper in 'Quantum Implications - Essays in Honor of David Bohm' on negative probabilities which he said gave him cultural shock. The problem with imagined fast quantum computers (QC) is that speed requires both statistical behavior and truth of the mathematical formalism. The Swedish Royal Academy 2012 Nobel Prize in physics press release touted the discovery of methods to control ''individual quantum systems'', to ''measure and control very fragile quantum states'' which enables ''first steps towards building a new type of super fast computer based on quantum physics.'' A number of examples where widely accepted mathematical descriptions have turned out to be problematic are examined: Problems with the use of Oracles in P=NP computational complexity, Paul Finsler's proof of the continuum hypothesis, and Turing's Enigma code breaking versus William tutte's Colossus. I view QC research as faith in computational oracles with wished for properties. Arther Fine's interpretation in 'The Shaky Game' of Einstein's skepticism toward QM is discussed. If Einstein's reality as space-time curvature is correct, then space-time computers will be the next type of super fast computer.

  8. AN EVOLUTIONARY ALGORITHM FOR FAST INTENSITY BASED IMAGE MATCHING BETWEEN OPTICAL AND SAR SATELLITE IMAGERY

    Directory of Open Access Journals (Sweden)

    P. Fischer

    2018-04-01

    Full Text Available This paper presents a hybrid evolutionary algorithm for fast intensity based matching between satellite imagery from SAR and very high-resolution (VHR optical sensor systems. The precise and accurate co-registration of image time series and images of different sensors is a key task in multi-sensor image processing scenarios. The necessary preprocessing step of image matching and tie-point detection is divided into a search problem and a similarity measurement. Within this paper we evaluate the use of an evolutionary search strategy for establishing the spatial correspondence between satellite imagery of optical and radar sensors. The aim of the proposed algorithm is to decrease the computational costs during the search process by formulating the search as an optimization problem. Based upon the canonical evolutionary algorithm, the proposed algorithm is adapted for SAR/optical imagery intensity based matching. Extensions are drawn using techniques like hybridization (e.g. local search and others to lower the number of objective function calls and refine the result. The algorithm significantely decreases the computational costs whilst finding the optimal solution in a reliable way.

  9. Evolutionary modeling-based approach for model errors correction

    Directory of Open Access Journals (Sweden)

    S. Q. Wan

    2012-08-01

    Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."

    On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  10. Evolutionary engineering to enhance starter culture performance in food fermentations.

    Science.gov (United States)

    Bachmann, Herwig; Pronk, Jack T; Kleerebezem, Michiel; Teusink, Bas

    2015-04-01

    Microbial starter cultures are essential for consistent product quality and functional properties such as flavor, texture, pH or the alcohol content of various fermented foods. Strain improvement programs to achieve desired properties in starter cultures are diverse, but developments in next-generation sequencing lead to an increased interest in evolutionary engineering of desired phenotypes. We here discuss recent developments of strain selection protocols and how computational approaches can assist such experimental design. Furthermore the analysis of evolved phenotypes and possibilities with complex consortia are highlighted. Studies carried out with mainly yeast and lactic acid bacteria demonstrate the power of evolutionary engineering to deliver strains with novel phenotypes as well as insight into underlying mechanisms. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Asteroseismology of pulsating DA white dwarfs with fully evolutionary models

    Directory of Open Access Journals (Sweden)

    Althaus L.G.

    2013-03-01

    Full Text Available We present a new approach for asteroseismology of DA white dwarfs that consists in the employment of a large set of non-static, physically sound, fully evolutionary models representative of these stars. We already have applied this approach with success to pulsating PG1159 stars (GW Vir variables. Our white dwarf models, which cover a wide range of stellar masses, effective temperatures, and envelope thicknesses, are the result of fully evolutionary computations that take into account the complete history of the progenitor stars from the ZAMS. In particular, the models are characterized by self-consistent chemical structures from the centre to the surface, a crucial aspect of white dwarf asteroseismology. We apply this approach to an ensemble of 44 bright DAV (ZZ Ceti stars.

  12. Evolutionary Algorithms for Boolean Functions in Diverse Domains of Cryptography.

    Science.gov (United States)

    Picek, Stjepan; Carlet, Claude; Guilley, Sylvain; Miller, Julian F; Jakobovic, Domagoj

    2016-01-01

    The role of Boolean functions is prominent in several areas including cryptography, sequences, and coding theory. Therefore, various methods for the construction of Boolean functions with desired properties are of direct interest. New motivations on the role of Boolean functions in cryptography with attendant new properties have emerged over the years. There are still many combinations of design criteria left unexplored and in this matter evolutionary computation can play a distinct role. This article concentrates on two scenarios for the use of Boolean functions in cryptography. The first uses Boolean functions as the source of the nonlinearity in filter and combiner generators. Although relatively well explored using evolutionary algorithms, it still presents an interesting goal in terms of the practical sizes of Boolean functions. The second scenario appeared rather recently where the objective is to find Boolean functions that have various orders of the correlation immunity and minimal Hamming weight. In both these scenarios we see that evolutionary algorithms are able to find high-quality solutions where genetic programming performs the best.

  13. MUSICOLOGY AS AN EVOLUTIONARY SYSTEM OF THE MUSICAL THINKING CATEGORIES (PROPOSED FORAMETHODOLOGICAL-HERMENEUTICS APPROACH

    Directory of Open Access Journals (Sweden)

    GARAZ OLEG

    2015-03-01

    Full Text Available The appearance of musicology as a science is due yo Guido Adler’s foundation act of 1885. The dialectical formulation of musicology as an interaction between systematic and historical disciplines were to confer it sufficien methodological force so that the newly invented science could become practicable as soon as possible. Historically being a descendant of the German historical method (Leopold von Ranke and the one contemporary with positivism (Auguste Comte, musicology affirms itself quickly as an academic discipline (begun by Wilhelm von Humboldt and takes in both sociology (Max Weber and hermeneutics (Hermann Kretzschmar as constituent disciplines. In one sense, musicology presented itself as an open conceptual system, permeable to the evolutionary changes of the thinking field and musical practices and, in a second sense, the methodology of systematic conceptual organization was oriented towards the university teaching field as an offer for pedagogical and research activities. In our text, we propose a hipothesis of approaching the conceptual system of musicology as a cosubstantional system of botn the constituent parts — the systematic and the historical ones, in such a way motivating and explaining its dialectical-evolutionary nature.

  14. Parallel Evolutionary Optimization for Neuromorphic Network Training

    Energy Technology Data Exchange (ETDEWEB)

    Schuman, Catherine D [ORNL; Disney, Adam [University of Tennessee (UT); Singh, Susheela [North Carolina State University (NCSU), Raleigh; Bruer, Grant [University of Tennessee (UT); Mitchell, John Parker [University of Tennessee (UT); Klibisz, Aleksander [University of Tennessee (UT); Plank, James [University of Tennessee (UT)

    2016-01-01

    One of the key impediments to the success of current neuromorphic computing architectures is the issue of how best to program them. Evolutionary optimization (EO) is one promising programming technique; in particular, its wide applicability makes it especially attractive for neuromorphic architectures, which can have many different characteristics. In this paper, we explore different facets of EO on a spiking neuromorphic computing model called DANNA. We focus on the performance of EO in the design of our DANNA simulator, and on how to structure EO on both multicore and massively parallel computing systems. We evaluate how our parallel methods impact the performance of EO on Titan, the U.S.'s largest open science supercomputer, and BOB, a Beowulf-style cluster of Raspberry Pi's. We also focus on how to improve the EO by evaluating commonality in higher performing neural networks, and present the result of a study that evaluates the EO performed by Titan.

  15. IDEA: Interactive Display for Evolutionary Analyses.

    Science.gov (United States)

    Egan, Amy; Mahurkar, Anup; Crabtree, Jonathan; Badger, Jonathan H; Carlton, Jane M; Silva, Joana C

    2008-12-08

    The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood) suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. We have developed IDEA (Interactive Display for Evolutionary Analyses), an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  16. Analytical and computational methodology to assess the over pressures generated by a potential catastrophic failure of a cryogenic pressure vessel

    Energy Technology Data Exchange (ETDEWEB)

    Zamora, I.; Fradera, J.; Jaskiewicz, F.; Lopez, D.; Hermosa, B.; Aleman, A.; Izquierdo, J.; Buskop, J.

    2014-07-01

    Idom has participated in the risk evaluation of Safety Important Class (SIC) structures due to over pressures generated by a catastrophic failure of a cryogenic pressure vessel at ITER plant site. The evaluation implements both analytical and computational methodologies achieving consistent and robust results. (Author)

  17. Analytical and computational methodology to assess the over pressures generated by a potential catastrophic failure of a cryogenic pressure vessel

    International Nuclear Information System (INIS)

    Zamora, I.; Fradera, J.; Jaskiewicz, F.; Lopez, D.; Hermosa, B.; Aleman, A.; Izquierdo, J.; Buskop, J.

    2014-01-01

    Idom has participated in the risk evaluation of Safety Important Class (SIC) structures due to over pressures generated by a catastrophic failure of a cryogenic pressure vessel at ITER plant site. The evaluation implements both analytical and computational methodologies achieving consistent and robust results. (Author)

  18. An Evolutionary Approach to Driving Tendency Recognition for Advanced Driver Assistance Systems

    Directory of Open Access Journals (Sweden)

    Lee Jong-Hyun

    2016-01-01

    Full Text Available Driving tendency recognition is important for constructing Advanced Driver Assistance Systems (ADAS. However, it had not been a lot of research using vehicle sensing data, due to the high difficulty to define it. In this paper, we attempt to improve the learning capability of a machine learning method using evolutionary computation. We propose a driving tendency recognition method, with consideration of data characteristics. Comparison of our classification system with conventional methods demonstrated the effectiveness and accuracy over 92% in our system. Our proposed evolutionary approach is confirmed that improve the classification accuracy of the learning method through evolution in the experiment.

  19. Selection methodology for LWR safety programs and proposals. Volume 2. Methodology application

    International Nuclear Information System (INIS)

    Ritzman, R.L.; Husseiny, A.A.

    1980-08-01

    The results of work done to update and apply a methodology for selecting (prioritizing) LWR safety technology R and D programs are described. The methodology is based on multiattribute utility (MAU) theory. Application of the methodology to rank-order a group of specific R and D programs included development of a complete set of attribute utility functions, specification of individual attribute scaling constants, and refinement and use of an interactive computer program (MAUP) to process decision-maker inputs and generate overall (multiattribute) program utility values. The output results from several decision-makers are examined for consistency and conclusions and recommendations regarding general use of the methodology are presented. 3 figures, 18 tables

  20. 进化作曲研究%Research on evolutionary music composer system

    Institute of Scientific and Technical Information of China (English)

    汪镭; 郑晓妹; 申林

    2014-01-01

    Algorithmic composition is the most attractive research area in computer music and genetic algorithm-based evolution-ary music composer system has become a hot spot in the algorithmic composition.This paper gives a structure of evolutionary mu-sic composer system,analyzes different goals of music composer systems,and then discusses two types of evolutionary music com-poser system from the aspect of fitness function design.Finally,several instances of evolutionary music composer system are ana-lyzed.%算法作曲是计算机音乐中最具吸引力的研究领域,而基于遗传算法的进化作曲系统已成为算法作曲中的热点。给出了进化作曲系统的结构,分析了系统不同的作曲目标,从适应度函数的设计讨论了两类作曲系统。最后给出了几个作曲系统实例分析。

  1. Evolutionary algorithms applied to Landau-gauge fixing

    International Nuclear Information System (INIS)

    Markham, J.F.

    1998-01-01

    Current algorithms used to put a lattice gauge configuration into Landau gauge either suffer from the problem of critical slowing-down or involve an additions computational expense to overcome it. Evolutionary Algorithms (EAs), which have been widely applied to other global optimisation problems, may be of use in gauge fixing. Also, being global, they should not suffer from critical slowing-down as do local gradient based algorithms. We apply EA'S and also a Steepest Descent (SD) based method to the problem of Landau Gauge Fixing and compare their performance. (authors)

  2. Computer Presentation Programs and Teaching Research Methodologies

    Directory of Open Access Journals (Sweden)

    Vahid Motamedi

    2015-05-01

    Full Text Available Supplementing traditional chalk and board instruction with computer delivery has been viewed positively by students who have reported increased understanding and more interaction with the instructor when computer presentations are used in the classroom. Some problems contributing to student errors while taking class notes might be transcription of numbers to the board, and handwriting of the instructor can be resolved in careful construction of computer presentations. The use of computer presentation programs promises to increase the effectiveness of learning by making content more readily available, by reducing the cost and effort of producing quality content, and by allowing content to be more easily shared. This paper describes how problems can be overcome by using presentation packages for instruction.

  3. Evolutionary experience design – the case of Otopia

    DEFF Research Database (Denmark)

    Hansen, Kenneth

    experiences with the case of “Otopia”. “Otopia” is a large scale, new media experiment, which combines the areas of computer games, sports and performance in to a spectator oriented concept; it was premiered in a dome tent at the Roskilde Festival in Denmark the summer 2005. This paper presents and discusses......The design of experiences is a complicated challenge. It might not even be possible to design such a “thing”, but only to design for it. If this is the case it could seem appropriate with an evolutionary approach. This paper introduces such an approach to the design of new public oriented...... used as a means of specifying the basic immaterial design form. This discussion leads to the suggestion of a rule-based evolutionary model for the design of situations as a practical option for designers of new spectator oriented experiences in the future The project of Otopia was supported...

  4. A Multiagent Evolutionary Algorithm for the Resource-Constrained Project Portfolio Selection and Scheduling Problem

    Directory of Open Access Journals (Sweden)

    Yongyi Shou

    2014-01-01

    Full Text Available A multiagent evolutionary algorithm is proposed to solve the resource-constrained project portfolio selection and scheduling problem. The proposed algorithm has a dual level structure. In the upper level a set of agents make decisions to select appropriate project portfolios. Each agent selects its project portfolio independently. The neighborhood competition operator and self-learning operator are designed to improve the agent’s energy, that is, the portfolio profit. In the lower level the selected projects are scheduled simultaneously and completion times are computed to estimate the expected portfolio profit. A priority rule-based heuristic is used by each agent to solve the multiproject scheduling problem. A set of instances were generated systematically from the widely used Patterson set. Computational experiments confirmed that the proposed evolutionary algorithm is effective for the resource-constrained project portfolio selection and scheduling problem.

  5. A new decomposition-based computer-aided molecular/mixture design methodology for the design of optimal solvents and solvent mixtures

    DEFF Research Database (Denmark)

    Karunanithi, A.T.; Achenie, L.E.K.; Gani, Rafiqul

    2005-01-01

    This paper presents a novel computer-aided molecular/mixture design (CAMD) methodology for the design of optimal solvents and solvent mixtures. The molecular/mixture design problem is formulated as a mixed integer nonlinear programming (MINLP) model in which a performance objective is to be optim......This paper presents a novel computer-aided molecular/mixture design (CAMD) methodology for the design of optimal solvents and solvent mixtures. The molecular/mixture design problem is formulated as a mixed integer nonlinear programming (MINLP) model in which a performance objective...... is to be optimized subject to structural, property, and process constraints. The general molecular/mixture design problem is divided into two parts. For optimal single-compound design, the first part is solved. For mixture design, the single-compound design is first carried out to identify candidates...... and then the second part is solved to determine the optimal mixture. The decomposition of the CAMD MINLP model into relatively easy to solve subproblems is essentially a partitioning of the constraints from the original set. This approach is illustrated through two case studies. The first case study involves...

  6. Evolutionary Demography

    DEFF Research Database (Denmark)

    Levitis, Daniel

    2015-01-01

    of biological and cultural evolution. Demographic variation within and among human populations is influenced by our biology, and therefore by natural selection and our evolutionary background. Demographic methods are necessary for studying populations of other species, and for quantifying evolutionary fitness......Demography is the quantitative study of population processes, while evolution is a population process that influences all aspects of biological organisms, including their demography. Demographic traits common to all human populations are the products of biological evolution or the interaction...

  7. Proteomics in evolutionary ecology.

    Science.gov (United States)

    Baer, B; Millar, A H

    2016-03-01

    Evolutionary ecologists are traditionally gene-focused, as genes propagate phenotypic traits across generations and mutations and recombination in the DNA generate genetic diversity required for evolutionary processes. As a consequence, the inheritance of changed DNA provides a molecular explanation for the functional changes associated with natural selection. A direct focus on proteins on the other hand, the actual molecular agents responsible for the expression of a phenotypic trait, receives far less interest from ecologists and evolutionary biologists. This is partially due to the central dogma of molecular biology that appears to define proteins as the 'dead-end of molecular information flow' as well as technical limitations in identifying and studying proteins and their diversity in the field and in many of the more exotic genera often favored in ecological studies. Here we provide an overview of a newly forming field of research that we refer to as 'Evolutionary Proteomics'. We point out that the origins of cellular function are related to the properties of polypeptide and RNA and their interactions with the environment, rather than DNA descent, and that the critical role of horizontal gene transfer in evolution is more about coopting new proteins to impact cellular processes than it is about modifying gene function. Furthermore, post-transcriptional and post-translational processes generate a remarkable diversity of mature proteins from a single gene, and the properties of these mature proteins can also influence inheritance through genetic and perhaps epigenetic mechanisms. The influence of post-transcriptional diversification on evolutionary processes could provide a novel mechanistic underpinning for elements of rapid, directed evolutionary changes and adaptations as observed for a variety of evolutionary processes. Modern state-of the art technologies based on mass spectrometry are now available to identify and quantify peptides, proteins, protein

  8. An Adaptive Evolutionary Algorithm for Traveling Salesman Problem with Precedence Constraints

    Directory of Open Access Journals (Sweden)

    Jinmo Sung

    2014-01-01

    Full Text Available Traveling sales man problem with precedence constraints is one of the most notorious problems in terms of the efficiency of its solution approach, even though it has very wide range of industrial applications. We propose a new evolutionary algorithm to efficiently obtain good solutions by improving the search process. Our genetic operators guarantee the feasibility of solutions over the generations of population, which significantly improves the computational efficiency even when it is combined with our flexible adaptive searching strategy. The efficiency of the algorithm is investigated by computational experiments.

  9. Transdisciplinary Perspectives in Bioethics: A Co-evolutionary Introduction from the Big History

    Directory of Open Access Journals (Sweden)

    Javier Collado-Ruano

    2016-10-01

    Full Text Available The main objective of this work is to expand the bioethics notion expressed in the Article 17th of the Universal Declaration on Bioethics and Human Rights, concerning the interconnections between human beings and other life forms. For this purpose, it is combined the transdisciplinary methodology with the theoretical framework of the “Big History” to approach the co-evolutionary phenomena that life is developing on Earth for some 3.8 billion years. As a result, the study introduces us to the unification, integration and inclusion of the history of the universe, the solar system, Earth, and life with the history of human beings. In conclusion, I consider to safeguard the cosmic miracle that represents the emergence of life we must adopt new transdisciplinary perspectives into bioethics to address the ecosystem complexity of co-evolutionary processes of life on Gaia as a whole.

  10. Bioinspired computation in combinatorial optimization: algorithms and their computational complexity

    DEFF Research Database (Denmark)

    Neumann, Frank; Witt, Carsten

    2012-01-01

    Bioinspired computation methods, such as evolutionary algorithms and ant colony optimization, are being applied successfully to complex engineering and combinatorial optimization problems, and it is very important that we understand the computational complexity of these algorithms. This tutorials...... problems. Classical single objective optimization is examined first. They then investigate the computational complexity of bioinspired computation applied to multiobjective variants of the considered combinatorial optimization problems, and in particular they show how multiobjective optimization can help...... to speed up bioinspired computation for single-objective optimization problems. The tutorial is based on a book written by the authors with the same title. Further information about the book can be found at www.bioinspiredcomputation.com....

  11. Machine learning and evolutionary techniques in interplanetary trajectory design

    OpenAIRE

    Izzo, Dario; Sprague, Christopher; Tailor, Dharmesh

    2018-01-01

    After providing a brief historical overview on the synergies between artificial intelligence research, in the areas of evolutionary computations and machine learning, and the optimal design of interplanetary trajectories, we propose and study the use of deep artificial neural networks to represent, on-board, the optimal guidance profile of an interplanetary mission. The results, limited to the chosen test case of an Earth-Mars orbital transfer, extend the findings made previously for landing ...

  12. Evolutionary design assistants for architecture

    Directory of Open Access Journals (Sweden)

    N. Onur Sönmez

    2015-04-01

    Full Text Available In its parallel pursuit of an increased competitivity for design offices and more pleasurable and easier workflows for designers, artificial design intelligence is a technical, intellectual, and political challenge. While human-machine cooperation has become commonplace through Computer Aided Design (CAD tools, a more improved collaboration and better support appear possible only through an endeavor into a kind of artificial design intelligence, which is more sensitive to the human perception of affairs. Considered as part of the broader Computational Design studies, the research program of this quest can be called Artificial / Autonomous / Automated Design (AD. The current available level of Artificial Intelligence (AI for design is limited and a viable aim for current AD would be to develop design assistants that are capable of producing drafts for various design tasks. Thus, the overall aim of this thesis is the development of approaches, techniques, and tools towards artificial design assistants that offer a capability for generating drafts for sub-tasks within design processes. The main technology explored for this aim is Evolutionary Computation (EC, and the target design domain is architecture. The two connected research questions of the study concern, first, the investigation of the ways to develop an architectural design assistant, and secondly, the utilization of EC for the development of such assistants. While developing approaches, techniques, and computational tools for such an assistant, the study also carries out a broad theoretical investigation into the main problems, challenges, and requirements towards such assistants on a rather overall level. Therefore, the research is shaped as a parallel investigation of three main threads interwoven along several levels, moving from a more general level to specific applications. The three research threads comprise, first, theoretical discussions and speculations with regard to both

  13. XTALOPT: An open-source evolutionary algorithm for crystal structure prediction

    Science.gov (United States)

    Lonie, David C.; Zurek, Eva

    2011-02-01

    The implementation and testing of XTALOPT, an evolutionary algorithm for crystal structure prediction, is outlined. We present our new periodic displacement (ripple) operator which is ideally suited to extended systems. It is demonstrated that hybrid operators, which combine two pure operators, reduce the number of duplicate structures in the search. This allows for better exploration of the potential energy surface of the system in question, while simultaneously zooming in on the most promising regions. A continuous workflow, which makes better use of computational resources as compared to traditional generation based algorithms, is employed. Various parameters in XTALOPT are optimized using a novel benchmarking scheme. XTALOPT is available under the GNU Public License, has been interfaced with various codes commonly used to study extended systems, and has an easy to use, intuitive graphical interface. Program summaryProgram title:XTALOPT Catalogue identifier: AEGX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEGX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL v2.1 or later [1] No. of lines in distributed program, including test data, etc.: 36 849 No. of bytes in distributed program, including test data, etc.: 1 149 399 Distribution format: tar.gz Programming language: C++ Computer: PCs, workstations, or clusters Operating system: Linux Classification: 7.7 External routines: QT [2], OpenBabel [3], AVOGADRO [4], SPGLIB [8] and one of: VASP [5], PWSCF [6], GULP [7]. Nature of problem: Predicting the crystal structure of a system from its stoichiometry alone remains a grand challenge in computational materials science, chemistry, and physics. Solution method: Evolutionary algorithms are stochastic search techniques which use concepts from biological evolution in order to locate the global minimum on their potential energy surface. Our evolutionary algorithm, XTALOPT, is freely

  14. Use of multiple objective evolutionary algorithms in optimizing surveillance requirements

    International Nuclear Information System (INIS)

    Martorell, S.; Carlos, S.; Villanueva, J.F.; Sanchez, A.I; Galvan, B.; Salazar, D.; Cepin, M.

    2006-01-01

    This paper presents the development and application of a double-loop Multiple Objective Evolutionary Algorithm that uses a Multiple Objective Genetic Algorithm to perform the simultaneous optimization of periodic Test Intervals (TI) and Test Planning (TP). It takes into account the time-dependent effect of TP performed on stand-by safety-related equipment. TI and TP are part of the Surveillance Requirements within Technical Specifications at Nuclear Power Plants. It addresses the problem of multi-objective optimization in the space of dependable variables, i.e. TI and TP, using a novel flexible structure of the optimization algorithm. Lessons learnt from the cases of application of the methodology to optimize TI and TP for the High-Pressure Injection System are given. The results show that the double-loop Multiple Objective Evolutionary Algorithm is able to find the Pareto set of solutions that represents a surface of non-dominated solutions that satisfy all the constraints imposed on the objective functions and decision variables. Decision makers can adopt then the best solution found depending on their particular preference, e.g. minimum cost, minimum unavailability

  15. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  16. Evolutionary Trails of Plant Group II Pyridoxal Phosphate-Dependent Decarboxylase Genes.

    Science.gov (United States)

    Kumar, Rahul

    2016-01-01

    Type II pyridoxal phosphate-dependent decarboxylase (PLP_deC) enzymes play important metabolic roles during nitrogen metabolism. Recent evolutionary profiling of these genes revealed a sharp expansion of histidine decarboxylase genes in the members of Solanaceae family. In spite of the high sequence homology shared by PLP_deC orthologs, these enzymes display remarkable differences in their substrate specificities. Currently, limited information is available on the gene repertoires and substrate specificities of PLP_deCs which renders their precise annotation challenging and offers technical challenges in the immediate identification and biochemical characterization of their full gene complements in plants. Herein, we explored their evolutionary trails in a comprehensive manner by taking advantage of high-throughput data accessibility and computational approaches. We discussed the premise that has enabled an improved reconstruction of their evolutionary lineage and evaluated the factors offering constraints in their rapid functional characterization, till date. We envisage that the synthesized information herein would act as a catalyst for the rapid exploration of their biochemical specificity and physiological roles in more plant species.

  17. Transdisciplinary Perspectives in Bioethics: A Co-evolutionary Introduction from the Big History

    OpenAIRE

    Javier Collado-Ruano

    2016-01-01

    The main objective of this work is to expand the bioethics notion expressed in the Article 17th of the Universal Declaration on Bioethics and Human Rights, concerning the interconnections between human beings and other life forms. For this purpose, it is combined the transdisciplinary methodology with the theoretical framework of the “Big History” to approach the co-evolutionary phenomena that life is developing on Earth for some 3.8 billion years. As a result, the study introduces us to t...

  18. IDEA: Interactive Display for Evolutionary Analyses

    Directory of Open Access Journals (Sweden)

    Carlton Jane M

    2008-12-01

    Full Text Available Abstract Background The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. Results We have developed IDEA (Interactive Display for Evolutionary Analyses, an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. Conclusion IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  19. Solving advanced multi-objective robust designs by means of multiple objective evolutionary algorithms (MOEA): A reliability application

    Energy Technology Data Exchange (ETDEWEB)

    Salazar A, Daniel E. [Division de Computacion Evolutiva (CEANI), Instituto de Sistemas Inteligentes y Aplicaciones Numericas en Ingenieria (IUSIANI), Universidad de Las Palmas de Gran Canaria. Canary Islands (Spain)]. E-mail: danielsalazaraponte@gmail.com; Rocco S, Claudio M. [Universidad Central de Venezuela, Facultad de Ingenieria, Caracas (Venezuela)]. E-mail: crocco@reacciun.ve

    2007-06-15

    This paper extends the approach proposed by the second author in [Rocco et al. Robust design using a hybrid-cellular-evolutionary and interval-arithmetic approach: a reliability application. In: Tarantola S, Saltelli A, editors. SAMO 2001: Methodological advances and useful applications of sensitivity analysis. Reliab Eng Syst Saf 2003;79(2):149-59 [special issue

  20. Computer aided product design

    DEFF Research Database (Denmark)

    Constantinou, Leonidas; Bagherpour, Khosrow; Gani, Rafiqul

    1996-01-01

    A general methodology for Computer Aided Product Design (CAPD) with specified property constraints which is capable of solving a large range of problems is presented. The methodology employs the group contribution approach, generates acyclic, cyclic and aromatic compounds of various degrees......-liquid equilibria (LLE), solid-liquid equilibria (SLE) and gas solubility. Finally, a computer program based on the extended methodology has been developed and the results from five case studies highlighting various features of the methodology are presented....

  1. Evolutionary design optimization of traffic signals applied to Quito city.

    Science.gov (United States)

    Armas, Rolando; Aguirre, Hernán; Daolio, Fabio; Tanaka, Kiyoshi

    2017-01-01

    This work applies evolutionary computation and machine learning methods to study the transportation system of Quito from a design optimization perspective. It couples an evolutionary algorithm with a microscopic transport simulator and uses the outcome of the optimization process to deepen our understanding of the problem and gain knowledge about the system. The work focuses on the optimization of a large number of traffic lights deployed on a wide area of the city and studies their impact on travel time, emissions and fuel consumption. An evolutionary algorithm with specialized mutation operators is proposed to search effectively in large decision spaces, evolving small populations for a short number of generations. The effects of the operators combined with a varying mutation schedule are studied, and an analysis of the parameters of the algorithm is also included. In addition, hierarchical clustering is performed on the best solutions found in several runs of the algorithm. An analysis of signal clusters and their geolocation, estimation of fuel consumption, spatial analysis of emissions, and an analysis of signal coordination provide an overall picture of the systemic effects of the optimization process.

  2. Can An Evolutionary Process Create English Text?

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.

    2008-10-29

    Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed to produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).

  3. Computational implementation of a systems prioritization methodology for the Waste Isolation Pilot Plant: A preliminary example

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States). Dept. of Mathematics; Anderson, D.R. [Sandia National Labs., Albuquerque, NM (United States). WIPP Performance Assessments Departments; Baker, B.L. [Technadyne Engineering Consultants, Albuquerque, NM (United States)] [and others

    1996-04-01

    A systems prioritization methodology (SPM) is under development to provide guidance to the US DOE on experimental programs and design modifications to be supported in the development of a successful licensing application for the Waste Isolation Pilot Plant (WIPP) for the geologic disposal of transuranic (TRU) waste. The purpose of the SPM is to determine the probabilities that the implementation of different combinations of experimental programs and design modifications, referred to as activity sets, will lead to compliance. Appropriate tradeoffs between compliance probability, implementation cost and implementation time can then be made in the selection of the activity set to be supported in the development of a licensing application. Descriptions are given for the conceptual structure of the SPM and the manner in which this structure determines the computational implementation of an example SPM application. Due to the sophisticated structure of the SPM and the computational demands of many of its components, the overall computational structure must be organized carefully to provide the compliance probabilities for the large number of activity sets under consideration at an acceptable computational cost. Conceptually, the determination of each compliance probability is equivalent to a large numerical integration problem. 96 refs., 31 figs., 36 tabs.

  4. Computational implementation of a systems prioritization methodology for the Waste Isolation Pilot Plant: A preliminary example

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-04-01

    A systems prioritization methodology (SPM) is under development to provide guidance to the US DOE on experimental programs and design modifications to be supported in the development of a successful licensing application for the Waste Isolation Pilot Plant (WIPP) for the geologic disposal of transuranic (TRU) waste. The purpose of the SPM is to determine the probabilities that the implementation of different combinations of experimental programs and design modifications, referred to as activity sets, will lead to compliance. Appropriate tradeoffs between compliance probability, implementation cost and implementation time can then be made in the selection of the activity set to be supported in the development of a licensing application. Descriptions are given for the conceptual structure of the SPM and the manner in which this structure determines the computational implementation of an example SPM application. Due to the sophisticated structure of the SPM and the computational demands of many of its components, the overall computational structure must be organized carefully to provide the compliance probabilities for the large number of activity sets under consideration at an acceptable computational cost. Conceptually, the determination of each compliance probability is equivalent to a large numerical integration problem. 96 refs., 31 figs., 36 tabs

  5. Attractive evolutionary equilibria

    OpenAIRE

    Roorda, Berend; Joosten, Reinoud

    2011-01-01

    We present attractiveness, a refinement criterion for evolutionary equilibria. Equilibria surviving this criterion are robust to small perturbations of the underlying payoff system or the dynamics at hand. Furthermore, certain attractive equilibria are equivalent to others for certain evolutionary dynamics. For instance, each attractive evolutionarily stable strategy is an attractive evolutionarily stable equilibrium for certain barycentric ray-projection dynamics, and vice versa.

  6. Multipurpose Water Reservoir Management: An Evolutionary Multiobjective Optimization Approach

    Directory of Open Access Journals (Sweden)

    Luís A. Scola

    2014-01-01

    Full Text Available The reservoirs that feed large hydropower plants should be managed in order to provide other uses for the water resources. Those uses include, for instance, flood control and avoidance, irrigation, navigability in the rivers, and other ones. This work presents an evolutionary multiobjective optimization approach for the study of multiple water usages in multiple interlinked reservoirs, including both power generation objectives and other objectives not related to energy generation. The classical evolutionary algorithm NSGA-II is employed as the basic multiobjective optimization machinery, being modified in order to cope with specific problem features. The case studies, which include the analysis of a problem which involves an objective of navigability on the river, are tailored in order to illustrate the usefulness of the data generated by the proposed methodology for decision-making on the problem of operation planning of multiple reservoirs with multiple usages. It is shown that it is even possible to use the generated data in order to determine the cost of any new usage of the water, in terms of the opportunity cost that can be measured on the revenues related to electric energy sales.

  7. Evolutionary principles and their practical application.

    Science.gov (United States)

    Hendry, Andrew P; Kinnison, Michael T; Heino, Mikko; Day, Troy; Smith, Thomas B; Fitt, Gary; Bergstrom, Carl T; Oakeshott, John; Jørgensen, Peter S; Zalucki, Myron P; Gilchrist, George; Southerton, Simon; Sih, Andrew; Strauss, Sharon; Denison, Robert F; Carroll, Scott P

    2011-03-01

    Evolutionary principles are now routinely incorporated into medicine and agriculture. Examples include the design of treatments that slow the evolution of resistance by weeds, pests, and pathogens, and the design of breeding programs that maximize crop yield or quality. Evolutionary principles are also increasingly incorporated into conservation biology, natural resource management, and environmental science. Examples include the protection of small and isolated populations from inbreeding depression, the identification of key traits involved in adaptation to climate change, the design of harvesting regimes that minimize unwanted life-history evolution, and the setting of conservation priorities based on populations, species, or communities that harbor the greatest evolutionary diversity and potential. The adoption of evolutionary principles has proceeded somewhat independently in these different fields, even though the underlying fundamental concepts are the same. We explore these fundamental concepts under four main themes: variation, selection, connectivity, and eco-evolutionary dynamics. Within each theme, we present several key evolutionary principles and illustrate their use in addressing applied problems. We hope that the resulting primer of evolutionary concepts and their practical utility helps to advance a unified multidisciplinary field of applied evolutionary biology.

  8. The State of Software for Evolutionary Biology.

    Science.gov (United States)

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  9. EvolQG - An R package for evolutionary quantitative genetics [version 2; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Diogo Melo

    2016-06-01

    Full Text Available We present an open source package for performing evolutionary quantitative genetics analyses in the R environment for statistical computing. Evolutionary theory shows that evolution depends critically on the available variation in a given population. When dealing with many quantitative traits this variation is expressed in the form of a covariance matrix, particularly the additive genetic covariance matrix or sometimes the phenotypic matrix, when the genetic matrix is unavailable and there is evidence the phenotypic matrix is sufficiently similar to the genetic matrix. Given this mathematical representation of available variation, the EvolQG package provides functions for calculation of relevant evolutionary statistics; estimation of sampling error; corrections for this error; matrix comparison via correlations, distances and matrix decomposition; analysis of modularity patterns; and functions for testing evolutionary hypotheses on taxa diversification.

  10. Application of network methods for understanding evolutionary dynamics in discrete habitats.

    Science.gov (United States)

    Greenbaum, Gili; Fefferman, Nina H

    2017-06-01

    In populations occupying discrete habitat patches, gene flow between habitat patches may form an intricate population structure. In such structures, the evolutionary dynamics resulting from interaction of gene-flow patterns with other evolutionary forces may be exceedingly complex. Several models describing gene flow between discrete habitat patches have been presented in the population-genetics literature; however, these models have usually addressed relatively simple settings of habitable patches and have stopped short of providing general methodologies for addressing nontrivial gene-flow patterns. In the last decades, network theory - a branch of discrete mathematics concerned with complex interactions between discrete elements - has been applied to address several problems in population genetics by modelling gene flow between habitat patches using networks. Here, we present the idea and concepts of modelling complex gene flows in discrete habitats using networks. Our goal is to raise awareness to existing network theory applications in molecular ecology studies, as well as to outline the current and potential contribution of network methods to the understanding of evolutionary dynamics in discrete habitats. We review the main branches of network theory that have been, or that we believe potentially could be, applied to population genetics and molecular ecology research. We address applications to theoretical modelling and to empirical population-genetic studies, and we highlight future directions for extending the integration of network science with molecular ecology. © 2017 John Wiley & Sons Ltd.

  11. Computational methodology of sodium-water reaction phenomenon in steam generator of sodium-cooled fast reactor

    International Nuclear Information System (INIS)

    Takata, Takashi; Yamaguchi, Akira; Uchibori, Akihiro; Ohshima, Hiroyuki

    2009-01-01

    A new computational methodology of sodium-water reaction (SWR), which occurs in a steam generator of a liquid-sodium-cooled fast reactor when a heat transfer tube in the steam generator fails, has been developed considering multidimensional and multiphysics thermal hydraulics. Two kinds of reaction models are proposed in accordance with a phase of sodium as a reactant. One is the surface reaction model in which water vapor reacts directly with liquid sodium at the interface between the liquid sodium and the water vapor. The reaction heat will lead to a vigorous evaporation of liquid sodium, resulting in a reaction of gas-phase sodium. This is designated as the gas-phase reaction model. These two models are coupled with a multidimensional, multicomponent gas, and multiphase thermal hydraulics simulation method with compressibility (named the 'SERAPHIM' code). Using the present methodology, a numerical investigation of the SWR under a pin-bundle configuration (a benchmark analysis of the SWAT-1R experiment) has been carried out. As a result, the maximum gas temperature of approximately 1,300degC is predicted stably, which lies within the range of previous experimental observations. It is also demonstrated that the maximum temperature of the mass weighted average in the analysis agrees reasonably well with the experimental result measured by thermocouples. The present methodology will be promising to establish a theoretical and mechanical modeling of secondary failure propagation of heat transfer tubes due to such as an overheating rupture and a wastage. (author)

  12. Behavioural, ecological and evolutionary responses to extreme climatic events: challenges and directions.

    Science.gov (United States)

    van de Pol, Martijn; Jenouvrier, Stéphanie; Cornelissen, Johannes H C; Visser, Marcel E

    2017-06-19

    More extreme climatic events (ECEs) are among the most prominent consequences of climate change. Despite a long-standing recognition of the importance of ECEs by paleo-ecologists and macro-evolutionary biologists, ECEs have only recently received a strong interest in the wider ecological and evolutionary community. However, as with many rapidly expanding fields, it lacks structure and cohesiveness, which strongly limits scientific progress. Furthermore, due to the descriptive and anecdotal nature of many ECE studies it is still unclear what the most relevant questions and long-term consequences are of ECEs. To improve synthesis, we first discuss ways to define ECEs that facilitate comparison among studies. We then argue that biologists should adhere to more rigorous attribution and mechanistic methods to assess ECE impacts. Subsequently, we discuss conceptual and methodological links with climatology and disturbance-, tipping point- and paleo-ecology. These research fields have close linkages with ECE research, but differ in the identity and/or the relative severity of environmental factors. By summarizing the contributions to this theme issue we draw parallels between behavioural, ecological and evolutionary ECE studies, and suggest that an overarching challenge is that most empirical and theoretical evidence points towards responses being highly idiosyncratic, and thus predictability being low. Finally, we suggest a roadmap based on the proposition that an increased focus on the mechanisms behind the biological response function will be crucial for increased understanding and predictability of the impacts of ECE.This article is part of the themed issue 'Behavioural, ecological and evolutionary responses to extreme climatic events'. © 2017 The Author(s).

  13. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  14. An evolutionary perspective on anti-tumor immunity

    Directory of Open Access Journals (Sweden)

    David John Klinke

    2013-01-01

    Full Text Available The challenges associated with demonstrating a durable response using molecular targeted therapies in cancer has sparked a renewed interest in viewing cancer from an evolutionary perspective. Evolutionary processes have three common traits: heterogeneity, dynamics, and a selective fitness landscape. Mutagens randomly alter the genome of host cells creating a population of cells that contain different somatic mutations. This genomic rearrangement perturbs cellular homeostasis through changing how cells interact with their tissue microenvironment. To counterbalance the ability of mutated cells to outcompete for limited resources, control structures are encoded within the cell and within the organ system, such as innate and adaptive immunity, to restore cellular homeostasis. These control structures shape the selective fitness landscape and determine whether a cell that harbors particular somatic mutations is retained or eliminated from a cell population. While next-generation sequencing has revealed the complexity and heterogeneity of oncogenic transformation, understanding the dynamics of oncogenesis and how cancer cells alter the selective fitness landscape remain unclear. In this technology review, we will summarize how recent advances in technology have impacted our understanding of these three attributes of cancer as an evolutionary process. In particular, we will focus on how advances in genome sequencing have enabled quantifying cellular heterogeneity, advances in computational power have enabled explicit testing of postulated intra- and intercellular control structures against the available data using simulation, and advances in proteomics have enabled identifying novel mechanisms of cellular cross-talk that cancer cells use to alter the fitness landscape.

  15. ABCtoolbox: a versatile toolkit for approximate Bayesian computations

    Directory of Open Access Journals (Sweden)

    Neuenschwander Samuel

    2010-03-01

    Full Text Available Abstract Background The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. Results Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC. It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. Conclusion ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.

  16. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    Science.gov (United States)

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  17. Evolutionary autonomous agents and the nature of apraxia

    Directory of Open Access Journals (Sweden)

    Jin Frank

    2005-01-01

    Full Text Available Abstract Background Evolutionary autonomous agents are robots or robot simulations whose controller is a dynamical neural network and whose evolution occurs autonomously under the guidance of a fitness function without the detailed or explicit direction of an external programmer. They are embodied agents with a simple neural network controller and as such they provide the optimal forum by which sensorimotor interactions in a specified environment can be studied without the computational assumptions inherent in standard neuroscience. Methods Evolutionary autonomous agents were evolved that were able to perform identical movements under two different contexts, one which represented an automatic movement and one which had a symbolic context. In an attempt to model the automatic-voluntary dissociation frequently seen in ideomotor apraxia, lesions were introduced into the neural network controllers resulting in a behavioral dissociation with loss of the ability to perform the movement which had a symbolic context and preservation of the simpler, automatic movement. Results Analysis of the changes in the hierarchical organization of the networks in the apractic EAAs demonstrated consistent changes in the network dynamics across all agents with loss of longer duration time scales in the network dynamics. Conclusion The concepts of determinate motor programs and perceptual representations that are implicit in the present day understanding of ideomotor apraxia are assumptions inherent in the computational understanding of brain function. The strength of the present study using EAAs to model one aspect of ideomotor apraxia is the absence of these assumptions and a grounding of all sensorimotor interactions in an embodied, autonomous agent. The consistency of the hierarchical changes in the network dynamics across all apractic agents demonstrates that this technique is tenable and will be a valuable adjunct to a computational formalism in the understanding

  18. More efficient evolutionary strategies for model calibration with watershed model for demonstration

    Science.gov (United States)

    Baggett, J. S.; Skahill, B. E.

    2008-12-01

    Evolutionary strategies allow automatic calibration of more complex models than traditional gradient based approaches, but they are more computationally intensive. We present several efficiency enhancements for evolution strategies, many of which are not new, but when combined have been shown to dramatically decrease the number of model runs required for calibration of synthetic problems. To reduce the number of expensive model runs we employ a surrogate objective function for an adaptively determined fraction of the population at each generation (Kern et al., 2006). We demonstrate improvements to the adaptive ranking strategy that increase its efficiency while sacrificing little reliability and further reduce the number of model runs required in densely sampled parts of parameter space. Furthermore, we include a gradient individual in each generation that is usually not selected when the search is in a global phase or when the derivatives are poorly approximated, but when selected near a smooth local minimum can dramatically increase convergence speed (Tahk et al., 2007). Finally, the selection of the gradient individual is used to adapt the size of the population near local minima. We show, by incorporating these enhancements into the Covariance Matrix Adaption Evolution Strategy (CMAES; Hansen, 2006), that their synergetic effect is greater than their individual parts. This hybrid evolutionary strategy exploits smooth structure when it is present but degrades to an ordinary evolutionary strategy, at worst, if smoothness is not present. Calibration of 2D-3D synthetic models with the modified CMAES requires approximately 10%-25% of the model runs of ordinary CMAES. Preliminary demonstration of this hybrid strategy will be shown for watershed model calibration problems. Hansen, N. (2006). The CMA Evolution Strategy: A Comparing Review. In J.A. Lozano, P. Larrañga, I. Inza and E. Bengoetxea (Eds.). Towards a new evolutionary computation. Advances in estimation of

  19. An Evolutionary Formulation of the Crossing Number Problem

    Directory of Open Access Journals (Sweden)

    Che Sheng Gan

    2009-01-01

    Full Text Available A graph drawing algorithm is presented which results in complete graphs having minimum crossings equal to that of Guy's conjecture. It is then generalized and formulated in an evolutionary algorithm (EA to perform constrained search for the crossing numbers. The main objective of this work is to present a suitable two-dimensional scheme which can greatly reduce the complexity of finding crossing numbers by using computer. Program performance criteria are presented and discussed. It is shown that the EA implementation provides good confirmation of the predicted crossing numbers.

  20. A security mechanism based on evolutionary game in fog computing

    Directory of Open Access Journals (Sweden)

    Yan Sun

    2018-02-01

    Full Text Available Fog computing is a distributed computing paradigm at the edge of the network and requires cooperation of users and sharing of resources. When users in fog computing open their resources, their devices are easily intercepted and attacked because they are accessed through wireless network and present an extensive geographical distribution. In this study, a credible third party was introduced to supervise the behavior of users and protect the security of user cooperation. A fog computing security mechanism based on human nervous system is proposed, and the strategy for a stable system evolution is calculated. The MATLAB simulation results show that the proposed mechanism can reduce the number of attack behaviors effectively and stimulate users to cooperate in application tasks positively.

  1. A security mechanism based on evolutionary game in fog computing.

    Science.gov (United States)

    Sun, Yan; Lin, Fuhong; Zhang, Nan

    2018-02-01

    Fog computing is a distributed computing paradigm at the edge of the network and requires cooperation of users and sharing of resources. When users in fog computing open their resources, their devices are easily intercepted and attacked because they are accessed through wireless network and present an extensive geographical distribution. In this study, a credible third party was introduced to supervise the behavior of users and protect the security of user cooperation. A fog computing security mechanism based on human nervous system is proposed, and the strategy for a stable system evolution is calculated. The MATLAB simulation results show that the proposed mechanism can reduce the number of attack behaviors effectively and stimulate users to cooperate in application tasks positively.

  2. Large fluctuations and fixation in evolutionary games

    International Nuclear Information System (INIS)

    Assaf, Michael; Mobilia, Mauro

    2010-01-01

    We study large fluctuations in evolutionary games belonging to the coordination and anti-coordination classes. The dynamics of these games, modeling cooperation dilemmas, is characterized by a coexistence fixed point separating two absorbing states. We are particularly interested in the problem of fixation that refers to the possibility that a few mutants take over the entire population. Here, the fixation phenomenon is induced by large fluctuations and is investigated by a semiclassical WKB (Wentzel–Kramers–Brillouin) theory generalized to treat stochastic systems possessing multiple absorbing states. Importantly, this method allows us to analyze the combined influence of selection and random fluctuations on the evolutionary dynamics beyond the weak selection limit often considered in previous works. We accurately compute, including pre-exponential factors, the probability distribution function in the long-lived coexistence state and the mean fixation time necessary for a few mutants to take over the entire population in anti-coordination games, and also the fixation probability in the coordination class. Our analytical results compare excellently with extensive numerical simulations. Furthermore, we demonstrate that our treatment is superior to the Fokker–Planck approximation when the selection intensity is finite

  3. COEVOLUTIONARY SEMANTICS OF TECHNOLOGICAL CIVILIZATION GENESIS AND EVOLUTIONARY RISK (BETWEEN THE BIOAESTHETICS AND BIOPOLITICS

    Directory of Open Access Journals (Sweden)

    V. T. Cheshko

    2016-12-01

    Full Text Available Purpose (metatask of the present work is to attempt to give a glance at the problem of existential and anthropological risk caused by the contemporary man-made civilization from the perspective of comparison and confrontation of aesthetics, the substrate of which is emotional and metaphorical interpretation of individual subjective values and politics feeding by objectively rational interests of social groups. In both cases there is some semantic gap present between the represented social reality and its representation in perception of works of art and in the political doctrines as well. Methodology of the research is evolutionary anthropologicalcomparativistics. Originality of the conducted analysis comes to the following: As the antithesis to biological and social reductionism in interpretation of the phenomenon of bio-power it is proposed a co-evolutionary semantic model in accordance with which the described semantic gap is of the substantial nature related to the complex module organization of a consistent and adaptive human strategy consisting of three associated but independently functional modules (genetic, cultural and techno-rational. Evolutionary trajectory of all anthropogenesis components including civilization cultural and social-political evolution is identified by the proportion between two macro variables – evolutionary effectiveness and evolutionary stability (sameness, i.e. preservation in the context of consequential transformations of some invariants of Homo sapiens species specificity organization. It should be noted that inasmuch as in respect to human, some modules of the evolutionary (adaptive strategy assume self-reflection attributes, it would be more correctly to state about evolutionary correctness, i.e. correspondence to some perfection. As a result, the future of human nature depends not only on the rationalist principles of ethics of Homo species (the archaism of Jurgen Habermas, but also on the holistic and

  4. Quality-of-service sensitivity to bio-inspired/evolutionary computational methods for intrusion detection in wireless ad hoc multimedia sensor networks

    Science.gov (United States)

    Hortos, William S.

    2012-06-01

    In the author's previous work, a cross-layer protocol approach to wireless sensor network (WSN) intrusion detection an identification is created with multiple bio-inspired/evolutionary computational methods applied to the functions of the protocol layers, a single method to each layer, to improve the intrusion-detection performance of the protocol over that of one method applied to only a single layer's functions. The WSN cross-layer protocol design embeds GAs, anti-phase synchronization, ACO, and a trust model based on quantized data reputation at the physical, MAC, network, and application layer, respectively. The construct neglects to assess the net effect of the combined bioinspired methods on the quality-of-service (QoS) performance for "normal" data streams, that is, streams without intrusions. Analytic expressions of throughput, delay, and jitter, coupled with simulation results for WSNs free of intrusion attacks, are the basis for sensitivity analyses of QoS metrics for normal traffic to the bio-inspired methods.

  5. Evolutionary Explanations of Eating Disorders

    Directory of Open Access Journals (Sweden)

    Igor Kardum

    2008-12-01

    Full Text Available This article reviews several most important evolutionary mechanisms that underlie eating disorders. The first part clarifies evolutionary foundations of mental disorders and various mechanisms leading to their development. In the second part selective pressures and evolved adaptations causing contemporary epidemic of obesity as well as differences in dietary regimes and life-style between modern humans and their ancestors are described. Concerning eating disorders, a number of current evolutionary explanations of anorexia nervosa are presented together with their main weaknesses. Evolutionary explanations of eating disorders based on the reproductive suppression hypothesis and its variants derived from kin selection theory and the model of parental manipulation were elaborated. The sexual competition hypothesis of eating disorder, adapted to flee famine hypothesis as well as explanation based on the concept of social attention holding power and the need to belonging were also explained. The importance of evolutionary theory in modern conceptualization and research of eating disorders is emphasized.

  6. Evolutionary dynamics of fluctuating populations with strong mutualism

    Science.gov (United States)

    Chotibut, Thiparat; Nelson, David

    2013-03-01

    Evolutionary game theory with finite interacting populations is receiving increased attention, including subtle phenomena associated with number fluctuations, i.e., ``genetic drift.'' Models of cooperation and competition often utilize a simplified Moran model, with a strictly fixed total population size. We explore a more general evolutionary model with independent fluctuations in the numbers of two distinct species, in a regime characterized by ``strong mutualism.'' The model has two absorbing states, each corresponding to fixation of one of the two species, and allows exploration of the interplay between growth, competition, and mutualism. When mutualism is favored, number fluctuations eventually drive the system away from a stable fixed point, characterized by cooperation, to one of the absorbing states. Well-mixed populations will thus be taken over by a single species in a finite time, despite the bias towards cooperation. We calculate both the fixation probability and the mean fixation time as a function of the initial conditions and carrying capacities in the strong mutualism regime, using the method of matched asymptotic expansions. Our results are compared to computer simulations.

  7. The citation field of evolutionary economics

    NARCIS (Netherlands)

    Dolfsma, Wilfred; Leydesdorff, Loet

    2010-01-01

    Evolutionary economics has developed into an academic field of its own, institutionalized around, amongst others, the Journal of Evolutionary Economics (JEE). This paper analyzes the way and extent to which evolutionary economics has become an interdisciplinary journal, as its aim was: a journal

  8. Learning, epigenetics, and computation: An extension on Fitch's proposal. Comment on “Toward a computational framework for cognitive biology: Unifying approaches from cognitive neuroscience and comparative cognition” by W. Tecumseh Fitch

    Science.gov (United States)

    Okanoya, Kazuo

    2014-09-01

    The comparative computational approach of Fitch [1] attempts to renew the classical David Marr paradigm of computation, algorithm, and implementation, by introducing evolutionary view of the relationship between neural architecture and cognition. This comparative evolutionary view provides constraints useful in narrowing down the problem space for both cognition and neural mechanisms. I will provide two examples from our own studies that reinforce and extend Fitch's proposal.

  9. Towards a mechanistic foundation of evolutionary theory.

    Science.gov (United States)

    Doebeli, Michael; Ispolatov, Yaroslav; Simon, Burt

    2017-02-15

    Most evolutionary thinking is based on the notion of fitness and related ideas such as fitness landscapes and evolutionary optima. Nevertheless, it is often unclear what fitness actually is, and its meaning often depends on the context. Here we argue that fitness should not be a basal ingredient in verbal or mathematical descriptions of evolution. Instead, we propose that evolutionary birth-death processes, in which individuals give birth and die at ever-changing rates, should be the basis of evolutionary theory, because such processes capture the fundamental events that generate evolutionary dynamics. In evolutionary birth-death processes, fitness is at best a derived quantity, and owing to the potential complexity of such processes, there is no guarantee that there is a simple scalar, such as fitness, that would describe long-term evolutionary outcomes. We discuss how evolutionary birth-death processes can provide useful perspectives on a number of central issues in evolution.

  10. Evolutionary thinking: "A conversation with Carter Phipps about the role of evolutionary thinking in modern culture".

    Science.gov (United States)

    Hunt, Tam

    2014-12-01

    Evolution as an idea has a lengthy history, even though the idea of evolution is generally associated with Darwin today. Rebecca Stott provides an engaging and thoughtful overview of this history of evolutionary thinking in her 2013 book, Darwin's Ghosts: The Secret History of Evolution. Since Darwin, the debate over evolution-both how it takes place and, in a long war of words with religiously-oriented thinkers, whether it takes place-has been sustained and heated. A growing share of this debate is now devoted to examining how evolutionary thinking affects areas outside of biology. How do our lives change when we recognize that all is in flux? What can we learn about life more generally if we study change instead of stasis? Carter Phipps' book, Evolutionaries: Unlocking the Spiritual and Cultural Potential of Science's Greatest Idea, delves deep into this relatively new development. Phipps generally takes as a given the validity of the Modern Synthesis of evolutionary biology. His story takes us into, as the subtitle suggests, the spiritual and cultural implications of evolutionary thinking. Can religion and evolution be reconciled? Can evolutionary thinking lead to a new type of spirituality? Is our culture already being changed in ways that we don't realize by evolutionary thinking? These are all important questions and Phipps book is a great introduction to this discussion. Phipps is an author, journalist, and contributor to the emerging "integral" or "evolutionary" cultural movement that combines the insights of Integral Philosophy, evolutionary science, developmental psychology, and the social sciences. He has served as the Executive Editor of EnlightenNext magazine (no longer published) and more recently is the co-founder of the Institute for Cultural Evolution, a public policy think tank addressing the cultural roots of America's political challenges. What follows is an email interview with Phipps.

  11. Essays on nonlinear evolutionary game dynamics

    NARCIS (Netherlands)

    Ochea, M.I.

    2010-01-01

    Evolutionary game theory has been viewed as an evolutionary repair of rational actor game theory in the hope that a population of boundedly rational players may attain convergence to classic rational solutions, such as the Nash Equilibrium, via some learning or evolutionary process. In this thesis

  12. International Conference on Computer, Communication and Computational Sciences

    CERN Document Server

    Mishra, Krishn; Tiwari, Shailesh; Singh, Vivek

    2017-01-01

    Exchange of information and innovative ideas are necessary to accelerate the development of technology. With advent of technology, intelligent and soft computing techniques came into existence with a wide scope of implementation in engineering sciences. Keeping this ideology in preference, this book includes the insights that reflect the ‘Advances in Computer and Computational Sciences’ from upcoming researchers and leading academicians across the globe. It contains high-quality peer-reviewed papers of ‘International Conference on Computer, Communication and Computational Sciences (ICCCCS 2016), held during 12-13 August, 2016 in Ajmer, India. These papers are arranged in the form of chapters. The content of the book is divided into two volumes that cover variety of topics such as intelligent hardware and software design, advanced communications, power and energy optimization, intelligent techniques used in internet of things, intelligent image processing, advanced software engineering, evolutionary and ...

  13. A Methodology to Reduce the Computational Effort in the Evaluation of the Lightning Performance of Distribution Networks

    Directory of Open Access Journals (Sweden)

    Ilaria Bendato

    2016-11-01

    Full Text Available The estimation of the lightning performance of a power distribution network is of great importance to design its protection system against lightning. An accurate evaluation of the number of lightning events that can create dangerous overvoltages requires a huge computational effort, as it implies the adoption of a Monte Carlo procedure. Such a procedure consists of generating many different random lightning events and calculating the corresponding overvoltages. The paper proposes a methodology to deal with the problem in two computationally efficient ways: (i finding out the minimum number of Monte Carlo runs that lead to reliable results; and (ii setting up a procedure that bypasses the lightning field-to-line coupling problem for each Monte Carlo run. The proposed approach is shown to provide results consistent with existing approaches while exhibiting superior Central Processing Unit (CPU time performances.

  14. Computer-assisted detection (CAD) methodology for early detection of response to pharmaceutical therapy in tuberculosis patients

    Science.gov (United States)

    Lieberman, Robert; Kwong, Heston; Liu, Brent; Huang, H. K.

    2009-02-01

    The chest x-ray radiological features of tuberculosis patients are well documented, and the radiological features that change in response to successful pharmaceutical therapy can be followed with longitudinal studies over time. The patients can also be classified as either responsive or resistant to pharmaceutical therapy based on clinical improvement. We have retrospectively collected time series chest x-ray images of 200 patients diagnosed with tuberculosis receiving the standard pharmaceutical treatment. Computer algorithms can be created to utilize image texture features to assess the temporal changes in the chest x-rays of the tuberculosis patients. This methodology provides a framework for a computer-assisted detection (CAD) system that may provide physicians with the ability to detect poor treatment response earlier in pharmaceutical therapy. Early detection allows physicians to respond with more timely treatment alternatives and improved outcomes. Such a system has the potential to increase treatment efficacy for millions of patients each year.

  15. Integrating genomics into evolutionary medicine.

    Science.gov (United States)

    Rodríguez, Juan Antonio; Marigorta, Urko M; Navarro, Arcadi

    2014-12-01

    The application of the principles of evolutionary biology into medicine was suggested long ago and is already providing insight into the ultimate causes of disease. However, a full systematic integration of medical genomics and evolutionary medicine is still missing. Here, we briefly review some cases where the combination of the two fields has proven profitable and highlight two of the main issues hindering the development of evolutionary genomic medicine as a mature field, namely the dissociation between fitness and health and the still considerable difficulties in predicting phenotypes from genotypes. We use publicly available data to illustrate both problems and conclude that new approaches are needed for evolutionary genomic medicine to overcome these obstacles. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Form of an evolutionary tradeoff affects eco-evolutionary dynamics in a predator-prey system.

    Science.gov (United States)

    Kasada, Minoru; Yamamichi, Masato; Yoshida, Takehito

    2014-11-11

    Evolution on a time scale similar to ecological dynamics has been increasingly recognized for the last three decades. Selection mediated by ecological interactions can change heritable phenotypic variation (i.e., evolution), and evolution of traits, in turn, can affect ecological interactions. Hence, ecological and evolutionary dynamics can be tightly linked and important to predict future dynamics, but our understanding of eco-evolutionary dynamics is still in its infancy and there is a significant gap between theoretical predictions and empirical tests. Empirical studies have demonstrated that the presence of genetic variation can dramatically change ecological dynamics, whereas theoretical studies predict that eco-evolutionary dynamics depend on the details of the genetic variation, such as the form of a tradeoff among genotypes, which can be more important than the presence or absence of the genetic variation. Using a predator-prey (rotifer-algal) experimental system in laboratory microcosms, we studied how different forms of a tradeoff between prey defense and growth affect eco-evolutionary dynamics. Our experimental results show for the first time to our knowledge that different forms of the tradeoff produce remarkably divergent eco-evolutionary dynamics, including near fixation, near extinction, and coexistence of algal genotypes, with quantitatively different population dynamics. A mathematical model, parameterized from completely independent experiments, explains the observed dynamics. The results suggest that knowing the details of heritable trait variation and covariation within a population is essential for understanding how evolution and ecology will interact and what form of eco-evolutionary dynamics will result.

  17. A Novel Evolutionary Engineering Design Approach for Mixed-Domain Systems

    DEFF Research Database (Denmark)

    Fan, Zhun; Hu, J.; Seo, K.

    2004-01-01

    This paper presents an approach to engineering design of mixed-domain dynamic systems. The approach aims at system-level design and has two key features: first, it generates engineering designs that satisfy predefined specifications in an automatic manner; second, it can design systems belonging ...... often encountered in evolutionary computation, a HFC (Hierarchical Fair Competition) model is adopted in this work. Examples of an analog filter design and a MEM filter design illustrate the application of the approach....

  18. An Evolutionary Real-Time 3D Route Planner for Aircraft

    Institute of Scientific and Technical Information of China (English)

    郑昌文; 丁明跃; 周成平

    2003-01-01

    A novel evolutionary route planner for aircraft is proposed in this paper. In the new planner, individual candidates are evaluated with respect to the workspace, thus the computation of the configuration space is not required. By using problem-specific chromosome structure and genetic operators, the routes are generated in real time,with different mission constraints such as minimum route leg length and flying altitude, maximum turning angle, maximum climbing/diving angle and route distance constraint taken into account.

  19. Calculating evolutionary dynamics in structured populations.

    Directory of Open Access Journals (Sweden)

    Charles G Nathanson

    2009-12-01

    Full Text Available Evolution is shaping the world around us. At the core of every evolutionary process is a population of reproducing individuals. The outcome of an evolutionary process depends on population structure. Here we provide a general formula for calculating evolutionary dynamics in a wide class of structured populations. This class includes the recently introduced "games in phenotype space" and "evolutionary set theory." There can be local interactions for determining the relative fitness of individuals, but we require global updating, which means all individuals compete uniformly for reproduction. We study the competition of two strategies in the context of an evolutionary game and determine which strategy is favored in the limit of weak selection. We derive an intuitive formula for the structure coefficient, sigma, and provide a method for efficient numerical calculation.

  20. Evolutionary Multiplayer Games

    OpenAIRE

    Gokhale, Chaitanya S.; Traulsen, Arne

    2014-01-01

    Evolutionary game theory has become one of the most diverse and far reaching theories in biology. Applications of this theory range from cell dynamics to social evolution. However, many applications make it clear that inherent non-linearities of natural systems need to be taken into account. One way of introducing such non-linearities into evolutionary games is by the inclusion of multiple players. An example is of social dilemmas, where group benefits could e.g.\\ increase less than linear wi...

  1. Generator Approach to Evolutionary Optimization of Catalysts and its Integration with Surrogate Modeling

    Czech Academy of Sciences Publication Activity Database

    Holeňa, Martin; Linke, D.; Rodemerck, U.

    2011-01-01

    Roč. 159, č. 1 (2011), s. 84-95 ISSN 0920-5861 R&D Projects: GA ČR GA201/08/0802 Institutional research plan: CEZ:AV0Z10300504 Keywords : optimization of catalytic materials * evolutionary optimization * surrogate modeling * artificial neural networks * multilayer perceptron * regression boosting Subject RIV: IN - Informatics, Computer Science Impact factor: 3.407, year: 2011

  2. A Multi Agent System for Flow-Based Intrusion Detection Using Reputation and Evolutionary Computation

    Science.gov (United States)

    2011-03-01

    pertinent example of the application of Evolutionary Algorithms to pattern recognition comes from Radtke et al. [130]. The authors apply Multi- Objective...J., T. Zseby, and B. Claise. S. Zander,” Requirements for IP Flow Information Export (IPFIX). Technical report, RFC 3917, October 2004. [130] Radtke ...hal.inria.fr/inria-00104200/en/. [131] Radtke , P.V.W., T. Wong, and R. Sabourin. “A multi-objective memetic al- gorithm for intelligent feature extraction

  3. Asymmetric Evolutionary Games

    Science.gov (United States)

    McAvoy, Alex; Hauert, Christoph

    2015-01-01

    Evolutionary game theory is a powerful framework for studying evolution in populations of interacting individuals. A common assumption in evolutionary game theory is that interactions are symmetric, which means that the players are distinguished by only their strategies. In nature, however, the microscopic interactions between players are nearly always asymmetric due to environmental effects, differing baseline characteristics, and other possible sources of heterogeneity. To model these phenomena, we introduce into evolutionary game theory two broad classes of asymmetric interactions: ecological and genotypic. Ecological asymmetry results from variation in the environments of the players, while genotypic asymmetry is a consequence of the players having differing baseline genotypes. We develop a theory of these forms of asymmetry for games in structured populations and use the classical social dilemmas, the Prisoner’s Dilemma and the Snowdrift Game, for illustrations. Interestingly, asymmetric games reveal essential differences between models of genetic evolution based on reproduction and models of cultural evolution based on imitation that are not apparent in symmetric games. PMID:26308326

  4. Multi-agent evolutionary systems for the generation of complex virtual worlds

    Directory of Open Access Journals (Sweden)

    J. Kruse

    2016-01-01

    Full Text Available Modern films, games and virtual reality applications are dependent on convincing computer graphics. Highly complex models are a requirement for the successful delivery of many scenes and environments. While workflows such as rendering, compositing and animation have been streamlined to accommodate increasing demands, modelling complex models is still a laborious task. This paper introduces the computational benefits of an Interactive Genetic Algorithm (IGA to computer graphics modelling while compensating the effects of user fatigue, a common issue with Interactive Evolutionary Computation. An intelligent agent is used in conjunction with an IGA that offers the potential to reduce the effects of user fatigue by learning from the choices made by the human designer and directing the search accordingly. This workflow accelerates the layout and distribution of basic elements to form complex models. It captures the designer’s intent through interaction, and encourages playful discovery.

  5. Residual radioactive material guidelines: Methodology and applications

    International Nuclear Information System (INIS)

    Yu, C.; Yuan, Y.C.; Zielen, A.J.; Wallo, A. III.

    1989-01-01

    A methodology to calculate residual radioactive material guidelines was developed for the US Department of Energy (DOE). This methodology is coded in a menu-driven computer program, RESRAD, which can be run on IBM or IBM-compatible microcomputers. Seven pathways of exposure are considered: external radiation, inhalation, and ingestion of plant foods, meat, milk, aquatic foods, and water. The RESRAD code has been applied to several DOE sites to calculate soil cleanup guidelines. This experience has shown that the computer code is easy to use and very user-friendly. 3 refs., 8 figs

  6. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  7. Core principles of evolutionary medicine: A Delphi study.

    Science.gov (United States)

    Grunspan, Daniel Z; Nesse, Randolph M; Barnes, M Elizabeth; Brownell, Sara E

    2018-01-01

    Evolutionary medicine is a rapidly growing field that uses the principles of evolutionary biology to better understand, prevent and treat disease, and that uses studies of disease to advance basic knowledge in evolutionary biology. Over-arching principles of evolutionary medicine have been described in publications, but our study is the first to systematically elicit core principles from a diverse panel of experts in evolutionary medicine. These principles should be useful to advance recent recommendations made by The Association of American Medical Colleges and the Howard Hughes Medical Institute to make evolutionary thinking a core competency for pre-medical education. The Delphi method was used to elicit and validate a list of core principles for evolutionary medicine. The study included four surveys administered in sequence to 56 expert panelists. The initial open-ended survey created a list of possible core principles; the three subsequent surveys winnowed the list and assessed the accuracy and importance of each principle. Fourteen core principles elicited at least 80% of the panelists to agree or strongly agree that they were important core principles for evolutionary medicine. These principles over-lapped with concepts discussed in other articles discussing key concepts in evolutionary medicine. This set of core principles will be helpful for researchers and instructors in evolutionary medicine. We recommend that evolutionary medicine instructors use the list of core principles to construct learning goals. Evolutionary medicine is a young field, so this list of core principles will likely change as the field develops further.

  8. Conceptual Barriers to Progress Within Evolutionary Biology.

    Science.gov (United States)

    Laland, Kevin N; Odling-Smee, John; Feldman, Marcus W; Kendal, Jeremy

    2009-08-01

    In spite of its success, Neo-Darwinism is faced with major conceptual barriers to further progress, deriving directly from its metaphysical foundations. Most importantly, neo-Darwinism fails to recognize a fundamental cause of evolutionary change, "niche construction". This failure restricts the generality of evolutionary theory, and introduces inaccuracies. It also hinders the integration of evolutionary biology with neighbouring disciplines, including ecosystem ecology, developmental biology, and the human sciences. Ecology is forced to become a divided discipline, developmental biology is stubbornly difficult to reconcile with evolutionary theory, and the majority of biologists and social scientists are still unhappy with evolutionary accounts of human behaviour. The incorporation of niche construction as both a cause and a product of evolution removes these disciplinary boundaries while greatly generalizing the explanatory power of evolutionary theory.

  9. A representation-theoretic approach to the calculation of evolutionary distance in bacteria

    Science.gov (United States)

    Sumner, Jeremy G.; Jarvis, Peter D.; Francis, Andrew R.

    2017-08-01

    In the context of bacteria and models of their evolution under genome rearrangement, we explore a novel application of group representation theory to the inference of evolutionary history. Our contribution is to show, in a very general maximum likelihood setting, how to use elementary matrix algebra to sidestep intractable combinatorial computations and convert the problem into one of eigenvalue estimation amenable to standard numerical approximation techniques.

  10. A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery

    Science.gov (United States)

    Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh

    2012-01-01

    The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…

  11. Contemporary issues in evolutionary biology

    Indian Academy of Sciences (India)

    These discussions included, among others, the possible consequences of nonDNA-based inheritance—epigenetics and cultural evolution, niche construction, and developmental mechanisms on our understanding of the evolutionary process, speciation, complexity in biology, and constructing a formal evolutionary theory.

  12. Self-Organized Criticality and Mass Extinction in Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Krink, Thiemo; Thomsen, Rene

    2001-01-01

    The gaps in the fossil record gave rise to the hypothesis that evolution proceeded in long periods of stasis, which alternated with occasional, rapid changes that yielded evolutionary progress. One mechanism that could cause these punctuated bursts is the re-colonbation of changing and deserted...... at a critical state between chaos and order, known as self-organized criticality (SOC). Based on this background, we used SOC to control the size of spatial extinction zones in a diffusion model. The SOC selection process was easy to implement and implied only negligible computational costs. Our results show...

  13. An Evolutionary Approach for Robust Layout Synthesis of MEMS

    DEFF Research Database (Denmark)

    Fan, Zhun; Wang, Jiachuan; Goodman, Erik

    2005-01-01

    The paper introduces a robust design method for layout synthesis of MEM resonators subject to inherent geometric uncertainties such as the fabrication error on the sidewall of the structure. The robust design problem is formulated as a multi-objective constrained optimisation problem after certain...... assumptions and treated with multiobjective genetic algorithm (MOGA), a special type of evolutionary computing approaches. Case study based on layout synthesis of a comb-driven MEM resonator shows that the approach proposed in this paper can lead to design results that meet the target performance and are less...

  14. Applications of evolutionary economic geography

    NARCIS (Netherlands)

    Boschma, R.A.; Frenken, K.; Puranam, Krishna Kishore; Ravi Kumar Jain B., xx

    2008-01-01

    This paper is written as the first chapter of an edited volume on evolutionary economics and economic geography (Frenken, K., editor, Applied Evolutionary Economics and Economic Geography, Cheltenham: Edward Elgar, expected publication date February 2007). The paper reviews empirical applications of

  15. What the Current System Development Trends tell us about Systems Development Methodologies: Toward explaining SSDAM, Agile and IDEF0 Methodologies

    Directory of Open Access Journals (Sweden)

    Abdulla F. Ally

    2015-03-01

    Full Text Available Systems integration, customization and component based development approach are of increasing attention. This trend facilitates the research attention to also focus on systems development methodologies. The availability of systems development tools, rapid change in technologies, evolution of mobile computing and the growth of cloud computing have necessitated a move toward systems integration and customization rather than developing systems from scratch. This tendency encourages component based development and discourages traditional systems development approach. The paper presents and evaluates SSADM, IDEF0 and Agile systems development methodologies. More specifically, it examines how they fit or not fit into the current competitive market of systems development. In the view of this perspective, it is anticipated that despite of its popularity, SSADM methodology is becoming obsolete while Agile and IDEF0 methodologies are still gaining acceptance in the current competitive market of systems development. The present study more likely enrich our understanding of the systems development methodologies concepts and draw attention regarding where the current trends in system development are heading.

  16. Home and away- the evolutionary dynamics of homing endonucleases

    Directory of Open Access Journals (Sweden)

    Barzel Adi

    2011-11-01

    Full Text Available Abstract Background Homing endonucleases (HEases are a large and diverse group of site-specific DNAases. They reside within self-splicing introns and inteins, and promote their horizontal dissemination. In recent years, HEases have been the focus of extensive research due to their promising potential use in gene targeting procedures for the treatment of genetic diseases and for the genetic engineering of crop, animal models and cell lines. Results Using mathematical analysis and computational modeling, we present here a novel account for the evolution and population dynamics of HEase genes (HEGs. We describe HEGs as paradoxical selfish elements whose long-term persistence in a single population relies on low transmission rates and a positive correlation between transmission efficiency and toxicity. Conclusion Plausible conditions allow HEGs to sustain at high frequency through long evolutionary periods, with the endonuclease frequency being either at equilibrium or periodically oscillating. The predictions of our model may prove important not only for evolutionary theory but also for gene therapy and bio-engineering applications of HEases.

  17. Numerical methodologies for investigation of moderate-velocity flow using a hybrid computational fluid dynamics - molecular dynamics simulation approach

    International Nuclear Information System (INIS)

    Ko, Soon Heum; Kim, Na Yong; Nikitopoulos, Dimitris E.; Moldovan, Dorel; Jha, Shantenu

    2014-01-01

    Numerical approaches are presented to minimize the statistical errors inherently present due to finite sampling and the presence of thermal fluctuations in the molecular region of a hybrid computational fluid dynamics (CFD) - molecular dynamics (MD) flow solution. Near the fluid-solid interface the hybrid CFD-MD simulation approach provides a more accurate solution, especially in the presence of significant molecular-level phenomena, than the traditional continuum-based simulation techniques. It also involves less computational cost than the pure particle-based MD. Despite these advantages the hybrid CFD-MD methodology has been applied mostly in flow studies at high velocities, mainly because of the higher statistical errors associated with low velocities. As an alternative to the costly increase of the size of the MD region to decrease statistical errors, we investigate a few numerical approaches that reduce sampling noise of the solution at moderate-velocities. These methods are based on sampling of multiple simulation replicas and linear regression of multiple spatial/temporal samples. We discuss the advantages and disadvantages of each technique in the perspective of solution accuracy and computational cost.

  18. Application of Neural Network Optimized by Mind Evolutionary Computation in Building Energy Prediction

    Science.gov (United States)

    Song, Chen; Zhong-Cheng, Wu; Hong, Lv

    2018-03-01

    Building Energy forecasting plays an important role in energy management and plan. Using mind evolutionary algorithm to find the optimal network weights and threshold, to optimize the BP neural network, can overcome the problem of the BP neural network into a local minimum point. The optimized network is used for time series prediction, and the same month forecast, to get two predictive values. Then two kinds of predictive values are put into neural network, to get the final forecast value. The effectiveness of the method was verified by experiment with the energy value of three buildings in Hefei.

  19. Mathematics and evolutionary biology make bioinformatics education comprehensible

    Science.gov (United States)

    Weisstein, Anton E.

    2013-01-01

    The patterns of variation within a molecular sequence data set result from the interplay between population genetic, molecular evolutionary and macroevolutionary processes—the standard purview of evolutionary biologists. Elucidating these patterns, particularly for large data sets, requires an understanding of the structure, assumptions and limitations of the algorithms used by bioinformatics software—the domain of mathematicians and computer scientists. As a result, bioinformatics often suffers a ‘two-culture’ problem because of the lack of broad overlapping expertise between these two groups. Collaboration among specialists in different fields has greatly mitigated this problem among active bioinformaticians. However, science education researchers report that much of bioinformatics education does little to bridge the cultural divide, the curriculum too focused on solving narrow problems (e.g. interpreting pre-built phylogenetic trees) rather than on exploring broader ones (e.g. exploring alternative phylogenetic strategies for different kinds of data sets). Herein, we present an introduction to the mathematics of tree enumeration, tree construction, split decomposition and sequence alignment. We also introduce off-line downloadable software tools developed by the BioQUEST Curriculum Consortium to help students learn how to interpret and critically evaluate the results of standard bioinformatics analyses. PMID:23821621

  20. Mathematics and evolutionary biology make bioinformatics education comprehensible.

    Science.gov (United States)

    Jungck, John R; Weisstein, Anton E

    2013-09-01

    The patterns of variation within a molecular sequence data set result from the interplay between population genetic, molecular evolutionary and macroevolutionary processes-the standard purview of evolutionary biologists. Elucidating these patterns, particularly for large data sets, requires an understanding of the structure, assumptions and limitations of the algorithms used by bioinformatics software-the domain of mathematicians and computer scientists. As a result, bioinformatics often suffers a 'two-culture' problem because of the lack of broad overlapping expertise between these two groups. Collaboration among specialists in different fields has greatly mitigated this problem among active bioinformaticians. However, science education researchers report that much of bioinformatics education does little to bridge the cultural divide, the curriculum too focused on solving narrow problems (e.g. interpreting pre-built phylogenetic trees) rather than on exploring broader ones (e.g. exploring alternative phylogenetic strategies for different kinds of data sets). Herein, we present an introduction to the mathematics of tree enumeration, tree construction, split decomposition and sequence alignment. We also introduce off-line downloadable software tools developed by the BioQUEST Curriculum Consortium to help students learn how to interpret and critically evaluate the results of standard bioinformatics analyses.

  1. Bidirectional Dynamic Diversity Evolutionary Algorithm for Constrained Optimization

    Directory of Open Access Journals (Sweden)

    Weishang Gao

    2013-01-01

    Full Text Available Evolutionary algorithms (EAs were shown to be effective for complex constrained optimization problems. However, inflexible exploration-exploitation and improper penalty in EAs with penalty function would lead to losing the global optimum nearby or on the constrained boundary. To determine an appropriate penalty coefficient is also difficult in most studies. In this paper, we propose a bidirectional dynamic diversity evolutionary algorithm (Bi-DDEA with multiagents guiding exploration-exploitation through local extrema to the global optimum in suitable steps. In Bi-DDEA potential advantage is detected by three kinds of agents. The scale and the density of agents will change dynamically according to the emerging of potential optimal area, which play an important role of flexible exploration-exploitation. Meanwhile, a novel double optimum estimation strategy with objective fitness and penalty fitness is suggested to compute, respectively, the dominance trend of agents in feasible region and forbidden region. This bidirectional evolving with multiagents can not only effectively avoid the problem of determining penalty coefficient but also quickly converge to the global optimum nearby or on the constrained boundary. By examining the rapidity and veracity of Bi-DDEA across benchmark functions, the proposed method is shown to be effective.

  2. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  3. Gender approaches to evolutionary multi-objective optimization using pre-selection of criteria

    Science.gov (United States)

    Kowalczuk, Zdzisław; Białaszewski, Tomasz

    2018-01-01

    A novel idea to perform evolutionary computations (ECs) for solving highly dimensional multi-objective optimization (MOO) problems is proposed. Following the general idea of evolution, it is proposed that information about gender is used to distinguish between various groups of objectives and identify the (aggregate) nature of optimality of individuals (solutions). This identification is drawn out of the fitness of individuals and applied during parental crossover in the processes of evolutionary multi-objective optimization (EMOO). The article introduces the principles of the genetic-gender approach (GGA) and virtual gender approach (VGA), which are not just evolutionary techniques, but constitute a completely new rule (philosophy) for use in solving MOO tasks. The proposed approaches are validated against principal representatives of the EMOO algorithms of the state of the art in solving benchmark problems in the light of recognized EC performance criteria. The research shows the superiority of the gender approach in terms of effectiveness, reliability, transparency, intelligibility and MOO problem simplification, resulting in the great usefulness and practicability of GGA and VGA. Moreover, an important feature of GGA and VGA is that they alleviate the 'curse' of dimensionality typical of many engineering designs.

  4. Evolutionary psychology: new perspectives on cognition and motivation.

    Science.gov (United States)

    Cosmides, Leda; Tooby, John

    2013-01-01

    Evolutionary psychology is the second wave of the cognitive revolution. The first wave focused on computational processes that generate knowledge about the world: perception, attention, categorization, reasoning, learning, and memory. The second wave views the brain as composed of evolved computational systems, engineered by natural selection to use information to adaptively regulate physiology and behavior. This shift in focus--from knowledge acquisition to the adaptive regulation of behavior--provides new ways of thinking about every topic in psychology. It suggests a mind populated by a large number of adaptive specializations, each equipped with content-rich representations, concepts, inference systems, and regulatory variables, which are functionally organized to solve the complex problems of survival and reproduction encountered by the ancestral hunter-gatherers from whom we are descended. We present recent empirical examples that illustrate how this approach has been used to discover new features of attention, categorization, reasoning, learning, emotion, and motivation.

  5. An improved shuffled frog leaping algorithm based evolutionary framework for currency exchange rate prediction

    Science.gov (United States)

    Dash, Rajashree

    2017-11-01

    Forecasting purchasing power of one currency with respect to another currency is always an interesting topic in the field of financial time series prediction. Despite the existence of several traditional and computational models for currency exchange rate forecasting, there is always a need for developing simpler and more efficient model, which will produce better prediction capability. In this paper, an evolutionary framework is proposed by using an improved shuffled frog leaping (ISFL) algorithm with a computationally efficient functional link artificial neural network (CEFLANN) for prediction of currency exchange rate. The model is validated by observing the monthly prediction measures obtained for three currency exchange data sets such as USD/CAD, USD/CHF, and USD/JPY accumulated within same period of time. The model performance is also compared with two other evolutionary learning techniques such as Shuffled frog leaping algorithm and Particle Swarm optimization algorithm. Practical analysis of results suggest that, the proposed model developed using the ISFL algorithm with CEFLANN network is a promising predictor model for currency exchange rate prediction compared to other models included in the study.

  6. Evolutionary disarmament in interspecific competition.

    Science.gov (United States)

    Kisdi, E; Geritz, S A

    2001-12-22

    Competitive asymmetry, which is the advantage of having a larger body or stronger weaponry than a contestant, drives spectacular evolutionary arms races in intraspecific competition. Similar asymmetries are well documented in interspecific competition, yet they seldom lead to exaggerated traits. Here we demonstrate that two species with substantially different size may undergo parallel coevolution towards a smaller size under the same ecological conditions where a single species would exhibit an evolutionary arms race. We show that disarmament occurs for a wide range of parameters in an ecologically explicit model of competition for a single shared resource; disarmament also occurs in a simple Lotka-Volterra competition model. A key property of both models is the interplay between evolutionary dynamics and population density. The mechanism does not rely on very specific features of the model. Thus, evolutionary disarmament may be widespread and may help to explain the lack of interspecific arms races.

  7. ECG Signal Processing, Classification and Interpretation A Comprehensive Framework of Computational Intelligence

    CERN Document Server

    Pedrycz, Witold

    2012-01-01

    Electrocardiogram (ECG) signals are among the most important sources of diagnostic information in healthcare so improvements in their analysis may also have telling consequences. Both the underlying signal technology and a burgeoning variety of algorithms and systems developments have proved successful targets for recent rapid advances in research. ECG Signal Processing, Classification and Interpretation shows how the various paradigms of Computational Intelligence, employed either singly or in combination, can produce an effective structure for obtaining often vital information from ECG signals. Neural networks do well at capturing the nonlinear nature of the signals, information granules realized as fuzzy sets help to confer interpretability on the data and evolutionary optimization may be critical in supporting the structural development of ECG classifiers and models of ECG signals. The contributors address concepts, methodology, algorithms, and case studies and applications exploiting the paradigm of Comp...

  8. Incorporating evolutionary principles into environmental management and policy

    DEFF Research Database (Denmark)

    Lankau, Richard; Jørgensen, Peter Søgaard; Harris, David J.

    2011-01-01

    As policymakers and managers work to mitigate the effects of rapid anthropogenic environmental changes, they need to consider organisms’ responses. In light of recent evidence that evolution can be quite rapid, this now includes evolutionary responses. Evolutionary principles have a long history...... in conservation biology, and the necessary next step for the field is to consider ways in which conservation policy makers and managers can proactively manipulate evolutionary processes to achieve their goals. In this review, we aim to illustrate the potential conservation benefits of an increased understanding...... of evolutionary history and prescriptive manipulation of three basic evolutionary factors: selection, variation, and gene flow. For each, we review and propose ways that policy makers and managers can use evolutionary thinking to preserve threatened species, combat pest species, or reduce undesirable evolutionary...

  9. Knowledge Generation as Natural Computation

    Directory of Open Access Journals (Sweden)

    Gordana Dodig-Crnkovic

    2008-04-01

    Full Text Available Knowledge generation can be naturalized by adopting computational model of cognition and evolutionary approach. In this framework knowledge is seen as a result of the structuring of input data (data ? information ? knowledge by an interactive computational process going on in the agent during the adaptive interplay with the environment, which clearly presents developmental advantage by increasing agent's ability to cope with the situation dynamics. This paper addresses the mechanism of knowledge generation, a process that may be modeled as natural computation in order to be better understood and improved.

  10. Innovation dynamics of Salvadoran agri-food industry from an evolutionary perspective

    Energy Technology Data Exchange (ETDEWEB)

    Peraza Castaneda, E.H.; Aleixandre Mendizábal, G.

    2016-07-01

    This paper presents a holistic approach to analyse the dynamics of innovation of a low-tech sector in a less developed economy, the agri-food industry in El Salvador, in the context of evolutionary economy. This requires using complementary quantitative and qualitative data and methodologies to better understand how Salvadoran agri-food industry innovation system works and how STI public policies can improve the performance of a key sector in terms of national socioeconomic development. The work already done shows a concentrated and vigorous sector with some upstream and downstream connections that innovate depending on firm size, age, R&D activities and use of industrial property rights. (Author)

  11. Design of a computation tool for neutron spectrometry and dosimetry through evolutionary neural networks

    International Nuclear Information System (INIS)

    Ortiz R, J. M.; Vega C, H. R.; Martinez B, M. R.; Gallego, E.

    2009-10-01

    The neutron dosimetry is one of the most complicated tasks of radiation protection, due to it is a complex technique and highly dependent of neutron energy. One of the first devices used to perform neutron spectrometry is the system known as spectrometric system of Bonner spheres, that continuous being one of spectrometers most commonly used. This system has disadvantages such as: the components weight, the low resolution of spectrum, long and drawn out procedure for the spectra reconstruction, which require an expert user in system management, the need of use a reconstruction code as BUNKIE, SAND, etc., which are based on an iterative reconstruction algorithm and whose greatest inconvenience is that for the spectrum reconstruction, are needed to provide to system and initial spectrum as close as possible to the desired spectrum get. Consequently, researchers have mentioned the need to developed alternative measurement techniques to improve existing monitoring systems for workers. Among these alternative techniques have been reported several reconstruction procedures based on artificial intelligence techniques such as genetic algorithms, artificial neural networks and hybrid systems of evolutionary artificial neural networks using genetic algorithms. However, the use of these techniques in the nuclear science area is not free of problems, so it has been suggested that more research is conducted in such a way as to solve these disadvantages. Because they are emerging technologies, there are no tools for the results analysis, so in this paper we present first the design of a computation tool that allow to analyze the neutron spectra and equivalent doses, obtained through the hybrid technology of neural networks and genetic algorithms. This tool provides an user graphical environment, friendly, intuitive and easy of operate. The speed of program operation is high, executing the analysis in a few seconds, so it may storage and or print the obtained information for

  12. Evolutionary economics and industry location

    NARCIS (Netherlands)

    Boschma, R.A.; Frenken, K.

    2003-01-01

    This paper aims to provide the outlines of an evolutionary economic geography of industry location. We discuss two evolutionary explanations of industry location, that is, one that concentrates on spin-offs, and one that focuses attention on knowledge and agglomeration economies. We claim that both

  13. Overview of the ISAM safety assessment methodology

    International Nuclear Information System (INIS)

    Simeonov, G.

    2003-01-01

    The ISAM safety assessment methodology consists of the following key components: specification of the assessment context description of the disposal system development and justification of scenarios formulation and implementation of models running of computer codes and analysis and presentation of results. Common issues run through two or more of these assessment components, including: use of methodological and computer tools, collation and use of data, need to address various sources of uncertainty, building of confidence in the individual components, as well as the overall assessment. The importance of the iterative nature of the assessment should be recognised

  14. Evolutionary institutionalism.

    Science.gov (United States)

    Fürstenberg, Dr Kai

    Institutions are hard to define and hard to study. Long prominent in political science have been two theories: Rational Choice Institutionalism (RCI) and Historical Institutionalism (HI). Arising from the life sciences is now a third: Evolutionary Institutionalism (EI). Comparative strengths and weaknesses of these three theories warrant review, and the value-to-be-added by expanding the third beyond Darwinian evolutionary theory deserves consideration. Should evolutionary institutionalism expand to accommodate new understanding in ecology, such as might apply to the emergence of stability, and in genetics, such as might apply to political behavior? Core arguments are reviewed for each theory with more detailed exposition of the third, EI. Particular attention is paid to EI's gene-institution analogy; to variation, selection, and retention of institutional traits; to endogeneity and exogeneity; to agency and structure; and to ecosystem effects, institutional stability, and empirical limitations in behavioral genetics. RCI, HI, and EI are distinct but complementary. Institutional change, while amenable to rational-choice analysis and, retrospectively, to criticaljuncture and path-dependency analysis, is also, and importantly, ecological. Stability, like change, is an emergent property of institutions, which tend to stabilize after change in a manner analogous to allopatric speciation. EI is more than metaphorically biological in that institutional behaviors are driven by human behaviors whose evolution long preceded the appearance of institutions themselves.

  15. Evolutionary foundations for cancer biology.

    Science.gov (United States)

    Aktipis, C Athena; Nesse, Randolph M

    2013-01-01

    New applications of evolutionary biology are transforming our understanding of cancer. The articles in this special issue provide many specific examples, such as microorganisms inducing cancers, the significance of within-tumor heterogeneity, and the possibility that lower dose chemotherapy may sometimes promote longer survival. Underlying these specific advances is a large-scale transformation, as cancer research incorporates evolutionary methods into its toolkit, and asks new evolutionary questions about why we are vulnerable to cancer. Evolution explains why cancer exists at all, how neoplasms grow, why cancer is remarkably rare, and why it occurs despite powerful cancer suppression mechanisms. Cancer exists because of somatic selection; mutations in somatic cells result in some dividing faster than others, in some cases generating neoplasms. Neoplasms grow, or do not, in complex cellular ecosystems. Cancer is relatively rare because of natural selection; our genomes were derived disproportionally from individuals with effective mechanisms for suppressing cancer. Cancer occurs nonetheless for the same six evolutionary reasons that explain why we remain vulnerable to other diseases. These four principles-cancers evolve by somatic selection, neoplasms grow in complex ecosystems, natural selection has shaped powerful cancer defenses, and the limitations of those defenses have evolutionary explanations-provide a foundation for understanding, preventing, and treating cancer.

  16. Hybrid Microgrid Configuration Optimization with Evolutionary Algorithms

    Science.gov (United States)

    Lopez, Nicolas

    This dissertation explores the Renewable Energy Integration Problem, and proposes a Genetic Algorithm embedded with a Monte Carlo simulation to solve large instances of the problem that are impractical to solve via full enumeration. The Renewable Energy Integration Problem is defined as finding the optimum set of components to supply the electric demand to a hybrid microgrid. The components considered are solar panels, wind turbines, diesel generators, electric batteries, connections to the power grid and converters, which can be inverters and/or rectifiers. The methodology developed is explained as well as the combinatorial formulation. In addition, 2 case studies of a single objective optimization version of the problem are presented, in order to minimize cost and to minimize global warming potential (GWP) followed by a multi-objective implementation of the offered methodology, by utilizing a non-sorting Genetic Algorithm embedded with a monte Carlo Simulation. The method is validated by solving a small instance of the problem with known solution via a full enumeration algorithm developed by NREL in their software HOMER. The dissertation concludes that the evolutionary algorithms embedded with Monte Carlo simulation namely modified Genetic Algorithms are an efficient form of solving the problem, by finding approximate solutions in the case of single objective optimization, and by approximating the true Pareto front in the case of multiple objective optimization of the Renewable Energy Integration Problem.

  17. Methodology and computational framework used for the US Department of Energy Environmental Restoration and Waste Management Programmatic Environmental Impact Statement accident analysis

    International Nuclear Information System (INIS)

    Mueller, C.; Roglans-Ribas, J.; Folga, S.; Huttenga, A.; Jackson, R.; TenBrook, W.; Russell, J.

    1994-01-01

    A methodology, computational framework, and integrated PC-based database have been developed to assess the risks of facility accidents in support of the US Department of Energy (DOE) Environmental Restoration and Waste Management Programmatic Environmental Impact Statement. The methodology includes the following interrelated elements: (1) screening of storage and treatment processes and related waste inventories to determine risk-dominant facilities across the DOE complex, (2) development and frequency estimation of the risk-dominant sequences of accidents, and (3) determination of the evolution of and final compositions of radiological or chemically hazardous source terms predicted to be released as a function of the storage inventory or treatment process throughput. The computational framework automates these elements to provide source term input for the second part of the analysis which includes (1) development or integration of existing site-specific demographics and meteorological data and calculation of attendant unit-risk factors and (2) assessment of the radiological or toxicological consequences of accident releases to the general public and to the occupational work force

  18. Evolutionary principles and their practical application

    DEFF Research Database (Denmark)

    Hendry, A. P.; Kinnison, M. T.; Heino, M.

    2011-01-01

    Evolutionary principles are now routinely incorporated into medicine and agriculture. Examples include the design of treatments that slow the evolution of resistance by weeds, pests, and pathogens, and the design of breeding programs that maximize crop yield or quality. Evolutionary principles...... are also increasingly incorporated into conservation biology, natural resource management, and environmental science. Examples include the protection of small and isolated populations from inbreeding depression, the identification of key traits involved in adaptation to climate change, the design...... of harvesting regimes that minimize unwanted life-history evolution, and the setting of conservation priorities based on populations, species, or communities that harbor the greatest evolutionary diversity and potential. The adoption of evolutionary principles has proceeded somewhat independently...

  19. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  20. Exploring methodological frameworks for a mental task-based near-infrared spectroscopy brain-computer interface.

    Science.gov (United States)

    Weyand, Sabine; Takehara-Nishiuchi, Kaori; Chau, Tom

    2015-10-30

    Near-infrared spectroscopy (NIRS) brain-computer interfaces (BCIs) enable users to interact with their environment using only cognitive activities. This paper presents the results of a comparison of four methodological frameworks used to select a pair of tasks to control a binary NIRS-BCI; specifically, three novel personalized task paradigms and the state-of-the-art prescribed task framework were explored. Three types of personalized task selection approaches were compared, including: user-selected mental tasks using weighted slope scores (WS-scores), user-selected mental tasks using pair-wise accuracy rankings (PWAR), and researcher-selected mental tasks using PWAR. These paradigms, along with the state-of-the-art prescribed mental task framework, where mental tasks are selected based on the most commonly used tasks in literature, were tested by ten able-bodied participants who took part in five NIRS-BCI sessions. The frameworks were compared in terms of their accuracy, perceived ease-of-use, computational time, user preference, and length of training. Most notably, researcher-selected personalized tasks resulted in significantly higher accuracies, while user-selected personalized tasks resulted in significantly higher perceived ease-of-use. It was also concluded that PWAR minimized the amount of data that needed to be collected; while, WS-scores maximized user satisfaction and minimized computational time. In comparison to the state-of-the-art prescribed mental tasks, our findings show that overall, personalized tasks appear to be superior to prescribed tasks with respect to accuracy and perceived ease-of-use. The deployment of personalized rather than prescribed mental tasks ought to be considered and further investigated in future NIRS-BCI studies. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Computational intelligence in automotive applications

    Energy Technology Data Exchange (ETDEWEB)

    Prokhorov, Danil (ed.) [Toyota Motor Engineering and Manufacturing (TEMA), Ann Arbor, MI (United States). Toyota Technical Center

    2008-07-01

    What is computational intelligence (CI)? Traditionally, CI is understood as a collection of methods from the fields of neural networks (NN), fuzzy logic and evolutionary computation. This edited volume is the first of its kind, suitable to automotive researchers, engineers and students. It provides a representative sample of contemporary CI activities in the area of automotive technology. The volume consists of 13 chapters, including but not limited to these topics: vehicle diagnostics and vehicle system safety, control of vehicular systems, quality control of automotive processes, driver state estimation, safety of pedestrians, intelligent vehicles. All chapters contain overviews of state of the art, and several chapters illustrate their methodologies on examples of real-world systems. About the Editor: Danil Prokhorov began his technical career in St. Petersburg, Russia, after graduating with Honors from Saint Petersburg State University of Aerospace Instrumentation in 1992 (MS in Robotics). He worked as a research engineer in St. Petersburg Institute for Informatics and Automation, one of the institutes of the Russian Academy of Sciences. He came to the US in late 1993 for Ph.D. studies. He became involved in automotive research in 1995 when he was a Summer intern at Ford Scientific Research Lab in Dearborn, MI. Upon his graduation from the EE Department of Texas Tech University, Lubbock, in 1997, he joined Ford to pursue application-driven research on neural networks and other machine learning algorithms. While at Ford, he took part in several production-bound projects including neural network based engine misfire detection. Since 2005 he is with Toyota Technical Center, Ann Arbor, MI, overseeing important mid- and long-term research projects in computational intelligence. (orig.)

  2. Building v/s Exploring Models: Comparing Learning of Evolutionary Processes through Agent-based Modeling

    Science.gov (United States)

    Wagh, Aditi

    Two strands of work motivate the three studies in this dissertation. Evolutionary change can be viewed as a computational complex system in which a small set of rules operating at the individual level result in different population level outcomes under different conditions. Extensive research has documented students' difficulties with learning about evolutionary change (Rosengren et al., 2012), particularly in terms of levels slippage (Wilensky & Resnick, 1999). Second, though building and using computational models is becoming increasingly common in K-12 science education, we know little about how these two modalities compare. This dissertation adopts agent-based modeling as a representational system to compare these modalities in the conceptual context of micro-evolutionary processes. Drawing on interviews, Study 1 examines middle-school students' productive ways of reasoning about micro-evolutionary processes to find that the specific framing of traits plays a key role in whether slippage explanations are cued. Study 2, which was conducted in 2 schools with about 150 students, forms the crux of the dissertation. It compares learning processes and outcomes when students build their own models or explore a pre-built model. Analysis of Camtasia videos of student pairs reveals that builders' and explorers' ways of accessing rules, and sense-making of observed trends are of a different character. Builders notice rules through available blocks-based primitives, often bypassing their enactment while explorers attend to rules primarily through the enactment. Moreover, builders' sense-making of observed trends is more rule-driven while explorers' is more enactment-driven. Pre and posttests reveal that builders manifest a greater facility with accessing rules, providing explanations manifesting targeted assembly. Explorers use rules to construct explanations manifesting non-targeted assembly. Interviews reveal varying degrees of shifts away from slippage in both

  3. Gender Inequality in Interaction--An Evolutionary Account

    Science.gov (United States)

    Hopcroft, Rosemary L.

    2009-01-01

    In this article I argue that evolutionary theorizing can help sociologists and feminists better understand gender inequality. Evolutionary theory explains why control of the sexuality of young women is a priority across most human societies both past and present. Evolutionary psychology has extended our understanding of male violence against…

  4. Making evolutionary biology a basic science for medicine

    Science.gov (United States)

    Nesse, Randolph M.; Bergstrom, Carl T.; Ellison, Peter T.; Flier, Jeffrey S.; Gluckman, Peter; Govindaraju, Diddahally R.; Niethammer, Dietrich; Omenn, Gilbert S.; Perlman, Robert L.; Schwartz, Mark D.; Thomas, Mark G.; Stearns, Stephen C.; Valle, David

    2010-01-01

    New applications of evolutionary biology in medicine are being discovered at an accelerating rate, but few physicians have sufficient educational background to use them fully. This article summarizes suggestions from several groups that have considered how evolutionary biology can be useful in medicine, what physicians should learn about it, and when and how they should learn it. Our general conclusion is that evolutionary biology is a crucial basic science for medicine. In addition to looking at established evolutionary methods and topics, such as population genetics and pathogen evolution, we highlight questions about why natural selection leaves bodies vulnerable to disease. Knowledge about evolution provides physicians with an integrative framework that links otherwise disparate bits of knowledge. It replaces the prevalent view of bodies as machines with a biological view of bodies shaped by evolutionary processes. Like other basic sciences, evolutionary biology needs to be taught both before and during medical school. Most introductory biology courses are insufficient to establish competency in evolutionary biology. Premedical students need evolution courses, possibly ones that emphasize medically relevant aspects. In medical school, evolutionary biology should be taught as one of the basic medical sciences. This will require a course that reviews basic principles and specific medical applications, followed by an integrated presentation of evolutionary aspects that apply to each disease and organ system. Evolutionary biology is not just another topic vying for inclusion in the curriculum; it is an essential foundation for a biological understanding of health and disease. PMID:19918069

  5. Context dependent DNA evolutionary models

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    This paper is about stochastic models for the evolution of DNA. For a set of aligned DNA sequences, connected in a phylogenetic tree, the models should be able to explain - in probabilistic terms - the differences seen in the sequences. From the estimates of the parameters in the model one can...... start to make biologically interpretations and conclusions concerning the evolutionary forces at work. In parallel with the increase in computing power, models have become more complex. Starting with Markov processes on a space with 4 states, and extended to Markov processes with 64 states, we are today...... studying models on spaces with 4n (or 64n) number of states with n well above one hundred, say. For such models it is no longer possible to calculate the transition probability analytically, and often Markov chain Monte Carlo is used in connection with likelihood analysis. This is also the approach taken...

  6. Multi-step-prediction of chaotic time series based on co-evolutionary recurrent neural network

    International Nuclear Information System (INIS)

    Ma Qianli; Zheng Qilun; Peng Hong; Qin Jiangwei; Zhong Tanwei

    2008-01-01

    This paper proposes a co-evolutionary recurrent neural network (CERNN) for the multi-step-prediction of chaotic time series, it estimates the proper parameters of phase space reconstruction and optimizes the structure of recurrent neural networks by co-evolutionary strategy. The searching space was separated into two subspaces and the individuals are trained in a parallel computational procedure. It can dynamically combine the embedding method with the capability of recurrent neural network to incorporate past experience due to internal recurrence. The effectiveness of CERNN is evaluated by using three benchmark chaotic time series data sets: the Lorenz series, Mackey-Glass series and real-world sun spot series. The simulation results show that CERNN improves the performances of multi-step-prediction of chaotic time series

  7. Contemporary issues in evolutionary biology

    Indian Academy of Sciences (India)

    We are delighted to bring to the readers, a set of peer-reviewed papers on evolutionary biology, published as a special issue of the Journal of Genetics. These papers emanated from ruminations upon and discussions at the Foundations of. Evolutionary Theory: the Ongoing Synthesis meeting at Coorg, India, in February ...

  8. Toward a method for tracking virus evolutionary trajectory applied to the pandemic H1N1 2009 influenza virus.

    Science.gov (United States)

    Squires, R Burke; Pickett, Brett E; Das, Sajal; Scheuermann, Richard H

    2014-12-01

    In 2009 a novel pandemic H1N1 influenza virus (H1N1pdm09) emerged as the first official influenza pandemic of the 21st century. Early genomic sequence analysis pointed to the swine origin of the virus. Here we report a novel computational approach to determine the evolutionary trajectory of viral sequences that uses data-driven estimations of nucleotide substitution rates to track the gradual accumulation of observed sequence alterations over time. Phylogenetic analysis and multiple sequence alignments show that sequences belonging to the resulting evolutionary trajectory of the H1N1pdm09 lineage exhibit a gradual accumulation of sequence variations and tight temporal correlations in the topological structure of the phylogenetic trees. These results suggest that our evolutionary trajectory analysis (ETA) can more effectively pinpoint the evolutionary history of viruses, including the host and geographical location traversed by each segment, when compared against either BLAST or traditional phylogenetic analysis alone. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Multidimensional Space-Time Methodology for Development of Planetary and Space Sciences, S-T Data Management and S-T Computational Tomography

    Science.gov (United States)

    Andonov, Zdravko

    This R&D represent innovative multidimensional 6D-N(6n)D Space-Time (S-T) Methodology, 6D-6nD Coordinate Systems, 6D Equations, new 6D strategy and technology for development of Planetary Space Sciences, S-T Data Management and S-T Computational To-mography. . . The Methodology is actual for brain new RS Microwaves' Satellites and Compu-tational Tomography Systems development, aimed to defense sustainable Earth, Moon, & Sun System evolution. Especially, extremely important are innovations for monitoring and protec-tion of strategic threelateral system H-OH-H2O Hydrogen, Hydroxyl and Water), correspond-ing to RS VHRS (Very High Resolution Systems) of 1.420-1.657-22.089GHz microwaves. . . One of the Greatest Paradox and Challenge of World Science is the "transformation" of J. L. Lagrange 4D Space-Time (S-T) System to H. Minkovski 4D S-T System (O-X,Y,Z,icT) for Einstein's "Theory of Relativity". As a global result: -In contemporary Advanced Space Sciences there is not real adequate 4D-6D Space-Time Coordinate System and 6D Advanced Cosmos Strategy & Methodology for Multidimensional and Multitemporal Space-Time Data Management and Tomography. . . That's one of the top actual S-T Problems. Simple and optimal nD S-T Methodology discovery is extremely important for all Universities' Space Sci-ences' Education Programs, for advances in space research and especially -for all young Space Scientists R&D!... The top ten 21-Century Challenges ahead of Planetary and Space Sciences, Space Data Management and Computational Space Tomography, important for successfully de-velopment of Young Scientist Generations, are following: 1. R&D of W. R. Hamilton General Idea for transformation all Space Sciences to Time Sciences, beginning with 6D Eukonal for 6D anisotropic mediums & velocities. Development of IERS Earth & Space Systems (VLBI; LLR; GPS; SLR; DORIS Etc.) for Planetary-Space Data Management & Computational Planetary & Space Tomography. 2. R&D of S. W. Hawking Paradigm for 2D

  10. Evolutionary multimodal optimization using the principle of locality

    KAUST Repository

    Wong, Kachun; Wu, Chunho; Mok, Ricky; Peng, Chengbin; Zhang, Zhaolei

    2012-01-01

    The principle of locality is one of the most widely used concepts in designing computing systems. To explore the principle in evolutionary computation, crowding differential evolution is incorporated with locality for multimodal optimization. Instead of generating trial vectors randomly, the first method proposed takes advantage of spatial locality to generate trial vectors. Temporal locality is also adopted to help generate offspring in the second method proposed. Temporal and spatial locality are then applied together in the third method proposed. Numerical experiments are conducted to compare the proposed methods with the state-of-the-art methods on benchmark functions. Experimental analysis is undertaken to observe the effect of locality and the synergy between temporal locality and spatial locality. Further experiments are also conducted on two application problems. One is the varied-line-spacing holographic grating design problem, while the other is the protein structure prediction problem. The numerical results demonstrate the effectiveness of the methods proposed. © 2012 Elsevier Inc. All rights reserved.

  11. Evolutionary multimodal optimization using the principle of locality

    KAUST Repository

    Wong, Kachun

    2012-07-01

    The principle of locality is one of the most widely used concepts in designing computing systems. To explore the principle in evolutionary computation, crowding differential evolution is incorporated with locality for multimodal optimization. Instead of generating trial vectors randomly, the first method proposed takes advantage of spatial locality to generate trial vectors. Temporal locality is also adopted to help generate offspring in the second method proposed. Temporal and spatial locality are then applied together in the third method proposed. Numerical experiments are conducted to compare the proposed methods with the state-of-the-art methods on benchmark functions. Experimental analysis is undertaken to observe the effect of locality and the synergy between temporal locality and spatial locality. Further experiments are also conducted on two application problems. One is the varied-line-spacing holographic grating design problem, while the other is the protein structure prediction problem. The numerical results demonstrate the effectiveness of the methods proposed. © 2012 Elsevier Inc. All rights reserved.

  12. Evolutionary-driven support vector machines for determining the degree of liver fibrosis in chronic hepatitis C.

    Science.gov (United States)

    Stoean, Ruxandra; Stoean, Catalin; Lupsor, Monica; Stefanescu, Horia; Badea, Radu

    2011-01-01

    Hepatic fibrosis, the principal pointer to the development of a liver disease within chronic hepatitis C, can be measured through several stages. The correct evaluation of its degree, based on recent different non-invasive procedures, is of current major concern. The latest methodology for assessing it is the Fibroscan and the effect of its employment is impressive. However, the complex interaction between its stiffness indicator and the other biochemical and clinical examinations towards a respective degree of liver fibrosis is hard to be manually discovered. In this respect, the novel, well-performing evolutionary-powered support vector machines are proposed towards an automated learning of the relationship between medical attributes and fibrosis levels. The traditional support vector machines have been an often choice for addressing hepatic fibrosis, while the evolutionary option has been validated on many real-world tasks and proven flexibility and good performance. The evolutionary approach is simple and direct, resulting from the hybridization of the learning component within support vector machines and the optimization engine of evolutionary algorithms. It discovers the optimal coefficients of surfaces that separate instances of distinct classes. Apart from a detached manner of establishing the fibrosis degree for new cases, a resulting formula also offers insight upon the correspondence between the medical factors and the respective outcome. What is more, a feature selection genetic algorithm can be further embedded into the method structure, in order to dynamically concentrate search only on the most relevant attributes. The data set refers 722 patients with chronic hepatitis C infection and 24 indicators. The five possible degrees of fibrosis range from F0 (no fibrosis) to F4 (cirrhosis). Since the standard support vector machines are among the most frequently used methods in recent artificial intelligence studies for hepatic fibrosis staging, the

  13. Estimating the ratios of the stationary distribution values for Markov chains modeling evolutionary algorithms.

    Science.gov (United States)

    Mitavskiy, Boris; Cannings, Chris

    2009-01-01

    The evolutionary algorithm stochastic process is well-known to be Markovian. These have been under investigation in much of the theoretical evolutionary computing research. When the mutation rate is positive, the Markov chain modeling of an evolutionary algorithm is irreducible and, therefore, has a unique stationary distribution. Rather little is known about the stationary distribution. In fact, the only quantitative facts established so far tell us that the stationary distributions of Markov chains modeling evolutionary algorithms concentrate on uniform populations (i.e., those populations consisting of a repeated copy of the same individual). At the same time, knowing the stationary distribution may provide some information about the expected time it takes for the algorithm to reach a certain solution, assessment of the biases due to recombination and selection, and is of importance in population genetics to assess what is called a "genetic load" (see the introduction for more details). In the recent joint works of the first author, some bounds have been established on the rates at which the stationary distribution concentrates on the uniform populations. The primary tool used in these papers is the "quotient construction" method. It turns out that the quotient construction method can be exploited to derive much more informative bounds on ratios of the stationary distribution values of various subsets of the state space. In fact, some of the bounds obtained in the current work are expressed in terms of the parameters involved in all the three main stages of an evolutionary algorithm: namely, selection, recombination, and mutation.

  14. Parameterless evolutionary algorithm applied to the nuclear reload problem

    International Nuclear Information System (INIS)

    Caldas, Gustavo Henrique Flores; Schirru, Roberto

    2008-01-01

    In this work, an evolutionary algorithm with no parameters called FPBIL (parameter free PBIL) is developed based on PBIL (population-based incremental learning). Moreover, the analysis reveals how the parameters from PBIL can be replaced by self-adaptable mechanisms which appear from the radically different form by which the evolution is processed. Despite the advantages, the FPBIL reveals itself compact and relatively modest in the use of computational resources. The FPBIL is then applied to the nuclear reload problem. The experimental results observed are compared to those of other works and corroborate to affirm the superiority of the new algorithm

  15. The impact of computers on the nuclear utility industry

    International Nuclear Information System (INIS)

    Taylor, J.J.

    1984-01-01

    The applications of computer technology to the nuclear utility industry are discussed in light of recent phenomenal growth of computer hardware and software. Computer applications in existence in the power plants are presented, as well as potential future development for plant design, construction, operation, maintenance and retrofit. Utility concerns are addressed. The study concludes that the applications of computer technology to the nuclear utility industry are highly promising and evolutionary in nature

  16. Archaeogenetics in evolutionary medicine.

    Science.gov (United States)

    Bouwman, Abigail; Rühli, Frank

    2016-09-01

    Archaeogenetics is the study of exploration of ancient DNA (aDNA) of more than 70 years old. It is an important part of the wider studies of many different areas of our past, including animal, plant and pathogen evolution and domestication events. Hereby, we address specifically the impact of research in archaeogenetics in the broader field of evolutionary medicine. Studies on ancient hominid genomes help to understand even modern health patterns. Human genetic microevolution, e.g. related to abilities of post-weaning milk consumption, and specifically genetic adaptation in disease susceptibility, e.g. towards malaria and other infectious diseases, are of the upmost importance in contributions of archeogenetics on the evolutionary understanding of human health and disease. With the increase in both the understanding of modern medical genetics and the ability to deep sequence ancient genetic information, the field of archaeogenetic evolutionary medicine is blossoming.

  17. Evolutionary relevance facilitates visual information processing.

    Science.gov (United States)

    Jackson, Russell E; Calvillo, Dusti P

    2013-11-03

    Visual search of the environment is a fundamental human behavior that perceptual load affects powerfully. Previously investigated means for overcoming the inhibitions of high perceptual load, however, generalize poorly to real-world human behavior. We hypothesized that humans would process evolutionarily relevant stimuli more efficiently than evolutionarily novel stimuli, and evolutionary relevance would mitigate the repercussions of high perceptual load during visual search. Animacy is a significant component to evolutionary relevance of visual stimuli because perceiving animate entities is time-sensitive in ways that pose significant evolutionary consequences. Participants completing a visual search task located evolutionarily relevant and animate objects fastest and with the least impact of high perceptual load. Evolutionarily novel and inanimate objects were located slowest and with the highest impact of perceptual load. Evolutionary relevance may importantly affect everyday visual information processing.

  18. Is evolutionary psychology a metatheory for psychology? A discussion of four major issues in psychology from an evolutionary developmental perspective

    NARCIS (Netherlands)

    Ploeger, A.; van der Maas, H.L.J.; Raijmakers, M.E.J.

    2008-01-01

    Evolutionary psychology has been proposed as a metatheoretical framework for psychology. We argue that evolutionary psychology should be expanded if it is to offer new insights regarding the major issues in psychology. Evolutionary developmental biology can provide valuable new insights into issues

  19. Workshop on Computational Optimization

    CERN Document Server

    2016-01-01

    This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization 2014, held at Warsaw, Poland, September 7-10, 2014. The book presents recent advances in computational optimization. The volume includes important real problems like parameter settings for controlling processes in bioreactor and other processes, resource constrained project scheduling, infection distribution, molecule distance geometry, quantum computing, real-time management and optimal control, bin packing, medical image processing, localization the abrupt atmospheric contamination source and so on. It shows how to develop algorithms for them based on new metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming and others. This research demonstrates how some real-world problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks.

  20. 3rd International Conference on Frontiers of Intelligent Computing : Theory and Applications

    CERN Document Server

    Biswal, Bhabendra; Udgata, Siba; Mandal, JK

    2015-01-01

    Volume 1 contains 95 papers presented at FICTA 2014: Third International Conference on Frontiers in Intelligent Computing: Theory and Applications. The conference was held during 14-15, November, 2014 at Bhubaneswar, Odisha, India.  This volume contains papers mainly focused on Data Warehousing and Mining, Machine Learning, Mobile and Ubiquitous Computing, AI, E-commerce & Distributed Computing and Soft Computing, Evolutionary Computing, Bio-inspired Computing and its Applications.

  1. Species co-evolutionary algorithm: a novel evolutionary algorithm based on the ecology and environments for optimization

    DEFF Research Database (Denmark)

    Li, Wuzhao; Wang, Lei; Cai, Xingjuan

    2015-01-01

    and affect each other in many ways. The relationships include competition, predation, parasitism, mutualism and pythogenesis. In this paper, we consider the five relationships between solutions to propose a co-evolutionary algorithm termed species co-evolutionary algorithm (SCEA). In SCEA, five operators...

  2. Remedial action assessment system (RAAS) - A computer-based methodology for conducting feasibility studies

    International Nuclear Information System (INIS)

    Buelt, J.L.; Stottlemyre, J.A.; White, M.K.

    1991-01-01

    Because of the great complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, the DOE is supporting the development of a computer-based methodology to streamline the remedial investigation/feasibility study process required for DOE operable units. DOE operable units are generally more complex in nature because of the existence of multiple waste sites within many of the operable units and the presence of mixed radioactive and hazardous chemical wastes. Consequently, Pacific Northwest Laboratory (PNL) is developing the Remedial Action Assessment System (RAAS), which is aimed at screening, linking, and evaluating established technology process options in support of conducting feasibility studies under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). It is also intended to do the same in support of corrective measures studies required by the Resource Conservation and Recovery Act (RCRA). One of the greatest attributes of the RAAS project is that the computer interface with the user is being designed to be friendly, intuitive, and interactive. Consequently, the user interface employs menus, windows, help features, and graphical information while RAAS is in operation. During operation, each technology process option is represented by an open-quotes objectclose quotes module. Object-oriented programming is then used to link these unit processes into remedial alternatives. In this way, various object modules representing technology process options can communicate so that a linked set of compatible processes form an appropriate remedial alternative. Once the remedial alternatives are formed, they can be evaluated in terms of effectiveness, implementability, and cost

  3. Organizational information assets classification model and security architecture methodology

    Directory of Open Access Journals (Sweden)

    Mostafa Tamtaji

    2015-12-01

    Full Text Available Today's, Organizations are exposed with huge and diversity of information and information assets that are produced in different systems shuch as KMS, financial and accounting systems, official and industrial automation sysytems and so on and protection of these information is necessary. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released.several benefits of this model cuses that organization has a great trend to implementing Cloud computing. Maintaining and management of information security is the main challenges in developing and accepting of this model. In this paper, at first, according to "design science research methodology" and compatible with "design process at information systems research", a complete categorization of organizational assets, including 355 different types of information assets in 7 groups and 3 level, is presented to managers be able to plan corresponding security controls according to importance of each groups. Then, for directing of organization to architect it’s information security in cloud computing environment, appropriate methodology is presented. Presented cloud computing security architecture , resulted proposed methodology, and presented classification model according to Delphi method and expers comments discussed and verified.

  4. An Improved Evolutionary Programming with Voting and Elitist Dispersal Scheme

    Science.gov (United States)

    Maity, Sayan; Gunjan, Kumar; Das, Swagatam

    Although initially conceived for evolving finite state machines, Evolutionary Programming (EP), in its present form, is largely used as a powerful real parameter optimizer. For function optimization, EP mainly relies on its mutation operators. Over past few years several mutation operators have been proposed to improve the performance of EP on a wide variety of numerical benchmarks. However, unlike real-coded GAs, there has been no fitness-induced bias in parent selection for mutation in EP. That means the i-th population member is selected deterministically for mutation and creation of the i-th offspring in each generation. In this article we present an improved EP variant called Evolutionary Programming with Voting and Elitist Dispersal (EPVE). The scheme encompasses a voting process which not only gives importance to best solutions but also consider those solutions which are converging fast. By introducing Elitist Dispersal Scheme we maintain the elitism by keeping the potential solutions intact and other solutions are perturbed accordingly, so that those come out of the local minima. By applying these two techniques we can be able to explore those regions which have not been explored so far that may contain optima. Comparison with the recent and best-known versions of EP over 25 benchmark functions from the CEC (Congress on Evolutionary Computation) 2005 test-suite for real parameter optimization reflects the superiority of the new scheme in terms of final accuracy, speed, and robustness.

  5. Fixation Time for Evolutionary Graphs

    Science.gov (United States)

    Nie, Pu-Yan; Zhang, Pei-Ai

    Evolutionary graph theory (EGT) is recently proposed by Lieberman et al. in 2005. EGT is successful for explaining biological evolution and some social phenomena. It is extremely important to consider the time of fixation for EGT in many practical problems, including evolutionary theory and the evolution of cooperation. This study characterizes the time to asymptotically reach fixation.

  6. Discovery radiomics via evolutionary deep radiomic sequencer discovery for pathologically proven lung cancer detection.

    Science.gov (United States)

    Shafiee, Mohammad Javad; Chung, Audrey G; Khalvati, Farzad; Haider, Masoom A; Wong, Alexander

    2017-10-01

    While lung cancer is the second most diagnosed form of cancer in men and women, a sufficiently early diagnosis can be pivotal in patient survival rates. Imaging-based, or radiomics-driven, detection methods have been developed to aid diagnosticians, but largely rely on hand-crafted features that may not fully encapsulate the differences between cancerous and healthy tissue. Recently, the concept of discovery radiomics was introduced, where custom abstract features are discovered from readily available imaging data. We propose an evolutionary deep radiomic sequencer discovery approach based on evolutionary deep intelligence. Motivated by patient privacy concerns and the idea of operational artificial intelligence, the evolutionary deep radiomic sequencer discovery approach organically evolves increasingly more efficient deep radiomic sequencers that produce significantly more compact yet similarly descriptive radiomic sequences over multiple generations. As a result, this framework improves operational efficiency and enables diagnosis to be run locally at the radiologist's computer while maintaining detection accuracy. We evaluated the evolved deep radiomic sequencer (EDRS) discovered via the proposed evolutionary deep radiomic sequencer discovery framework against state-of-the-art radiomics-driven and discovery radiomics methods using clinical lung CT data with pathologically proven diagnostic data from the LIDC-IDRI dataset. The EDRS shows improved sensitivity (93.42%), specificity (82.39%), and diagnostic accuracy (88.78%) relative to previous radiomics approaches.

  7. MEASURING THE EVOLUTIONARY RATE OF COOLING OF ZZ Ceti

    Energy Technology Data Exchange (ETDEWEB)

    Mukadam, Anjum S.; Fraser, Oliver; Riecken, T. S.; Kronberg, M. E. [Department of Astronomy, University of Washington, Seattle, WA 98195 (United States); Bischoff-Kim, Agnes [Georgia College and State University, Milledgeville, GA 31061 (United States); Corsico, A. H. [Facultad de Ciencias Astronomicas y Geofisicas, Universidad Nacional de La Plata (Argentina); Montgomery, M. H.; Winget, D. E.; Hermes, J. J.; Winget, K. I.; Falcon, Ross E.; Reaves, D. [Department of Astronomy, University of Texas at Austin, Austin, TX 78759 (United States); Kepler, S. O.; Romero, A. D. [Universidade Federal do Rio Grande do Sul, Porto Alegre 91501-970, RS (Brazil); Chandler, D. W. [Meyer Observatory, Central Texas Astronomical Society, 3409 Whispering Oaks, Temple, TX 76504 (United States); Kuehne, J. W. [McDonald Observatory, Fort Davis, TX 79734 (United States); Sullivan, D. J. [Victoria University of Wellington, P.O. Box 600, Wellington (New Zealand); Von Hippel, T. [Embry-Riddle Aeronautical University, 600 South Clyde Morris Boulevard, Daytona Beach, FL 32114 (United States); Mullally, F. [SETI Institute, NASA Ames Research Center, MS 244-30, Moffet Field, CA 94035 (United States); Shipman, H. [Delaware Asteroseismic Research Center, Mt. Cuba Observatory, Greenville, DE 19807 (United States); and others

    2013-07-01

    We have finally measured the evolutionary rate of cooling of the pulsating hydrogen atmosphere (DA) white dwarf ZZ Ceti (Ross 548), as reflected by the drift rate of the 213.13260694 s period. Using 41 yr of time-series photometry from 1970 November to 2012 January, we determine the rate of change of this period with time to be dP/dt = (5.2 {+-} 1.4) Multiplication-Sign 10{sup -15} s s{sup -1} employing the O - C method and (5.45 {+-} 0.79) Multiplication-Sign 10{sup -15} s s{sup -1} using a direct nonlinear least squares fit to the entire lightcurve. We adopt the dP/dt obtained from the nonlinear least squares program as our final determination, but augment the corresponding uncertainty to a more realistic value, ultimately arriving at the measurement of dP/dt = (5.5 {+-} 1.0) Multiplication-Sign 10{sup -15} s s{sup -1}. After correcting for proper motion, the evolutionary rate of cooling of ZZ Ceti is computed to be (3.3 {+-} 1.1) Multiplication-Sign 10{sup -15} s s{sup -1}. This value is consistent within uncertainties with the measurement of (4.19 {+-} 0.73) Multiplication-Sign 10{sup -15} s s{sup -1} for another similar pulsating DA white dwarf, G 117-B15A. Measuring the cooling rate of ZZ Ceti helps us refine our stellar structure and evolutionary models, as cooling depends mainly on the core composition and stellar mass. Calibrating white dwarf cooling curves with this measurement will reduce the theoretical uncertainties involved in white dwarf cosmochronometry. Should the 213.13 s period be trapped in the hydrogen envelope, then our determination of its drift rate compared to the expected evolutionary rate suggests an additional source of stellar cooling. Attributing the excess cooling to the emission of axions imposes a constraint on the mass of the hypothetical axion particle.

  8. New methodology for fast prediction of wheel wear evolution

    Science.gov (United States)

    Apezetxea, I. S.; Perez, X.; Casanueva, C.; Alonso, A.

    2017-07-01

    In railway applications wear prediction in the wheel-rail interface is a fundamental matter in order to study problems such as wheel lifespan and the evolution of vehicle dynamic characteristic with time. However, one of the principal drawbacks of the existing methodologies for calculating the wear evolution is the computational cost. This paper proposes a new wear prediction methodology with a reduced computational cost. This methodology is based on two main steps: the first one is the substitution of the calculations over the whole network by the calculation of the contact conditions in certain characteristic point from whose result the wheel wear evolution can be inferred. The second one is the substitution of the dynamic calculation (time integration calculations) by the quasi-static calculation (the solution of the quasi-static situation of a vehicle at a certain point which is the same that neglecting the acceleration terms in the dynamic equations). These simplifications allow a significant reduction of computational cost to be obtained while maintaining an acceptable level of accuracy (error order of 5-10%). Several case studies are analysed along the paper with the objective of assessing the proposed methodology. The results obtained in the case studies allow concluding that the proposed methodology is valid for an arbitrary vehicle running through an arbitrary track layout.

  9. Evolutionary Developmental Soft Robotics As a Framework to Study Intelligence and Adaptive Behavior in Animals and Plants

    Directory of Open Access Journals (Sweden)

    Francesco Corucci

    2017-07-01

    Full Text Available In this paper, a comprehensive methodology and simulation framework will be reviewed, designed in order to study the emergence of adaptive and intelligent behavior in generic soft-bodied creatures. By incorporating artificial evolutionary and developmental processes, the system allows to evolve complete creatures (brain, body, developmental properties, sensory, control system, etc. for different task environments. Whether the evolved creatures will resemble animals or plants is in general not known a priori, and depends on the specific task environment set up by the experimenter. In this regard, the system may offer a unique opportunity to explore differences and similarities between these two worlds. Different material properties can be simulated and optimized, from a continuum of soft/stiff materials, to the interconnection of heterogeneous structures, both found in animals and plants alike. The adopted genetic encoding and simulation environment are particularly suitable in order to evolve distributed sensory and control systems, which play a particularly important role in plants. After a general description of the system some case studies will be presented, focusing on the emergent properties of the evolved creatures. Particular emphasis will be on some unifying concepts that are thought to play an important role in the emergence of intelligent and adaptive behavior across both the animal and plant kingdoms, such as morphological computation and morphological developmental plasticity. Overall, with this paper, we hope to draw attention on set of tools, methodologies, ideas and results, which may be relevant to researchers interested in plant-inspired robotics and intelligence.

  10. Probabilistic methodology for turbine missile risk analysis

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.; Frank, R.A.

    1984-01-01

    A methodology has been developed for estimation of the probabilities of turbine-generated missile damage to nuclear power plant structures and systems. Mathematical models of the missile generation, transport, and impact events have been developed and sequenced to form an integrated turbine missile simulation methodology. Probabilistic Monte Carlo techniques are used to estimate the plant impact and damage probabilities. The methodology has been coded in the TURMIS computer code to facilitate numerical analysis and plant-specific turbine missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and probabilities have been estimated for a hypothetical nuclear power plant case study. (orig.)

  11. Deterministic network interdiction optimization via an evolutionary approach

    International Nuclear Information System (INIS)

    Rocco S, Claudio M.; Ramirez-Marquez, Jose Emmanuel

    2009-01-01

    This paper introduces an evolutionary optimization approach that can be readily applied to solve deterministic network interdiction problems. The network interdiction problem solved considers the minimization of the maximum flow that can be transmitted between a source node and a sink node for a fixed network design when there is a limited amount of resources available to interdict network links. Furthermore, the model assumes that the nominal capacity of each network link and the cost associated with their interdiction can change from link to link. For this problem, the solution approach developed is based on three steps that use: (1) Monte Carlo simulation, to generate potential network interdiction strategies, (2) Ford-Fulkerson algorithm for maximum s-t flow, to analyze strategies' maximum source-sink flow and, (3) an evolutionary optimization technique to define, in probabilistic terms, how likely a link is to appear in the final interdiction strategy. Examples for different sizes of networks and network behavior are used throughout the paper to illustrate the approach. In terms of computational effort, the results illustrate that solutions are obtained from a significantly restricted solution search space. Finally, the authors discuss the need for a reliability perspective to network interdiction, so that solutions developed address more realistic scenarios of such problem

  12. Temperature-based estimation of global solar radiation using soft computing methodologies

    Science.gov (United States)

    Mohammadi, Kasra; Shamshirband, Shahaboddin; Danesh, Amir Seyed; Abdullah, Mohd Shahidan; Zamani, Mazdak

    2016-07-01

    Precise knowledge of solar radiation is indeed essential in different technological and scientific applications of solar energy. Temperature-based estimation of global solar radiation would be appealing owing to broad availability of measured air temperatures. In this study, the potentials of soft computing techniques are evaluated to estimate daily horizontal global solar radiation (DHGSR) from measured maximum, minimum, and average air temperatures ( T max, T min, and T avg) in an Iranian city. For this purpose, a comparative evaluation between three methodologies of adaptive neuro-fuzzy inference system (ANFIS), radial basis function support vector regression (SVR-rbf), and polynomial basis function support vector regression (SVR-poly) is performed. Five combinations of T max, T min, and T avg are served as inputs to develop ANFIS, SVR-rbf, and SVR-poly models. The attained results show that all ANFIS, SVR-rbf, and SVR-poly models provide favorable accuracy. Based upon all techniques, the higher accuracies are achieved by models (5) using T max- T min and T max as inputs. According to the statistical results, SVR-rbf outperforms SVR-poly and ANFIS. For SVR-rbf (5), the mean absolute bias error, root mean square error, and correlation coefficient are 1.1931 MJ/m2, 2.0716 MJ/m2, and 0.9380, respectively. The survey results approve that SVR-rbf can be used efficiently to estimate DHGSR from air temperatures.

  13. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  14. Evolutionary Relevance Facilitates Visual Information Processing

    Directory of Open Access Journals (Sweden)

    Russell E. Jackson

    2013-07-01

    Full Text Available Visual search of the environment is a fundamental human behavior that perceptual load affects powerfully. Previously investigated means for overcoming the inhibitions of high perceptual load, however, generalize poorly to real-world human behavior. We hypothesized that humans would process evolutionarily relevant stimuli more efficiently than evolutionarily novel stimuli, and evolutionary relevance would mitigate the repercussions of high perceptual load during visual search. Animacy is a significant component to evolutionary relevance of visual stimuli because perceiving animate entities is time-sensitive in ways that pose significant evolutionary consequences. Participants completing a visual search task located evolutionarily relevant and animate objects fastest and with the least impact of high perceptual load. Evolutionarily novel and inanimate objects were located slowest and with the highest impact of perceptual load. Evolutionary relevance may importantly affect everyday visual information processing.

  15. Quantum Genetic Algorithms for Computer Scientists

    OpenAIRE

    Lahoz Beltrá, Rafael

    2016-01-01

    Genetic algorithms (GAs) are a class of evolutionary algorithms inspired by Darwinian natural selection. They are popular heuristic optimisation methods based on simulated genetic mechanisms, i.e., mutation, crossover, etc. and population dynamical processes such as reproduction, selection, etc. Over the last decade, the possibility to emulate a quantum computer (a computer using quantum-mechanical phenomena to perform operations on data) has led to a new class of GAs known as “Quantum Geneti...

  16. Evolutionary games on graphs

    Science.gov (United States)

    Szabó, György; Fáth, Gábor

    2007-07-01

    Game theory is one of the key paradigms behind many scientific disciplines from biology to behavioral sciences to economics. In its evolutionary form and especially when the interacting agents are linked in a specific social network the underlying solution concepts and methods are very similar to those applied in non-equilibrium statistical physics. This review gives a tutorial-type overview of the field for physicists. The first four sections introduce the necessary background in classical and evolutionary game theory from the basic definitions to the most important results. The fifth section surveys the topological complications implied by non-mean-field-type social network structures in general. The next three sections discuss in detail the dynamic behavior of three prominent classes of models: the Prisoner's Dilemma, the Rock-Scissors-Paper game, and Competing Associations. The major theme of the review is in what sense and how the graph structure of interactions can modify and enrich the picture of long term behavioral patterns emerging in evolutionary games.

  17. Minimizing the symbol-error-rate for amplify-and-forward relaying systems using evolutionary algorithms

    KAUST Repository

    Ahmed, Qasim Zeeshan

    2015-02-01

    In this paper, a new detector is proposed for an amplify-and-forward (AF) relaying system. The detector is designed to minimize the symbol-error-rate (SER) of the system. The SER surface is non-linear and may have multiple minimas, therefore, designing an SER detector for cooperative communications becomes an optimization problem. Evolutionary based algorithms have the capability to find the global minima, therefore, evolutionary algorithms such as particle swarm optimization (PSO) and differential evolution (DE) are exploited to solve this optimization problem. The performance of proposed detectors is compared with the conventional detectors such as maximum likelihood (ML) and minimum mean square error (MMSE) detector. In the simulation results, it can be observed that the SER performance of the proposed detectors is less than 2 dB away from the ML detector. Significant improvement in SER performance is also observed when comparing with the MMSE detector. The computational complexity of the proposed detector is much less than the ML and MMSE algorithms. Moreover, in contrast to ML and MMSE detectors, the computational complexity of the proposed detectors increases linearly with respect to the number of relays.

  18. A Note on Evolutionary Algorithms and Its Applications

    Science.gov (United States)

    Bhargava, Shifali

    2013-01-01

    This paper introduces evolutionary algorithms with its applications in multi-objective optimization. Here elitist and non-elitist multiobjective evolutionary algorithms are discussed with their advantages and disadvantages. We also discuss constrained multiobjective evolutionary algorithms and their applications in various areas.

  19. Upon Accounting for the Impact of Isoenzyme Loss, Gene Deletion Costs Anticorrelate with Their Evolutionary Rates.

    Directory of Open Access Journals (Sweden)

    Christopher Jacobs

    Full Text Available System-level metabolic network models enable the computation of growth and metabolic phenotypes from an organism's genome. In particular, flux balance approaches have been used to estimate the contribution of individual metabolic genes to organismal fitness, offering the opportunity to test whether such contributions carry information about the evolutionary pressure on the corresponding genes. Previous failure to identify the expected negative correlation between such computed gene-loss cost and sequence-derived evolutionary rates in Saccharomyces cerevisiae has been ascribed to a real biological gap between a gene's fitness contribution to an organism "here and now" and the same gene's historical importance as evidenced by its accumulated mutations over millions of years of evolution. Here we show that this negative correlation does exist, and can be exposed by revisiting a broadly employed assumption of flux balance models. In particular, we introduce a new metric that we call "function-loss cost", which estimates the cost of a gene loss event as the total potential functional impairment caused by that loss. This new metric displays significant negative correlation with evolutionary rate, across several thousand minimal environments. We demonstrate that the improvement gained using function-loss cost over gene-loss cost is explained by replacing the base assumption that isoenzymes provide unlimited capacity for backup with the assumption that isoenzymes are completely non-redundant. We further show that this change of the assumption regarding isoenzymes increases the recall of epistatic interactions predicted by the flux balance model at the cost of a reduction in the precision of the predictions. In addition to suggesting that the gene-to-reaction mapping in genome-scale flux balance models should be used with caution, our analysis provides new evidence that evolutionary gene importance captures much more than strict essentiality.

  20. Upon Accounting for the Impact of Isoenzyme Loss, Gene Deletion Costs Anticorrelate with Their Evolutionary Rates.

    Science.gov (United States)

    Jacobs, Christopher; Lambourne, Luke; Xia, Yu; Segrè, Daniel

    2017-01-01

    System-level metabolic network models enable the computation of growth and metabolic phenotypes from an organism's genome. In particular, flux balance approaches have been used to estimate the contribution of individual metabolic genes to organismal fitness, offering the opportunity to test whether such contributions carry information about the evolutionary pressure on the corresponding genes. Previous failure to identify the expected negative correlation between such computed gene-loss cost and sequence-derived evolutionary rates in Saccharomyces cerevisiae has been ascribed to a real biological gap between a gene's fitness contribution to an organism "here and now" and the same gene's historical importance as evidenced by its accumulated mutations over millions of years of evolution. Here we show that this negative correlation does exist, and can be exposed by revisiting a broadly employed assumption of flux balance models. In particular, we introduce a new metric that we call "function-loss cost", which estimates the cost of a gene loss event as the total potential functional impairment caused by that loss. This new metric displays significant negative correlation with evolutionary rate, across several thousand minimal environments. We demonstrate that the improvement gained using function-loss cost over gene-loss cost is explained by replacing the base assumption that isoenzymes provide unlimited capacity for backup with the assumption that isoenzymes are completely non-redundant. We further show that this change of the assumption regarding isoenzymes increases the recall of epistatic interactions predicted by the flux balance model at the cost of a reduction in the precision of the predictions. In addition to suggesting that the gene-to-reaction mapping in genome-scale flux balance models should be used with caution, our analysis provides new evidence that evolutionary gene importance captures much more than strict essentiality.

  1. Steganography: LSB Methodology

    Science.gov (United States)

    2012-08-02

    of LSB steganography in grayscale and color images . In J. Dittmann, K. Nahrstedt, and P. Wohlmacher, editors, Proceedings of the ACM, Special...Fridrich, M. Gojan and R. Du paper titled “Reliable detection of LSB steganography in grayscale and color images ”. From a general perspective Figure 2...REPORT Steganography : LSB Methodology (Progress Report) 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: In computer science, steganography is the science

  2. Neuroscience, brains, and computers

    Directory of Open Access Journals (Sweden)

    Giorno Maria Innocenti

    2013-07-01

    Full Text Available This paper addresses the role of the neurosciences in establishing what the brain is and how states of the brain relate to states of the mind. The brain is viewed as a computational deviceperforming operations on symbols. However, the brain is a special purpose computational devicedesigned by evolution and development for survival and reproduction, in close interaction with theenvironment. The hardware of the brain (its structure is very different from that of man-made computers.The computational style of the brain is also very different from traditional computers: the computationalalgorithms, instead of being sets of external instructions, are embedded in brain structure. Concerningthe relationships between brain and mind a number of questions lie ahead. One of them is why andhow, only the human brain grasped the notion of God, probably only at the evolutionary stage attainedby Homo sapiens.

  3. Diversity-Guided Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Ursem, Rasmus Kjær

    2002-01-01

    Population diversity is undoubtably a key issue in the performance of evolutionary algorithms. A common hypothesis is that high diversity is important to avoid premature convergence and to escape local optima. Various diversity measures have been used to analyze algorithms, but so far few...... algorithms have used a measure to guide the search. The diversity-guided evolutionary algorithm (DGEA) uses the wellknown distance-to-average-point measure to alternate between phases of exploration (mutation) and phases of exploitation (recombination and selection). The DGEA showed remarkable results...

  4. EVALUATION OF THE GRAI INTEGRATED METHODOLOGY AND THE IMAGIM SUPPORTWARE

    Directory of Open Access Journals (Sweden)

    J.M.C. Reid

    2012-01-01

    Full Text Available This paper describes the GRAI Integrated Methodology and identifies the need for computer tools to support enterprise modelling,design and integration. The IMAGIM tool is then evaluated in terms of its ability to support the GRAI Integrated Methodology. The GRAI Integrated Methodology is an Enterprise Integration methodology developed to support the design of CIM systems . The GRAI Integrated Methodology consists of the GRAI model and a structured approach. The latest addition to the methodology is the IMAGIM software tool developed by the GRAI research group for the specific purpose of supporting the methodology.

  5. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  6. Methodology is more than research design and technology.

    Science.gov (United States)

    Proctor, Robert W

    2005-05-01

    The Society for Computers in Psychology has been at the forefront of disseminating information about advances in computer technology and their applications for psychologists. Although technological advances, as well as clean research designs, are key contributors to progress in psychological research, the justification of methodological rules for interpreting data and making theory choices is at least as important. Historically, methodological beliefs and practices have been justified through intuition and logic, an approach known as foundationism. However, naturalism, a modern approach in the philosophy of science inspired by the work of Thomas S. Kuhn, indicates that all aspects of scientific practice, including its methodology, should be evaluated empirically. This article examines implications of the naturalistic approach for psychological research methods in general and for the current debate that is often framed as one of qualitative versus quantitative methods.

  7. Evolutionary analyses of non-genealogical bonds produced by introgressive descent.

    Science.gov (United States)

    Bapteste, Eric; Lopez, Philippe; Bouchard, Frédéric; Baquero, Fernando; McInerney, James O; Burian, Richard M

    2012-11-06

    All evolutionary biologists are familiar with evolutionary units that evolve by vertical descent in a tree-like fashion in single lineages. However, many other kinds of processes contribute to evolutionary diversity. In vertical descent, the genetic material of a particular evolutionary unit is propagated by replication inside its own lineage. In what we call introgressive descent, the genetic material of a particular evolutionary unit propagates into different host structures and is replicated within these host structures. Thus, introgressive descent generates a variety of evolutionary units and leaves recognizable patterns in resemblance networks. We characterize six kinds of evolutionary units, of which five involve mosaic lineages generated by introgressive descent. To facilitate detection of these units in resemblance networks, we introduce terminology based on two notions, P3s (subgraphs of three nodes: A, B, and C) and mosaic P3s, and suggest an apparatus for systematic detection of introgressive descent. Mosaic P3s correspond to a distinct type of evolutionary bond that is orthogonal to the bonds of kinship and genealogy usually examined by evolutionary biologists. We argue that recognition of these evolutionary bonds stimulates radical rethinking of key questions in evolutionary biology (e.g., the relations among evolutionary players in very early phases of evolutionary history, the origin and emergence of novelties, and the production of new lineages). This line of research will expand the study of biological complexity beyond the usual genealogical bonds, revealing additional sources of biodiversity. It provides an important step to a more realistic pluralist treatment of evolutionary complexity.

  8. Computer Class Role Playing Games, an innovative teaching methodology based on STEM and ICT: first experimental results

    Science.gov (United States)

    Maraffi, S.

    2016-12-01

    Context/PurposeWe experienced a new teaching and learning technology: a Computer Class Role Playing Game (RPG) to perform educational activity in classrooms through an interactive game. This approach is new, there are some experiences on educational games, but mainly individual and not class-based. Gaming all together in a class, with a single scope for the whole class, it enhances peer collaboration, cooperative problem solving and friendship. MethodsTo perform the research we experimented the games in several classes of different degrees, acquiring specific questionnaire by teachers and pupils. Results Experimental results were outstanding: RPG, our interactive activity, exceed by 50% the overall satisfaction compared to traditional lessons or Power Point supported teaching. InterpretationThe appreciation of RPG was in agreement with the class level outcome identified by the teacher after the experimentation. Our work experience get excellent feedbacks by teachers, in terms of efficacy of this new teaching methodology and of achieved results. Using new methodology more close to the student point of view improves the innovation and creative capacities of learners, and it support the new role of teacher as learners' "coach". ConclusionThis paper presents the first experimental results on the application of this new technology based on a Computer game which project on a wall in the class an adventure lived by the students. The plots of the actual adventures are designed for deeper learning of Science, Technology, Engineering, Mathematics (STEM) and Social Sciences & Humanities (SSH). The participation of the pupils it's based on the interaction with the game by the use of their own tablets or smartphones. The game is based on a mixed reality learning environment, giving the students the feel "to be IN the adventure".

  9. Research Methodology in Global Strategy Research

    DEFF Research Database (Denmark)

    Cuervo-Cazurra, Alvaro; Mudambi, Ram; Pedersen, Torben

    2017-01-01

    We review advances in research methodology used in global strategy research and provide suggestions on how researchers can improve their analyses and arguments. Methodological advances in the extraction of information, such as computer-aided text analysis, and in the analysis of datasets......, such as differences-in-differences and propensity score matching, have helped deal with challenges (e.g., endogeneity and causality) that bedeviled earlier studies and resulted in conflicting findings. These methodological advances need to be considered as tools that complement theoretical arguments and well......-explained logics and mechanisms so that researchers can provide better and more relevant recommendations to managers designing the global strategies of their organizations....

  10. Evolutionary perspectives on ageing.

    Science.gov (United States)

    Reichard, Martin

    2017-10-01

    From an evolutionary perspective, ageing is a decrease in fitness with chronological age - expressed by an increase in mortality risk and/or decline in reproductive success and mediated by deterioration of functional performance. While this makes ageing intuitively paradoxical - detrimental to individual fitness - evolutionary theory offers answers as to why ageing has evolved. In this review, I first briefly examine the classic evolutionary theories of ageing and their empirical tests, and highlight recent findings that have advanced our understanding of the evolution of ageing (condition-dependent survival, positive pleiotropy). I then provide an overview of recent theoretical extensions and modifications that accommodate those new discoveries. I discuss the role of indeterminate (asymptotic) growth for lifetime increases in fecundity and ageing trajectories. I outline alternative views that challenge a universal existence of senescence - namely the lack of a germ-soma distinction and the ability of tissue replacement and retrogression to younger developmental stages in modular organisms. I argue that rejuvenation at the organismal level is plausible, but includes a return to a simple developmental stage. This may exempt a particular genotype from somatic defects but, correspondingly, removes any information acquired during development. A resolution of the question of whether a rejuvenated individual is the same entity is central to the recognition of whether current evolutionary theories of ageing, with their extensions and modifications, can explain the patterns of ageing across the Tree of Life. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Conjugate gradient based projection - A new explicit methodology for frictional contact

    Science.gov (United States)

    Tamma, Kumar K.; Li, Maocheng; Sha, Desong

    1993-01-01

    With special attention towards the applicability to parallel computation or vectorization, a new and effective explicit approach for linear complementary formulations involving a conjugate gradient based projection methodology is proposed in this study for contact problems with Coulomb friction. The overall objectives are focussed towards providing an explicit methodology of computation for the complete contact problem with friction. In this regard, the primary idea for solving the linear complementary formulations stems from an established search direction which is projected to a feasible region determined by the non-negative constraint condition; this direction is then applied to the Fletcher-Reeves conjugate gradient method resulting in a powerful explicit methodology which possesses high accuracy, excellent convergence characteristics, fast computational speed and is relatively simple to implement for contact problems involving Coulomb friction.

  12. Methodology for the hybrid solution of systems of differential equations

    International Nuclear Information System (INIS)

    Larrinaga, E.F.; Lopez, M.A.

    1993-01-01

    This work shows a general methodology of solution to systems of differential equations in hybrid computers. Taking into account this methodology, a mathematical model was elaborated. It offers wide possibilities of recording and handling the results on the basis of using the hybrid system IBM-VIDAC 1224 which the ISCTN has. It also presents the results gained when simulating a simple model of a nuclear reactor, which was used in the validation of the results of the computational model

  13. Interpreting Evolutionary Diagrams: When Topology and Process Conflict

    Science.gov (United States)

    Catley, Kefyn M.; Novick, Laura R.; Shade, Courtney K.

    2010-01-01

    The authors argue that some diagrams in biology textbooks and the popular press presented as depicting evolutionary relationships suggest an inappropriate (anagenic) conception of evolutionary history. The goal of this research was to provide baseline data that begin to document how college students conceptualize the evolutionary relationships…

  14. Establishing a standard calibration methodology for MOSFET detectors in computed tomography dosimetry

    International Nuclear Information System (INIS)

    Brady, S. L.; Kaufman, R. A.

    2012-01-01

    Purpose: The use of metal-oxide-semiconductor field-effect transistor (MOSFET) detectors for patient dosimetry has increased by ∼25% since 2005. Despite this increase, no standard calibration methodology has been identified nor calibration uncertainty quantified for the use of MOSFET dosimetry in CT. This work compares three MOSFET calibration methodologies proposed in the literature, and additionally investigates questions relating to optimal time for signal equilibration and exposure levels for maximum calibration precision. Methods: The calibration methodologies tested were (1) free in-air (FIA) with radiographic x-ray tube, (2) FIA with stationary CT x-ray tube, and (3) within scatter phantom with rotational CT x-ray tube. Each calibration was performed at absorbed dose levels of 10, 23, and 35 mGy. Times of 0 min or 5 min were investigated for signal equilibration before or after signal read out. Results: Calibration precision was measured to be better than 5%–7%, 3%–5%, and 2%–4% for the 10, 23, and 35 mGy respective dose levels, and independent of calibration methodology. No correlation was demonstrated for precision and signal equilibration time when allowing 5 min before or after signal read out. Differences in average calibration coefficients were demonstrated between the FIA with CT calibration methodology 26.7 ± 1.1 mV cGy −1 versus the CT scatter phantom 29.2 ± 1.0 mV cGy −1 and FIA with x-ray 29.9 ± 1.1 mV cGy −1 methodologies. A decrease in MOSFET sensitivity was seen at an average change in read out voltage of ∼3000 mV. Conclusions: The best measured calibration precision was obtained by exposing the MOSFET detectors to 23 mGy. No signal equilibration time is necessary to improve calibration precision. A significant difference between calibration outcomes was demonstrated for FIA with CT compared to the other two methodologies. If the FIA with a CT calibration methodology was used to create calibration coefficients for the

  15. Establishing a standard calibration methodology for MOSFET detectors in computed tomography dosimetry.

    Science.gov (United States)

    Brady, S L; Kaufman, R A

    2012-06-01

    The use of metal-oxide-semiconductor field-effect transistor (MOSFET) detectors for patient dosimetry has increased by ~25% since 2005. Despite this increase, no standard calibration methodology has been identified nor calibration uncertainty quantified for the use of MOSFET dosimetry in CT. This work compares three MOSFET calibration methodologies proposed in the literature, and additionally investigates questions relating to optimal time for signal equilibration and exposure levels for maximum calibration precision. The calibration methodologies tested were (1) free in-air (FIA) with radiographic x-ray tube, (2) FIA with stationary CT x-ray tube, and (3) within scatter phantom with rotational CT x-ray tube. Each calibration was performed at absorbed dose levels of 10, 23, and 35 mGy. Times of 0 min or 5 min were investigated for signal equilibration before or after signal read out. Calibration precision was measured to be better than 5%-7%, 3%-5%, and 2%-4% for the 10, 23, and 35 mGy respective dose levels, and independent of calibration methodology. No correlation was demonstrated for precision and signal equilibration time when allowing 5 min before or after signal read out. Differences in average calibration coefficients were demonstrated between the FIA with CT calibration methodology 26.7 ± 1.1 mV cGy(-1) versus the CT scatter phantom 29.2 ± 1.0 mV cGy(-1) and FIA with x-ray 29.9 ± 1.1 mV cGy(-1) methodologies. A decrease in MOSFET sensitivity was seen at an average change in read out voltage of ~3000 mV. The best measured calibration precision was obtained by exposing the MOSFET detectors to 23 mGy. No signal equilibration time is necessary to improve calibration precision. A significant difference between calibration outcomes was demonstrated for FIA with CT compared to the other two methodologies. If the FIA with a CT calibration methodology was used to create calibration coefficients for the eventual use for phantom dosimetry, a measurement error ~12

  16. Computer-Automated Evolution of Spacecraft X-Band Antennas

    Science.gov (United States)

    Lohn, Jason D.; Homby, Gregory S.; Linden, Derek S.

    2010-01-01

    A document discusses the use of computer- aided evolution in arriving at a design for X-band communication antennas for NASA s three Space Technology 5 (ST5) satellites, which were launched on March 22, 2006. Two evolutionary algorithms, incorporating different representations of the antenna design and different fitness functions, were used to automatically design and optimize an X-band antenna design. A set of antenna designs satisfying initial ST5 mission requirements was evolved by use these algorithms. The two best antennas - one from each evolutionary algorithm - were built. During flight-qualification testing of these antennas, the mission requirements were changed. After minimal changes in the evolutionary algorithms - mostly in the fitness functions - new antenna designs satisfying the changed mission requirements were evolved and within one month of this change, two new antennas were designed and prototypes of the antennas were built and tested. One of these newly evolved antennas was approved for deployment on the ST5 mission, and flight-qualified versions of this design were built and installed on the spacecraft. At the time of writing the document, these antennas were the first computer-evolved hardware in outer space.

  17. Using Evolutionary Theory to Guide Mental Health Research.

    Science.gov (United States)

    Durisko, Zachary; Mulsant, Benoit H; McKenzie, Kwame; Andrews, Paul W

    2016-03-01

    Evolutionary approaches to medicine can shed light on the origins and etiology of disease. Such an approach may be especially useful in psychiatry, which frequently addresses conditions with heterogeneous presentation and unknown causes. We review several previous applications of evolutionary theory that highlight the ways in which psychiatric conditions may persist despite and because of natural selection. One lesson from the evolutionary approach is that some conditions currently classified as disorders (because they cause distress and impairment) may actually be caused by functioning adaptations operating "normally" (as designed by natural selection). Such conditions suggest an alternative illness model that may generate alternative intervention strategies. Thus, the evolutionary approach suggests that psychiatry should sometimes think differently about distress and impairment. The complexity of the human brain, including normal functioning and potential for dysfunctions, has developed over evolutionary time and has been shaped by natural selection. Understanding the evolutionary origins of psychiatric conditions is therefore a crucial component to a complete understanding of etiology. © The Author(s) 2016.

  18. Evolutionary engineering of industrial microorganisms-strategies and applications.

    Science.gov (United States)

    Zhu, Zhengming; Zhang, Juan; Ji, Xiaomei; Fang, Zhen; Wu, Zhimeng; Chen, Jian; Du, Guocheng

    2018-06-01

    Microbial cells have been widely used in the industry to obtain various biochemical products, and evolutionary engineering is a common method in biological research to improve their traits, such as high environmental tolerance and improvement of product yield. To obtain better integrate functions of microbial cells, evolutionary engineering combined with other biotechnologies have attracted more attention in recent years. Classical laboratory evolution has been proven effective to letting more beneficial mutations occur in different genes but also has some inherent limitations such as a long evolutionary period and uncontrolled mutation frequencies. However, recent studies showed that some new strategies may gradually overcome these limitations. In this review, we summarize the evolutionary strategies commonly used in industrial microorganisms and discuss the combination of evolutionary engineering with other biotechnologies such as systems biology and inverse metabolic engineering. Finally, we prospect the importance and application prospect of evolutionary engineering as a powerful tool especially in optimization of industrial microbial cell factories.

  19. Democratizing evolutionary biology, lessons from insects

    DEFF Research Database (Denmark)

    Dunn, Robert Roberdeau; Beasley, DeAnna E.

    2016-01-01

    The engagement of the public in the scientific process is an old practice. Yet with recent advances in technology, the role of the citizen scientist in studying evolutionary processes has increased. Insects provide ideal models for understanding these evolutionary processes at large scales. This ...

  20. The fractional scaling methodology (FSM) Part 1. methodology development

    International Nuclear Information System (INIS)

    Novak Zuber; Ivan Catton; Upendra S Rohatgi; Wolfgang Wulff

    2005-01-01

    Full text of publication follows: a quantitative methodology is developed, based on the concepts of hierarchy and synthesis, to integrate and organize information and data. The methodology uses scaling to synthesize experimental data and analytical results, and to provide quantitative criteria for evaluating the effects of various design and operating parameters that influence processes in a complex system such as a nuclear power plant or a related test facility. Synthesis and scaling are performed on three hierarchical levels: the process, component and system levels. Scaling on the process level determines the effect of a selected process on a particular state variable during a selected scenario. At the component level this scaling determines the effects various processes have on a state variable, and it ranks the processes according to their importance by the magnitude of the fractional change they cause on that state variable. At the system level the scaling determines the governing processes and corresponding components, ranking these in the order of importance according to their effect on the fractional change of system-wide state variables. The scaling methodology reveals on all levels the fractional change of state variables and is called therefore the Fractional Scaling Methodology (FSM). FSM synthesizes process parameters and assigns to each thermohydraulic process a dimensionless effect metric Ω = ωt, that is the product of the specific rate of fractional change ω and the characteristic time t. The rate of fractional change ω is the ratio of process transport rate over content of a preserved quantity in a component. The effect metric Ω quantifies the contribution of the process to the fractional change of a state variable in a given component. Ordering of a component effect metrics provides the hierarchy of processes in a component, then in all components and the system. FSM separates quantitatively dominant from minor processes and components and

  1. Evolutionary theory and the naturalist fallacy

    DEFF Research Database (Denmark)

    Grodal, Torben Kragh

    2008-01-01

    that great work of art are also automatically fitness-enhancing in the present day environment, at that there are simple correllations between whether a work of art has a high aesthetic value and whether it is fitness-enhancing or not.  Keywords :  Evolutionary aesthetics, film theory, literary theory......The article is an invited response to a target article by Joseph Carroll entitled "An evolutionary paradigm for literary study". It argues that the target article  misuse the fact that works of art are based on adaptations that were fitness-enhancing in the era of evolutionary adaptations to claim...

  2. Evolutionary public health: introducing the concept.

    Science.gov (United States)

    Wells, Jonathan C K; Nesse, Randolph M; Sear, Rebecca; Johnstone, Rufus A; Stearns, Stephen C

    2017-07-29

    The emerging discipline of evolutionary medicine is breaking new ground in understanding why people become ill. However, the value of evolutionary analyses of human physiology and behaviour is only beginning to be recognised in the field of public health. Core principles come from life history theory, which analyses the allocation of finite amounts of energy between four competing functions-maintenance, growth, reproduction, and defence. A central tenet of evolutionary theory is that organisms are selected to allocate energy and time to maximise reproductive success, rather than health or longevity. Ecological interactions that influence mortality risk, nutrient availability, and pathogen burden shape energy allocation strategies throughout the life course, thereby affecting diverse health outcomes. Public health interventions could improve their own effectiveness by incorporating an evolutionary perspective. In particular, evolutionary approaches offer new opportunities to address the complex challenges of global health, in which populations are differentially exposed to the metabolic consequences of poverty, high fertility, infectious diseases, and rapid changes in nutrition and lifestyle. The effect of specific interventions is predicted to depend on broader factors shaping life expectancy. Among the important tools in this approach are mathematical models, which can explore probable benefits and limitations of interventions in silico, before their implementation in human populations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. A teleofunctional account of evolutionary mismatch.

    Science.gov (United States)

    Cofnas, Nathan

    When the environment in which an organism lives deviates in some essential way from that to which it is adapted, this is described as "evolutionary mismatch," or "evolutionary novelty." The notion of mismatch plays an important role, explicitly or implicitly, in evolution-informed cognitive psychology, clinical psychology, and medicine. The evolutionary novelty of our contemporary environment is thought to have significant implications for our health and well-being. However, scientists have generally been working without a clear definition of mismatch. This paper defines mismatch as deviations in the environment that render biological traits unable, or impaired in their ability, to produce their selected effects (i.e., to perform their proper functions in Neander's sense). The machinery developed by Millikan in connection with her account of proper function, and with her related teleosemantic account of representation, is used to identify four major types, and several subtypes, of evolutionary mismatch. While the taxonomy offered here does not in itself resolve any scientific debates, the hope is that it can be used to better formulate empirical hypotheses concerning the effects of mismatch. To illustrate, it is used to show that the controversial hypothesis that general intelligence evolved as an adaptation to handle evolutionary novelty can, contra some critics, be formulated in a conceptually coherent way.

  4. The evolutionary ecology of molecular replicators.

    Science.gov (United States)

    Nee, Sean

    2016-08-01

    By reasonable criteria, life on the Earth consists mainly of molecular replicators. These include viruses, transposons, transpovirons, coviruses and many more, with continuous new discoveries like Sputnik Virophage. Their study is inherently multidisciplinary, spanning microbiology, genetics, immunology and evolutionary theory, and the current view is that taking a unified approach has great power and promise. We support this with a new, unified, model of their evolutionary ecology, using contemporary evolutionary theory coupling the Price equation with game theory, studying the consequences of the molecular replicators' promiscuous use of each others' gene products for their natural history and evolutionary ecology. Even at this simple expository level, we can make a firm prediction of a new class of replicators exploiting viruses such as lentiviruses like SIVs, a family which includes HIV: these have been explicitly stated in the primary literature to be non-existent. Closely connected to this departure is the view that multicellular organism immunology is more about the management of chronic infections rather than the elimination of acute ones and new understandings emerging are changing our view of the kind of theatre we ourselves provide for the evolutionary play of molecular replicators. This study adds molecular replicators to bacteria in the emerging field of sociomicrobiology.

  5. Evolutionary heritage influences Amazon tree ecology

    Science.gov (United States)

    Coelho de Souza, Fernanda; Dexter, Kyle G.; Phillips, Oliver L.; Brienen, Roel J. W.; Chave, Jerome; Galbraith, David R.; Lopez Gonzalez, Gabriela; Monteagudo Mendoza, Abel; Pennington, R. Toby; Poorter, Lourens; Alexiades, Miguel; Álvarez-Dávila, Esteban; Andrade, Ana; Aragão, Luis E. O. C.; Araujo-Murakami, Alejandro; Arets, Eric J. M. M.; Aymard C, Gerardo A.; Baraloto, Christopher; Barroso, Jorcely G.; Bonal, Damien; Boot, Rene G. A.; Camargo, José L. C.; Comiskey, James A.; Valverde, Fernando Cornejo; de Camargo, Plínio B.; Di Fiore, Anthony; Erwin, Terry L.; Feldpausch, Ted R.; Ferreira, Leandro; Fyllas, Nikolaos M.; Gloor, Emanuel; Herault, Bruno; Herrera, Rafael; Higuchi, Niro; Honorio Coronado, Eurídice N.; Killeen, Timothy J.; Laurance, William F.; Laurance, Susan; Lloyd, Jon; Lovejoy, Thomas E.; Malhi, Yadvinder; Maracahipes, Leandro; Marimon, Beatriz S.; Marimon-Junior, Ben H.; Mendoza, Casimiro; Morandi, Paulo; Neill, David A.; Vargas, Percy Núñez; Oliveira, Edmar A.; Lenza, Eddie; Palacios, Walter A.; Peñuela-Mora, Maria C.; Pipoly, John J.; Pitman, Nigel C. A.; Prieto, Adriana; Quesada, Carlos A.; Ramirez-Angulo, Hirma; Rudas, Agustin; Ruokolainen, Kalle; Salomão, Rafael P.; Silveira, Marcos; ter Steege, Hans; Thomas-Caesar, Raquel; van der Hout, Peter; van der Heijden, Geertje M. F.; van der Meer, Peter J.; Vasquez, Rodolfo V.; Vieira, Simone A.; Vilanova, Emilio; Vos, Vincent A.; Wang, Ophelia; Young, Kenneth R.; Zagt, Roderick J.; Baker, Timothy R.

    2016-01-01

    Lineages tend to retain ecological characteristics of their ancestors through time. However, for some traits, selection during evolutionary history may have also played a role in determining trait values. To address the relative importance of these processes requires large-scale quantification of traits and evolutionary relationships among species. The Amazonian tree flora comprises a high diversity of angiosperm lineages and species with widely differing life-history characteristics, providing an excellent system to investigate the combined influences of evolutionary heritage and selection in determining trait variation. We used trait data related to the major axes of life-history variation among tropical trees (e.g. growth and mortality rates) from 577 inventory plots in closed-canopy forest, mapped onto a phylogenetic hypothesis spanning more than 300 genera including all major angiosperm clades to test for evolutionary constraints on traits. We found significant phylogenetic signal (PS) for all traits, consistent with evolutionarily related genera having more similar characteristics than expected by chance. Although there is also evidence for repeated evolution of pioneer and shade tolerant life-history strategies within independent lineages, the existence of significant PS allows clearer predictions of the links between evolutionary diversity, ecosystem function and the response of tropical forests to global change. PMID:27974517

  6. Evolutionary heritage influences Amazon tree ecology.

    Science.gov (United States)

    Coelho de Souza, Fernanda; Dexter, Kyle G; Phillips, Oliver L; Brienen, Roel J W; Chave, Jerome; Galbraith, David R; Lopez Gonzalez, Gabriela; Monteagudo Mendoza, Abel; Pennington, R Toby; Poorter, Lourens; Alexiades, Miguel; Álvarez-Dávila, Esteban; Andrade, Ana; Aragão, Luis E O C; Araujo-Murakami, Alejandro; Arets, Eric J M M; Aymard C, Gerardo A; Baraloto, Christopher; Barroso, Jorcely G; Bonal, Damien; Boot, Rene G A; Camargo, José L C; Comiskey, James A; Valverde, Fernando Cornejo; de Camargo, Plínio B; Di Fiore, Anthony; Elias, Fernando; Erwin, Terry L; Feldpausch, Ted R; Ferreira, Leandro; Fyllas, Nikolaos M; Gloor, Emanuel; Herault, Bruno; Herrera, Rafael; Higuchi, Niro; Honorio Coronado, Eurídice N; Killeen, Timothy J; Laurance, William F; Laurance, Susan; Lloyd, Jon; Lovejoy, Thomas E; Malhi, Yadvinder; Maracahipes, Leandro; Marimon, Beatriz S; Marimon-Junior, Ben H; Mendoza, Casimiro; Morandi, Paulo; Neill, David A; Vargas, Percy Núñez; Oliveira, Edmar A; Lenza, Eddie; Palacios, Walter A; Peñuela-Mora, Maria C; Pipoly, John J; Pitman, Nigel C A; Prieto, Adriana; Quesada, Carlos A; Ramirez-Angulo, Hirma; Rudas, Agustin; Ruokolainen, Kalle; Salomão, Rafael P; Silveira, Marcos; Stropp, Juliana; Ter Steege, Hans; Thomas-Caesar, Raquel; van der Hout, Peter; van der Heijden, Geertje M F; van der Meer, Peter J; Vasquez, Rodolfo V; Vieira, Simone A; Vilanova, Emilio; Vos, Vincent A; Wang, Ophelia; Young, Kenneth R; Zagt, Roderick J; Baker, Timothy R

    2016-12-14

    Lineages tend to retain ecological characteristics of their ancestors through time. However, for some traits, selection during evolutionary history may have also played a role in determining trait values. To address the relative importance of these processes requires large-scale quantification of traits and evolutionary relationships among species. The Amazonian tree flora comprises a high diversity of angiosperm lineages and species with widely differing life-history characteristics, providing an excellent system to investigate the combined influences of evolutionary heritage and selection in determining trait variation. We used trait data related to the major axes of life-history variation among tropical trees (e.g. growth and mortality rates) from 577 inventory plots in closed-canopy forest, mapped onto a phylogenetic hypothesis spanning more than 300 genera including all major angiosperm clades to test for evolutionary constraints on traits. We found significant phylogenetic signal (PS) for all traits, consistent with evolutionarily related genera having more similar characteristics than expected by chance. Although there is also evidence for repeated evolution of pioneer and shade tolerant life-history strategies within independent lineages, the existence of significant PS allows clearer predictions of the links between evolutionary diversity, ecosystem function and the response of tropical forests to global change. © 2016 The Authors.

  7. Intention recognition, commitment and their roles in the evolution of cooperation from artificial intelligence techniques to evolutionary game theory models

    CERN Document Server

    Han, The Anh

    2013-01-01

    This original and timely monograph describes a unique self-contained excursion that reveals to the readers the roles of two basic cognitive abilities, i.e. intention recognition and arranging commitments, in the evolution of cooperative behavior. This book analyses intention recognition, an important ability that helps agents predict others’ behavior, in its artificial intelligence and evolutionary computational modeling aspects, and proposes a novel intention recognition method. Furthermore, the book presents a new framework for intention-based decision making and illustrates several ways in which an ability to recognize intentions of others can enhance a decision making process. By employing the new intention recognition method and the tools of evolutionary game theory, this book introduces computational models demonstrating that intention recognition promotes the emergence of cooperation within populations of self-regarding agents. Finally, the book describes how commitment provides a pathway to the evol...

  8. How to Identify and Interpret Evolutionary Tree Diagrams

    Science.gov (United States)

    Kong, Yi; Anderson, Trevor; Pelaez, Nancy

    2016-01-01

    Evolutionary trees are key tools for modern biology and are commonly portrayed in textbooks to promote learning about biological evolution. However, many people have difficulty in understanding what evolutionary trees are meant to portray. In fact, some ideas that current professional biologists depict with evolutionary trees are neither clearly…

  9. 1st International Conference on Intelligent Computing and Communication

    CERN Document Server

    Satapathy, Suresh; Sanyal, Manas; Bhateja, Vikrant

    2017-01-01

    The book covers a wide range of topics in Computer Science and Information Technology including swarm intelligence, artificial intelligence, evolutionary algorithms, and bio-inspired algorithms. It is a collection of papers presented at the First International Conference on Intelligent Computing and Communication (ICIC2) 2016. The prime areas of the conference are Intelligent Computing, Intelligent Communication, Bio-informatics, Geo-informatics, Algorithm, Graphics and Image Processing, Graph Labeling, Web Security, Privacy and e-Commerce, Computational Geometry, Service Orient Architecture, and Data Engineering.

  10. Evolutionary biology of bacterial and fungal pathogens

    National Research Council Canada - National Science Library

    Baquero, F

    2008-01-01

    ... and Evolutionary Dynamics of Pathogens * 21 Keith A. Crandall and Marcos Pérez-Losada II. Evolutionary Genetics of Microbial Pathogens 4. Environmental and Social Influences on Infectious Disea...

  11. Evolutionary image simplification for lung nodule classification with convolutional neural networks.

    Science.gov (United States)

    Lückehe, Daniel; von Voigt, Gabriele

    2018-05-29

    Understanding decisions of deep learning techniques is important. Especially in the medical field, the reasons for a decision in a classification task are as crucial as the pure classification results. In this article, we propose a new approach to compute relevant parts of a medical image. Knowing the relevant parts makes it easier to understand decisions. In our approach, a convolutional neural network is employed to learn structures of images of lung nodules. Then, an evolutionary algorithm is applied to compute a simplified version of an unknown image based on the learned structures by the convolutional neural network. In the simplified version, irrelevant parts are removed from the original image. In the results, we show simplified images which allow the observer to focus on the relevant parts. In these images, more than 50% of the pixels are simplified. The simplified pixels do not change the meaning of the images based on the learned structures by the convolutional neural network. An experimental analysis shows the potential of the approach. Besides the examples of simplified images, we analyze the run time development. Simplified images make it easier to focus on relevant parts and to find reasons for a decision. The combination of an evolutionary algorithm employing a learned convolutional neural network is well suited for the simplification task. From a research perspective, it is interesting which areas of the images are simplified and which parts are taken as relevant.

  12. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  13. Regional systems of innovation: an evolutionary perspective

    OpenAIRE

    P Cooke; M G Uranga; G Etxebarria

    1998-01-01

    The authors develop the concept of regional systems of innovation and relate it to preexisting research on national systems of innovation. They argue that work conducted in the 'new regional science' field is complementary to systems of innovation approaches. They seek to link new regional work to evolutionary economics, and argue for the development of evolutionary regional science. Common elements of interest to evolutionary innovation research and new regional science are important in unde...

  14. Evolutionary robotics

    Indian Academy of Sciences (India)

    In evolutionary robotics, a suitable robot control system is developed automatically through evolution due to the interactions between the robot and its environment. It is a complicated task, as the robot and the environment constitute a highly dynamical system. Several methods have been tried by various investigators to ...

  15. Heterogeneous Compression of Large Collections of Evolutionary Trees.

    Science.gov (United States)

    Matthews, Suzanne J

    2015-01-01

    Compressing heterogeneous collections of trees is an open problem in computational phylogenetics. In a heterogeneous tree collection, each tree can contain a unique set of taxa. An ideal compression method would allow for the efficient archival of large tree collections and enable scientists to identify common evolutionary relationships over disparate analyses. In this paper, we extend TreeZip to compress heterogeneous collections of trees. TreeZip is the most efficient algorithm for compressing homogeneous tree collections. To the best of our knowledge, no other domain-based compression algorithm exists for large heterogeneous tree collections or enable their rapid analysis. Our experimental results indicate that TreeZip averages 89.03 percent (72.69 percent) space savings on unweighted (weighted) collections of trees when the level of heterogeneity in a collection is moderate. The organization of the TRZ file allows for efficient computations over heterogeneous data. For example, consensus trees can be computed in mere seconds. Lastly, combining the TreeZip compressed (TRZ) file with general-purpose compression yields average space savings of 97.34 percent (81.43 percent) on unweighted (weighted) collections of trees. Our results lead us to believe that TreeZip will prove invaluable in the efficient archival of tree collections, and enables scientists to develop novel methods for relating heterogeneous collections of trees.

  16. A methodology for direct quantification of over-ranging length in helical computed tomography with real-time dosimetry.

    Science.gov (United States)

    Tien, Christopher J; Winslow, James F; Hintenlang, David E

    2011-01-31

    In helical computed tomography (CT), reconstruction information from volumes adjacent to the clinical volume of interest (VOI) is required for proper reconstruction. Previous studies have relied upon either operator console readings or indirect extrapolation of measurements in order to determine the over-ranging length of a scan. This paper presents a methodology for the direct quantification of over-ranging dose contributions using real-time dosimetry. A Siemens SOMATOM Sensation 16 multislice helical CT scanner is used with a novel real-time "point" fiber-optic dosimeter system with 10 ms temporal resolution to measure over-ranging length, which is also expressed in dose-length-product (DLP). Film was used to benchmark the exact length of over-ranging. Over-ranging length varied from 4.38 cm at pitch of 0.5 to 6.72 cm at a pitch of 1.5, which corresponds to DLP of 131 to 202 mGy-cm. The dose-extrapolation method of Van der Molen et al. yielded results within 3%, while the console reading method of Tzedakis et al. yielded consistently larger over-ranging lengths. From film measurements, it was determined that Tzedakis et al. overestimated over-ranging lengths by one-half of beam collimation width. Over-ranging length measured as a function of reconstruction slice thicknesses produced two linear regions similar to previous publications. Over-ranging is quantified with both absolute length and DLP, which contributes about 60 mGy-cm or about 10% of DLP for a routine abdominal scan. This paper presents a direct physical measurement of over-ranging length within 10% of previous methodologies. Current uncertainties are less than 1%, in comparison with 5% in other methodologies. Clinical implantation can be increased by using only one dosimeter if codependence with console readings is acceptable, with an uncertainty of 1.1% This methodology will be applied to different vendors, models, and postprocessing methods--which have been shown to produce over-ranging lengths

  17. Conducting Computer Security Assessments at Nuclear Facilities

    International Nuclear Information System (INIS)

    2016-06-01

    Computer security is increasingly recognized as a key component in nuclear security. As technology advances, it is anticipated that computer and computing systems will be used to an even greater degree in all aspects of plant operations including safety and security systems. A rigorous and comprehensive assessment process can assist in strengthening the effectiveness of the computer security programme. This publication outlines a methodology for conducting computer security assessments at nuclear facilities. The methodology can likewise be easily adapted to provide assessments at facilities with other radioactive materials

  18. Evolutionary mysteries in meiosis.

    Science.gov (United States)

    Lenormand, Thomas; Engelstädter, Jan; Johnston, Susan E; Wijnker, Erik; Haag, Christoph R

    2016-10-19

    Meiosis is a key event of sexual life cycles in eukaryotes. Its mechanistic details have been uncovered in several model organisms, and most of its essential features have received various and often contradictory evolutionary interpretations. In this perspective, we present an overview of these often 'weird' features. We discuss the origin of meiosis (origin of ploidy reduction and recombination, two-step meiosis), its secondary modifications (in polyploids or asexuals, inverted meiosis), its importance in punctuating life cycles (meiotic arrests, epigenetic resetting, meiotic asymmetry, meiotic fairness) and features associated with recombination (disjunction constraints, heterochiasmy, crossover interference and hotspots). We present the various evolutionary scenarios and selective pressures that have been proposed to account for these features, and we highlight that their evolutionary significance often remains largely mysterious. Resolving these mysteries will likely provide decisive steps towards understanding why sex and recombination are found in the majority of eukaryotes.This article is part of the themed issue 'Weird sex: the underappreciated diversity of sexual reproduction'. © 2016 The Author(s).

  19. Harmonic elimination in diode-clamped multilevel inverter using evolutionary algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Barkati, Said [Laboratoire d' analyse des Signaux et Systemes (LASS), Universite de M' sila, BP. 166, rue Ichbilia 28000 M' sila (Algeria); Baghli, Lotfi [Groupe de Recherche en Electrotechnique et Electronique de Nancy (GREEN), CNRS UMR 7030, Universite Henri Poincare Nancy 1, BP. 239, 54506 Vandoeuvre-les-Nancy (France); Berkouk, El Madjid; Boucherit, Mohamed-Seghir [Laboratoire de Commande des Processus (LCP), Ecole Nationale Polytechnique, BP. 182, 10 Avenue Hassen Badi, 16200 El Harrach, Alger (Algeria)

    2008-10-15

    This paper describes two evolutionary algorithms for the optimized harmonic stepped-waveform technique. Genetic algorithms and particle swarm optimization are applied to compute the switching angles in a three-phase seven-level inverter to produce the required fundamental voltage while, at the same time, specified harmonics are eliminated. Furthermore, these algorithms are also used to solve the starting point problem of the Newton-Raphson conventional method. This combination provides a very effective method for the harmonic elimination technique. This strategy is useful for different structures of seven-level inverters. The diode-clamped topology is considered in this study. (author)

  20. Methodology of Computer-Aided Design of Variable Guide Vanes of Aircraft Engines

    Science.gov (United States)

    Falaleev, Sergei V.; Melentjev, Vladimir S.; Gvozdev, Alexander S.

    2016-01-01

    The paper presents a methodology which helps to avoid a great amount of costly experimental research. This methodology includes thermo-gas dynamic design of an engine and its mounts, the profiling of compressor flow path and cascade design of guide vanes. Employing a method elaborated by Howell, we provide a theoretical solution to the task of…