WorldWideScience

Sample records for evolutionary computation combined

  1. Applications of Evolutionary Computation

    NARCIS (Netherlands)

    Mora, Antonio M.; Squillero, Giovanni; Di Chio, C; Agapitos, Alexandros; Cagnoni, Stefano; Cotta, Carlos; Fernández De Vega, F; Di Caro, G A; Drechsler, R.; Ekárt, A; Esparcia-Alcázar, Anna I.; Farooq, M; Langdon, W B; Merelo-Guervós, J.J.; Preuss, M; Richter, O.-M.H.; Silva, Sara; Sim$\\$~oes, A; Squillero, Giovanni; Tarantino, Ernesto; Tettamanzi, Andrea G B; Togelius, J; Urquhart, Neil; Uyar, A S; Yannakakis, G N; Smith, Stephen L; Caserta, Marco; Ramirez, Adriana; Voß, Stefan; Squillero, Giovanni; Burelli, Paolo; Mora, Antonio M.; Squillero, Giovanni; Jan, Mathieu; Matthias, M; Di Chio, C; Agapitos, Alexandros; Cagnoni, Stefano; Cotta, Carlos; Fernández De Vega, F; Di Caro, G A; Drechsler, R.; Ekárt, A; Esparcia-Alcázar, Anna I.; Farooq, M; Langdon, W B; Merelo-Guervós, J.J.; Preuss, M; Richter, O.-M.H.; Silva, Sara; Sim$\\$~oes, A; Squillero, Giovanni; Tarantino, Ernesto; Tettamanzi, Andrea G B; Togelius, J; Urquhart, Neil; Uyar, A S; Yannakakis, G N; Caserta, Marco; Ramirez, Adriana; Voß, Stefan; Squillero, Giovanni; Burelli, Paolo; Esparcia-Alcazar, Anna I; Silva, Sara; Agapitos, Alexandros; Cotta, Carlos; De Falco, Ivanoe; Cioppa, Antonio Della; Diwold, Konrad; Ekart, Aniko; Tarantino, Ernesto; Vega, Francisco Fernandez De; Burelli, Paolo; Sim, Kevin; Cagnoni, Stefano; Simoes, Anabela; Merelo, J.J.; Urquhart, Neil; Haasdijk, Evert; Zhang, Mengjie; Squillero, Giovanni; Eiben, A E; Tettamanzi, Andrea G B; Glette, Kyrre; Rohlfshagen, Philipp; Schaefer, Robert; Caserta, Marco; Ramirez, Adriana; Voß, Stefan

    2015-01-01

    The application of genetic and evolutionary computation to problems in medicine has increased rapidly over the past five years, but there are specific issues and challenges that distinguish it from other real-world applications. Obtaining reliable and coherent patient data, establishing the clinical

  2. Part E: Evolutionary Computation

    DEFF Research Database (Denmark)

    2015-01-01

    of Computational Intelligence. First, comprehensive surveys of genetic algorithms, genetic programming, evolution strategies, parallel evolutionary algorithms are presented, which are readable and constructive so that a large audience might find them useful and – to some extent – ready to use. Some more general...... kinds of evolutionary algorithms, have been prudently analyzed. This analysis was followed by a thorough analysis of various issues involved in stochastic local search algorithms. An interesting survey of various technological and industrial applications in mechanical engineering and design has been...... topics like the estimation of distribution algorithms, indicator-based selection, etc., are also discussed. An important problem, from a theoretical and practical point of view, of learning classifier systems is presented in depth. Multiobjective evolutionary algorithms, which constitute one of the most...

  3. Electricity demand and spot price forecasting using evolutionary computation combined with chaotic nonlinear dynamic model

    International Nuclear Information System (INIS)

    Unsihuay-Vila, C.; Zambroni de Souza, A.C.; Marangon-Lima, J.W.; Balestrassi, P.P.

    2010-01-01

    This paper proposes a new hybrid approach based on nonlinear chaotic dynamics and evolutionary strategy to forecast electricity loads and prices. The main idea is to develop a new training or identification stage in a nonlinear chaotic dynamic based predictor. In the training stage five optimal parameters for a chaotic based predictor are searched through an optimization model based on evolutionary strategy. The objective function of the optimization model is the mismatch minimization between the multi-step-ahead forecasting of predictor and observed data such as it is done in identification problems. The first contribution of this paper is that the proposed approach is capable of capturing the complex dynamic of demand and price time series considered resulting in a more accuracy forecasting. The second contribution is that the proposed approach run on-line manner, i.e. the optimal set of parameters and prediction is executed automatically which can be used to prediction in real-time, it is an advantage in comparison with other models, where the choice of their input parameters are carried out off-line, following qualitative/experience-based recipes. A case study of load and price forecasting is presented using data from New England, Alberta, and Spain. A comparison with other methods such as autoregressive integrated moving average (ARIMA) and artificial neural network (ANN) is shown. The results show that the proposed approach provides a more accurate and effective forecasting than ARIMA and ANN methods. (author)

  4. Practical advantages of evolutionary computation

    Science.gov (United States)

    Fogel, David B.

    1997-10-01

    Evolutionary computation is becoming a common technique for solving difficult, real-world problems in industry, medicine, and defense. This paper reviews some of the practical advantages to using evolutionary algorithms as compared with classic methods of optimization or artificial intelligence. Specific advantages include the flexibility of the procedures, as well as their ability to self-adapt the search for optimum solutions on the fly. As desktop computers increase in speed, the application of evolutionary algorithms will become routine.

  5. Evolutionary computation for reinforcement learning

    NARCIS (Netherlands)

    Whiteson, S.; Wiering, M.; van Otterlo, M.

    2012-01-01

    Algorithms for evolutionary computation, which simulate the process of natural selection to solve optimization problems, are an effective tool for discovering high-performing reinforcement-learning policies. Because they can automatically find good representations, handle continuous action spaces,

  6. Markov Networks in Evolutionary Computation

    CERN Document Server

    Shakya, Siddhartha

    2012-01-01

    Markov networks and other probabilistic graphical modes have recently received an upsurge in attention from Evolutionary computation community, particularly in the area of Estimation of distribution algorithms (EDAs).  EDAs have arisen as one of the most successful experiences in the application of machine learning methods in optimization, mainly due to their efficiency to solve complex real-world optimization problems and their suitability for theoretical analysis. This book focuses on the different steps involved in the conception, implementation and application of EDAs that use Markov networks, and undirected models in general. It can serve as a general introduction to EDAs but covers also an important current void in the study of these algorithms by explaining the specificities and benefits of modeling optimization problems by means of undirected probabilistic models. All major developments to date in the progressive introduction of Markov networks based EDAs are reviewed in the book. Hot current researc...

  7. Algorithmic Mechanism Design of Evolutionary Computation.

    Science.gov (United States)

    Pei, Yan

    2015-01-01

    We consider algorithmic design, enhancement, and improvement of evolutionary computation as a mechanism design problem. All individuals or several groups of individuals can be considered as self-interested agents. The individuals in evolutionary computation can manipulate parameter settings and operations by satisfying their own preferences, which are defined by an evolutionary computation algorithm designer, rather than by following a fixed algorithm rule. Evolutionary computation algorithm designers or self-adaptive methods should construct proper rules and mechanisms for all agents (individuals) to conduct their evolution behaviour correctly in order to definitely achieve the desired and preset objective(s). As a case study, we propose a formal framework on parameter setting, strategy selection, and algorithmic design of evolutionary computation by considering the Nash strategy equilibrium of a mechanism design in the search process. The evaluation results present the efficiency of the framework. This primary principle can be implemented in any evolutionary computation algorithm that needs to consider strategy selection issues in its optimization process. The final objective of our work is to solve evolutionary computation design as an algorithmic mechanism design problem and establish its fundamental aspect by taking this perspective. This paper is the first step towards achieving this objective by implementing a strategy equilibrium solution (such as Nash equilibrium) in evolutionary computation algorithm.

  8. Prospective Algorithms for Quantum Evolutionary Computation

    OpenAIRE

    Sofge, Donald A.

    2008-01-01

    This effort examines the intersection of the emerging field of quantum computing and the more established field of evolutionary computation. The goal is to understand what benefits quantum computing might offer to computational intelligence and how computational intelligence paradigms might be implemented as quantum programs to be run on a future quantum computer. We critically examine proposed algorithms and methods for implementing computational intelligence paradigms, primarily focused on ...

  9. Optimizing a reconfigurable material via evolutionary computation

    Science.gov (United States)

    Wilken, Sam; Miskin, Marc Z.; Jaeger, Heinrich M.

    2015-08-01

    Rapid prototyping by combining evolutionary computation with simulations is becoming a powerful tool for solving complex design problems in materials science. This method of optimization operates in a virtual design space that simulates potential material behaviors and after completion needs to be validated by experiment. However, in principle an evolutionary optimizer can also operate on an actual physical structure or laboratory experiment directly, provided the relevant material parameters can be accessed by the optimizer and information about the material's performance can be updated by direct measurements. Here we provide a proof of concept of such direct, physical optimization by showing how a reconfigurable, highly nonlinear material can be tuned to respond to impact. We report on an entirely computer controlled laboratory experiment in which a 6 ×6 grid of electromagnets creates a magnetic field pattern that tunes the local rigidity of a concentrated suspension of ferrofluid and iron filings. A genetic algorithm is implemented and tasked to find field patterns that minimize the force transmitted through the suspension. Searching within a space of roughly 1010 possible configurations, after testing only 1500 independent trials the algorithm identifies an optimized configuration of layered rigid and compliant regions.

  10. Evolutionary Computation and Its Applications in Neural and Fuzzy Systems

    Directory of Open Access Journals (Sweden)

    Biaobiao Zhang

    2011-01-01

    Full Text Available Neural networks and fuzzy systems are two soft-computing paradigms for system modelling. Adapting a neural or fuzzy system requires to solve two optimization problems: structural optimization and parametric optimization. Structural optimization is a discrete optimization problem which is very hard to solve using conventional optimization techniques. Parametric optimization can be solved using conventional optimization techniques, but the solution may be easily trapped at a bad local optimum. Evolutionary computation is a general-purpose stochastic global optimization approach under the universally accepted neo-Darwinian paradigm, which is a combination of the classical Darwinian evolutionary theory, the selectionism of Weismann, and the genetics of Mendel. Evolutionary algorithms are a major approach to adaptation and optimization. In this paper, we first introduce evolutionary algorithms with emphasis on genetic algorithms and evolutionary strategies. Other evolutionary algorithms such as genetic programming, evolutionary programming, particle swarm optimization, immune algorithm, and ant colony optimization are also described. Some topics pertaining to evolutionary algorithms are also discussed, and a comparison between evolutionary algorithms and simulated annealing is made. Finally, the application of EAs to the learning of neural networks as well as to the structural and parametric adaptations of fuzzy systems is also detailed.

  11. International Conference of Intelligence Computation and Evolutionary Computation ICEC 2012

    CERN Document Server

    Intelligence Computation and Evolutionary Computation

    2013-01-01

    2012 International Conference of Intelligence Computation and Evolutionary Computation (ICEC 2012) is held on July 7, 2012 in Wuhan, China. This conference is sponsored by Information Technology & Industrial Engineering Research Center.  ICEC 2012 is a forum for presentation of new research results of intelligent computation and evolutionary computation. Cross-fertilization of intelligent computation, evolutionary computation, evolvable hardware and newly emerging technologies is strongly encouraged. The forum aims to bring together researchers, developers, and users from around the world in both industry and academia for sharing state-of-art results, for exploring new areas of research and development, and to discuss emerging issues facing intelligent computation and evolutionary computation.

  12. Evolutionary computation in zoology and ecology.

    Science.gov (United States)

    Boone, Randall B

    2017-12-01

    Evolutionary computational methods have adopted attributes of natural selection and evolution to solve problems in computer science, engineering, and other fields. The method is growing in use in zoology and ecology. Evolutionary principles may be merged with an agent-based modeling perspective to have individual animals or other agents compete. Four main categories are discussed: genetic algorithms, evolutionary programming, genetic programming, and evolutionary strategies. In evolutionary computation, a population is represented in a way that allows for an objective function to be assessed that is relevant to the problem of interest. The poorest performing members are removed from the population, and remaining members reproduce and may be mutated. The fitness of the members is again assessed, and the cycle continues until a stopping condition is met. Case studies include optimizing: egg shape given different clutch sizes, mate selection, migration of wildebeest, birds, and elk, vulture foraging behavior, algal bloom prediction, and species richness given energy constraints. Other case studies simulate the evolution of species and a means to project shifts in species ranges in response to a changing climate that includes competition and phenotypic plasticity. This introduction concludes by citing other uses of evolutionary computation and a review of the flexibility of the methods. For example, representing species' niche spaces subject to selective pressure allows studies on cladistics, the taxon cycle, neutral versus niche paradigms, fundamental versus realized niches, community structure and order of colonization, invasiveness, and responses to a changing climate.

  13. Massively parallel evolutionary computation on GPGPUs

    CERN Document Server

    Tsutsui, Shigeyoshi

    2013-01-01

    Evolutionary algorithms (EAs) are metaheuristics that learn from natural collective behavior and are applied to solve optimization problems in domains such as scheduling, engineering, bioinformatics, and finance. Such applications demand acceptable solutions with high-speed execution using finite computational resources. Therefore, there have been many attempts to develop platforms for running parallel EAs using multicore machines, massively parallel cluster machines, or grid computing environments. Recent advances in general-purpose computing on graphics processing units (GPGPU) have opened u

  14. Soft computing integrating evolutionary, neural, and fuzzy systems

    CERN Document Server

    Tettamanzi, Andrea

    2001-01-01

    Soft computing encompasses various computational methodologies, which, unlike conventional algorithms, are tolerant of imprecision, uncertainty, and partial truth. Soft computing technologies offer adaptability as a characteristic feature and thus permit the tracking of a problem through a changing environment. Besides some recent developments in areas like rough sets and probabilistic networks, fuzzy logic, evolutionary algorithms, and artificial neural networks are core ingredients of soft computing, which are all bio-inspired and can easily be combined synergetically. This book presents a well-balanced integration of fuzzy logic, evolutionary computing, and neural information processing. The three constituents are introduced to the reader systematically and brought together in differentiated combinations step by step. The text was developed from courses given by the authors and offers numerous illustrations as

  15. Comparison of evolutionary computation algorithms for solving bi ...

    Indian Academy of Sciences (India)

    failure probability. Multiobjective Evolutionary Computation algorithms (MOEAs) are well-suited for Multiobjective task scheduling on heterogeneous environment. The two Multi-Objective Evolutionary Algorithms such as Multiobjective Genetic. Algorithm (MOGA) and Multiobjective Evolutionary Programming (MOEP) with.

  16. Function Follows Performance in Evolutionary Computational Processing

    DEFF Research Database (Denmark)

    Pasold, Anke; Foged, Isak Worre

    2011-01-01

    As the title ‘Function Follows Performance in Evolutionary Computational Processing’ suggests, this paper explores the potentials of employing multiple design and evaluation criteria within one processing model in order to account for a number of performative parameters desired within varied...

  17. Evolutionary Computing for Intelligent Power System Optimization and Control

    DEFF Research Database (Denmark)

    This new book focuses on how evolutionary computing techniques benefit engineering research and development tasks by converting practical problems of growing complexities into simple formulations, thus largely reducing development efforts. This book begins with an overview of the optimization the...... theory and modern evolutionary computing techniques, and goes on to cover specific applications of evolutionary computing to power system optimization and control problems....

  18. Conversion Rate Optimization through Evolutionary Computation

    OpenAIRE

    Miikkulainen, Risto; Iscoe, Neil; Shagrin, Aaron; Cordell, Ron; Nazari, Sam; Schoolland, Cory; Brundage, Myles; Epstein, Jonathan; Dean, Randy; Lamba, Gurmeet

    2017-01-01

    Conversion optimization means designing a web interface so that as many users as possible take a desired action on it, such as register or purchase. Such design is usually done by hand, testing one change at a time through A/B testing, or a limited number of combinations through multivariate testing, making it possible to evaluate only a small fraction of designs in a vast design space. This paper describes Sentient Ascend, an automatic conversion optimization system that uses evolutionary op...

  19. Evolutionary computation techniques a comparative perspective

    CERN Document Server

    Cuevas, Erik; Oliva, Diego

    2017-01-01

    This book compares the performance of various evolutionary computation (EC) techniques when they are faced with complex optimization problems extracted from different engineering domains. Particularly focusing on recently developed algorithms, it is designed so that each chapter can be read independently. Several comparisons among EC techniques have been reported in the literature, however, they all suffer from one limitation: their conclusions are based on the performance of popular evolutionary approaches over a set of synthetic functions with exact solutions and well-known behaviors, without considering the application context or including recent developments. In each chapter, a complex engineering optimization problem is posed, and then a particular EC technique is presented as the best choice, according to its search characteristics. Lastly, a set of experiments is conducted in order to compare its performance to other popular EC methods.

  20. Evolutionary Computing Methods for Spectral Retrieval

    Science.gov (United States)

    Terrile, Richard; Fink, Wolfgang; Huntsberger, Terrance; Lee, Seugwon; Tisdale, Edwin; VonAllmen, Paul; Tinetti, Geivanna

    2009-01-01

    A methodology for processing spectral images to retrieve information on underlying physical, chemical, and/or biological phenomena is based on evolutionary and related computational methods implemented in software. In a typical case, the solution (the information that one seeks to retrieve) consists of parameters of a mathematical model that represents one or more of the phenomena of interest. The methodology was developed for the initial purpose of retrieving the desired information from spectral image data acquired by remote-sensing instruments aimed at planets (including the Earth). Examples of information desired in such applications include trace gas concentrations, temperature profiles, surface types, day/night fractions, cloud/aerosol fractions, seasons, and viewing angles. The methodology is also potentially useful for retrieving information on chemical and/or biological hazards in terrestrial settings. In this methodology, one utilizes an iterative process that minimizes a fitness function indicative of the degree of dissimilarity between observed and synthetic spectral and angular data. The evolutionary computing methods that lie at the heart of this process yield a population of solutions (sets of the desired parameters) within an accuracy represented by a fitness-function value specified by the user. The evolutionary computing methods (ECM) used in this methodology are Genetic Algorithms and Simulated Annealing, both of which are well-established optimization techniques and have also been described in previous NASA Tech Briefs articles. These are embedded in a conceptual framework, represented in the architecture of the implementing software, that enables automatic retrieval of spectral and angular data and analysis of the retrieved solutions for uniqueness.

  1. Advances of evolutionary computation methods and operators

    CERN Document Server

    Cuevas, Erik; Oliva Navarro, Diego Alberto

    2016-01-01

    The goal of this book is to present advances that discuss alternative Evolutionary Computation (EC) developments and non-conventional operators which have proved to be effective in the solution of several complex problems. The book has been structured so that each chapter can be read independently from the others. The book contains nine chapters with the following themes: 1) Introduction, 2) the Social Spider Optimization (SSO), 3) the States of Matter Search (SMS), 4) the collective animal behavior (CAB) algorithm, 5) the Allostatic Optimization (AO) method, 6) the Locust Search (LS) algorithm, 7) the Adaptive Population with Reduced Evaluations (APRE) method, 8) the multimodal CAB, 9) the constrained SSO method.

  2. Evolutionary Based Solutions for Green Computing

    CERN Document Server

    Kołodziej, Joanna; Li, Juan; Zomaya, Albert

    2013-01-01

    Today’s highly parameterized large-scale distributed computing systems may be composed  of a large number of various components (computers, databases, etc) and must provide a wide range of services. The users of such systems, located at different (geographical or managerial) network cluster may have a limited access to the system’s services and resources, and different, often conflicting, expectations and requirements. Moreover, the information and data processed in such dynamic environments may be incomplete, imprecise, fragmentary, and overloading. All of the above mentioned issues require some intelligent scalable methodologies for the management of the whole complex structure, which unfortunately may increase the energy consumption of such systems.   This book in its eight chapters, addresses the fundamental issues related to the energy usage and the optimal low-cost system design in high performance ``green computing’’ systems. The recent evolutionary and general metaheuristic-based solutions ...

  3. Computational intelligence synergies of fuzzy logic, neural networks and evolutionary computing

    CERN Document Server

    Siddique, Nazmul

    2013-01-01

    Computational Intelligence: Synergies of Fuzzy Logic, Neural Networks and Evolutionary Computing presents an introduction to some of the cutting edge technological paradigms under the umbrella of computational intelligence. Computational intelligence schemes are investigated with the development of a suitable framework for fuzzy logic, neural networks and evolutionary computing, neuro-fuzzy systems, evolutionary-fuzzy systems and evolutionary neural systems. Applications to linear and non-linear systems are discussed with examples. Key features: Covers all the aspect

  4. Parallel evolutionary computation in bioinformatics applications.

    Science.gov (United States)

    Pinho, Jorge; Sobral, João Luis; Rocha, Miguel

    2013-05-01

    A large number of optimization problems within the field of Bioinformatics require methods able to handle its inherent complexity (e.g. NP-hard problems) and also demand increased computational efforts. In this context, the use of parallel architectures is a necessity. In this work, we propose ParJECoLi, a Java based library that offers a large set of metaheuristic methods (such as Evolutionary Algorithms) and also addresses the issue of its efficient execution on a wide range of parallel architectures. The proposed approach focuses on the easiness of use, making the adaptation to distinct parallel environments (multicore, cluster, grid) transparent to the user. Indeed, this work shows how the development of the optimization library can proceed independently of its adaptation for several architectures, making use of Aspect-Oriented Programming. The pluggable nature of parallelism related modules allows the user to easily configure its environment, adding parallelism modules to the base source code when needed. The performance of the platform is validated with two case studies within biological model optimization. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  5. Applications of evolutionary computation in image processing and pattern recognition

    CERN Document Server

    Cuevas, Erik; Perez-Cisneros, Marco

    2016-01-01

    This book presents the use of efficient Evolutionary Computation (EC) algorithms for solving diverse real-world image processing and pattern recognition problems. It provides an overview of the different aspects of evolutionary methods in order to enable the reader in reaching a global understanding of the field and, in conducting studies on specific evolutionary techniques that are related to applications in image processing and pattern recognition. It explains the basic ideas of the proposed applications in a way that can also be understood by readers outside of the field. Image processing and pattern recognition practitioners who are not evolutionary computation researchers will appreciate the discussed techniques beyond simple theoretical tools since they have been adapted to solve significant problems that commonly arise on such areas. On the other hand, members of the evolutionary computation community can learn the way in which image processing and pattern recognition problems can be translated into an...

  6. Evolutionary Cell Computing: From Protocells to Self-Organized Computing

    Science.gov (United States)

    Colombano, Silvano; New, Michael H.; Pohorille, Andrew; Scargle, Jeffrey; Stassinopoulos, Dimitris; Pearson, Mark; Warren, James

    2000-01-01

    On the path from inanimate to animate matter, a key step was the self-organization of molecules into protocells - the earliest ancestors of contemporary cells. Studies of the properties of protocells and the mechanisms by which they maintained themselves and reproduced are an important part of astrobiology. These studies also have the potential to greatly impact research in nanotechnology and computer science. Previous studies of protocells have focussed on self-replication. In these systems, Darwinian evolution occurs through a series of small alterations to functional molecules whose identities are stored. Protocells, however, may have been incapable of such storage. We hypothesize that under such conditions, the replication of functions and their interrelationships, rather than the precise identities of the functional molecules, is sufficient for survival and evolution. This process is called non-genomic evolution. Recent breakthroughs in experimental protein chemistry have opened the gates for experimental tests of non-genomic evolution. On the basis of these achievements, we have developed a stochastic model for examining the evolutionary potential of non-genomic systems. In this model, the formation and destruction (hydrolysis) of bonds joining amino acids in proteins occur through catalyzed, albeit possibly inefficient, pathways. Each protein can act as a substrate for polymerization or hydrolysis, or as a catalyst of these chemical reactions. When a protein is hydrolyzed to form two new proteins, or two proteins are joined into a single protein, the catalytic abilities of the product proteins are related to the catalytic abilities of the reactants. We will demonstrate that the catalytic capabilities of such a system can increase. Its evolutionary potential is dependent upon the competition between the formation of bond-forming and bond-cutting catalysts. The degree to which hydrolysis preferentially affects bonds in less efficient, and therefore less well

  7. Integrated evolutionary computation neural network quality controller for automated systems

    Energy Technology Data Exchange (ETDEWEB)

    Patro, S.; Kolarik, W.J. [Texas Tech Univ., Lubbock, TX (United States). Dept. of Industrial Engineering

    1999-06-01

    With increasing competition in the global market, more and more stringent quality standards and specifications are being demands at lower costs. Manufacturing applications of computing power are becoming more common. The application of neural networks to identification and control of dynamic processes has been discussed. The limitations of using neural networks for control purposes has been pointed out and a different technique, evolutionary computation, has been discussed. The results of identifying and controlling an unstable, dynamic process using evolutionary computation methods has been presented. A framework for an integrated system, using both neural networks and evolutionary computation, has been proposed to identify the process and then control the product quality, in a dynamic, multivariable system, in real-time.

  8. Evolutionary computing in Nuclear Engineering Institute/CNEN-Brazil

    International Nuclear Information System (INIS)

    Pereira, Claudio M.N.A.; Lapa, Celso M.F.; Lapa, Nelbia da Silva; Mol, Antonio C.

    2000-01-01

    This paper aims to discuss the importance of evolutionary computation (CE) for nuclear engineering and the development of this area in the Instituto de Engenharia Nuclear (IEN) at the last years. Are describe, briefly, the applications realized in this institute by the technical group of CE. For example: nuclear reactor core design optimization, preventive maintenance scheduling optimizing and nuclear reactor transient identifications. It is also shown a novel computational tool to implementation of genetic algorithm that was development in this institute and applied in those works. Some results were presents and the gains obtained with the evolutionary computation were discussing. (author)

  9. 10th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Lin, Jerry; Wang, Chia-Hung; Jiang, Xin

    2017-01-01

    This book gathers papers presented at the 10th International Conference on Genetic and Evolutionary Computing (ICGEC 2016). The conference was co-sponsored by Springer, Fujian University of Technology in China, the University of Computer Studies in Yangon, University of Miyazaki in Japan, National Kaohsiung University of Applied Sciences in Taiwan, Taiwan Association for Web Intelligence Consortium, and VSB-Technical University of Ostrava, Czech Republic. The ICGEC 2016, which was held from November 7 to 9, 2016 in Fuzhou City, China, was intended as an international forum for researchers and professionals in all areas of genetic and evolutionary computing.

  10. From evolutionary computation to the evolution of things

    NARCIS (Netherlands)

    Eiben, A.E.; Smith, J.E.

    2015-01-01

    Evolution has provided a source of inspiration for algorithm designers since the birth of computers. The resulting field, evolutionary computation, has been successful in solving engineering tasks ranging in outlook from the molecular to the astronomical. Today, the field is entering a new phase as

  11. Robust and Flexible Scheduling with Evolutionary Computation

    DEFF Research Database (Denmark)

    Jensen, Mikkel T.

    Over the last ten years, there have been numerous applications of evolutionary algorithms to a variety of scheduling problems. Like most other research on heuristic scheduling, the primary aim of the research has been on deterministic formulations of the problems. This is in contrast to real world...... scheduling problems which are usually not deterministic. Usually at the time the schedule is made some information about the problem and processing environment is available, but this information is uncertain and likely to change during schedule execution. Changes frequently encountered in scheduling...... environments include machine breakdowns, uncertain processing times, workers getting sick, materials being delayed and the appearance of new jobs. These possible environmental changes mean that a schedule which was optimal for the information available at the time of scheduling can end up being highly...

  12. Evolutionary Computation Techniques for Predicting Atmospheric Corrosion

    Directory of Open Access Journals (Sweden)

    Amine Marref

    2013-01-01

    Full Text Available Corrosion occurs in many engineering structures such as bridges, pipelines, and refineries and leads to the destruction of materials in a gradual manner and thus shortening their lifespan. It is therefore crucial to assess the structural integrity of engineering structures which are approaching or exceeding their designed lifespan in order to ensure their correct functioning, for example, carrying ability and safety. An understanding of corrosion and an ability to predict corrosion rate of a material in a particular environment plays a vital role in evaluating the residual life of the material. In this paper we investigate the use of genetic programming and genetic algorithms in the derivation of corrosion-rate expressions for steel and zinc. Genetic programming is used to automatically evolve corrosion-rate expressions while a genetic algorithm is used to evolve the parameters of an already engineered corrosion-rate expression. We show that both evolutionary techniques yield corrosion-rate expressions that have good accuracy.

  13. Evidence Combination From an Evolutionary Game Theory Perspective.

    Science.gov (United States)

    Deng, Xinyang; Han, Deqiang; Dezert, Jean; Deng, Yong; Shyr, Yu

    2016-09-01

    Dempster-Shafer evidence theory is a primary methodology for multisource information fusion because it is good at dealing with uncertain information. This theory provides a Dempster's rule of combination to synthesize multiple evidences from various information sources. However, in some cases, counter-intuitive results may be obtained based on that combination rule. Numerous new or improved methods have been proposed to suppress these counter-intuitive results based on perspectives, such as minimizing the information loss or deviation. Inspired by evolutionary game theory, this paper considers a biological and evolutionary perspective to study the combination of evidences. An evolutionary combination rule (ECR) is proposed to help find the most biologically supported proposition in a multievidence system. Within the proposed ECR, we develop a Jaccard matrix game to formalize the interaction between propositions in evidences, and utilize the replicator dynamics to mimick the evolution of propositions. Experimental results show that the proposed ECR can effectively suppress the counter-intuitive behaviors appeared in typical paradoxes of evidence theory, compared with many existing methods. Properties of the ECR, such as solution's stability and convergence, have been mathematically proved as well.

  14. Coevolution of Artificial Agents Using Evolutionary Computation in Bargaining Game

    Directory of Open Access Journals (Sweden)

    Sangwook Lee

    2015-01-01

    Full Text Available Analysis of bargaining game using evolutionary computation is essential issue in the field of game theory. This paper investigates the interaction and coevolutionary process among heterogeneous artificial agents using evolutionary computation (EC in the bargaining game. In particular, the game performance with regard to payoff through the interaction and coevolution of agents is studied. We present three kinds of EC based agents (EC-agent participating in the bargaining game: genetic algorithm (GA, particle swarm optimization (PSO, and differential evolution (DE. The agents’ performance with regard to changing condition is compared. From the simulation results it is found that the PSO-agent is superior to the other agents.

  15. Evolutionary Computation Methods and their applications in Statistics

    Directory of Open Access Journals (Sweden)

    Francesco Battaglia

    2013-05-01

    Full Text Available A brief discussion of the genesis of evolutionary computation methods, their relationship to artificial intelligence, and the contribution of genetics and Darwin’s theory of natural evolution is provided. Then, the main evolutionary computation methods are illustrated: evolution strategies, genetic algorithms, estimation of distribution algorithms, differential evolution, and a brief description of some evolutionary behavior methods such as ant colony and particle swarm optimization. We also discuss the role of the genetic algorithm for multivariate probability distribution random generation, rather than as a function optimizer. Finally, some relevant applications of genetic algorithm to statistical problems are reviewed: selection of variables in regression, time series model building, outlier identification, cluster analysis, design of experiments.

  16. Interior spatial layout with soft objectives using evolutionary computation

    NARCIS (Netherlands)

    Chatzikonstantinou, I.; Bengisu, E.

    2016-01-01

    This paper presents the design problem of furniture arrangement in a residential interior living space, and addresses it by means of evolutionary computation. Interior arrangement is an important and interesting problem that occurs commonly when designing living spaces. It entails determining the

  17. MEGA X: Molecular Evolutionary Genetics Analysis across Computing Platforms.

    Science.gov (United States)

    Kumar, Sudhir; Stecher, Glen; Li, Michael; Knyaz, Christina; Tamura, Koichiro

    2018-06-01

    The Molecular Evolutionary Genetics Analysis (Mega) software implements many analytical methods and tools for phylogenomics and phylomedicine. Here, we report a transformation of Mega to enable cross-platform use on Microsoft Windows and Linux operating systems. Mega X does not require virtualization or emulation software and provides a uniform user experience across platforms. Mega X has additionally been upgraded to use multiple computing cores for many molecular evolutionary analyses. Mega X is available in two interfaces (graphical and command line) and can be downloaded from www.megasoftware.net free of charge.

  18. Biomimetic design processes in architecture: morphogenetic and evolutionary computational design

    International Nuclear Information System (INIS)

    Menges, Achim

    2012-01-01

    Design computation has profound impact on architectural design methods. This paper explains how computational design enables the development of biomimetic design processes specific to architecture, and how they need to be significantly different from established biomimetic processes in engineering disciplines. The paper first explains the fundamental difference between computer-aided and computational design in architecture, as the understanding of this distinction is of critical importance for the research presented. Thereafter, the conceptual relation and possible transfer of principles from natural morphogenesis to design computation are introduced and the related developments of generative, feature-based, constraint-based, process-based and feedback-based computational design methods are presented. This morphogenetic design research is then related to exploratory evolutionary computation, followed by the presentation of two case studies focusing on the exemplary development of spatial envelope morphologies and urban block morphologies. (paper)

  19. Regulatory RNA design through evolutionary computation and strand displacement.

    Science.gov (United States)

    Rostain, William; Landrain, Thomas E; Rodrigo, Guillermo; Jaramillo, Alfonso

    2015-01-01

    The discovery and study of a vast number of regulatory RNAs in all kingdoms of life over the past decades has allowed the design of new synthetic RNAs that can regulate gene expression in vivo. Riboregulators, in particular, have been used to activate or repress gene expression. However, to accelerate and scale up the design process, synthetic biologists require computer-assisted design tools, without which riboregulator engineering will remain a case-by-case design process requiring expert attention. Recently, the design of RNA circuits by evolutionary computation and adapting strand displacement techniques from nanotechnology has proven to be suited to the automated generation of DNA sequences implementing regulatory RNA systems in bacteria. Herein, we present our method to carry out such evolutionary design and how to use it to create various types of riboregulators, allowing the systematic de novo design of genetic control systems in synthetic biology.

  20. Statistical physics and computational methods for evolutionary game theory

    CERN Document Server

    Javarone, Marco Alberto

    2018-01-01

    This book presents an introduction to Evolutionary Game Theory (EGT) which is an emerging field in the area of complex systems attracting the attention of researchers from disparate scientific communities. EGT allows one to represent and study several complex phenomena, such as the emergence of cooperation in social systems, the role of conformity in shaping the equilibrium of a population, and the dynamics in biological and ecological systems. Since EGT models belong to the area of complex systems, statistical physics constitutes a fundamental ingredient for investigating their behavior. At the same time, the complexity of some EGT models, such as those realized by means of agent-based methods, often require the implementation of numerical simulations. Therefore, beyond providing an introduction to EGT, this book gives a brief overview of the main statistical physics tools (such as phase transitions and the Ising model) and computational strategies for simulating evolutionary games (such as Monte Carlo algor...

  1. 9th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Lin, Jerry; Pan, Jeng-Shyang; Tin, Pyke; Yokota, Mitsuhiro; Genetic and Evolutionary Computing

    2016-01-01

    This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2015, the 9th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by Ministry of Science and Technology, Myanmar, University of Computer Studies, Yangon, University of Miyazaki in Japan, Kaohsiung University of Applied Science in Taiwan, Fujian University of Technology in China and VSB-Technical University of Ostrava. ICGEC 2015 is held from 26-28, August, 2015 in Yangon, Myanmar. Yangon, the most multiethnic and cosmopolitan city in Myanmar, is the main gateway to the country. Despite being the commercial capital of Myanmar, Yangon is a city engulfed by its rich history and culture, an integration of ancient traditions and spiritual heritage. The stunning SHWEDAGON Pagoda is the center piece of Yangon city, which itself is famous for the best British colonial era architecture. Of particular interest in many shops of Bogyoke Aung San Market,...

  2. 8th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Yang, Chin-Yu; Lin, Chun-Wei; Pan, Jeng-Shyang; Snasel, Vaclav; Abraham, Ajith

    2015-01-01

    This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2014, the 8th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by Nanchang Institute of Technology in China, Kaohsiung University of Applied Science in Taiwan, and VSB-Technical University of Ostrava. ICGEC 2014 is held from 18-20 October 2014 in Nanchang, China. Nanchang is one of is the capital of Jiangxi Province in southeastern China, located in the north-central portion of the province. As it is bounded on the west by the Jiuling Mountains, and on the east by Poyang Lake, it is famous for its scenery, rich history and cultural sites. Because of its central location relative to the Yangtze and Pearl River Delta regions, it is a major railroad hub in Southern China. The conference is intended as an international forum for the researchers and professionals in all areas of genetic and evolutionary computing.

  3. 7th International Conference on Genetic and Evolutionary Computing

    CERN Document Server

    Krömer, Pavel; Snášel, Václav

    2014-01-01

    Genetic and Evolutionary Computing This volume of Advances in Intelligent Systems and Computing contains accepted papers presented at ICGEC 2013, the 7th International Conference on Genetic and Evolutionary Computing. The conference this year was technically co-sponsored by The Waseda University in Japan, Kaohsiung University of Applied Science in Taiwan, and VSB-Technical University of Ostrava. ICGEC 2013 was held in Prague, Czech Republic. Prague is one of the most beautiful cities in the world whose magical atmosphere has been shaped over ten centuries. Places of the greatest tourist interest are on the Royal Route running from the Powder Tower through Celetna Street to Old Town Square, then across Charles Bridge through the Lesser Town up to the Hradcany Castle. One should not miss the Jewish Town, and the National Gallery with its fine collection of Czech Gothic art, collection of old European art, and a beautiful collection of French art. The conference was intended as an international forum for the res...

  4. Recent advances in swarm intelligence and evolutionary computation

    CERN Document Server

    2015-01-01

    This timely review volume summarizes the state-of-the-art developments in nature-inspired algorithms and applications with the emphasis on swarm intelligence and bio-inspired computation. Topics include the analysis and overview of swarm intelligence and evolutionary computation, hybrid metaheuristic algorithms, bat algorithm, discrete cuckoo search, firefly algorithm, particle swarm optimization, and harmony search as well as convergent hybridization. Application case studies have focused on the dehydration of fruits and vegetables by the firefly algorithm and goal programming, feature selection by the binary flower pollination algorithm, job shop scheduling, single row facility layout optimization, training of feed-forward neural networks, damage and stiffness identification, synthesis of cross-ambiguity functions by the bat algorithm, web document clustering, truss analysis, water distribution networks, sustainable building designs and others. As a timely review, this book can serve as an ideal reference f...

  5. Protein 3D structure computed from evolutionary sequence variation.

    Directory of Open Access Journals (Sweden)

    Debora S Marks

    Full Text Available The evolutionary trajectory of a protein through sequence space is constrained by its function. Collections of sequence homologs record the outcomes of millions of evolutionary experiments in which the protein evolves according to these constraints. Deciphering the evolutionary record held in these sequences and exploiting it for predictive and engineering purposes presents a formidable challenge. The potential benefit of solving this challenge is amplified by the advent of inexpensive high-throughput genomic sequencing.In this paper we ask whether we can infer evolutionary constraints from a set of sequence homologs of a protein. The challenge is to distinguish true co-evolution couplings from the noisy set of observed correlations. We address this challenge using a maximum entropy model of the protein sequence, constrained by the statistics of the multiple sequence alignment, to infer residue pair couplings. Surprisingly, we find that the strength of these inferred couplings is an excellent predictor of residue-residue proximity in folded structures. Indeed, the top-scoring residue couplings are sufficiently accurate and well-distributed to define the 3D protein fold with remarkable accuracy.We quantify this observation by computing, from sequence alone, all-atom 3D structures of fifteen test proteins from different fold classes, ranging in size from 50 to 260 residues, including a G-protein coupled receptor. These blinded inferences are de novo, i.e., they do not use homology modeling or sequence-similar fragments from known structures. The co-evolution signals provide sufficient information to determine accurate 3D protein structure to 2.7-4.8 Å C(α-RMSD error relative to the observed structure, over at least two-thirds of the protein (method called EVfold, details at http://EVfold.org. This discovery provides insight into essential interactions constraining protein evolution and will facilitate a comprehensive survey of the universe of

  6. Fundamentals of computational intelligence neural networks, fuzzy systems, and evolutionary computation

    CERN Document Server

    Keller, James M; Fogel, David B

    2016-01-01

    This book covers the three fundamental topics that form the basis of computational intelligence: neural networks, fuzzy systems, and evolutionary computation. The text focuses on inspiration, design, theory, and practical aspects of implementing procedures to solve real-world problems. While other books in the three fields that comprise computational intelligence are written by specialists in one discipline, this book is co-written by current former Editor-in-Chief of IEEE Transactions on Neural Networks and Learning Systems, a former Editor-in-Chief of IEEE Transactions on Fuzzy Systems, and the founding Editor-in-Chief of IEEE Transactions on Evolutionary Computation. The coverage across the three topics is both uniform and consistent in style and notation. Discusses single-layer and multilayer neural networks, radial-basi function networks, and recurrent neural networks Covers fuzzy set theory, fuzzy relations, fuzzy logic interference, fuzzy clustering and classification, fuzzy measures and fuzz...

  7. Optimization and Assessment of Wavelet Packet Decompositions with Evolutionary Computation

    Directory of Open Access Journals (Sweden)

    Schell Thomas

    2003-01-01

    Full Text Available In image compression, the wavelet transformation is a state-of-the-art component. Recently, wavelet packet decomposition has received quite an interest. A popular approach for wavelet packet decomposition is the near-best-basis algorithm using nonadditive cost functions. In contrast to additive cost functions, the wavelet packet decomposition of the near-best-basis algorithm is only suboptimal. We apply methods from the field of evolutionary computation (EC to test the quality of the near-best-basis results. We observe a phenomenon: the results of the near-best-basis algorithm are inferior in terms of cost-function optimization but are superior in terms of rate/distortion performance compared to EC methods.

  8. Crowd Computing as a Cooperation Problem: An Evolutionary Approach

    Science.gov (United States)

    Christoforou, Evgenia; Fernández Anta, Antonio; Georgiou, Chryssis; Mosteiro, Miguel A.; Sánchez, Angel

    2013-05-01

    Cooperation is one of the socio-economic issues that has received more attention from the physics community. The problem has been mostly considered by studying games such as the Prisoner's Dilemma or the Public Goods Game. Here, we take a step forward by studying cooperation in the context of crowd computing. We introduce a model loosely based on Principal-agent theory in which people (workers) contribute to the solution of a distributed problem by computing answers and reporting to the problem proposer (master). To go beyond classical approaches involving the concept of Nash equilibrium, we work on an evolutionary framework in which both the master and the workers update their behavior through reinforcement learning. Using a Markov chain approach, we show theoretically that under certain----not very restrictive—conditions, the master can ensure the reliability of the answer resulting of the process. Then, we study the model by numerical simulations, finding that convergence, meaning that the system reaches a point in which it always produces reliable answers, may in general be much faster than the upper bounds given by the theoretical calculation. We also discuss the effects of the master's level of tolerance to defectors, about which the theory does not provide information. The discussion shows that the system works even with very large tolerances. We conclude with a discussion of our results and possible directions to carry this research further.

  9. The Security Challenges in the IoT Enabled Cyber-Physical Systems and Opportunities for Evolutionary Computing & Other Computational Intelligence

    OpenAIRE

    He, H.; Maple, C.; Watson, T.; Tiwari, A.; Mehnen, J.; Jin, Y.; Gabrys, Bogdan

    2016-01-01

    Internet of Things (IoT) has given rise to the fourth industrial revolution (Industrie 4.0), and it brings great benefits by connecting people, processes and data. However, cybersecurity has become a critical challenge in the IoT enabled cyber physical systems, from connected supply chain, Big Data produced by huge amount of IoT devices, to industry control systems. Evolutionary computation combining with other computational intelligence will play an important role for cybersecurity, such as ...

  10. An evolutionary computation approach to examine functional brain plasticity

    Directory of Open Access Journals (Sweden)

    Arnab eRoy

    2016-04-01

    Full Text Available One common research goal in systems neurosciences is to understand how the functional relationship between a pair of regions of interest (ROIs evolves over time. Examining neural connectivity in this way is well-suited for the study of developmental processes, learning, and even in recovery or treatment designs in response to injury. For most fMRI based studies, the strength of the functional relationship between two ROIs is defined as the correlation between the average signal representing each region. The drawback to this approach is that much information is lost due to averaging heterogeneous voxels, and therefore, the functional relationship between a ROI-pair that evolve at a spatial scale much finer than the ROIs remain undetected. To address this shortcoming, we introduce a novel evolutionary computation (EC based voxel-level procedure to examine functional plasticity between an investigator defined ROI-pair by simultaneously using subject-specific BOLD-fMRI data collected from two sessions seperated by finite duration of time. This data-driven procedure detects a sub-region composed of spatially connected voxels from each ROI (a so-called sub-regional-pair such that the pair shows a significant gain/loss of functional relationship strength across the two time points. The procedure is recursive and iteratively finds all statistically significant sub-regional-pairs within the ROIs. Using this approach, we examine functional plasticity between the default mode network (DMN and the executive control network (ECN during recovery from traumatic brain injury (TBI; the study includes 14 TBI and 12 healthy control subjects. We demonstrate that the EC based procedure is able to detect functional plasticity where a traditional averaging based approach fails. The subject-specific plasticity estimates obtained using the EC-procedure are highly consistent across multiple runs. Group-level analyses using these plasticity estimates showed an increase in

  11. Combining Conflicting Environmental and Task Requirements in Evolutionary Robotics

    NARCIS (Netherlands)

    Haasdijk, E.W.

    2015-01-01

    The MONEE framework endows collective adaptive robotic systems with the ability to combine environment- and task-driven selection pressures: it enables distributed online algorithms for learning behaviours that ensure both survival and accomplishment of user-defined tasks. This paper explores the

  12. Investigating the Multi-memetic Mind Evolutionary Computation Algorithm Efficiency

    Directory of Open Access Journals (Sweden)

    M. K. Sakharov

    2017-01-01

    Full Text Available In solving practically significant problems of global optimization, the objective function is often of high dimensionality and computational complexity and of nontrivial landscape as well. Studies show that often one optimization method is not enough for solving such problems efficiently - hybridization of several optimization methods is necessary.One of the most promising contemporary trends in this field are memetic algorithms (MA, which can be viewed as a combination of the population-based search for a global optimum and the procedures for a local refinement of solutions (memes, provided by a synergy. Since there are relatively few theoretical studies concerning the MA configuration, which is advisable for use to solve the black-box optimization problems, many researchers tend just to adaptive algorithms, which for search select the most efficient methods of local optimization for the certain domains of the search space.The article proposes a multi-memetic modification of a simple SMEC algorithm, using random hyper-heuristics. Presents the software algorithm and memes used (Nelder-Mead method, method of random hyper-sphere surface search, Hooke-Jeeves method. Conducts a comparative study of the efficiency of the proposed algorithm depending on the set and the number of memes. The study has been carried out using Rastrigin, Rosenbrock, and Zakharov multidimensional test functions. Computational experiments have been carried out for all possible combinations of memes and for each meme individually.According to results of study, conducted by the multi-start method, the combinations of memes, comprising the Hooke-Jeeves method, were successful. These results prove a rapid convergence of the method to a local optimum in comparison with other memes, since all methods perform the fixed number of iterations at the most.The analysis of the average number of iterations shows that using the most efficient sets of memes allows us to find the optimal

  13. Computing the Quartet Distance Between Evolutionary Trees in Time O(n log n)

    DEFF Research Database (Denmark)

    Brodal, Gerth Sølfting; Fagerberg, Rolf; Pedersen, Christian Nørgaard Storm

    2003-01-01

    Evolutionary trees describing the relationship for a set of species are central in evolutionary biology, and quantifying differences between evolutionary trees is therefore an important task. The quartet distance is a distance measure between trees previously proposed by Estabrook, McMorris, and ...... unrooted evolutionary trees of n species, where all internal nodes have degree three, in time O(n log n. The previous best algorithm for the problem uses time O(n 2).......Evolutionary trees describing the relationship for a set of species are central in evolutionary biology, and quantifying differences between evolutionary trees is therefore an important task. The quartet distance is a distance measure between trees previously proposed by Estabrook, Mc......Morris, and Meacham. The quartet distance between two unrooted evolutionary trees is the number of quartet topology differences between the two trees, where a quartet topology is the topological subtree induced by four species. In this paper we present an algorithm for computing the quartet distance between two...

  14. Simplified Drift Analysis for Proving Lower Bounds in Evolutionary Computation

    DEFF Research Database (Denmark)

    Oliveto, Pietro S.; Witt, Carsten

    2011-01-01

    Drift analysis is a powerful tool used to bound the optimization time of evolutionary algorithms (EAs). Various previous works apply a drift theorem going back to Hajek in order to show exponential lower bounds on the optimization time of EAs. However, this drift theorem is tedious to read...... and to apply since it requires two bounds on the moment-generating (exponential) function of the drift. A recent work identifies a specialization of this drift theorem that is much easier to apply. Nevertheless, it is not as simple and not as general as possible. The present paper picks up Hajek’s line...

  15. Practical Applications of Evolutionary Computation to Financial Engineering Robust Techniques for Forecasting, Trading and Hedging

    CERN Document Server

    Iba, Hitoshi

    2012-01-01

    “Practical Applications of Evolutionary Computation to Financial Engineering” presents the state of the art techniques in Financial Engineering using recent results in Machine Learning and Evolutionary Computation. This book bridges the gap between academics in computer science and traders and explains the basic ideas of the proposed systems and the financial problems in ways that can be understood by readers without previous knowledge on either of the fields. To cement the ideas discussed in the book, software packages are offered that implement the systems described within. The book is structured so that each chapter can be read independently from the others. Chapters 1 and 2 describe evolutionary computation. The third chapter is an introduction to financial engineering problems for readers who are unfamiliar with this area. The following chapters each deal, in turn, with a different problem in the financial engineering field describing each problem in detail and focusing on solutions based on evolutio...

  16. A hybrid finite element analysis and evolutionary computation method for the design of lightweight lattice components with optimized strut diameter

    DEFF Research Database (Denmark)

    Salonitis, Konstantinos; Chantzis, Dimitrios; Kappatos, Vasileios

    2017-01-01

    approaches or with the use of topology optimization methodologies. An optimization approach utilizing multipurpose optimization algorithms has not been proposed yet. This paper presents a novel user-friendly method for the design optimization of lattice components towards weight minimization, which combines...... finite element analysis and evolutionary computation. The proposed method utilizes the cell homogenization technique in order to reduce the computational cost of the finite element analysis and a genetic algorithm in order to search for the most lightweight lattice configuration. A bracket consisting...

  17. Basic emotions and adaptation. A computational and evolutionary model.

    Science.gov (United States)

    Pacella, Daniela; Ponticorvo, Michela; Gigliotta, Onofrio; Miglino, Orazio

    2017-01-01

    The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then switching their behavior

  18. A security mechanism based on evolutionary game in fog computing.

    Science.gov (United States)

    Sun, Yan; Lin, Fuhong; Zhang, Nan

    2018-02-01

    Fog computing is a distributed computing paradigm at the edge of the network and requires cooperation of users and sharing of resources. When users in fog computing open their resources, their devices are easily intercepted and attacked because they are accessed through wireless network and present an extensive geographical distribution. In this study, a credible third party was introduced to supervise the behavior of users and protect the security of user cooperation. A fog computing security mechanism based on human nervous system is proposed, and the strategy for a stable system evolution is calculated. The MATLAB simulation results show that the proposed mechanism can reduce the number of attack behaviors effectively and stimulate users to cooperate in application tasks positively.

  19. A security mechanism based on evolutionary game in fog computing

    Directory of Open Access Journals (Sweden)

    Yan Sun

    2018-02-01

    Full Text Available Fog computing is a distributed computing paradigm at the edge of the network and requires cooperation of users and sharing of resources. When users in fog computing open their resources, their devices are easily intercepted and attacked because they are accessed through wireless network and present an extensive geographical distribution. In this study, a credible third party was introduced to supervise the behavior of users and protect the security of user cooperation. A fog computing security mechanism based on human nervous system is proposed, and the strategy for a stable system evolution is calculated. The MATLAB simulation results show that the proposed mechanism can reduce the number of attack behaviors effectively and stimulate users to cooperate in application tasks positively.

  20. Basic emotions and adaptation. A computational and evolutionary model.

    Directory of Open Access Journals (Sweden)

    Daniela Pacella

    Full Text Available The core principles of the evolutionary theories of emotions declare that affective states represent crucial drives for action selection in the environment and regulated the behavior and adaptation of natural agents in ancestrally recurrent situations. While many different studies used autonomous artificial agents to simulate emotional responses and the way these patterns can affect decision-making, few are the approaches that tried to analyze the evolutionary emergence of affective behaviors directly from the specific adaptive problems posed by the ancestral environment. A model of the evolution of affective behaviors is presented using simulated artificial agents equipped with neural networks and physically inspired on the architecture of the iCub humanoid robot. We use genetic algorithms to train populations of virtual robots across generations, and investigate the spontaneous emergence of basic emotional behaviors in different experimental conditions. In particular, we focus on studying the emotion of fear, therefore the environment explored by the artificial agents can contain stimuli that are safe or dangerous to pick. The simulated task is based on classical conditioning and the agents are asked to learn a strategy to recognize whether the environment is safe or represents a threat to their lives and select the correct action to perform in absence of any visual cues. The simulated agents have special input units in their neural structure whose activation keep track of their actual "sensations" based on the outcome of past behavior. We train five different neural network architectures and then test the best ranked individuals comparing their performances and analyzing the unit activations in each individual's life cycle. We show that the agents, regardless of the presence of recurrent connections, spontaneously evolved the ability to cope with potentially dangerous environment by collecting information about the environment and then

  1. A sense of life: computational and experimental investigations with models of biochemical and evolutionary processes.

    Science.gov (United States)

    Mishra, Bud; Daruwala, Raoul-Sam; Zhou, Yi; Ugel, Nadia; Policriti, Alberto; Antoniotti, Marco; Paxia, Salvatore; Rejali, Marc; Rudra, Archisman; Cherepinsky, Vera; Silver, Naomi; Casey, William; Piazza, Carla; Simeoni, Marta; Barbano, Paolo; Spivak, Marina; Feng, Jiawu; Gill, Ofer; Venkatesh, Mysore; Cheng, Fang; Sun, Bing; Ioniata, Iuliana; Anantharaman, Thomas; Hubbard, E Jane Albert; Pnueli, Amir; Harel, David; Chandru, Vijay; Hariharan, Ramesh; Wigler, Michael; Park, Frank; Lin, Shih-Chieh; Lazebnik, Yuri; Winkler, Franz; Cantor, Charles R; Carbone, Alessandra; Gromov, Mikhael

    2003-01-01

    We collaborate in a research program aimed at creating a rigorous framework, experimental infrastructure, and computational environment for understanding, experimenting with, manipulating, and modifying a diverse set of fundamental biological processes at multiple scales and spatio-temporal modes. The novelty of our research is based on an approach that (i) requires coevolution of experimental science and theoretical techniques and (ii) exploits a certain universality in biology guided by a parsimonious model of evolutionary mechanisms operating at the genomic level and manifesting at the proteomic, transcriptomic, phylogenic, and other higher levels. Our current program in "systems biology" endeavors to marry large-scale biological experiments with the tools to ponder and reason about large, complex, and subtle natural systems. To achieve this ambitious goal, ideas and concepts are combined from many different fields: biological experimentation, applied mathematical modeling, computational reasoning schemes, and large-scale numerical and symbolic simulations. From a biological viewpoint, the basic issues are many: (i) understanding common and shared structural motifs among biological processes; (ii) modeling biological noise due to interactions among a small number of key molecules or loss of synchrony; (iii) explaining the robustness of these systems in spite of such noise; and (iv) cataloging multistatic behavior and adaptation exhibited by many biological processes.

  2. Computational Modeling of Teaching and Learning through Application of Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Richard Lamb

    2015-09-01

    Full Text Available Within the mind, there are a myriad of ideas that make sense within the bounds of everyday experience, but are not reflective of how the world actually exists; this is particularly true in the domain of science. Classroom learning with teacher explanation are a bridge through which these naive understandings can be brought in line with scientific reality. The purpose of this paper is to examine how the application of a Multiobjective Evolutionary Algorithm (MOEA can work in concert with an existing computational-model to effectively model critical-thinking in the science classroom. An evolutionary algorithm is an algorithm that iteratively optimizes machine learning based computational models. The research question is, does the application of an evolutionary algorithm provide a means to optimize the Student Task and Cognition Model (STAC-M and does the optimized model sufficiently represent and predict teaching and learning outcomes in the science classroom? Within this computational study, the authors outline and simulate the effect of teaching on the ability of a “virtual” student to solve a Piagetian task. Using the Student Task and Cognition Model (STAC-M a computational model of student cognitive processing in science class developed in 2013, the authors complete a computational experiment which examines the role of cognitive retraining on student learning. Comparison of the STAC-M and the STAC-M with inclusion of the Multiobjective Evolutionary Algorithm shows greater success in solving the Piagetian science-tasks post cognitive retraining with the Multiobjective Evolutionary Algorithm. This illustrates the potential uses of cognitive and neuropsychological computational modeling in educational research. The authors also outline the limitations and assumptions of computational modeling.

  3. Designing a parallel evolutionary algorithm for inferring gene networks on the cloud computing environment.

    Science.gov (United States)

    Lee, Wei-Po; Hsiao, Yu-Ting; Hwang, Wei-Che

    2014-01-16

    To improve the tedious task of reconstructing gene networks through testing experimentally the possible interactions between genes, it becomes a trend to adopt the automated reverse engineering procedure instead. Some evolutionary algorithms have been suggested for deriving network parameters. However, to infer large networks by the evolutionary algorithm, it is necessary to address two important issues: premature convergence and high computational cost. To tackle the former problem and to enhance the performance of traditional evolutionary algorithms, it is advisable to use parallel model evolutionary algorithms. To overcome the latter and to speed up the computation, it is advocated to adopt the mechanism of cloud computing as a promising solution: most popular is the method of MapReduce programming model, a fault-tolerant framework to implement parallel algorithms for inferring large gene networks. This work presents a practical framework to infer large gene networks, by developing and parallelizing a hybrid GA-PSO optimization method. Our parallel method is extended to work with the Hadoop MapReduce programming model and is executed in different cloud computing environments. To evaluate the proposed approach, we use a well-known open-source software GeneNetWeaver to create several yeast S. cerevisiae sub-networks and use them to produce gene profiles. Experiments have been conducted and the results have been analyzed. They show that our parallel approach can be successfully used to infer networks with desired behaviors and the computation time can be largely reduced. Parallel population-based algorithms can effectively determine network parameters and they perform better than the widely-used sequential algorithms in gene network inference. These parallel algorithms can be distributed to the cloud computing environment to speed up the computation. By coupling the parallel model population-based optimization method and the parallel computational framework, high

  4. A case study of evolutionary computation of biochemical adaptation

    International Nuclear Information System (INIS)

    François, Paul; Siggia, Eric D

    2008-01-01

    Simulations of evolution have a long history, but their relation to biology is questioned because of the perceived contingency of evolution. Here we provide an example of a biological process, adaptation, where simulations are argued to approach closer to biology. Adaptation is a common feature of sensory systems, and a plausible component of other biochemical networks because it rescales upstream signals to facilitate downstream processing. We create random gene networks numerically, by linking genes with interactions that model transcription, phosphorylation and protein–protein association. We define a fitness function for adaptation in terms of two functional metrics, and show that any reasonable combination of them will yield the same adaptive networks after repeated rounds of mutation and selection. Convergence to these networks is driven by positive selection and thus fast. There is always a path in parameter space of continuously improving fitness that leads to perfect adaptation, implying that the actual mutation rates we use in the simulation do not bias the results. Our results imply a kinetic view of evolution, i.e., it favors gene networks that can be learned quickly from the random examples supplied by mutation. This formulation allows for deductive predictions of the networks realized in nature

  5. Controller Design of DFIG Based Wind Turbine by Using Evolutionary Soft Computational Techniques

    Directory of Open Access Journals (Sweden)

    O. P. Bharti

    2017-06-01

    Full Text Available This manuscript illustrates the controller design for a doubly fed induction generator based variable speed wind turbine by using a bioinspired scheme. This methodology is based on exploiting two proficient swarm intelligence based evolutionary soft computational procedures. The particle swarm optimization (PSO and bacterial foraging optimization (BFO techniques are employed to design the controller intended for small damping plant of the DFIG. Wind energy overview and DFIG operating principle along with the equivalent circuit model is adequately discussed in this paper. The controller design for DFIG based WECS using PSO and BFO are described comparatively in detail. The responses of the DFIG system regarding terminal voltage, current, active-reactive power, and DC-Link voltage have slightly improved with the evolutionary soft computational procedure. Lastly, the obtained output is equated with a standard technique for performance improvement of DFIG based wind energy conversion system.

  6. A cyber kill chain based taxonomy of banking Trojans for evolutionary computational intelligence

    OpenAIRE

    Kiwia, D; Dehghantanha, A; Choo, K-KR; Slaughter, J

    2017-01-01

    Malware such as banking Trojans are popular with financially-motivated cybercriminals. Detection of banking Trojans remains a challenging task, due to the constant evolution of techniques used to obfuscate and circumvent existing detection and security solutions. Having a malware taxonomy can facilitate the design of mitigation strategies such as those based on evolutionary computational intelligence. Specifically, in this paper, we propose a cyber kill chain based taxonomy of banking Trojans...

  7. OT-Combiners Via Secure Computation

    DEFF Research Database (Denmark)

    Harnik, Danny; Ishai, Yuval; Kushilevitz, Eyal

    2008-01-01

    of faulty candidates (t = Ω(n)). Previous OT-combiners required either ω(n) or poly(k) calls to the n candidates, where k is a security parameter, and produced only a single secure OT. We demonstrate the usefulness of the latter result by presenting several applications that are of independent interest......An OT-combiner implements a secure oblivious transfer (OT) protocol using oracle access to n OT-candidates of which at most t may be faulty. We introduce a new general approach for combining OTs by making a simple and modular use of protocols for secure computation. Specifically, we obtain an OT......, strengthen the security, and improve the efficiency of previous OT-combiners. In particular, we obtain the first constant-rate OT-combiners in which the number of secure OTs being produced is a constant fraction of the total number of calls to the OT-candidates, while still tolerating a constant fraction...

  8. International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems

    CERN Document Server

    Vijayakumar, K; Panigrahi, Bijaya; Das, Swagatam

    2017-01-01

    The volume is a collection of high-quality peer-reviewed research papers presented in the International Conference on Artificial Intelligence and Evolutionary Computation in Engineering Systems (ICAIECES 2016) held at SRM University, Chennai, Tamilnadu, India. This conference is an international forum for industry professionals and researchers to deliberate and state their research findings, discuss the latest advancements and explore the future directions in the emerging areas of engineering and technology. The book presents original work and novel ideas, information, techniques and applications in the field of communication, computing and power technologies.

  9. International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems

    CERN Document Server

    Bhaskar, M; Panigrahi, Bijaya; Das, Swagatam

    2016-01-01

    The book is a collection of high-quality peer-reviewed research papers presented in the first International Conference on International Conference on Artificial Intelligence and Evolutionary Computations in Engineering Systems (ICAIECES -2015) held at Velammal Engineering College (VEC), Chennai, India during 22 – 23 April 2015. The book discusses wide variety of industrial, engineering and scientific applications of the emerging techniques. Researchers from academic and industry present their original work and exchange ideas, information, techniques and applications in the field of Communication, Computing and Power Technologies.

  10. Artificial Intelligence, Evolutionary Computing and Metaheuristics In the Footsteps of Alan Turing

    CERN Document Server

    2013-01-01

    Alan Turing pioneered many research areas such as artificial intelligence, computability, heuristics and pattern formation.  Nowadays at the information age, it is hard to imagine how the world would be without computers and the Internet. Without Turing's work, especially the core concept of Turing Machine at the heart of every computer, mobile phone and microchip today, so many things on which we are so dependent would be impossible. 2012 is the Alan Turing year -- a centenary celebration of the life and work of Alan Turing. To celebrate Turing's legacy and follow the footsteps of this brilliant mind, we take this golden opportunity to review the latest developments in areas of artificial intelligence, evolutionary computation and metaheuristics, and all these areas can be traced back to Turing's pioneer work. Topics include Turing test, Turing machine, artificial intelligence, cryptography, software testing, image processing, neural networks, nature-inspired algorithms such as bat algorithm and cuckoo sear...

  11. Artificial intelligence in peer review: How can evolutionary computation support journal editors?

    Science.gov (United States)

    Mrowinski, Maciej J; Fronczak, Piotr; Fronczak, Agata; Ausloos, Marcel; Nedic, Olgica

    2017-01-01

    With the volume of manuscripts submitted for publication growing every year, the deficiencies of peer review (e.g. long review times) are becoming more apparent. Editorial strategies, sets of guidelines designed to speed up the process and reduce editors' workloads, are treated as trade secrets by publishing houses and are not shared publicly. To improve the effectiveness of their strategies, editors in small publishing groups are faced with undertaking an iterative trial-and-error approach. We show that Cartesian Genetic Programming, a nature-inspired evolutionary algorithm, can dramatically improve editorial strategies. The artificially evolved strategy reduced the duration of the peer review process by 30%, without increasing the pool of reviewers (in comparison to a typical human-developed strategy). Evolutionary computation has typically been used in technological processes or biological ecosystems. Our results demonstrate that genetic programs can improve real-world social systems that are usually much harder to understand and control than physical systems.

  12. Artificial intelligence in peer review: How can evolutionary computation support journal editors?

    Directory of Open Access Journals (Sweden)

    Maciej J Mrowinski

    Full Text Available With the volume of manuscripts submitted for publication growing every year, the deficiencies of peer review (e.g. long review times are becoming more apparent. Editorial strategies, sets of guidelines designed to speed up the process and reduce editors' workloads, are treated as trade secrets by publishing houses and are not shared publicly. To improve the effectiveness of their strategies, editors in small publishing groups are faced with undertaking an iterative trial-and-error approach. We show that Cartesian Genetic Programming, a nature-inspired evolutionary algorithm, can dramatically improve editorial strategies. The artificially evolved strategy reduced the duration of the peer review process by 30%, without increasing the pool of reviewers (in comparison to a typical human-developed strategy. Evolutionary computation has typically been used in technological processes or biological ecosystems. Our results demonstrate that genetic programs can improve real-world social systems that are usually much harder to understand and control than physical systems.

  13. Genomicus 2018: karyotype evolutionary trees and on-the-fly synteny computing.

    Science.gov (United States)

    Nguyen, Nga Thi Thuy; Vincens, Pierre; Roest Crollius, Hugues; Louis, Alexandra

    2018-01-04

    Since 2010, the Genomicus web server is available online at http://genomicus.biologie.ens.fr/genomicus. This graphical browser provides access to comparative genomic analyses in four different phyla (Vertebrate, Plants, Fungi, and non vertebrate Metazoans). Users can analyse genomic information from extant species, as well as ancestral gene content and gene order for vertebrates and flowering plants, in an integrated evolutionary context. New analyses and visualization tools have recently been implemented in Genomicus Vertebrate. Karyotype structures from several genomes can now be compared along an evolutionary pathway (Multi-KaryotypeView), and synteny blocks can be computed and visualized between any two genomes (PhylDiagView). © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Combining Environment-Driven Adaptation and Task-Driven Optimisation in Evolutionary Robotics

    NARCIS (Netherlands)

    Haasdijk, E.W.; Bredeche, Nicolas; Eiben, A.E.

    2014-01-01

    Embodied evolutionary robotics is a sub-field of evolutionary robotics that employs evolutionary algorithms on the robotic hardware itself, during the operational period, i.e., in an on-line fashion. This enables robotic systems that continuously adapt, and are therefore capable of (re-)adjusting

  15. Multi-objective optimization of HVAC system with an evolutionary computation algorithm

    International Nuclear Information System (INIS)

    Kusiak, Andrew; Tang, Fan; Xu, Guanglin

    2011-01-01

    A data-mining approach for the optimization of a HVAC (heating, ventilation, and air conditioning) system is presented. A predictive model of the HVAC system is derived by data-mining algorithms, using a dataset collected from an experiment conducted at a research facility. To minimize the energy while maintaining the corresponding IAQ (indoor air quality) within a user-defined range, a multi-objective optimization model is developed. The solutions of this model are set points of the control system derived with an evolutionary computation algorithm. The controllable input variables - supply air temperature and supply air duct static pressure set points - are generated to reduce the energy use. The results produced by the evolutionary computation algorithm show that the control strategy saves energy by optimizing operations of an HVAC system. -- Highlights: → A data-mining approach for the optimization of a heating, ventilation, and air conditioning (HVAC) system is presented. → The data used in the project has been collected from an experiment conducted at an energy research facility. → The approach presented in the paper leads to accomplishing significant energy savings without compromising the indoor air quality. → The energy savings are accomplished by computing set points for the supply air temperature and the supply air duct static pressure.

  16. Solution of Fractional Order System of Bagley-Torvik Equation Using Evolutionary Computational Intelligence

    Directory of Open Access Journals (Sweden)

    Muhammad Asif Zahoor Raja

    2011-01-01

    Full Text Available A stochastic technique has been developed for the solution of fractional order system represented by Bagley-Torvik equation. The mathematical model of the equation was developed with the help of feed-forward artificial neural networks. The training of the networks was made with evolutionary computational intelligence based on genetic algorithm hybrid with pattern search technique. Designed scheme was successfully applied to different forms of the equation. Results are compared with standard approximate analytic, stochastic numerical solvers and exact solutions.

  17. An evolutionary computing frame work toward object extraction from satellite images

    Directory of Open Access Journals (Sweden)

    P.V. Arun

    2013-12-01

    Full Text Available Image interpretation domains have witnessed the application of many intelligent methodologies over the past decade; however the effective use of evolutionary computing techniques for feature detection has been less explored. In this paper, we critically analyze the possibility of using cellular neural network for accurate feature detection. Contextual knowledge has been effectively represented by incorporating spectral and spatial aspects using adaptive kernel strategy. Developed methodology has been compared with traditional approaches in an object based context and investigations revealed that considerable success has been achieved with the procedure. Intelligent interpretation, automatic interpolation, and effective contextual representations are the features of the system.

  18. Combining evolutionary algorithms with oblique decision trees to detect bent-double galaxies

    Science.gov (United States)

    Cantu-Paz, Erick; Kamath, Chandrika

    2000-10-01

    Decision tress have long been popular in classification as they use simple and easy-to-understand tests at each node. Most variants of decision trees test a single attribute at a node, leading to axis- parallel trees, where the test results in a hyperplane which is parallel to one of the dimensions in the attribute space. These trees can be rather large and inaccurate in cases where the concept to be learned is best approximated by oblique hyperplanes. In such cases, it may be more appropriate to use an oblique decision tree, where the decision at each node is a linear combination of the attributes. Oblique decision trees have not gained wide popularity in part due to the complexity of constructing good oblique splits and the tendency of existing splitting algorithms to get stuck in local minima. Several alternatives have been proposed to handle these problems including randomization in conjunction wiht deterministic hill-climbing and the use of simulated annealing. In this paper, we use evolutionary algorithms (EAs) to determine the split. EAs are well suited for this problem because of their global search properties, their tolerance to noisy fitness evaluations, and their scalability to large dimensional search spaces. We demonstrate our technique on a synthetic data set, and then we apply it to a practical problem from astronomy, namely, the classification of galaxies with a bent-double morphology. In addition, we describe our experiences with several split evaluation criteria. Our results suggest that, in some cases, the evolutionary approach is faster and more accurate than existing oblique decision tree algorithms. However, for our astronomical data, the accuracy is not significantly different than the axis-parallel trees.

  19. Investigating preferences for color-shape combinations with gaze driven optimization method based on evolutionary algorithms.

    Science.gov (United States)

    Holmes, Tim; Zanker, Johannes M

    2013-01-01

    Studying aesthetic preference is notoriously difficult because it targets individual experience. Eye movements provide a rich source of behavioral measures that directly reflect subjective choice. To determine individual preferences for simple composition rules we here use fixation duration as the fitness measure in a Gaze Driven Evolutionary Algorithm (GDEA), which has been demonstrated as a tool to identify aesthetic preferences (Holmes and Zanker, 2012). In the present study, the GDEA was used to investigate the preferred combination of color and shape which have been promoted in the Bauhaus arts school. We used the same three shapes (square, circle, triangle) used by Kandinsky (1923), with the three color palette from the original experiment (A), an extended seven color palette (B), and eight different shape orientation (C). Participants were instructed to look for their preferred circle, triangle or square in displays with eight stimuli of different shapes, colors and rotations, in an attempt to test for a strong preference for red squares, yellow triangles and blue circles in such an unbiased experimental design and with an extended set of possible combinations. We Tested six participants extensively on the different conditions and found consistent preferences for color-shape combinations for individuals, but little evidence at the group level for clear color/shape preference consistent with Kandinsky's claims, apart from some weak link between yellow and triangles. Our findings suggest substantial inter-individual differences in the presence of stable individual associations of color and shapes, but also that these associations are robust within a single individual. These individual differences go some way toward challenging the claims of the universal preference for color/shape combinations proposed by Kandinsky, but also indicate that a much larger sample size would be needed to confidently reject that hypothesis. Moreover, these experiments highlight the

  20. Discovering Unique, Low-Energy Transition States Using Evolutionary Molecular Memetic Computing

    DEFF Research Database (Denmark)

    Ellabaan, Mostafa M Hashim; Ong, Y.S.; Handoko, S.D.

    2013-01-01

    In the last few decades, identification of transition states has experienced significant growth in research interests from various scientific communities. As per the transition states theory, reaction paths and landscape analysis as well as many thermodynamic properties of biochemical systems can...... be accurately identified through the transition states. Transition states describe the paths of molecular systems in transiting across stable states. In this article, we present the discovery of unique, low-energy transition states and showcase the efficacy of their identification using the memetic computing...... paradigm under a Molecular Memetic Computing (MMC) framework. In essence, the MMC is equipped with the tree-based representation of non-cyclic molecules and the covalent-bond-driven evolutionary operators, in addition to the typical backbone of memetic algorithms. Herein, we employ genetic algorithm...

  1. Investigating preferences for colour-shape combinations with gaze driven optimization method based on evolutionary algorithms.

    Directory of Open Access Journals (Sweden)

    Tim eHolmes

    2013-12-01

    Full Text Available Studying aesthetic preference is notoriously difficult because it targets individual experience. Eye movements provide a rich source of behavioural measures that directly reflect subjective choice. To determine individual preferences for simple composition rules we here use fixation duration as the fitness measure in a Gaze Driven Evolutionary Algorithm (GDEA, which has been used as a tool to identify aesthetic preferences (Holmes & Zanker, 2012. In the present study, the GDEA was used to investigate the preferred combination of colour and shape which have been promoted in the Bauhaus arts school. We used the same 3 shapes (square, circle, triangle used by Kandinsky (1923, with the 3 colour palette from the original experiment (A, an extended 7 colour palette (B, and 8 different shape orientation (C. Participants were instructed to look for their preferred circle, triangle or square in displays with 8 stimuli of different shapes, colours and rotations, in an attempt to test for a strong preference for red squares, yellow triangles and blue circles in such an unbiased experimental design and with an extended set of possible combinations. We Tested 6 participants extensively on the different conditions and found consistent preferences for individuals, but little evidence at the group level for preference consistent with Kandinsky’s claims, apart from some weak link between yellow and triangles. Our findings suggest substantial inter-individual differences in the presence of stable individual associations of colour and shapes, but also that these associations are robust within a single individual. These individual differences go some way towards challenging the claims of the universal preference for colour/shape combinations proposed by Kandinsky, but also indicate that a much larger sample size would be needed to confidently reject that hypothesis. Moreover, these experiments highlight the vast potential of the GDEA in experimental aesthetics

  2. Computational methods using weighed-extreme learning machine to predict protein self-interactions with protein evolutionary information.

    Science.gov (United States)

    An, Ji-Yong; Zhang, Lei; Zhou, Yong; Zhao, Yu-Jun; Wang, Da-Fu

    2017-08-18

    Self-interactions Proteins (SIPs) is important for their biological activity owing to the inherent interaction amongst their secondary structures or domains. However, due to the limitations of experimental Self-interactions detection, one major challenge in the study of prediction SIPs is how to exploit computational approaches for SIPs detection based on evolutionary information contained protein sequence. In the work, we presented a novel computational approach named WELM-LAG, which combined the Weighed-Extreme Learning Machine (WELM) classifier with Local Average Group (LAG) to predict SIPs based on protein sequence. The major improvement of our method lies in presenting an effective feature extraction method used to represent candidate Self-interactions proteins by exploring the evolutionary information embedded in PSI-BLAST-constructed position specific scoring matrix (PSSM); and then employing a reliable and robust WELM classifier to carry out classification. In addition, the Principal Component Analysis (PCA) approach is used to reduce the impact of noise. The WELM-LAG method gave very high average accuracies of 92.94 and 96.74% on yeast and human datasets, respectively. Meanwhile, we compared it with the state-of-the-art support vector machine (SVM) classifier and other existing methods on human and yeast datasets, respectively. Comparative results indicated that our approach is very promising and may provide a cost-effective alternative for predicting SIPs. In addition, we developed a freely available web server called WELM-LAG-SIPs to predict SIPs. The web server is available at http://219.219.62.123:8888/WELMLAG/ .

  3. Combining Interactive Infrastructure Modeling and Evolutionary Algorithm Optimization for Sustainable Water Resources Design

    Science.gov (United States)

    Smith, R.; Kasprzyk, J. R.; Zagona, E. A.

    2013-12-01

    Population growth and climate change, combined with difficulties in building new infrastructure, motivate portfolio-based solutions to ensuring sufficient water supply. Powerful simulation models with graphical user interfaces (GUI) are often used to evaluate infrastructure portfolios; these GUI based models require manual modification of the system parameters, such as reservoir operation rules, water transfer schemes, or system capacities. Multiobjective evolutionary algorithm (MOEA) based optimization can be employed to balance multiple objectives and automatically suggest designs for infrastructure systems, but MOEA based decision support typically uses a fixed problem formulation (i.e., a single set of objectives, decisions, and constraints). This presentation suggests a dynamic framework for linking GUI-based infrastructure models with MOEA search. The framework begins with an initial formulation which is solved using a MOEA. Then, stakeholders can interact with candidate solutions, viewing their properties in the GUI model. This is followed by changes in the formulation which represent users' evolving understanding of exigent system properties. Our case study is built using RiverWare, an object-oriented, data-centered model that facilitates the representation of a diverse array of water resources systems. Results suggest that assumptions within the initial MOEA search are violated after investigating tradeoffs and reveal how formulations should be modified to better capture stakeholders' preferences.

  4. Computational Evolutionary Methodology for Knowledge Discovery and Forecasting in Epidemiology and Medicine

    International Nuclear Information System (INIS)

    Rao, Dhananjai M.; Chernyakhovsky, Alexander; Rao, Victoria

    2008-01-01

    Humanity is facing an increasing number of highly virulent and communicable diseases such as avian influenza. Researchers believe that avian influenza has potential to evolve into one of the deadliest pandemics. Combating these diseases requires in-depth knowledge of their epidemiology. An effective methodology for discovering epidemiological knowledge is to utilize a descriptive, evolutionary, ecological model and use bio-simulations to study and analyze it. These types of bio-simulations fall under the category of computational evolutionary methods because the individual entities participating in the simulation are permitted to evolve in a natural manner by reacting to changes in the simulated ecosystem. This work describes the application of the aforementioned methodology to discover epidemiological knowledge about avian influenza using a novel eco-modeling and bio-simulation environment called SEARUMS. The mathematical principles underlying SEARUMS, its design, and the procedure for using SEARUMS are discussed. The bio-simulations and multi-faceted case studies conducted using SEARUMS elucidate its ability to pinpoint timelines, epicenters, and socio-economic impacts of avian influenza. This knowledge is invaluable for proactive deployment of countermeasures in order to minimize negative socioeconomic impacts, combat the disease, and avert a pandemic

  5. Towards a Population Dynamics Theory for Evolutionary Computing: Learning from Biological Population Dynamics in Nature

    Science.gov (United States)

    Ma, Zhanshan (Sam)

    In evolutionary computing (EC), population size is one of the critical parameters that a researcher has to deal with. Hence, it was no surprise that the pioneers of EC, such as De Jong (1975) and Holland (1975), had already studied the population sizing from the very beginning of EC. What is perhaps surprising is that more than three decades later, we still largely depend on the experience or ad-hoc trial-and-error approach to set the population size. For example, in a recent monograph, Eiben and Smith (2003) indicated: "In almost all EC applications, the population size is constant and does not change during the evolutionary search." Despite enormous research on this issue in recent years, we still lack a well accepted theory for population sizing. In this paper, I propose to develop a population dynamics theory forEC with the inspiration from the population dynamics theory of biological populations in nature. Essentially, the EC population is considered as a dynamic system over time (generations) and space (search space or fitness landscape), similar to the spatial and temporal dynamics of biological populations in nature. With this conceptual mapping, I propose to 'transplant' the biological population dynamics theory to EC via three steps: (i) experimentally test the feasibility—whether or not emulating natural population dynamics improves the EC performance; (ii) comparatively study the underlying mechanisms—why there are improvements, primarily via statistical modeling analysis; (iii) conduct theoretical analysis with theoretical models such as percolation theory and extended evolutionary game theory that are generally applicable to both EC and natural populations. This article is a summary of a series of studies we have performed to achieve the general goal [27][30]-[32]. In the following, I start with an extremely brief introduction on the theory and models of natural population dynamics (Sections 1 & 2). In Sections 4 to 6, I briefly discuss three

  6. EVOLVE : a Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation II

    CERN Document Server

    Coello, Carlos; Tantar, Alexandru-Adrian; Tantar, Emilia; Bouvry, Pascal; Moral, Pierre; Legrand, Pierrick; EVOLVE 2012

    2013-01-01

    This book comprises a selection of papers from the EVOLVE 2012 held in Mexico City, Mexico. The aim of the EVOLVE is to build a bridge between probability, set oriented numerics and evolutionary computing, as to identify new common and challenging research aspects. The conference is also intended to foster a growing interest for robust and efficient methods with a sound theoretical background. EVOLVE is intended to unify theory-inspired methods and cutting-edge techniques ensuring performance guarantee factors. By gathering researchers with different backgrounds, a unified view and vocabulary can emerge where the theoretical advancements may echo in different domains. Summarizing, the EVOLVE focuses on challenging aspects arising at the passage from theory to new paradigms and aims to provide a unified view while raising questions related to reliability,  performance guarantees and modeling. The papers of the EVOLVE 2012 make a contribution to this goal. 

  7. REAL TIME PULVERISED COAL FLOW SOFT SENSOR FOR THERMAL POWER PLANTS USING EVOLUTIONARY COMPUTATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    B. Raja Singh

    2015-01-01

    Full Text Available Pulverised coal preparation system (Coal mills is the heart of coal-fired power plants. The complex nature of a milling process, together with the complex interactions between coal quality and mill conditions, would lead to immense difficulties for obtaining an effective mathematical model of the milling process. In this paper, vertical spindle coal mills (bowl mill that are widely used in coal-fired power plants, is considered for the model development and its pulverised fuel flow rate is computed using the model. For the steady state coal mill model development, plant measurements such as air-flow rate, differential pressure across mill etc., are considered as inputs/outputs. The mathematical model is derived from analysis of energy, heat and mass balances. An Evolutionary computation technique is adopted to identify the unknown model parameters using on-line plant data. Validation results indicate that this model is accurate enough to represent the whole process of steady state coal mill dynamics. This coal mill model is being implemented on-line in a 210 MW thermal power plant and the results obtained are compared with plant data. The model is found accurate and robust that will work better in power plants for system monitoring. Therefore, the model can be used for online monitoring, fault detection, and control to improve the efficiency of combustion.

  8. Combining environment-driven adaptation and task-driven optimisation in evolutionary robotics.

    Science.gov (United States)

    Haasdijk, Evert; Bredeche, Nicolas; Eiben, A E

    2014-01-01

    Embodied evolutionary robotics is a sub-field of evolutionary robotics that employs evolutionary algorithms on the robotic hardware itself, during the operational period, i.e., in an on-line fashion. This enables robotic systems that continuously adapt, and are therefore capable of (re-)adjusting themselves to previously unknown or dynamically changing conditions autonomously, without human oversight. This paper addresses one of the major challenges that such systems face, viz. that the robots must satisfy two sets of requirements. Firstly, they must continue to operate reliably in their environment (viability), and secondly they must competently perform user-specified tasks (usefulness). The solution we propose exploits the fact that evolutionary methods have two basic selection mechanisms-survivor selection and parent selection. This allows evolution to tackle the two sets of requirements separately: survivor selection is driven by the environment and parent selection is based on task-performance. This idea is elaborated in the Multi-Objective aNd open-Ended Evolution (monee) framework, which we experimentally validate. Experiments with robotic swarms of 100 simulated e-pucks show that monee does indeed promote task-driven behaviour without compromising environmental adaptation. We also investigate an extension of the parent selection process with a 'market mechanism' that can ensure equitable distribution of effort over multiple tasks, a particularly pressing issue if the environment promotes specialisation in single tasks.

  9. Combining environment-driven adaptation and task-driven optimisation in evolutionary robotics.

    Directory of Open Access Journals (Sweden)

    Evert Haasdijk

    Full Text Available Embodied evolutionary robotics is a sub-field of evolutionary robotics that employs evolutionary algorithms on the robotic hardware itself, during the operational period, i.e., in an on-line fashion. This enables robotic systems that continuously adapt, and are therefore capable of (re-adjusting themselves to previously unknown or dynamically changing conditions autonomously, without human oversight. This paper addresses one of the major challenges that such systems face, viz. that the robots must satisfy two sets of requirements. Firstly, they must continue to operate reliably in their environment (viability, and secondly they must competently perform user-specified tasks (usefulness. The solution we propose exploits the fact that evolutionary methods have two basic selection mechanisms-survivor selection and parent selection. This allows evolution to tackle the two sets of requirements separately: survivor selection is driven by the environment and parent selection is based on task-performance. This idea is elaborated in the Multi-Objective aNd open-Ended Evolution (monee framework, which we experimentally validate. Experiments with robotic swarms of 100 simulated e-pucks show that monee does indeed promote task-driven behaviour without compromising environmental adaptation. We also investigate an extension of the parent selection process with a 'market mechanism' that can ensure equitable distribution of effort over multiple tasks, a particularly pressing issue if the environment promotes specialisation in single tasks.

  10. Evolutionary Nephrology.

    Science.gov (United States)

    Chevalier, Robert L

    2017-05-01

    Progressive kidney disease follows nephron loss, hyperfiltration, and incomplete repair, a process described as "maladaptive." In the past 20 years, a new discipline has emerged that expands research horizons: evolutionary medicine. In contrast to physiologic (homeostatic) adaptation, evolutionary adaptation is the result of reproductive success that reflects natural selection. Evolutionary explanations for physiologically maladaptive responses can emerge from mismatch of the phenotype with environment or evolutionary tradeoffs. Evolutionary adaptation to a terrestrial environment resulted in a vulnerable energy-consuming renal tubule and a hypoxic, hyperosmolar microenvironment. Natural selection favors successful energy investment strategy: energy is allocated to maintenance of nephron integrity through reproductive years, but this declines with increasing senescence after ~40 years of age. Risk factors for chronic kidney disease include restricted fetal growth or preterm birth (life history tradeoff resulting in fewer nephrons), evolutionary selection for APOL1 mutations (that provide resistance to trypanosome infection, a tradeoff), and modern life experience (Western diet mismatch leading to diabetes and hypertension). Current advances in genomics, epigenetics, and developmental biology have revealed proximate causes of kidney disease, but attempts to slow kidney disease remain elusive. Evolutionary medicine provides a complementary approach by addressing ultimate causes of kidney disease. Marked variation in nephron number at birth, nephron heterogeneity, and changing susceptibility to kidney injury throughout life history are the result of evolutionary processes. Combined application of molecular genetics, evolutionary developmental biology (evo-devo), developmental programming and life history theory may yield new strategies for prevention and treatment of chronic kidney disease.

  11. Exploiting Genomic Knowledge in Optimising Molecular Breeding Programmes: Algorithms from Evolutionary Computing

    Science.gov (United States)

    O'Hagan, Steve; Knowles, Joshua; Kell, Douglas B.

    2012-01-01

    Comparatively few studies have addressed directly the question of quantifying the benefits to be had from using molecular genetic markers in experimental breeding programmes (e.g. for improved crops and livestock), nor the question of which organisms should be mated with each other to best effect. We argue that this requires in silico modelling, an approach for which there is a large literature in the field of evolutionary computation (EC), but which has not really been applied in this way to experimental breeding programmes. EC seeks to optimise measurable outcomes (phenotypic fitnesses) by optimising in silico the mutation, recombination and selection regimes that are used. We review some of the approaches from EC, and compare experimentally, using a biologically relevant in silico landscape, some algorithms that have knowledge of where they are in the (genotypic) search space (G-algorithms) with some (albeit well-tuned ones) that do not (F-algorithms). For the present kinds of landscapes, F- and G-algorithms were broadly comparable in quality and effectiveness, although we recognise that the G-algorithms were not equipped with any ‘prior knowledge’ of epistatic pathway interactions. This use of algorithms based on machine learning has important implications for the optimisation of experimental breeding programmes in the post-genomic era when we shall potentially have access to the full genome sequence of every organism in a breeding population. The non-proprietary code that we have used is made freely available (via Supplementary information). PMID:23185279

  12. Combining evolutionary game theory and network theory to analyze human cooperation patterns

    International Nuclear Information System (INIS)

    Scatà, Marialisa; Di Stefano, Alessandro; La Corte, Aurelio; Liò, Pietro; Catania, Emanuele; Guardo, Ermanno; Pagano, Salvatore

    2016-01-01

    Highlights: • We investigate the evolutionary dynamics of human cooperation in a social network. • We introduce the concepts of “Critical Mass”, centrality measure and homophily. • The emergence of cooperation is affected by the spatial choice of the “Critical Mass”. • Our findings show that homophily speeds up the convergence towards cooperation. • Centrality and “Critical Mass” spatial choice partially offset the impact of homophily. - Abstract: As natural systems continuously evolve, the human cooperation dilemma represents an increasingly more challenging question. Humans cooperate in natural and social systems, but how it happens and what are the mechanisms which rule the emergence of cooperation, represent an open and fascinating issue. In this work, we investigate the evolution of cooperation through the analysis of the evolutionary dynamics of behaviours within the social network, where nodes can choose to cooperate or defect following the classical social dilemmas represented by Prisoner’s Dilemma and Snowdrift games. To this aim, we introduce a sociological concept and statistical estimator, “Critical Mass”, to detect the minimum initial seed of cooperators able to trigger the diffusion process, and the centrality measure to select within the social network. Selecting different spatial configurations of the Critical Mass nodes, we highlight how the emergence of cooperation can be influenced by this spatial choice of the initial core in the network. Moreover, we target to shed light how the concept of homophily, a social shaping factor for which “birds of a feather flock together”, can affect the evolutionary process. Our findings show that homophily allows speeding up the diffusion process and make quicker the convergence towards human cooperation, while centrality measure and thus the Critical Mass selection, play a key role in the evolution showing how the spatial configurations can create some hidden patterns, partially

  13. An Artificial Immune System-Inspired Multiobjective Evolutionary Algorithm with Application to the Detection of Distributed Computer Network Intrusions

    Science.gov (United States)

    2007-03-01

    Optimization Coello, Van Veldhuizen , and Lamont define global optimization as, “the process of finding the global minimum4 within some search space S [CVL02...Technology, Shapes Markets, and Manages People, Simon & Schuster, New York, 1995. [CVL02] Coello, C., Van Veldhuizen , D., Lamont, G.B., Evolutionary...Anomaly Detection, Technical Report CS- 2003-02, Computer Science Department, Florida Institute of Technology, 2003. [Marmelstein99] Marmelstein, R., Van

  14. Design of a computation tool for neutron spectrometry and dosimetry through evolutionary neural networks

    International Nuclear Information System (INIS)

    Ortiz R, J. M.; Vega C, H. R.; Martinez B, M. R.; Gallego, E.

    2009-10-01

    The neutron dosimetry is one of the most complicated tasks of radiation protection, due to it is a complex technique and highly dependent of neutron energy. One of the first devices used to perform neutron spectrometry is the system known as spectrometric system of Bonner spheres, that continuous being one of spectrometers most commonly used. This system has disadvantages such as: the components weight, the low resolution of spectrum, long and drawn out procedure for the spectra reconstruction, which require an expert user in system management, the need of use a reconstruction code as BUNKIE, SAND, etc., which are based on an iterative reconstruction algorithm and whose greatest inconvenience is that for the spectrum reconstruction, are needed to provide to system and initial spectrum as close as possible to the desired spectrum get. Consequently, researchers have mentioned the need to developed alternative measurement techniques to improve existing monitoring systems for workers. Among these alternative techniques have been reported several reconstruction procedures based on artificial intelligence techniques such as genetic algorithms, artificial neural networks and hybrid systems of evolutionary artificial neural networks using genetic algorithms. However, the use of these techniques in the nuclear science area is not free of problems, so it has been suggested that more research is conducted in such a way as to solve these disadvantages. Because they are emerging technologies, there are no tools for the results analysis, so in this paper we present first the design of a computation tool that allow to analyze the neutron spectra and equivalent doses, obtained through the hybrid technology of neural networks and genetic algorithms. This tool provides an user graphical environment, friendly, intuitive and easy of operate. The speed of program operation is high, executing the analysis in a few seconds, so it may storage and or print the obtained information for

  15. Evolutionary games combining two or three pair coordinations on a square lattice

    Science.gov (United States)

    Király, Balázs; Szabó, György

    2017-10-01

    We study multiagent logit-rule-driven evolutionary games on a square lattice whose pair interactions are composed of a maximal number of nonoverlapping elementary coordination games describing Ising-type interactions between just two of the available strategies. Using Monte Carlo simulations we investigate the macroscopic noise-level-dependent behavior of the two- and three-pair games and the critical properties of the continuous phase transtitions these systems exhibit. The four-strategy game is shown to be equivalent to a system that consists of two independent and identical Ising models.

  16. Computer aided system for parametric design of combination die

    Science.gov (United States)

    Naranje, Vishal G.; Hussein, H. M. A.; Kumar, S.

    2017-09-01

    In this paper, a computer aided system for parametric design of combination dies is presented. The system is developed using knowledge based system technique of artificial intelligence. The system is capable to design combination dies for production of sheet metal parts having punching and cupping operations. The system is coded in Visual Basic and interfaced with AutoCAD software. The low cost of the proposed system will help die designers of small and medium scale sheet metal industries for design of combination dies for similar type of products. The proposed system is capable to reduce design time and efforts of die designers for design of combination dies.

  17. Evolutionary Nephrology

    Directory of Open Access Journals (Sweden)

    Robert L. Chevalier

    2017-05-01

    Full Text Available Progressive kidney disease follows nephron loss, hyperfiltration, and incomplete repair, a process described as “maladaptive.” In the past 20 years, a new discipline has emerged that expands research horizons: evolutionary medicine. In contrast to physiologic (homeostatic adaptation, evolutionary adaptation is the result of reproductive success that reflects natural selection. Evolutionary explanations for physiologically maladaptive responses can emerge from mismatch of the phenotype with environment or from evolutionary tradeoffs. Evolutionary adaptation to a terrestrial environment resulted in a vulnerable energy-consuming renal tubule and a hypoxic, hyperosmolar microenvironment. Natural selection favors successful energy investment strategy: energy is allocated to maintenance of nephron integrity through reproductive years, but this declines with increasing senescence after ∼40 years of age. Risk factors for chronic kidney disease include restricted fetal growth or preterm birth (life history tradeoff resulting in fewer nephrons, evolutionary selection for APOL1 mutations (which provide resistance to trypanosome infection, a tradeoff, and modern life experience (Western diet mismatch leading to diabetes and hypertension. Current advances in genomics, epigenetics, and developmental biology have revealed proximate causes of kidney disease, but attempts to slow kidney disease remain elusive. Evolutionary medicine provides a complementary approach by addressing ultimate causes of kidney disease. Marked variation in nephron number at birth, nephron heterogeneity, and changing susceptibility to kidney injury throughout the life history are the result of evolutionary processes. Combined application of molecular genetics, evolutionary developmental biology (evo-devo, developmental programming, and life history theory may yield new strategies for prevention and treatment of chronic kidney disease.

  18. High Efficiency Computation of the Variances of Structural Evolutionary Random Responses

    Directory of Open Access Journals (Sweden)

    J.H. Lin

    2000-01-01

    Full Text Available For structures subjected to stationary or evolutionary white/colored random noise, their various response variances satisfy algebraic or differential Lyapunov equations. The solution of these Lyapunov equations used to be very difficult. A precise integration method is proposed in the present paper, which solves such Lyapunov equations accurately and very efficiently.

  19. Microscopes and computers combined for analysis of chromosomes

    Science.gov (United States)

    Butler, J. W.; Butler, M. K.; Stroud, A. N.

    1969-01-01

    Scanning machine CHLOE, developed for photographic use, is combined with a digital computer to obtain quantitative and statistically significant data on chromosome shapes, distribution, density, and pairing. CHLOE permits data acquisition about a chromosome complement to be obtained two times faster than by manual pairing.

  20. STADIC: a computer code for combining probability distributions

    International Nuclear Information System (INIS)

    Cairns, J.J.; Fleming, K.N.

    1977-03-01

    The STADIC computer code uses a Monte Carlo simulation technique for combining probability distributions. The specific function for combination of the input distribution is defined by the user by introducing the appropriate FORTRAN statements to the appropriate subroutine. The code generates a Monte Carlo sampling from each of the input distributions and combines these according to the user-supplied function to provide, in essence, a random sampling of the combined distribution. When the desired number of samples is obtained, the output routine calculates the mean, standard deviation, and confidence limits for the resultant distribution. This method of combining probability distributions is particularly useful in cases where analytical approaches are either too difficult or undefined

  1. Evolutionary thinking

    Science.gov (United States)

    Hunt, Tam

    2014-01-01

    Evolution as an idea has a lengthy history, even though the idea of evolution is generally associated with Darwin today. Rebecca Stott provides an engaging and thoughtful overview of this history of evolutionary thinking in her 2013 book, Darwin's Ghosts: The Secret History of Evolution. Since Darwin, the debate over evolution—both how it takes place and, in a long war of words with religiously-oriented thinkers, whether it takes place—has been sustained and heated. A growing share of this debate is now devoted to examining how evolutionary thinking affects areas outside of biology. How do our lives change when we recognize that all is in flux? What can we learn about life more generally if we study change instead of stasis? Carter Phipps’ book, Evolutionaries: Unlocking the Spiritual and Cultural Potential of Science's Greatest Idea, delves deep into this relatively new development. Phipps generally takes as a given the validity of the Modern Synthesis of evolutionary biology. His story takes us into, as the subtitle suggests, the spiritual and cultural implications of evolutionary thinking. Can religion and evolution be reconciled? Can evolutionary thinking lead to a new type of spirituality? Is our culture already being changed in ways that we don't realize by evolutionary thinking? These are all important questions and Phipps book is a great introduction to this discussion. Phipps is an author, journalist, and contributor to the emerging “integral” or “evolutionary” cultural movement that combines the insights of Integral Philosophy, evolutionary science, developmental psychology, and the social sciences. He has served as the Executive Editor of EnlightenNext magazine (no longer published) and more recently is the co-founder of the Institute for Cultural Evolution, a public policy think tank addressing the cultural roots of America's political challenges. What follows is an email interview with Phipps. PMID:26478766

  2. At the crossroads of evolutionary computation and music: self-programming synthesizers, swarm orchestras and the origins of melody.

    Science.gov (United States)

    Miranda, Eduardo Reck

    2004-01-01

    This paper introduces three approaches to using Evolutionary Computation (EC) in Music (namely, engineering, creative and musicological approaches) and discusses examples of representative systems that have been developed within the last decade, with emphasis on more recent and innovative works. We begin by reviewing engineering applications of EC in Music Technology such as Genetic Algorithms and Cellular Automata sound synthesis, followed by an introduction to applications where EC has been used to generate musical compositions. Next, we introduce ongoing research into EC models to study the origins of music and detail our own research work on modelling the evolution of melody. Copryright 2004 Massachusetts Institute of Technology

  3. A Multi Agent System for Flow-Based Intrusion Detection Using Reputation and Evolutionary Computation

    Science.gov (United States)

    2011-03-01

    pertinent example of the application of Evolutionary Algorithms to pattern recognition comes from Radtke et al. [130]. The authors apply Multi- Objective...J., T. Zseby, and B. Claise. S. Zander,” Requirements for IP Flow Information Export (IPFIX). Technical report, RFC 3917, October 2004. [130] Radtke ...hal.inria.fr/inria-00104200/en/. [131] Radtke , P.V.W., T. Wong, and R. Sabourin. “A multi-objective memetic al- gorithm for intelligent feature extraction

  4. Evolutionary optimization of neural networks with heterogeneous computation: study and implementation

    OpenAIRE

    FE, JORGE DEOLINDO; Aliaga Varea, Ramón José; Gadea Gironés, Rafael

    2015-01-01

    In the optimization of artificial neural networks (ANNs) via evolutionary algorithms and the implementation of the necessary training for the objective function, there is often a trade-off between efficiency and flexibility. Pure software solutions on general-purpose processors tend to be slow because they do not take advantage of the inherent parallelism, whereas hardware realizations usually rely on optimizations that reduce the range of applicable network topologies, or they...

  5. Automated Generation of User Guidance by Combining Computation and Deduction

    Directory of Open Access Journals (Sweden)

    Walther Neuper

    2012-02-01

    Full Text Available Herewith, a fairly old concept is published for the first time and named "Lucas Interpretation". This has been implemented in a prototype, which has been proved useful in educational practice and has gained academic relevance with an emerging generation of educational mathematics assistants (EMA based on Computer Theorem Proving (CTP. Automated Theorem Proving (ATP, i.e. deduction, is the most reliable technology used to check user input. However ATP is inherently weak in automatically generating solutions for arbitrary problems in applied mathematics. This weakness is crucial for EMAs: when ATP checks user input as incorrect and the learner gets stuck then the system should be able to suggest possible next steps. The key idea of Lucas Interpretation is to compute the steps of a calculation following a program written in a novel CTP-based programming language, i.e. computation provides the next steps. User guidance is generated by combining deduction and computation: the latter is performed by a specific language interpreter, which works like a debugger and hands over control to the learner at breakpoints, i.e. tactics generating the steps of calculation. The interpreter also builds up logical contexts providing ATP with the data required for checking user input, thus combining computation and deduction. The paper describes the concepts underlying Lucas Interpretation so that open questions can adequately be addressed, and prerequisites for further work are provided.

  6. Hydra-Ring: a computational framework to combine failure probabilities

    Science.gov (United States)

    Diermanse, Ferdinand; Roscoe, Kathryn; IJmker, Janneke; Mens, Marjolein; Bouwer, Laurens

    2013-04-01

    This presentation discusses the development of a new computational framework for the safety assessment of flood defence systems: Hydra-Ring. Hydra-Ring computes the failure probability of a flood defence system, which is composed of a number of elements (e.g., dike segments, dune segments or hydraulic structures), taking all relevant uncertainties explicitly into account. This is a major step forward in comparison with the current Dutch practice in which the safety assessment is done separately per individual flood defence section. The main advantage of the new approach is that it will result in a more balanced prioratization of required mitigating measures ('more value for money'). Failure of the flood defence system occurs if any element within the system fails. Hydra-Ring thus computes and combines failure probabilities of the following elements: - Failure mechanisms: A flood defence system can fail due to different failure mechanisms. - Time periods: failure probabilities are first computed for relatively small time scales (assessment of flood defense systems, Hydra-Ring can also be used to derive fragility curves, to asses the efficiency of flood mitigating measures, and to quantify the impact of climate change and land subsidence on flood risk. Hydra-Ring is being developed in the context of the Dutch situation. However, the computational concept is generic and the model is set up in such a way that it can be applied to other areas as well. The presentation will focus on the model concept and probabilistic computation techniques.

  7. MEGA-CC: computing core of molecular evolutionary genetics analysis program for automated and iterative data analysis.

    Science.gov (United States)

    Kumar, Sudhir; Stecher, Glen; Peterson, Daniel; Tamura, Koichiro

    2012-10-15

    There is a growing need in the research community to apply the molecular evolutionary genetics analysis (MEGA) software tool for batch processing a large number of datasets and to integrate it into analysis workflows. Therefore, we now make available the computing core of the MEGA software as a stand-alone executable (MEGA-CC), along with an analysis prototyper (MEGA-Proto). MEGA-CC provides users with access to all the computational analyses available through MEGA's graphical user interface version. This includes methods for multiple sequence alignment, substitution model selection, evolutionary distance estimation, phylogeny inference, substitution rate and pattern estimation, tests of natural selection and ancestral sequence inference. Additionally, we have upgraded the source code for phylogenetic analysis using the maximum likelihood methods for parallel execution on multiple processors and cores. Here, we describe MEGA-CC and outline the steps for using MEGA-CC in tandem with MEGA-Proto for iterative and automated data analysis. http://www.megasoftware.net/.

  8. Confidence Level Computation for Combining Searches with Small Statistics

    OpenAIRE

    Junk, Thomas

    1999-01-01

    This article describes an efficient procedure for computing approximate confidence levels for searches for new particles where the expected signal and background levels are small enough to require the use of Poisson statistics. The results of many independent searches for the same particle may be combined easily, regardless of the discriminating variables which may be measured for the candidate events. The effects of systematic uncertainty in the signal and background models are incorporated ...

  9. Application of Neural Network Optimized by Mind Evolutionary Computation in Building Energy Prediction

    Science.gov (United States)

    Song, Chen; Zhong-Cheng, Wu; Hong, Lv

    2018-03-01

    Building Energy forecasting plays an important role in energy management and plan. Using mind evolutionary algorithm to find the optimal network weights and threshold, to optimize the BP neural network, can overcome the problem of the BP neural network into a local minimum point. The optimized network is used for time series prediction, and the same month forecast, to get two predictive values. Then two kinds of predictive values are put into neural network, to get the final forecast value. The effectiveness of the method was verified by experiment with the energy value of three buildings in Hefei.

  10. Interactive Rhythm Learning System by Combining Tablet Computers and Robots

    Directory of Open Access Journals (Sweden)

    Chien-Hsing Chou

    2017-03-01

    Full Text Available This study proposes a percussion learning device that combines tablet computers and robots. This device comprises two systems: a rhythm teaching system, in which users can compose and practice rhythms by using a tablet computer, and a robot performance system. First, teachers compose the rhythm training contents on the tablet computer. Then, the learners practice these percussion exercises by using the tablet computer and a small drum set. The teaching system provides a new and user-friendly score editing interface for composing a rhythm exercise. It also provides a rhythm rating function to facilitate percussion training for children and improve the stability of rhythmic beating. To encourage children to practice percussion exercises, a robotic performance system is used to interact with the children; this system can perform percussion exercises for students to listen to and then help them practice the exercise. This interaction enhances children’s interest and motivation to learn and practice rhythm exercises. The results of experimental course and field trials reveal that the proposed system not only increases students’ interest and efficiency in learning but also helps them in understanding musical rhythms through interaction and composing simple rhythms.

  11. Unscented Sampling Techniques For Evolutionary Computation With Applications To Astrodynamic Optimization

    Science.gov (United States)

    2016-09-01

    factors that can cause the variations in trajectory computation time. First of all, these cases are initially computed using the guess-free mode of DIDO... Goldberg [91]. This concept essentially states that fundamental building blocks, or lower order schemata are pieced together by the genetic algorithms in...in Section 3.13.2. While this idea is very straightforward and logical, Goldberg also later points out that there are deceptive problems where these

  12. Application of computational fluid dynamics and surrogate-coupled evolutionary computing to enhance centrifugal-pump performance

    Directory of Open Access Journals (Sweden)

    Sayed Ahmed Imran Bellary

    2016-01-01

    Full Text Available To reduce the total design and optimization time, numerical analysis with surrogate-based approaches is being used in turbomachinery optimization. In this work, multiple surrogates are coupled with an evolutionary genetic algorithm to find the Pareto optimal fronts (PoFs of two centrifugal pumps with different specifications in order to enhance their performance. The two pumps were used a centrifugal pump commonly used in industry (Case I and an electrical submersible pump used in the petroleum industry (Case II. The objectives are to enhance head and efficiency of the pumps at specific flow rates. Surrogates such as response surface approximation (RSA, Kriging (KRG, neural networks and weighted-average surrogates (WASs were used to determine the PoFs. To obtain the objective functions’ values and to understand the flow physics, Reynolds-averaged Navier–Stokes equations were solved. It is found that the WAS performs better for both the objectives than any other individual surrogate. The best individual surrogates or the best predicted error sum of squares (PRESS surrogate (BPS obtained from cross-validation (CV error estimations produced better PoFs but was still unable to compete with the WAS. The high CV error-producing surrogate produced the worst PoFs. The performance improvement in this study is due to the change in flow pattern in the passage of the impeller of the pumps.

  13. Genetic characterization and evolutionary inference of TNF-α through computational analysis

    Directory of Open Access Journals (Sweden)

    Gauri Awasthi

    Full Text Available TNF-α is an important human cytokine that imparts dualism in malaria pathogenicity. At high dosages, TNF-α is believed to provoke pathogenicity in cerebral malaria; while at lower dosages TNF-α is protective against severe human malaria. In order to understand the human TNF-α gene and to ascertain evolutionary aspects of its dualistic nature for malaria pathogenicity, we characterized this gene in detail in six different mammalian taxa. The avian taxon, Gallus gallus was included in our study, as TNF-α is not present in birds; therefore, a tandemly placed duplicate of TNF-α (LT-α or TNF-β was included. A comparative study was made of nucleotide length variations, intron and exon sizes and number variations, differential compositions of coding to non-coding bases, etc., to look for similarities/dissimilarities in the TNF-α gene across all seven taxa. A phylogenetic analysis revealed the pattern found in other genes, as humans, chimpanzees and rhesus monkeys were placed in a single clade, and rats and mice in another; the chicken was in a clearly separate branch. We further focused on these three taxa and aligned the amino acid sequences; there were small differences between humans and chimpanzees; both were more different from the rhesus monkey. Further, comparison of coding and non-coding nucleotide length variations and coding to non-coding nucleotide ratio between TNF-α and TNF-β among these three mammalian taxa provided a first-hand indication of the role of the TNF-α gene, but not of TNF-β in the dualistic nature of TNF-α in malaria pathogenicity.

  14. CMS results in the Combined Computing Readiness Challenge CCRC'08

    International Nuclear Information System (INIS)

    Bonacorsi, D.; Bauerdick, L.

    2009-01-01

    During February and May 2008, CMS participated to the Combined Computing Readiness Challenge (CCRC'08) together with all other LHC experiments. The purpose of this worldwide exercise was to check the readiness of the Computing infrastructure for LHC data taking. Another set of major CMS tests called Computing, Software and Analysis challenge (CSA'08) - as well as CMS cosmic runs - were also running at the same time: CCRC augmented the load on computing with additional tests to validate and stress-test all CMS computing workflows at full data taking scale, also extending this to the global WLCG community. CMS exercised most aspects of the CMS computing model, with very comprehensive tests. During May 2008, CMS moved more than 3.6 Petabytes among more than 300 links in the complex Grid topology. CMS demonstrated that is able to safely move data out of CERN to the Tier-1 sites, sustaining more than 600 MB/s as a daily average for more than seven days in a row, with enough headroom and with hourly peaks of up to 1.7 GB/s. CMS ran hundreds of simultaneous jobs at each Tier-1 site, re-reconstructing and skimming hundreds of millions of events. After re-reconstruction the fresh AOD (Analysis Object Data) has to be synchronized between Tier-1 centers: CMS demonstrated that the required inter-Tier-1 transfers are achievable within a few days. CMS also showed that skimmed analysis data sets can be transferred to Tier-2 sites for analysis at sufficient rate, regionally as well as inter-regionally, achieving all goals in about 90% of >200 links. Simultaneously, CMS also ran a large Tier-2 analysis exercise, where realistic analysis jobs were submitted to a large set of Tier-2 sites by a large number of people to produce a chaotic workload across the systems, and with more than 400 analysis users in May. Taken all together, CMS routinely achieved submissions of 100k jobs/day, with peaks up to 200k jobs/day. The achieved results in CCRC'08 - focussing on the distributed

  15. Helicopter fuselage drag - combined computational fluid dynamics and experimental studies

    Science.gov (United States)

    Batrakov, A.; Kusyumov, A.; Mikhailov, S.; Pakhov, V.; Sungatullin, A.; Valeev, M.; Zherekhov, V.; Barakos, G.

    2015-06-01

    In this paper, wind tunnel experiments are combined with Computational Fluid Dynamics (CFD) aiming to analyze the aerodynamics of realistic fuselage configurations. A development model of the ANSAT aircraft and an early model of the AKTAI light helicopter were employed. Both models were tested at the subsonic wind tunnel of KNRTU-KAI for a range of Reynolds numbers and pitch and yaw angles. The force balance measurements were complemented by particle image velocimetry (PIV) investigations for the cases where the experimental force measurements showed substantial unsteadiness. The CFD results were found to be in fair agreement with the test data and revealed some flow separation at the rear of the fuselages. Once confidence on the CFD method was established, further modifications were introduced to the ANSAT-like fuselage model to demonstrate drag reduction via small shape changes.

  16. Study of natural circulation for the design of a research reactor using computational fluid dynamics and evolutionary computation techniques

    International Nuclear Information System (INIS)

    Oliveira, Andre Felipe da Silva de

    2012-01-01

    Safety is one of the most important and desirable characteristics in a nuclear plant Natural circulation cooling systems are noted for providing passive safety. These systems can be used as mechanism for removing the residual heat from the reactor, or even as the main cooling system for heated sections, such as the core. In this work, a computational fluid dynamics (CFD) code called CFX is used to simulate the process of natural circulation in a research reactor pool after its shutdown. The physical model studied is similar to the Open Pool Australian Light water reactor (OPAL), and contains the core, cooling pool, reflecting tank, circulation pipes and chimney. For best computing performance, the core region was modeled as a porous medium, where the parameters were obtained from a separately detailed CFD analysis. This work also aims to study the viability of the implementation of Differential Evolution algorithm for optimization the physical and operational parameters that, obeying the laws of similarity, lead to a test section on a reduced scale of the reactor pool.

  17. Evolutionary computation applied to the reconstruction of 3-D surface topography in the SEM.

    Science.gov (United States)

    Kodama, Tetsuji; Li, Xiaoyuan; Nakahira, Kenji; Ito, Dai

    2005-10-01

    A genetic algorithm has been applied to the line profile reconstruction from the signals of the standard secondary electron (SE) and/or backscattered electron detectors in a scanning electron microscope. This method solves the topographical surface reconstruction problem as one of combinatorial optimization. To extend this optimization approach for three-dimensional (3-D) surface topography, this paper considers the use of a string coding where a 3-D surface topography is represented by a set of coordinates of vertices. We introduce the Delaunay triangulation, which attains the minimum roughness for any set of height data to capture the fundamental features of the surface being probed by an electron beam. With this coding, the strings are processed with a class of hybrid optimization algorithms that combine genetic algorithms and simulated annealing algorithms. Experimental results on SE images are presented.

  18. Quality-of-service sensitivity to bio-inspired/evolutionary computational methods for intrusion detection in wireless ad hoc multimedia sensor networks

    Science.gov (United States)

    Hortos, William S.

    2012-06-01

    In the author's previous work, a cross-layer protocol approach to wireless sensor network (WSN) intrusion detection an identification is created with multiple bio-inspired/evolutionary computational methods applied to the functions of the protocol layers, a single method to each layer, to improve the intrusion-detection performance of the protocol over that of one method applied to only a single layer's functions. The WSN cross-layer protocol design embeds GAs, anti-phase synchronization, ACO, and a trust model based on quantized data reputation at the physical, MAC, network, and application layer, respectively. The construct neglects to assess the net effect of the combined bioinspired methods on the quality-of-service (QoS) performance for "normal" data streams, that is, streams without intrusions. Analytic expressions of throughput, delay, and jitter, coupled with simulation results for WSNs free of intrusion attacks, are the basis for sensitivity analyses of QoS metrics for normal traffic to the bio-inspired methods.

  19. An AmI-Based Software Architecture Enabling Evolutionary Computation in Blended Commerce: The Shopping Plan Application

    Directory of Open Access Journals (Sweden)

    Giuseppe D’Aniello

    2015-01-01

    Full Text Available This work describes an approach to synergistically exploit ambient intelligence technologies, mobile devices, and evolutionary computation in order to support blended commerce or ubiquitous commerce scenarios. The work proposes a software architecture consisting of three main components: linked data for e-commerce, cloud-based services, and mobile apps. The three components implement a scenario where a shopping mall is presented as an intelligent environment in which customers use NFC capabilities of their smartphones in order to handle e-coupons produced, suggested, and consumed by the abovesaid environment. The main function of the intelligent environment is to help customers define shopping plans, which minimize the overall shopping cost by looking for best prices, discounts, and coupons. The paper proposes a genetic algorithm to find suboptimal solutions for the shopping plan problem in a highly dynamic context, where the final cost of a product for an individual customer is dependent on his previous purchases. In particular, the work provides details on the Shopping Plan software prototype and some experimentation results showing the overall performance of the genetic algorithm.

  20. A trial to combine heterogeneous computer systems in NIFS

    International Nuclear Information System (INIS)

    Emoto, M.; Watanabe, K.; Ohdachi, S.; Matsunami, S.; Yamaguchi, S.; Sudo, S.; Okumura, H.

    2000-01-01

    Several computer systems in NIFS (the National Institute for Fusion Science) are involved in the operation of the LHD (Large Helical Device) for plasma experiments. Because these systems are independent of each other, it is burdensome to the programmers and the researchers to use several of them for one project. Currently, the programmers must write low-level data access routines for each system, then the researchers, after having learned these routines, can install the applications to retrieve the data on each system. In order to improve this situation, we have been developing a new system. This system employs two technologies, the signed applet and HORB, to combine the independent systems. These technologies were chosen for the following reasons: (1) The signed applet is an applet free from the general restrictions of applets, and can connect to any server and save the retrieved data to client machines; and (2) HORB is the Java-based ORB (Object Request Broker), the use of which makes it easy to build a Java-based, distributed environment. When the system is completed, it will allow users to obtain data from the above independent systems by means of ordinary Web browsers, without having to install additional applications

  1. Toward computational cumulative biology by combining models of biological datasets.

    Science.gov (United States)

    Faisal, Ali; Peltonen, Jaakko; Georgii, Elisabeth; Rung, Johan; Kaski, Samuel

    2014-01-01

    A main challenge of data-driven sciences is how to make maximal use of the progressively expanding databases of experimental datasets in order to keep research cumulative. We introduce the idea of a modeling-based dataset retrieval engine designed for relating a researcher's experimental dataset to earlier work in the field. The search is (i) data-driven to enable new findings, going beyond the state of the art of keyword searches in annotations, (ii) modeling-driven, to include both biological knowledge and insights learned from data, and (iii) scalable, as it is accomplished without building one unified grand model of all data. Assuming each dataset has been modeled beforehand, by the researchers or automatically by database managers, we apply a rapidly computable and optimizable combination model to decompose a new dataset into contributions from earlier relevant models. By using the data-driven decomposition, we identify a network of interrelated datasets from a large annotated human gene expression atlas. While tissue type and disease were major driving forces for determining relevant datasets, the found relationships were richer, and the model-based search was more accurate than the keyword search; moreover, it recovered biologically meaningful relationships that are not straightforwardly visible from annotations-for instance, between cells in different developmental stages such as thymocytes and T-cells. Data-driven links and citations matched to a large extent; the data-driven links even uncovered corrections to the publication data, as two of the most linked datasets were not highly cited and turned out to have wrong publication entries in the database.

  2. Personal computer versus personal computer/mobile device combination users' preclinical laboratory e-learning activity.

    Science.gov (United States)

    Kon, Haruka; Kobayashi, Hiroshi; Sakurai, Naoki; Watanabe, Kiyoshi; Yamaga, Yoshiro; Ono, Takahiro

    2017-11-01

    The aim of the present study was to clarify differences between personal computer (PC)/mobile device combination and PC-only user patterns. We analyzed access frequency and time spent on a complete denture preclinical website in order to maximize website effectiveness. Fourth-year undergraduate students (N=41) in the preclinical complete denture laboratory course were invited to participate in this survey during the final week of the course to track login data. Students accessed video demonstrations and quizzes via our e-learning site/course program, and were instructed to view online demonstrations before classes. When the course concluded, participating students filled out a questionnaire about the program, their opinions, and devices they had used to access the site. Combination user access was significantly more frequent than PC-only during supplementary learning time, indicating that students with mobile devices studied during lunch breaks and before morning classes. Most students had favorable opinions of the e-learning site, but a few combination users commented that some videos were too long and that descriptive answers were difficult on smartphones. These results imply that mobile devices' increased accessibility encouraged learning by enabling more efficient time use between classes. They also suggest that e-learning system improvements should cater to mobile device users by reducing video length and including more short-answer questions. © 2016 John Wiley & Sons Australia, Ltd.

  3. An Angiotensin II type 1 receptor activation switch patch revealed through Evolutionary Trace analysis

    DEFF Research Database (Denmark)

    Bonde, Marie Mi; Yao, Rong; Ma, Jian-Nong

    2010-01-01

    to be completely resolved. Evolutionary Trace (ET) analysis is a computational method, which identifies clusters of functionally important residues by integrating information on evolutionary important residue variations with receptor structure. Combined with known mutational data, ET predicted a patch of residues......) displayed phenotypes associated with changed activation state, such as increased agonist affinity or basal activity, promiscuous activation, or constitutive internalization highlighting the importance of testing different signaling pathways. We conclude that this evolutionary important patch mediates...

  4. Statistical analysis and definition of blockages-prediction formulae for the wastewater network of Oslo by evolutionary computing.

    Science.gov (United States)

    Ugarelli, Rita; Kristensen, Stig Morten; Røstum, Jon; Saegrov, Sveinung; Di Federico, Vittorio

    2009-01-01

    Oslo Vann og Avløpsetaten (Oslo VAV)-the water/wastewater utility in the Norwegian capital city of Oslo-is assessing future strategies for selection of most reliable materials for wastewater networks, taking into account not only material technical performance but also material performance, regarding operational condition of the system.The research project undertaken by SINTEF Group, the largest research organisation in Scandinavia, NTNU (Norges Teknisk-Naturvitenskapelige Universitet) and Oslo VAV adopts several approaches to understand reasons for failures that may impact flow capacity, by analysing historical data for blockages in Oslo.The aim of the study was to understand whether there is a relationship between the performance of the pipeline and a number of specific attributes such as age, material, diameter, to name a few. This paper presents the characteristics of the data set available and discusses the results obtained by performing two different approaches: a traditional statistical analysis by segregating the pipes into classes, each of which with the same explanatory variables, and a Evolutionary Polynomial Regression model (EPR), developed by Technical University of Bari and University of Exeter, to identify possible influence of pipe's attributes on the total amount of predicted blockages in a period of time.Starting from a detailed analysis of the available data for the blockage events, the most important variables are identified and a classification scheme is adopted.From the statistical analysis, it can be stated that age, size and function do seem to have a marked influence on the proneness of a pipeline to blockages, but, for the reduced sample available, it is difficult to say which variable it is more influencing. If we look at total number of blockages the oldest class seems to be the most prone to blockages, but looking at blockage rates (number of blockages per km per year), then it is the youngest class showing the highest blockage rate

  5. Inductive reasoning and forecasting of population dynamics of Cylindrospermopsis raciborskii in three sub-tropical reservoirs by evolutionary computation.

    Science.gov (United States)

    Recknagel, Friedrich; Orr, Philip T; Cao, Hongqing

    2014-01-01

    Seven-day-ahead forecasting models of Cylindrospermopsis raciborskii in three warm-monomictic and mesotrophic reservoirs in south-east Queensland have been developed by means of water quality data from 1999 to 2010 and the hybrid evolutionary algorithm HEA. Resulting models using all measured variables as inputs as well as models using electronically measurable variables only as inputs forecasted accurately timing of overgrowth of C. raciborskii and matched well high and low magnitudes of observed bloom events with 0.45≤r 2 >0.61 and 0.4≤r 2 >0.57, respectively. The models also revealed relationships and thresholds triggering bloom events that provide valuable information on synergism between water quality conditions and population dynamics of C. raciborskii. Best performing models based on using all measured variables as inputs indicated electrical conductivity (EC) within the range of 206-280mSm -1 as threshold above which fast growth and high abundances of C. raciborskii have been observed for the three lakes. Best models based on electronically measurable variables for the Lakes Wivenhoe and Somerset indicated a water temperature (WT) range of 25.5-32.7°C within which fast growth and high abundances of C. raciborskii can be expected. By contrast the model for Lake Samsonvale highlighted a turbidity (TURB) level of 4.8 NTU as indicator for mass developments of C. raciborskii. Experiments with online measured water quality data of the Lake Wivenhoe from 2007 to 2010 resulted in predictive models with 0.61≤r 2 >0.65 whereby again similar levels of EC and WT have been discovered as thresholds for outgrowth of C. raciborskii. The highest validity of r 2 =0.75 for an in situ data-based model has been achieved after considering time lags for EC by 7 days and dissolved oxygen by 1 day. These time lags have been discovered by a systematic screening of all possible combinations of time lags between 0 and 10 days for all electronically measurable variables. The so

  6. Combining dynamical decoupling with fault-tolerant quantum computation

    International Nuclear Information System (INIS)

    Ng, Hui Khoon; Preskill, John; Lidar, Daniel A.

    2011-01-01

    We study how dynamical decoupling (DD) pulse sequences can improve the reliability of quantum computers. We prove upper bounds on the accuracy of DD-protected quantum gates and derive sufficient conditions for DD-protected gates to outperform unprotected gates. Under suitable conditions, fault-tolerant quantum circuits constructed from DD-protected gates can tolerate stronger noise and have a lower overhead cost than fault-tolerant circuits constructed from unprotected gates. Our accuracy estimates depend on the dynamics of the bath that couples to the quantum computer and can be expressed either in terms of the operator norm of the bath's Hamiltonian or in terms of the power spectrum of bath correlations; we explain in particular how the performance of recursively generated concatenated pulse sequences can be analyzed from either viewpoint. Our results apply to Hamiltonian noise models with limited spatial correlations.

  7. Contention Bounds for Combinations of Computation Graphs and Network Topologies

    Science.gov (United States)

    2014-08-08

    notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does...Google, Nokia , NVIDIA, Oracle, MathWorks and Samsung. Also funded by U.S. DOE Office of Science, Office of Advanced Scientific Computing Research...communicate is at least Wproc(P,M,N) = Ω ( n3 PM1/2 ) . Note that the local memory size M appears in the denom- inator of the expression above, which is why

  8. Optimum topology for radial networks by using evolutionary computer programming; Topologia optima de redes radiais utilizando programacao evolucionaria

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, Joao Luis [Instituto de Engenhariade Sistemas e Computadores (INESC), Porto (Portugal). E-mail: jpinto@duque.inescn.pt; Proenca, Luis Miguel [Instituto Superior de Linguas e Administracao (ISLA), Gaia (Portugal). E-mail: lproenca@inescn.pt

    1999-07-01

    This paper describes the using of Evolutionary Programming techniques for determination of the radial electric network topology, considering investment costs and losses. The work aims to demonstrate the particular easiness of coding and implementation and the parallelism implicit to the method as well, giving outstanding performance levels. As test example, a 43 bars and 75 alternative lines network has been used by describing an implementation of the algorithm in an Object Oriented platform.

  9. 3D computational mechanics elucidate the evolutionary implications of orbit position and size diversity of early amphibians.

    Directory of Open Access Journals (Sweden)

    Jordi Marcé-Nogué

    Full Text Available For the first time in vertebrate palaeontology, the potential of joining Finite Element Analysis (FEA and Parametrical Analysis (PA is used to shed new light on two different cranial parameters from the orbits to evaluate their biomechanical role and evolutionary patterns. The early tetrapod group of Stereospondyls, one of the largest groups of Temnospondyls is used as a case study because its orbits position and size vary hugely within the members of this group. An adult skull of Edingerella madagascariensis was analysed using two different cases of boundary and loading conditions in order to quantify stress and deformation response under a bilateral bite and during skull raising. Firstly, the variation of the original geometry of its orbits was introduced in the models producing new FEA results, allowing the exploration of the ecomorphology, feeding strategy and evolutionary patterns of these top predators. Secondly, the quantitative results were analysed in order to check if the orbit size and position were correlated with different stress patterns. These results revealed that in most of the cases the stress distribution is not affected by changes in the size and position of the orbit. This finding supports the high mechanical plasticity of this group during the Triassic period. The absence of mechanical constraints regarding the orbit probably promoted the ecomorphological diversity acknowledged for this group, as well as its ecological niche differentiation in the terrestrial Triassic ecosystems in clades as lydekkerinids, trematosaurs, capitosaurs or metoposaurs.

  10. 3D Computational Mechanics Elucidate the Evolutionary Implications of Orbit Position and Size Diversity of Early Amphibians

    Science.gov (United States)

    Marcé-Nogué, Jordi; Fortuny, Josep; De Esteban-Trivigno, Soledad; Sánchez, Montserrat; Gil, Lluís; Galobart, Àngel

    2015-01-01

    For the first time in vertebrate palaeontology, the potential of joining Finite Element Analysis (FEA) and Parametrical Analysis (PA) is used to shed new light on two different cranial parameters from the orbits to evaluate their biomechanical role and evolutionary patterns. The early tetrapod group of Stereospondyls, one of the largest groups of Temnospondyls is used as a case study because its orbits position and size vary hugely within the members of this group. An adult skull of Edingerella madagascariensis was analysed using two different cases of boundary and loading conditions in order to quantify stress and deformation response under a bilateral bite and during skull raising. Firstly, the variation of the original geometry of its orbits was introduced in the models producing new FEA results, allowing the exploration of the ecomorphology, feeding strategy and evolutionary patterns of these top predators. Secondly, the quantitative results were analysed in order to check if the orbit size and position were correlated with different stress patterns. These results revealed that in most of the cases the stress distribution is not affected by changes in the size and position of the orbit. This finding supports the high mechanical plasticity of this group during the Triassic period. The absence of mechanical constraints regarding the orbit probably promoted the ecomorphological diversity acknowledged for this group, as well as its ecological niche differentiation in the terrestrial Triassic ecosystems in clades as lydekkerinids, trematosaurs, capitosaurs or metoposaurs. PMID:26107295

  11. A Hybrid Computational Intelligence Approach Combining Genetic Programming And Heuristic Classification for Pap-Smear Diagnosis

    DEFF Research Database (Denmark)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan

    2001-01-01

    The paper suggests the combined use of different computational intelligence (CI) techniques in a hybrid scheme, as an effective approach to medical diagnosis. Getting to know the advantages and disadvantages of each computational intelligence technique in the recent years, the time has come...

  12. Exergy, exergoeconomic and environmental analyses and evolutionary algorithm based multi-objective optimization of combined cycle power plants

    International Nuclear Information System (INIS)

    Ahmadi, Pouria; Dincer, Ibrahim; Rosen, Marc A.

    2011-01-01

    A comprehensive exergy, exergoeconomic and environmental impact analysis and optimization is reported of several combined cycle power plants (CCPPs). In the first part, thermodynamic analyses based on energy and exergy of the CCPPs are performed, and the effect of supplementary firing on the natural gas-fired CCPP is investigated. The latter step includes the effect of supplementary firing on the performance of bottoming cycle and CO 2 emissions, and utilizes the first and second laws of thermodynamics. In the second part, a multi-objective optimization is performed to determine the 'best' design parameters, accounting for exergetic, economic and environmental factors. The optimization considers three objective functions: CCPP exergy efficiency, total cost rate of the system products and CO 2 emissions of the overall plant. The environmental impact in terms of CO 2 emissions is integrated with the exergoeconomic objective function as a new objective function. The results of both exergy and exergoeconomic analyses show that the largest exergy destructions occur in the CCPP combustion chamber, and that increasing the gas turbine inlet temperature decreases the CCPP cost of exergy destruction. The optimization results demonstrates that CO 2 emissions are reduced by selecting the best components and using a low fuel injection rate into the combustion chamber. -- Highlights: → Comprehensive thermodynamic modeling of a combined cycle power plant. → Exergy, economic and environmental analyses of the system. → Investigation of the role of multiobjective exergoenvironmental optimization as a tool for more environmentally-benign design.

  13. Evolutionary Demography

    DEFF Research Database (Denmark)

    Levitis, Daniel

    2015-01-01

    of biological and cultural evolution. Demographic variation within and among human populations is influenced by our biology, and therefore by natural selection and our evolutionary background. Demographic methods are necessary for studying populations of other species, and for quantifying evolutionary fitness......Demography is the quantitative study of population processes, while evolution is a population process that influences all aspects of biological organisms, including their demography. Demographic traits common to all human populations are the products of biological evolution or the interaction...

  14. Evolutionary Statistical Procedures

    CERN Document Server

    Baragona, Roberto; Poli, Irene

    2011-01-01

    This proposed text appears to be a good introduction to evolutionary computation for use in applied statistics research. The authors draw from a vast base of knowledge about the current literature in both the design of evolutionary algorithms and statistical techniques. Modern statistical research is on the threshold of solving increasingly complex problems in high dimensions, and the generalization of its methodology to parameters whose estimators do not follow mathematically simple distributions is underway. Many of these challenges involve optimizing functions for which analytic solutions a

  15. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, J.G., E-mail: jglezg2002@gmail.es [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Rubiano, J.G. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Winter, G. [Instituto Universitario de Sistemas Inteligentes y Aplicaciones Numéricas en la Ingeniería, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Martel, P. [Departamento de Física, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Instituto Universitario de Estudios Ambientales y Recursos Naturales, Universidad de Las Palmas de Gran Canaria, 35001 Las Palmas de Gran Canaria (Spain); Bolivar, J.P. [Departamento de Física Aplicada, Universidad de Huelva, 21071 Huelva (Spain)

    2017-06-21

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs. - Highlights: • A computational method for characterizing HPGe detectors has been generalized. • The new version is usable for a wider range of sample geometries. • It starts from reference FEPEs obtained through a standard calibration procedure. • A model of an HPGe XtRa detector has been

  16. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    International Nuclear Information System (INIS)

    Guerra, J.G.; Rubiano, J.G.; Winter, G.; Guerra, A.G.; Alonso, H.; Arnedo, M.A.; Tejera, A.; Martel, P.; Bolivar, J.P.

    2017-01-01

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs. - Highlights: • A computational method for characterizing HPGe detectors has been generalized. • The new version is usable for a wider range of sample geometries. • It starts from reference FEPEs obtained through a standard calibration procedure. • A model of an HPGe XtRa detector has been

  17. Evolutionary Expectations

    DEFF Research Database (Denmark)

    Nash, Ulrik William

    2014-01-01

    , they are correlated among people who share environments because these individuals satisfice within their cognitive bounds by using cues in order of validity, as opposed to using cues arbitrarily. Any difference in expectations thereby arise from differences in cognitive ability, because two individuals with identical...... cognitive bounds will perceive business opportunities identically. In addition, because cues provide information about latent causal structures of the environment, changes in causality must be accompanied by changes in cognitive representations if adaptation is to be maintained. The concept of evolutionary......The concept of evolutionary expectations descends from cue learning psychology, synthesizing ideas on rational expectations with ideas on bounded rationality, to provide support for these ideas simultaneously. Evolutionary expectations are rational, but within cognitive bounds. Moreover...

  18. [Evolutionary medicine].

    Science.gov (United States)

    Wjst, M

    2013-12-01

    Evolutionary medicine allows new insights into long standing medical problems. Are we "really stoneagers on the fast lane"? This insight might have enormous consequences and will allow new answers that could never been provided by traditional anthropology. Only now this is made possible using data from molecular medicine and systems biology. Thereby evolutionary medicine takes a leap from a merely theoretical discipline to practical fields - reproductive, nutritional and preventive medicine, as well as microbiology, immunology and psychiatry. Evolutionary medicine is not another "just so story" but a serious candidate for the medical curriculum providing a universal understanding of health and disease based on our biological origin. © Georg Thieme Verlag KG Stuttgart · New York.

  19. Evolutionary Awareness

    Directory of Open Access Journals (Sweden)

    Gregory Gorelik

    2014-10-01

    Full Text Available In this article, we advance the concept of “evolutionary awareness,” a metacognitive framework that examines human thought and emotion from a naturalistic, evolutionary perspective. We begin by discussing the evolution and current functioning of the moral foundations on which our framework rests. Next, we discuss the possible applications of such an evolutionarily-informed ethical framework to several domains of human behavior, namely: sexual maturation, mate attraction, intrasexual competition, culture, and the separation between various academic disciplines. Finally, we discuss ways in which an evolutionary awareness can inform our cross-generational activities—which we refer to as “intergenerational extended phenotypes”—by helping us to construct a better future for ourselves, for other sentient beings, and for our environment.

  20. Industrial Applications of Evolutionary Algorithms

    CERN Document Server

    Sanchez, Ernesto; Tonda, Alberto

    2012-01-01

    This book is intended as a reference both for experienced users of evolutionary algorithms and for researchers that are beginning to approach these fascinating optimization techniques. Experienced users will find interesting details of real-world problems, and advice on solving issues related to fitness computation, modeling and setting appropriate parameters to reach optimal solutions. Beginners will find a thorough introduction to evolutionary computation, and a complete presentation of all evolutionary algorithms exploited to solve different problems. The book could fill the gap between the

  1. Computational analyses of an evolutionary arms race between mammalian immunity mediated by immunoglobulin A and its subversion by bacterial pathogens.

    Directory of Open Access Journals (Sweden)

    Ana Pinheiro

    Full Text Available IgA is the predominant immunoglobulin isotype in mucosal tissues and external secretions, playing important roles both in defense against pathogens and in maintenance of commensal microbiota. Considering the complexity of its interactions with the surrounding environment, IgA is a likely target for diversifying or positive selection. To investigate this possibility, the action of natural selection on IgA was examined in depth with six different methods: CODEML from the PAML package and the SLAC, FEL, REL, MEME and FUBAR methods implemented in the Datamonkey webserver. In considering just primate IgA, these analyses show that diversifying selection targeted five positions of the Cα1 and Cα2 domains of IgA. Extending the analysis to include other mammals identified 18 positively selected sites: ten in Cα1, five in Cα2 and three in Cα3. All but one of these positions display variation in polarity and charge. Their structural locations suggest they indirectly influence the conformation of sites on IgA that are critical for interaction with host IgA receptors and also with proteins produced by mucosal pathogens that prevent their elimination by IgA-mediated effector mechanisms. Demonstrating the plasticity of IgA in the evolution of different groups of mammals, only two of the eighteen selected positions in all mammals are included in the five selected positions in primates. That IgA residues subject to positive selection impact sites targeted both by host receptors and subversive pathogen ligands highlights the evolutionary arms race playing out between mammals and pathogens, and further emphasizes the importance of IgA in protection against mucosal pathogens.

  2. Providing full point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer

    Energy Technology Data Exchange (ETDEWEB)

    Archer, Charles J.; Faraj, Daniel A.; Inglett, Todd A.; Ratterman, Joseph D.

    2018-01-30

    Methods, apparatus, and products are disclosed for providing full point-to-point communications among compute nodes of an operational group in a global combining network of a parallel computer, each compute node connected to each adjacent compute node in the global combining network through a link, that include: receiving a network packet in a compute node, the network packet specifying a destination compute node; selecting, in dependence upon the destination compute node, at least one of the links for the compute node along which to forward the network packet toward the destination compute node; and forwarding the network packet along the selected link to the adjacent compute node connected to the compute node through the selected link.

  3. Evolutionary robotics

    Indian Academy of Sciences (India)

    In evolutionary robotics, a suitable robot control system is developed automatically through evolution due to the interactions between the robot and its environment. It is a complicated task, as the robot and the environment constitute a highly dynamical system. Several methods have been tried by various investigators to ...

  4. Computational characterization of HPGe detectors usable for a wide variety of source geometries by using Monte Carlo simulation and a multi-objective evolutionary algorithm

    Science.gov (United States)

    Guerra, J. G.; Rubiano, J. G.; Winter, G.; Guerra, A. G.; Alonso, H.; Arnedo, M. A.; Tejera, A.; Martel, P.; Bolivar, J. P.

    2017-06-01

    In this work, we have developed a computational methodology for characterizing HPGe detectors by implementing in parallel a multi-objective evolutionary algorithm, together with a Monte Carlo simulation code. The evolutionary algorithm is used for searching the geometrical parameters of a model of detector by minimizing the differences between the efficiencies calculated by Monte Carlo simulation and two reference sets of Full Energy Peak Efficiencies (FEPEs) corresponding to two given sample geometries, a beaker of small diameter laid over the detector window and a beaker of large capacity which wrap the detector. This methodology is a generalization of a previously published work, which was limited to beakers placed over the window of the detector with a diameter equal or smaller than the crystal diameter, so that the crystal mount cap (which surround the lateral surface of the crystal), was not considered in the detector model. The generalization has been accomplished not only by including such a mount cap in the model, but also using multi-objective optimization instead of mono-objective, with the aim of building a model sufficiently accurate for a wider variety of beakers commonly used for the measurement of environmental samples by gamma spectrometry, like for instance, Marinellis, Petris, or any other beaker with a diameter larger than the crystal diameter, for which part of the detected radiation have to pass through the mount cap. The proposed methodology has been applied to an HPGe XtRa detector, providing a model of detector which has been successfully verificated for different source-detector geometries and materials and experimentally validated using CRMs.

  5. Applying evolutionary anthropology.

    Science.gov (United States)

    Gibson, Mhairi A; Lawson, David W

    2015-01-01

    Evolutionary anthropology provides a powerful theoretical framework for understanding how both current environments and legacies of past selection shape human behavioral diversity. This integrative and pluralistic field, combining ethnographic, demographic, and sociological methods, has provided new insights into the ultimate forces and proximate pathways that guide human adaptation and variation. Here, we present the argument that evolutionary anthropological studies of human behavior also hold great, largely untapped, potential to guide the design, implementation, and evaluation of social and public health policy. Focusing on the key anthropological themes of reproduction, production, and distribution we highlight classic and recent research demonstrating the value of an evolutionary perspective to improving human well-being. The challenge now comes in transforming relevance into action and, for that, evolutionary behavioral anthropologists will need to forge deeper connections with other applied social scientists and policy-makers. We are hopeful that these developments are underway and that, with the current tide of enthusiasm for evidence-based approaches to policy, evolutionary anthropology is well positioned to make a strong contribution. © 2015 Wiley Periodicals, Inc.

  6. Applying Evolutionary Anthropology

    Science.gov (United States)

    Gibson, Mhairi A; Lawson, David W

    2015-01-01

    Evolutionary anthropology provides a powerful theoretical framework for understanding how both current environments and legacies of past selection shape human behavioral diversity. This integrative and pluralistic field, combining ethnographic, demographic, and sociological methods, has provided new insights into the ultimate forces and proximate pathways that guide human adaptation and variation. Here, we present the argument that evolutionary anthropological studies of human behavior also hold great, largely untapped, potential to guide the design, implementation, and evaluation of social and public health policy. Focusing on the key anthropological themes of reproduction, production, and distribution we highlight classic and recent research demonstrating the value of an evolutionary perspective to improving human well-being. The challenge now comes in transforming relevance into action and, for that, evolutionary behavioral anthropologists will need to forge deeper connections with other applied social scientists and policy-makers. We are hopeful that these developments are underway and that, with the current tide of enthusiasm for evidence-based approaches to policy, evolutionary anthropology is well positioned to make a strong contribution. PMID:25684561

  7. Combined experimental and computational modelling studies of the solubility of nickel in strontium titanate

    NARCIS (Netherlands)

    Beale, A.M.; Paul, M.; Sankar, G.; Oldman, R.J.; Catlow, R.A.; French, S.; Fowles, M.

    2009-01-01

    A combination of X-ray techniques and atomistic computational modelling has been used to study the solubility of Ni in SrTiO3 in relation to the application of this material for the catalytic partial oxidation of methane. The experiments have demonstrated that low temperature, hydrothermal synthesis

  8. A City Parking Integration System Combined with Cloud Computing Technologies and Smart Mobile Devices

    Science.gov (United States)

    Yeh, Her-Tyan; Chen, Bing-Chang; Wang, Bo-Xun

    2016-01-01

    The current study applied cloud computing technology and smart mobile devices combined with a streaming server for parking lots to plan a city parking integration system. It is also equipped with a parking search system, parking navigation system, parking reservation service, and car retrieval service. With this system, users can quickly find…

  9. Combining Self-Explaining with Computer Architecture Diagrams to Enhance the Learning of Assembly Language Programming

    Science.gov (United States)

    Hung, Y.-C.

    2012-01-01

    This paper investigates the impact of combining self explaining (SE) with computer architecture diagrams to help novice students learn assembly language programming. Pre- and post-test scores for the experimental and control groups were compared and subjected to covariance (ANCOVA) statistical analysis. Results indicate that the SE-plus-diagram…

  10. Computational modeling of Repeat1 region of INI1/hSNF5: An evolutionary link with ubiquitin

    Science.gov (United States)

    Bhutoria, Savita

    2016-01-01

    Abstract The structure of a protein can be very informative of its function. However, determining protein structures experimentally can often be very challenging. Computational methods have been used successfully in modeling structures with sufficient accuracy. Here we have used computational tools to predict the structure of an evolutionarily conserved and functionally significant domain of Integrase interactor (INI)1/hSNF5 protein. INI1 is a component of the chromatin remodeling SWI/SNF complex, a tumor suppressor and is involved in many protein‐protein interactions. It belongs to SNF5 family of proteins that contain two conserved repeat (Rpt) domains. Rpt1 domain of INI1 binds to HIV‐1 Integrase, and acts as a dominant negative mutant to inhibit viral replication. Rpt1 domain also interacts with oncogene c‐MYC and modulates its transcriptional activity. We carried out an ab initio modeling of a segment of INI1 protein containing the Rpt1 domain. The structural model suggested the presence of a compact and well defined ββαα topology as core structure in the Rpt1 domain of INI1. This topology in Rpt1 was similar to PFU domain of Phospholipase A2 Activating Protein, PLAA. Interestingly, PFU domain shares similarity with Ubiquitin and has ubiquitin binding activity. Because of the structural similarity between Rpt1 domain of INI1 and PFU domain of PLAA, we propose that Rpt1 domain of INI1 may participate in ubiquitin recognition or binding with ubiquitin or ubiquitin related proteins. This modeling study may shed light on the mode of interactions of Rpt1 domain of INI1 and is likely to facilitate future functional studies of INI1. PMID:27261671

  11. Introduction to Evolutionary Algorithms

    CERN Document Server

    Yu, Xinjie

    2010-01-01

    Evolutionary algorithms (EAs) are becoming increasingly attractive for researchers from various disciplines, such as operations research, computer science, industrial engineering, electrical engineering, social science, economics, etc. This book presents an insightful, comprehensive, and up-to-date treatment of EAs, such as genetic algorithms, differential evolution, evolution strategy, constraint optimization, multimodal optimization, multiobjective optimization, combinatorial optimization, evolvable hardware, estimation of distribution algorithms, ant colony optimization, particle swarm opti

  12. Heat Transfer Computations of Internal Duct Flows With Combined Hydraulic and Thermal Developing Length

    Science.gov (United States)

    Wang, C. R.; Towne, C. E.; Hippensteele, S. A.; Poinsatte, P. E.

    1997-01-01

    This study investigated the Navier-Stokes computations of the surface heat transfer coefficients of a transition duct flow. A transition duct from an axisymmetric cross section to a non-axisymmetric cross section, is usually used to connect the turbine exit to the nozzle. As the gas turbine inlet temperature increases, the transition duct is subjected to the high temperature at the gas turbine exit. The transition duct flow has combined development of hydraulic and thermal entry length. The design of the transition duct required accurate surface heat transfer coefficients. The Navier-Stokes computational method could be used to predict the surface heat transfer coefficients of a transition duct flow. The Proteus three-dimensional Navier-Stokes numerical computational code was used in this study. The code was first studied for the computations of the turbulent developing flow properties within a circular duct and a square duct. The code was then used to compute the turbulent flow properties of a transition duct flow. The computational results of the surface pressure, the skin friction factor, and the surface heat transfer coefficient were described and compared with their values obtained from theoretical analyses or experiments. The comparison showed that the Navier-Stokes computation could predict approximately the surface heat transfer coefficients of a transition duct flow.

  13. Evolutionary institutionalism.

    Science.gov (United States)

    Fürstenberg, Dr Kai

    Institutions are hard to define and hard to study. Long prominent in political science have been two theories: Rational Choice Institutionalism (RCI) and Historical Institutionalism (HI). Arising from the life sciences is now a third: Evolutionary Institutionalism (EI). Comparative strengths and weaknesses of these three theories warrant review, and the value-to-be-added by expanding the third beyond Darwinian evolutionary theory deserves consideration. Should evolutionary institutionalism expand to accommodate new understanding in ecology, such as might apply to the emergence of stability, and in genetics, such as might apply to political behavior? Core arguments are reviewed for each theory with more detailed exposition of the third, EI. Particular attention is paid to EI's gene-institution analogy; to variation, selection, and retention of institutional traits; to endogeneity and exogeneity; to agency and structure; and to ecosystem effects, institutional stability, and empirical limitations in behavioral genetics. RCI, HI, and EI are distinct but complementary. Institutional change, while amenable to rational-choice analysis and, retrospectively, to criticaljuncture and path-dependency analysis, is also, and importantly, ecological. Stability, like change, is an emergent property of institutions, which tend to stabilize after change in a manner analogous to allopatric speciation. EI is more than metaphorically biological in that institutional behaviors are driven by human behaviors whose evolution long preceded the appearance of institutions themselves.

  14. Combining Acceleration Techniques for Low-Dose X-Ray Cone Beam Computed Tomography Image Reconstruction.

    Science.gov (United States)

    Huang, Hsuan-Ming; Hsiao, Ing-Tsung

    2017-01-01

    Over the past decade, image quality in low-dose computed tomography has been greatly improved by various compressive sensing- (CS-) based reconstruction methods. However, these methods have some disadvantages including high computational cost and slow convergence rate. Many different speed-up techniques for CS-based reconstruction algorithms have been developed. The purpose of this paper is to propose a fast reconstruction framework that combines a CS-based reconstruction algorithm with several speed-up techniques. First, total difference minimization (TDM) was implemented using the soft-threshold filtering (STF). Second, we combined TDM-STF with the ordered subsets transmission (OSTR) algorithm for accelerating the convergence. To further speed up the convergence of the proposed method, we applied the power factor and the fast iterative shrinkage thresholding algorithm to OSTR and TDM-STF, respectively. Results obtained from simulation and phantom studies showed that many speed-up techniques could be combined to greatly improve the convergence speed of a CS-based reconstruction algorithm. More importantly, the increased computation time (≤10%) was minor as compared to the acceleration provided by the proposed method. In this paper, we have presented a CS-based reconstruction framework that combines several acceleration techniques. Both simulation and phantom studies provide evidence that the proposed method has the potential to satisfy the requirement of fast image reconstruction in practical CT.

  15. [Computational fluid dynamics simulation of different impeller combinations in high viscosity fermentation and its application].

    Science.gov (United States)

    Dong, Shuhao; Zhu, Ping; Xu, Xiaoying; Li, Sha; Jiang, Yongxiang; Xu, Hong

    2015-07-01

    Agitator is one of the essential factors to realize high efficient fermentation for high aerobic and viscous microorganisms, and the influence of different impeller combination on the fermentation process is very important. Welan gum is a microbial exopolysaccharide produced by Alcaligenes sp. under high aerobic and high viscos conditions. Computational fluid dynamics (CFD) numerical simulation was used for analyzing the distribution of velocity, shear rate and gas holdup in the welan fermentation reactor under six different impeller combinations. The best three combinations of impellers were applied to the fermentation of welan. By analyzing the fermentation performance, the MB-4-6 combination had better effect on dissolved oxygen and velocity. The content of welan was increased by 13%. Furthermore, the viscosity of production were also increased.

  16. Open Issues in Evolutionary Robotics.

    Science.gov (United States)

    Silva, Fernando; Duarte, Miguel; Correia, Luís; Oliveira, Sancho Moura; Christensen, Anders Lyhne

    2016-01-01

    One of the long-term goals in evolutionary robotics is to be able to automatically synthesize controllers for real autonomous robots based only on a task specification. While a number of studies have shown the applicability of evolutionary robotics techniques for the synthesis of behavioral control, researchers have consistently been faced with a number of issues preventing the widespread adoption of evolutionary robotics for engineering purposes. In this article, we review and discuss the open issues in evolutionary robotics. First, we analyze the benefits and challenges of simulation-based evolution and subsequent deployment of controllers versus evolution on real robotic hardware. Second, we discuss specific evolutionary computation issues that have plagued evolutionary robotics: (1) the bootstrap problem, (2) deception, and (3) the role of genomic encoding and genotype-phenotype mapping in the evolution of controllers for complex tasks. Finally, we address the absence of standard research practices in the field. We also discuss promising avenues of research. Our underlying motivation is the reduction of the current gap between evolutionary robotics and mainstream robotics, and the establishment of evolutionary robotics as a canonical approach for the engineering of autonomous robots.

  17. An online hybrid brain-computer interface combining multiple physiological signals for webpage browse.

    Science.gov (United States)

    Long Chen; Zhongpeng Wang; Feng He; Jiajia Yang; Hongzhi Qi; Peng Zhou; Baikun Wan; Dong Ming

    2015-08-01

    The hybrid brain computer interface (hBCI) could provide higher information transfer rate than did the classical BCIs. It included more than one brain-computer or human-machine interact paradigms, such as the combination of the P300 and SSVEP paradigms. Research firstly constructed independent subsystems of three different paradigms and tested each of them with online experiments. Then we constructed a serial hybrid BCI system which combined these paradigms to achieve the functions of typing letters, moving and clicking cursor, and switching among them for the purpose of browsing webpages. Five subjects were involved in this study. They all successfully realized these functions in the online tests. The subjects could achieve an accuracy above 90% after training, which met the requirement in operating the system efficiently. The results demonstrated that it was an efficient system capable of robustness, which provided an approach for the clinic application.

  18. Realization of the Evristic Combination Methods by Means of Computer Graphics

    Directory of Open Access Journals (Sweden)

    S. A. Novoselov

    2012-01-01

    Full Text Available The paper looks at the ways of enhancing and stimulating the creative activity and initiative of pedagogic students – the prospective specialists called for educating and upbringing socially and professionally competent, originally thinking, versatile personalities. For developing their creative abilities the author recommends introducing the heuristic combination methods, applied for engineering creativity facilitation; associative-synectic technology; and computer graphics tools. The paper contains the comparative analysis of the main heuristic method operations and the computer graphics redactor in creating a visual composition. The examples of implementing the heuristic combination methods are described along with the extracts of the laboratory classes designed for creativity and its motivation developments. The approbation of the given method in the several universities confirms the prospects of enhancing the students’ learning and creative activities. 

  19. Combined computational and experimental approach to improve the assessment of mitral regurgitation by echocardiography.

    Science.gov (United States)

    Sonntag, Simon J; Li, Wei; Becker, Michael; Kaestner, Wiebke; Büsen, Martin R; Marx, Nikolaus; Merhof, Dorit; Steinseifer, Ulrich

    2014-05-01

    Mitral regurgitation (MR) is one of the most frequent valvular heart diseases. To assess MR severity, color Doppler imaging (CDI) is the clinical standard. However, inadequate reliability, poor reproducibility and heavy user-dependence are known limitations. A novel approach combining computational and experimental methods is currently under development aiming to improve the quantification. A flow chamber for a circulatory flow loop was developed. Three different orifices were used to mimic variations of MR. The flow field was recorded simultaneously by a 2D Doppler ultrasound transducer and Particle Image Velocimetry (PIV). Computational Fluid Dynamics (CFD) simulations were conducted using the same geometry and boundary conditions. The resulting computed velocity field was used to simulate synthetic Doppler signals. Comparison between PIV and CFD shows a high level of agreement. The simulated CDI exhibits the same characteristics as the recorded color Doppler images. The feasibility of the proposed combination of experimental and computational methods for the investigation of MR is shown and the numerical methods are successfully validated against the experiments. Furthermore, it is discussed how the approach can be used in the long run as a platform to improve the assessment of MR quantification.

  20. Computational Identification of Potential Multi-drug Combinations for Reduction of Microglial Inflammation in Alzheimer Disease

    Directory of Open Access Journals (Sweden)

    Thomas J. Anastasio

    2015-06-01

    Full Text Available Like other neurodegenerative diseases, Alzheimer Disease (AD has a prominent inflammatory component mediated by brain microglia. Reducing microglial inflammation could potentially halt or at least slow the neurodegenerative process. A major challenge in the development of treatments targeting brain inflammation is the sheer complexity of the molecular mechanisms that determine whether microglia become inflammatory or take on a more neuroprotective phenotype. The process is highly multifactorial, raising the possibility that a multi-target/multi-drug strategy could be more effective than conventional monotherapy. This study takes a computational approach in finding combinations of approved drugs that are potentially more effective than single drugs in reducing microglial inflammation in AD. This novel approach exploits the distinct advantages of two different computer programming languages, one imperative and the other declarative. Existing programs written in both languages implement the same model of microglial behavior, and the input/output relationships of both programs agree with each other and with data on microglia over an extensive test battery. Here the imperative program is used efficiently to screen the model for the most efficacious combinations of 10 drugs, while the declarative program is used to analyze in detail the mechanisms of action of the most efficacious combinations. Of the 1024 possible drug combinations, the simulated screen identifies only 7 that are able to move simulated microglia at least 50% of the way from a neurotoxic to a neuroprotective phenotype. Subsequent analysis shows that of the 7 most efficacious combinations, 2 stand out as superior both in strength and reliability. The model offers many experimentally testable and therapeutically relevant predictions concerning effective drug combinations and their mechanisms of action.

  1. Computational identification of potential multi-drug combinations for reduction of microglial inflammation in Alzheimer disease.

    Science.gov (United States)

    Anastasio, Thomas J

    2015-01-01

    Like other neurodegenerative diseases, Alzheimer Disease (AD) has a prominent inflammatory component mediated by brain microglia. Reducing microglial inflammation could potentially halt or at least slow the neurodegenerative process. A major challenge in the development of treatments targeting brain inflammation is the sheer complexity of the molecular mechanisms that determine whether microglia become inflammatory or take on a more neuroprotective phenotype. The process is highly multifactorial, raising the possibility that a multi-target/multi-drug strategy could be more effective than conventional monotherapy. This study takes a computational approach in finding combinations of approved drugs that are potentially more effective than single drugs in reducing microglial inflammation in AD. This novel approach exploits the distinct advantages of two different computer programming languages, one imperative and the other declarative. Existing programs written in both languages implement the same model of microglial behavior, and the input/output relationships of both programs agree with each other and with data on microglia over an extensive test battery. Here the imperative program is used efficiently to screen the model for the most efficacious combinations of 10 drugs, while the declarative program is used to analyze in detail the mechanisms of action of the most efficacious combinations. Of the 1024 possible drug combinations, the simulated screen identifies only 7 that are able to move simulated microglia at least 50% of the way from a neurotoxic to a neuroprotective phenotype. Subsequent analysis shows that of the 7 most efficacious combinations, 2 stand out as superior both in strength and reliability. The model offers many experimentally testable and therapeutically relevant predictions concerning effective drug combinations and their mechanisms of action.

  2. Exploring Tradeoffs in Demand-Side and Supply-Side Management of Urban Water Resources Using Agent-Based Modeling and Evolutionary Computation

    Directory of Open Access Journals (Sweden)

    Lufthansa Kanta

    2015-11-01

    Full Text Available Urban water supply systems may be managed through supply-side and demand-side strategies, which focus on water source expansion and demand reductions, respectively. Supply-side strategies bear infrastructure and energy costs, while demand-side strategies bear costs of implementation and inconvenience to consumers. To evaluate the performance of demand-side strategies, the participation and water use adaptations of consumers should be simulated. In this study, a Complex Adaptive Systems (CAS framework is developed to simulate consumer agents that change their consumption to affect the withdrawal from the water supply system, which, in turn influences operational policies and long-term resource planning. Agent-based models are encoded to represent consumers and a policy maker agent and are coupled with water resources system simulation models. The CAS framework is coupled with an evolutionary computation-based multi-objective methodology to explore tradeoffs in cost, inconvenience to consumers, and environmental impacts for both supply-side and demand-side strategies. Decisions are identified to specify storage levels in a reservoir that trigger: (1 increases in the volume of water pumped through inter-basin transfers from an external reservoir; and (2 drought stages, which restrict the volume of water that is allowed for residential outdoor uses. The proposed methodology is demonstrated for Arlington, Texas, water supply system to identify non-dominated strategies for an historic drought decade. Results demonstrate that pumping costs associated with maximizing environmental reliability exceed pumping costs associated with minimizing restrictions on consumer water use.

  3. Evolutionary rescue and local adaptation under different rates of temperature increase: a combined analysis of changes in phenotype expression and genotype frequency in Paramecium microcosms.

    Science.gov (United States)

    Killeen, Joshua; Gougat-Barbera, Claire; Krenek, Sascha; Kaltz, Oliver

    2017-04-01

    Evolutionary rescue (ER) occurs when populations, which have declined due to rapid environmental change, recover through genetic adaptation. The success of this process and the evolutionary trajectory of the population strongly depend on the rate of environmental change. Here we investigated how different rates of temperature increase (from 23 to 32 °C) affect population persistence and evolutionary change in experimental microcosms of the protozoan Paramecium caudatum. Consistent with theory on ER, we found that those populations experiencing the slowest rate of temperature increase were the least likely to become extinct and tended to be the best adapted to the new temperature environment. All high-temperature populations were more tolerant to severe heat stress (35, 37 °C), indicating a common mechanism of heat protection. High-temperature populations also had superior growth rates at optimum temperatures, leading to the absence of a pattern of local adaptation to control (23 °C) and high-temperature (32 °C) environments. However, high-temperature populations had reduced growth at low temperatures (5-9 °C), causing a shift in the temperature niche. In part, the observed evolutionary change can be explained by selection from standing variation. Using mitochondrial markers, we found complete divergence between control and high-temperature populations in the frequencies of six initial founder genotypes. Our results confirm basic predictions of ER and illustrate how adaptation to an extreme local environment can produce positive as well as negative correlated responses to selection over the entire range of the ecological niche. © 2017 John Wiley & Sons Ltd.

  4. Genetical Genomics for Evolutionary Studies

    NARCIS (Netherlands)

    Prins, J.C.P.; Smant, G.; Jansen, R.C.

    2012-01-01

    Genetical genomics combines acquired high-throughput genomic data with genetic analysis. In this chapter, we discuss the application of genetical genomics for evolutionary studies, where new high-throughput molecular technologies are combined with mapping quantitative trait loci (QTL) on the genome

  5. Computational Combination of the Optical Properties of Fenestration Layers at High Directional Resolution

    Directory of Open Access Journals (Sweden)

    Lars Oliver Grobe

    2017-03-01

    Full Text Available Complex fenestration systems typically comprise co-planar, clear and scattering layers. As there are many ways to combine layers in fenestration systems, a common approach in building simulation is to store optical properties separate for each layer. System properties are then computed employing a fast matrix formalism, often based on a directional basis devised by JHKlems comprising 145 incident and 145 outgoing directions. While this low directional resolution is found sufficient to predict illuminance and solar gains, it is too coarse to replicate the effects of directionality in the generation of imagery. For increased accuracy, a modification of the matrix formalism is proposed. The tensor-tree format of RADIANCE, employing an algorithm subdividing the hemisphere at variable resolutions, replaces the directional basis. The utilization of the tensor-tree with interfaces to simulation software allows sharing and re-use of data. The light scattering properties of two exemplary fenestration systems as computed employing the matrix formalism at variable resolution show good accordance with the results of ray-tracing. Computation times are reduced to 0.4% to 2.5% compared to ray-tracing through co-planar layers. Imagery computed employing the method illustrates the effect of directional resolution. The method is supposed to foster research in the field of daylighting, as well as applications in planning and design.

  6. Combining Archetypes, Ontologies and Formalization Enables Automated Computation of Quality Indicators.

    Science.gov (United States)

    Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald

    2017-01-01

    ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.

  7. Computational Fluid Dynamic Modeling of Rocket Based Combined Cycle Engine Flowfields

    Science.gov (United States)

    Daines, Russell L.; Merkle, Charles L.

    1994-01-01

    Computational Fluid Dynamic techniques are used to study the flowfield of a fixed geometry Rocket Based Combined Cycle engine operating in rocket ejector mode. Heat addition resulting from the combustion of injected fuel causes the subsonic engine flow to choke and go supersonic in the slightly divergent combustor-mixer section. Reacting flow computations are undertaken to predict the characteristics of solutions where the heat addition is determined by the flowfield. Here, adaptive gridding is used to improve resolution in the shear layers. Results show that the sonic speed is reached in the unheated portions of the flow first, while the heated portions become supersonic later. Comparison with results from another code show reasonable agreement. The coupled solutions show that the character of the combustion-based thermal choking phenomenon can be controlled reasonably well such that there is opportunity to optimize the length and expansion ratio of the combustor-mixer.

  8. Spore: Spawning Evolutionary Misconceptions?

    Science.gov (United States)

    Bean, Thomas E.; Sinatra, Gale M.; Schrader, P. G.

    2010-10-01

    The use of computer simulations as educational tools may afford the means to develop understanding of evolution as a natural, emergent, and decentralized process. However, special consideration of developmental constraints on learning may be necessary when using these technologies. Specifically, the essentialist (biological forms possess an immutable essence), teleological (assignment of purpose to living things and/or parts of living things that may not be purposeful), and intentionality (assumption that events are caused by an intelligent agent) biases may be reinforced through the use of computer simulations, rather than addressed with instruction. We examine the video game Spore for its depiction of evolutionary content and its potential to reinforce these cognitive biases. In particular, we discuss three pedagogical strategies to mitigate weaknesses of Spore and other computer simulations: directly targeting misconceptions through refutational approaches, targeting specific principles of scientific inquiry, and directly addressing issues related to models as cognitive tools.

  9. Computed tomography diagnosis of neonatal hypoxic ischemic encephalopathy combined with intracranial hemorrhage and clinical nursing treatment.

    Science.gov (United States)

    Zhang, Y; Zhang, J L; Li, Y

    2016-01-01

    Hypoxic ischemic encephalopathy (HIE), one of the common causes of newborn invalidism, is likely to induce nervous system-associated sequelae and even intracranial hemorrhage in severe cases. The incidence rate of HIE has been rising in recent years. In order to study the clinical nursing effect for HIE combined with intracranial hemorrhage, 76 newborns diagnosed with HIE combined with intracranial hemorrhage by spiral computed tomography (CT) from the of Binzhou People’s Hospital, Shandong, China were selected. They were divided into a control group and an intervention group. The control group received routine nursing, while the intervention group received comprehensive nursing intervention. The experimental results suggested that the mental developmental index (MDI) value and the psychomotor developmental index (PDI) value of patients in the intervention group were much higher than those of the control group and the difference was significant (phemorrhage recover more effectively, therefore is worth applying.

  10. Simulation of computational fluid dynamics and comparison of cephalosporin C fermentation performance with different impeller combinations

    International Nuclear Information System (INIS)

    Duan, Shengbing; Ni, Weijia; Luo, Hongzhen; Shi, Zhongping; Liu, Fan; Yuan, Guoqiang; Zhao, Yanli

    2013-01-01

    Cephalosporin C (CPC) fermentation by Acremonium chrysogenum is an extremely high oxygen-consuming process and oxygen transfer rate in a bioreactor directly affects fermentation performance. In this study, fluid dynamics and oxygen transfer in a 7 L bioreactor with different impellers combinations were simulated by computational fluid dynamics (CFD) model. Based on the simulation results, two impeller combinations with higher oxygen transfer rate (K_La) were selected to conduct CPC fermentations, aiming at achieving high CPC concentration and low accumulation of major by-product, deacetoxycephalosporin (DAOC). It was found that an impeller combination with a higher K_La and moderate shear force is the prerequisite for efficient CPC production in a stirred bioreactor. The best impeller combination, which installed a six-bladed turbine and a four-pitched-blade turbine at bottom and upper layers but with a shortened impellers inter-distance, produced the highest CPC concentration of 35.77 g/L and lowest DAOC/CPC ratio of 0.5%

  11. Simulation of computational fluid dynamics and comparison of cephalosporin C fermentation performance with different impeller combinations

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Shengbing; Ni, Weijia; Luo, Hongzhen; Shi, Zhongping; Liu, Fan [Jiangnan University, Wuxi (China); Yuan, Guoqiang; Zhao, Yanli [CSPC Hebei Zhongrun Pharmaceutical Co. Ltd., Shijiazhuang (China)

    2013-05-15

    Cephalosporin C (CPC) fermentation by Acremonium chrysogenum is an extremely high oxygen-consuming process and oxygen transfer rate in a bioreactor directly affects fermentation performance. In this study, fluid dynamics and oxygen transfer in a 7 L bioreactor with different impellers combinations were simulated by computational fluid dynamics (CFD) model. Based on the simulation results, two impeller combinations with higher oxygen transfer rate (K{sub L}a) were selected to conduct CPC fermentations, aiming at achieving high CPC concentration and low accumulation of major by-product, deacetoxycephalosporin (DAOC). It was found that an impeller combination with a higher K{sub L}a and moderate shear force is the prerequisite for efficient CPC production in a stirred bioreactor. The best impeller combination, which installed a six-bladed turbine and a four-pitched-blade turbine at bottom and upper layers but with a shortened impellers inter-distance, produced the highest CPC concentration of 35.77 g/L and lowest DAOC/CPC ratio of 0.5%.

  12. Optimal steel thickness combined with computed radiography for portal imaging of nasopharyngeal cancer patients

    International Nuclear Information System (INIS)

    Wu Shixiu; Jin Xiance; Xie Congying; Cao Guoquan

    2005-01-01

    The poor image quality of conventional metal screen-film portal imaging system has long been of concern, and various methods have been investigated in an attempt to enhance the quality of portal images. Computed radiography (CR) used in combination with a steel plate displays image enhancement. The optimal thickness of the steel plate had been studied by measuring the modulation transfer function (MTF) characteristics. Portal images of nasopharyngeal carcinoma patients were taken by both a conventional metal screen-film system and this optimal steel and CR plate combination system. Compared with a conventional metal screen-film system, the CR-metal screen system achieves a much higher image contrast. The measured modulation transfer function (MTF) of the CR combination is greater than conventional film-screen portal imaging systems and also results in superior image performance, as demonstrated by receiver operator characteristic (ROC) analysis. This optimal combination steel CR plate portal imaging system is capable of producing high contrast portal images conveniently

  13. Exploring combinations of auditory and visual stimuli for gaze-independent brain-computer interfaces.

    Directory of Open Access Journals (Sweden)

    Xingwei An

    Full Text Available For Brain-Computer Interface (BCI systems that are designed for users with severe impairments of the oculomotor system, an appropriate mode of presenting stimuli to the user is crucial. To investigate whether multi-sensory integration can be exploited in the gaze-independent event-related potentials (ERP speller and to enhance BCI performance, we designed a visual-auditory speller. We investigate the possibility to enhance stimulus presentation by combining visual and auditory stimuli within gaze-independent spellers. In this study with N = 15 healthy users, two different ways of combining the two sensory modalities are proposed: simultaneous redundant streams (Combined-Speller and interleaved independent streams (Parallel-Speller. Unimodal stimuli were applied as control conditions. The workload, ERP components, classification accuracy and resulting spelling speed were analyzed for each condition. The Combined-speller showed a lower workload than uni-modal paradigms, without the sacrifice of spelling performance. Besides, shorter latencies, lower amplitudes, as well as a shift of the temporal and spatial distribution of discriminative information were observed for Combined-speller. These results are important and are inspirations for future studies to search the reason for these differences. For the more innovative and demanding Parallel-Speller, where the auditory and visual domains are independent from each other, a proof of concept was obtained: fifteen users could spell online with a mean accuracy of 87.7% (chance level <3% showing a competitive average speed of 1.65 symbols per minute. The fact that it requires only one selection period per symbol makes it a good candidate for a fast communication channel. It brings a new insight into the true multisensory stimuli paradigms. Novel approaches for combining two sensory modalities were designed here, which are valuable for the development of ERP-based BCI paradigms.

  14. Integrating genomics into evolutionary medicine.

    Science.gov (United States)

    Rodríguez, Juan Antonio; Marigorta, Urko M; Navarro, Arcadi

    2014-12-01

    The application of the principles of evolutionary biology into medicine was suggested long ago and is already providing insight into the ultimate causes of disease. However, a full systematic integration of medical genomics and evolutionary medicine is still missing. Here, we briefly review some cases where the combination of the two fields has proven profitable and highlight two of the main issues hindering the development of evolutionary genomic medicine as a mature field, namely the dissociation between fitness and health and the still considerable difficulties in predicting phenotypes from genotypes. We use publicly available data to illustrate both problems and conclude that new approaches are needed for evolutionary genomic medicine to overcome these obstacles. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Efficient Discovery of Novel Multicomponent Mixtures for Hydrogen Storage: A Combined Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Wolverton, Christopher [Northwestern Univ., Evanston, IL (United States). Dept. of Materials Science and Engineering; Ozolins, Vidvuds [Univ. of California, Los Angeles, CA (United States). Dept. of Materials Science and Engineering; Kung, Harold H. [Northwestern Univ., Evanston, IL (United States). Dept. of Chemical and Biological Engineering; Yang, Jun [Ford Scientific Research Lab., Dearborn, MI (United States); Hwang, Sonjong [California Inst. of Technology (CalTech), Pasadena, CA (United States). Dept. of Chemistry and Chemical Engineering; Shore, Sheldon [The Ohio State Univ., Columbus, OH (United States). Dept. of Chemistry and Biochemistry

    2016-11-28

    The objective of the proposed program is to discover novel mixed hydrides for hydrogen storage, which enable the DOE 2010 system-level goals. Our goal is to find a material that desorbs 8.5 wt.% H2 or more at temperatures below 85°C. The research program will combine first-principles calculations of reaction thermodynamics and kinetics with material and catalyst synthesis, testing, and characterization. We will combine materials from distinct categories (e.g., chemical and complex hydrides) to form novel multicomponent reactions. Systems to be studied include mixtures of complex hydrides and chemical hydrides [e.g. LiNH2+NH3BH3] and nitrogen-hydrogen based borohydrides [e.g. Al(BH4)3(NH3)3]. The 2010 and 2015 FreedomCAR/DOE targets for hydrogen storage systems are very challenging, and cannot be met with existing materials. The vast majority of the work to date has delineated materials into various classes, e.g., complex and metal hydrides, chemical hydrides, and sorbents. However, very recent studies indicate that mixtures of storage materials, particularly mixtures between various classes, hold promise to achieve technological attributes that materials within an individual class cannot reach. Our project involves a systematic, rational approach to designing novel multicomponent mixtures of materials with fast hydrogenation/dehydrogenation kinetics and favorable thermodynamics using a combination of state-of-the-art scientific computing and experimentation. We will use the accurate predictive power of first-principles modeling to understand the thermodynamic and microscopic kinetic processes involved in hydrogen release and uptake and to design new material/catalyst systems with improved properties. Detailed characterization and atomic-scale catalysis experiments will elucidate the effect of dopants and nanoscale catalysts in achieving fast kinetics and reversibility. And

  16. A review of combined experimental and computational procedures for assessing biopolymer structure-process-property relationships.

    Science.gov (United States)

    Gronau, Greta; Krishnaji, Sreevidhya T; Kinahan, Michelle E; Giesa, Tristan; Wong, Joyce Y; Kaplan, David L; Buehler, Markus J

    2012-11-01

    Tailored biomaterials with tunable functional properties are desirable for many applications ranging from drug delivery to regenerative medicine. To improve the predictability of biopolymer materials functionality, multiple design parameters need to be considered, along with appropriate models. In this article we review the state of the art of synthesis and processing related to the design of biopolymers, with an emphasis on the integration of bottom-up computational modeling in the design process. We consider three prominent examples of well-studied biopolymer materials - elastin, silk, and collagen - and assess their hierarchical structure, intriguing functional properties and categorize existing approaches to study these materials. We find that an integrated design approach in which both experiments and computational modeling are used has rarely been applied for these materials due to difficulties in relating insights gained on different length- and time-scales. In this context, multiscale engineering offers a powerful means to accelerate the biomaterials design process for the development of tailored materials that suit the needs posed by the various applications. The combined use of experimental and computational tools has a very broad applicability not only in the field of biopolymers, but can be exploited to tailor the properties of other polymers and composite materials in general. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. A review of combined experimental and computational procedures for assessing biopolymer structure–process–property relationships

    Science.gov (United States)

    Gronau, Greta; Krishnaji, Sreevidhya T.; Kinahan, Michelle E.; Giesa, Tristan; Wong, Joyce Y.; Kaplan, David L.; Buehler, Markus J.

    2013-01-01

    Tailored biomaterials with tunable functional properties are desirable for many applications ranging from drug delivery to regenerative medicine. To improve the predictability of biopolymer materials functionality, multiple design parameters need to be considered, along with appropriate models. In this article we review the state of the art of synthesis and processing related to the design of biopolymers, with an emphasis on the integration of bottom-up computational modeling in the design process. We consider three prominent examples of well-studied biopolymer materials – elastin, silk, and collagen – and assess their hierarchical structure, intriguing functional properties and categorize existing approaches to study these materials. We find that an integrated design approach in which both experiments and computational modeling are used has rarely been applied for these materials due to difficulties in relating insights gained on different length- and time-scales. In this context, multiscale engineering offers a powerful means to accelerate the biomaterials design process for the development of tailored materials that suit the needs posed by the various applications. The combined use of experimental and computational tools has a very broad applicability not only in the field of biopolymers, but can be exploited to tailor the properties of other polymers and composite materials in general. PMID:22938765

  18. Combining Fog Computing with Sensor Mote Machine Learning for Industrial IoT.

    Science.gov (United States)

    Lavassani, Mehrzad; Forsström, Stefan; Jennehag, Ulf; Zhang, Tingting

    2018-05-12

    Digitalization is a global trend becoming ever more important to our connected and sustainable society. This trend also affects industry where the Industrial Internet of Things is an important part, and there is a need to conserve spectrum as well as energy when communicating data to a fog or cloud back-end system. In this paper we investigate the benefits of fog computing by proposing a novel distributed learning model on the sensor device and simulating the data stream in the fog, instead of transmitting all raw sensor values to the cloud back-end. To save energy and to communicate as few packets as possible, the updated parameters of the learned model at the sensor device are communicated in longer time intervals to a fog computing system. The proposed framework is implemented and tested in a real world testbed in order to make quantitative measurements and evaluate the system. Our results show that the proposed model can achieve a 98% decrease in the number of packets sent over the wireless link, and the fog node can still simulate the data stream with an acceptable accuracy of 97%. We also observe an end-to-end delay of 180 ms in our proposed three-layer framework. Hence, the framework shows that a combination of fog and cloud computing with a distributed data modeling at the sensor device for wireless sensor networks can be beneficial for Industrial Internet of Things applications.

  19. Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges

    Science.gov (United States)

    Millán, J. d. R.; Rupp, R.; Müller-Putz, G. R.; Murray-Smith, R.; Giugliemma, C.; Tangermann, M.; Vidaurre, C.; Cincotti, F.; Kübler, A.; Leeb, R.; Neuper, C.; Müller, K.-R.; Mattia, D.

    2010-01-01

    In recent years, new research has brought the field of electroencephalogram (EEG)-based brain–computer interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely, “Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user–machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human–computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices. PMID:20877434

  20. Combining computer modelling and cardiac imaging to understand right ventricular pump function.

    Science.gov (United States)

    Walmsley, John; van Everdingen, Wouter; Cramer, Maarten J; Prinzen, Frits W; Delhaas, Tammo; Lumens, Joost

    2017-10-01

    Right ventricular (RV) dysfunction is a strong predictor of outcome in heart failure and is a key determinant of exercise capacity. Despite these crucial findings, the RV remains understudied in the clinical, experimental, and computer modelling literature. This review outlines how recent advances in using computer modelling and cardiac imaging synergistically help to understand RV function in health and disease. We begin by highlighting the complexity of interactions that make modelling the RV both challenging and necessary, and then summarize the multiscale modelling approaches used to date to simulate RV pump function in the context of these interactions. We go on to demonstrate how these modelling approaches in combination with cardiac imaging have improved understanding of RV pump function in pulmonary arterial hypertension, arrhythmogenic right ventricular cardiomyopathy, dyssynchronous heart failure and cardiac resynchronization therapy, hypoplastic left heart syndrome, and repaired tetralogy of Fallot. We conclude with a perspective on key issues to be addressed by computational models of the RV in the near future. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2017. For permissions, please email: journals.permissions@oup.com.

  1. Combining Topological Hardware and Topological Software: Color-Code Quantum Computing with Topological Superconductor Networks

    Science.gov (United States)

    Litinski, Daniel; Kesselring, Markus S.; Eisert, Jens; von Oppen, Felix

    2017-07-01

    We present a scalable architecture for fault-tolerant topological quantum computation using networks of voltage-controlled Majorana Cooper pair boxes and topological color codes for error correction. Color codes have a set of transversal gates which coincides with the set of topologically protected gates in Majorana-based systems, namely, the Clifford gates. In this way, we establish color codes as providing a natural setting in which advantages offered by topological hardware can be combined with those arising from topological error-correcting software for full-fledged fault-tolerant quantum computing. We provide a complete description of our architecture, including the underlying physical ingredients. We start by showing that in topological superconductor networks, hexagonal cells can be employed to serve as physical qubits for universal quantum computation, and we present protocols for realizing topologically protected Clifford gates. These hexagonal-cell qubits allow for a direct implementation of open-boundary color codes with ancilla-free syndrome read-out and logical T gates via magic-state distillation. For concreteness, we describe how the necessary operations can be implemented using networks of Majorana Cooper pair boxes, and we give a feasibility estimate for error correction in this architecture. Our approach is motivated by nanowire-based networks of topological superconductors, but it could also be realized in alternative settings such as quantum-Hall-superconductor hybrids.

  2. Combining Topological Hardware and Topological Software: Color-Code Quantum Computing with Topological Superconductor Networks

    Directory of Open Access Journals (Sweden)

    Daniel Litinski

    2017-09-01

    Full Text Available We present a scalable architecture for fault-tolerant topological quantum computation using networks of voltage-controlled Majorana Cooper pair boxes and topological color codes for error correction. Color codes have a set of transversal gates which coincides with the set of topologically protected gates in Majorana-based systems, namely, the Clifford gates. In this way, we establish color codes as providing a natural setting in which advantages offered by topological hardware can be combined with those arising from topological error-correcting software for full-fledged fault-tolerant quantum computing. We provide a complete description of our architecture, including the underlying physical ingredients. We start by showing that in topological superconductor networks, hexagonal cells can be employed to serve as physical qubits for universal quantum computation, and we present protocols for realizing topologically protected Clifford gates. These hexagonal-cell qubits allow for a direct implementation of open-boundary color codes with ancilla-free syndrome read-out and logical T gates via magic-state distillation. For concreteness, we describe how the necessary operations can be implemented using networks of Majorana Cooper pair boxes, and we give a feasibility estimate for error correction in this architecture. Our approach is motivated by nanowire-based networks of topological superconductors, but it could also be realized in alternative settings such as quantum-Hall–superconductor hybrids.

  3. A Hybrid Chaotic Quantum Evolutionary Algorithm

    DEFF Research Database (Denmark)

    Cai, Y.; Zhang, M.; Cai, H.

    2010-01-01

    A hybrid chaotic quantum evolutionary algorithm is proposed to reduce amount of computation, speed up convergence and restrain premature phenomena of quantum evolutionary algorithm. The proposed algorithm adopts the chaotic initialization method to generate initial population which will form a pe...... tests. The presented algorithm is applied to urban traffic signal timing optimization and the effect is satisfied....

  4. Cardiac single-photon emission-computed tomography using combined cone-beam/fan-beam collimation

    International Nuclear Information System (INIS)

    Gullberg, Grant T.; Zeng, Gengsheng L.

    2004-01-01

    The objective of this work is to increase system sensitivity in cardiac single-photon emission-computed tomography (SPECT) studies without increasing patient imaging time. For imaging the heart, convergent collimation offers the potential of increased sensitivity over that of parallel-hole collimation. However, if a cone-beam collimated gamma camera is rotated in a planar orbit, the projection data obtained are not complete. Two cone-beam collimators and one fan-beam collimator are used with a three-detector SPECT system. The combined cone-beam/fan-beam collimation provides a complete set of data for image reconstruction. The imaging geometry is evaluated using data acquired from phantom and patient studies. For the Jaszazck cardiac torso phantom experiment, the combined cone-beam/fan-beam collimation provided 1.7 times greater sensitivity than standard parallel-hole collimation (low-energy high-resolution collimators). Also, phantom and patient comparison studies showed improved image quality. The combined cone-beam/fan-beam imaging geometry with appropriate weighting of the two data sets provides improved system sensitivity while measuring sufficient data for artifact free cardiac images

  5. Towards Adaptive Evolutionary Architecture

    DEFF Research Database (Denmark)

    Bak, Sebastian HOlt; Rask, Nina; Risi, Sebastian

    2016-01-01

    This paper presents first results from an interdisciplinary project, in which the fields of architecture, philosophy and artificial life are combined to explore possible futures of architecture. Through an interactive evolutionary installation, called EvoCurtain, we investigate aspects of how...... to the development of designs tailored to the individual preferences of inhabitants, changing the roles of architects and designers entirely. Architecture-as-it-could-be is a philosophical approach conducted through artistic methods to anticipate the technological futures of human-centered development within...

  6. Evolutionary design assistants for architecture

    Directory of Open Access Journals (Sweden)

    N. Onur Sönmez

    2015-04-01

    evolutionary decisions. In this way, the Interleaved EA enables the use of different settings and operators for each of the objectives within an overall task, which would be the same for all objectives in a regular multi-objective EA. This property gives the algorithm a modular structure, which offers an improvable method for the utilization of domain-specific knowledge for each sub-task, i.e., objective. The Interleaved EA can be used by Evolutionary Computation (EC researchers and by practitioners who employ EC for their tasks. As a third main output, the “Architectural Stem Cells Framework” is a conceptual framework for architectural design assistants. It proposes a dynamic and multi-layered method for combining a set of design assistants for larger tasks in architectural design. The first component of the framework is a layer-based, parallel task decomposition approach, which aims at obtaining a dynamic parallelization of sub-tasks within a more complicated problem. The second component of the framework is a conception for the development mechanisms for building drafts, i.e., Architectural Stem Cells (ASC. An ASC can be conceived as a semantically marked geometric structure, which contains the information that specifies the possibilities and constraints for how an abstract building may develop from an undetailed stage to a fully developed building draft. ASCs are required for re-integrating the separated task layers of an architectural problem through solution-based development. The ASC Framework brings together many of the ideas of this thesis for a practical research agenda and it is presented to the AD researchers in architecture. Finally, the “design_proxy.layout” (d_p.layout is an architectural layout design assistant based on the design_proxy approach and the IEA. The system uses a relaxed problem definition (producing draft layouts and a flexible layout representation that permits the overlapping of design units and boundaries. User interaction with the

  7. Fundamentals of natural computing basic concepts, algorithms, and applications

    CERN Document Server

    de Castro, Leandro Nunes

    2006-01-01

    Introduction A Small Sample of Ideas The Philosophy of Natural Computing The Three Branches: A Brief Overview When to Use Natural Computing Approaches Conceptualization General Concepts PART I - COMPUTING INSPIRED BY NATURE Evolutionary Computing Problem Solving as a Search Task Hill Climbing and Simulated Annealing Evolutionary Biology Evolutionary Computing The Other Main Evolutionary Algorithms From Evolutionary Biology to Computing Scope of Evolutionary Computing Neurocomputing The Nervous System Artif

  8. Self-organized modularization in evolutionary algorithms.

    Science.gov (United States)

    Dauscher, Peter; Uthmann, Thomas

    2005-01-01

    The principle of modularization has proven to be extremely successful in the field of technical applications and particularly for Software Engineering purposes. The question to be answered within the present article is whether mechanisms can also be identified within the framework of Evolutionary Computation that cause a modularization of solutions. We will concentrate on processes, where modularization results only from the typical evolutionary operators, i.e. selection and variation by recombination and mutation (and not, e.g., from special modularization operators). This is what we call Self-Organized Modularization. Based on a combination of two formalizations by Radcliffe and Altenberg, some quantitative measures of modularity are introduced. Particularly, we distinguish Built-in Modularity as an inherent property of a genotype and Effective Modularity, which depends on the rest of the population. These measures can easily be applied to a wide range of present Evolutionary Computation models. It will be shown, both theoretically and by simulation, that under certain conditions, Effective Modularity (as defined within this paper) can be a selection factor. This causes Self-Organized Modularization to take place. The experimental observations emphasize the importance of Effective Modularity in comparison with Built-in Modularity. Although the experimental results have been obtained using a minimalist toy model, they can lead to a number of consequences for existing models as well as for future approaches. Furthermore, the results suggest a complex self-amplification of highly modular equivalence classes in the case of respected relations. Since the well-known Holland schemata are just the equivalence classes of respected relations in most Simple Genetic Algorithms, this observation emphasizes the role of schemata as Building Blocks (in comparison with arbitrary subsets of the search space).

  9. Computational Fluid Dynamics Analysis Method Developed for Rocket-Based Combined Cycle Engine Inlet

    Science.gov (United States)

    1997-01-01

    Renewed interest in hypersonic propulsion systems has led to research programs investigating combined cycle engines that are designed to operate efficiently across the flight regime. The Rocket-Based Combined Cycle Engine is a propulsion system under development at the NASA Lewis Research Center. This engine integrates a high specific impulse, low thrust-to-weight, airbreathing engine with a low-impulse, high thrust-to-weight rocket. From takeoff to Mach 2.5, the engine operates as an air-augmented rocket. At Mach 2.5, the engine becomes a dual-mode ramjet; and beyond Mach 8, the rocket is turned back on. One Rocket-Based Combined Cycle Engine variation known as the "Strut-Jet" concept is being investigated jointly by NASA Lewis, the U.S. Air Force, Gencorp Aerojet, General Applied Science Labs (GASL), and Lockheed Martin Corporation. Work thus far has included wind tunnel experiments and computational fluid dynamics (CFD) investigations with the NPARC code. The CFD method was initiated by modeling the geometry of the Strut-Jet with the GRIDGEN structured grid generator. Grids representing a subscale inlet model and the full-scale demonstrator geometry were constructed. These grids modeled one-half of the symmetric inlet flow path, including the precompression plate, diverter, center duct, side duct, and combustor. After the grid generation, full Navier-Stokes flow simulations were conducted with the NPARC Navier-Stokes code. The Chien low-Reynolds-number k-e turbulence model was employed to simulate the high-speed turbulent flow. Finally, the CFD solutions were postprocessed with a Fortran code. This code provided wall static pressure distributions, pitot pressure distributions, mass flow rates, and internal drag. These results were compared with experimental data from a subscale inlet test for code validation; then they were used to help evaluate the demonstrator engine net thrust.

  10. Evolutionary Relationship of the Scale-Bearing Kraken (incertae sedis, Monadofilosa, Cercozoa, Rhizaria): Combining Ultrastructure Data and a Two-Gene Phylogeny.

    Science.gov (United States)

    Dumack, Kenneth; Mylnikov, Alexander P; Bonkowski, Michael

    2017-07-01

    The genus Kraken represents a distinct lineage of filose amoebae within the Cercozoa. Currently a single species, Kraken carinae, has been described. SSU rDNA phylogeny showed an affiliation to the Cercomonadida, branching with weak support at its base, close to Paracercomonas, Metabolomonas, and Brevimastigomonas. Light microscopical analyses showed several unique features of the genus Kraken, but ultrastructure data were lacking. In this study, K. carinae has been studied by electron microscopy, these data conjoined with a two-gene phylogeny were used to give more insight into the evolutionary relationship of the genus Kraken within Cercozoa. The data confirmed the absence of flagella, but also showed novel characteristics, such as the presence of extrusomes, osmiophilic bodies, and mitochondria with flat cristae. Surprising was the presence of single-tier scales which are carried by cell outgrowths, much of what is expected of the last common ancestor of the class Imbricatea. The phylogenetic analyses however confirmed previous results, indicating Kraken as a sister group to Paracercomonas in Sarcomonadea with an increased but still low support of 0.98 PP/63 BP. Based on the unique features of Kraken we establish the Krakenidae fam. nov. that we, due to contradictory results in morphology and phylogeny, assign incertae sedis, Monadofilosa. Copyright © 2017 Elsevier GmbH. All rights reserved.

  11. Combined X-ray fluorescence and absorption computed tomography using a synchrotron beam

    International Nuclear Information System (INIS)

    Hall, C

    2013-01-01

    X-ray computed tomography (CT) and fluorescence X-ray computed tomography (FXCT) using synchrotron sources are both useful tools in biomedical imaging research. Synchrotron CT (SRCT) in its various forms is considered an important technique for biomedical imaging since the phase coherence of SR beams can be exploited to obtain images with high contrast resolution. Using a synchrotron as the source for FXCT ensures a fluorescence signal that is optimally detectable by exploiting the beam monochromaticity and polarisation. The ability to combine these techniques so that SRCT and FXCT images are collected simultaneously, would bring distinct benefits to certain biomedical experiments. Simultaneous image acquisition would alleviate some of the registration difficulties which comes from collecting separate data, and it would provide increased information about the sample: functional X-ray images from the FXCT, with the morphological information from the SRCT. A method is presented for generating simultaneous SRCT and FXCT images. Proof of principle modelling has been used to show that it is possible to recover a fluorescence image of a point-like source from an SRCT apparatus by suitably modulating the illuminating planar X-ray beam. The projection image can be successfully used for reconstruction by removing the static modulation from the sinogram in the normal flat and dark field processing. Detection of the modulated fluorescence signal using an energy resolving detector allows the position of a fluorescent marker to be obtained using inverse reconstruction techniques. A discussion is made of particular reconstruction methods which might be applied by utilising both the CT and FXCT data.

  12. Positron emission tomography combined with computed tomography for diagnosis of synchronous and metachronous tumors

    International Nuclear Information System (INIS)

    Zlatareva, D.; Garcheva, M.; Hadjiiska, V.

    2013-01-01

    Full text: Introduction: Positron emission tomography combined computed tomography (PET/CT) has proved to be the method of choice in oncology for diagnosis and staging, planning and determining the effect of treatment. Aim of the study was to determine the diagnostic capabilities of PET/CT for the detection of synchronous and metachronous tumors. Materials and Methods: The study was conducted with 18F FDG on Discovery, GE Healthcare under standard protocol. 18F FDG is dosed per kg body weight applying before a meal in blood sugar within reference values. The survey was conducted 60 min after application, in addition to visual assessment using quantitative indicators. For a period of a year (2012) 1408 patients were studied. In 11 (2 men, 9 women) of them synchronous and metachronous unsuspected tumors were found. Results: The most common as the second tumors are found processes in the head and neck, followed by lung cancer and colorectal cancer. In four of the cases operational or histological verification was made. In others cases due to refusal or advanced disease indications for systemic therapy the verification wasn't made. Diagnosis of the second tumor has changed the approach to patients as the therapeutic effect was detected at 3 patients over a period of nine months by repeated PET/CT study. Conclusion: The hybrid PET/CT, combining information about structural changes (CT) and metabolic changes (PET) plays an important role in the diagnosis of synchronous and metachronous tumors. This can significantly change the therapeutic management and prognosis of patients

  13. Bioremediation in marine ecosystems: a computational study combining ecological modelling and flux balance analysis

    Directory of Open Access Journals (Sweden)

    Marianna eTaffi

    2014-09-01

    Full Text Available The pressure to search effective bioremediation methodologies for contaminated ecosystems has led to the large-scale identification of microbial species and metabolic degradation pathways. However, minor attention has been paid to the study of bioremediation in marine food webs and to the definition of integrated strategies for reducing bioaccumulation in species. We propose a novel computational framework for analysing the multiscale effects of bioremediation at the ecosystem level, based on coupling food web bioaccumulation models and metabolic models of degrading bacteria. The combination of techniques from synthetic biology and ecological network analysis allows the specification of arbitrary scenarios of contaminant removal and the evaluation of strategies based on natural or synthetic microbial strains.In this study, we derive a bioaccumulation model of polychlorinated biphenyls (PCBs in the Adriatic food web, and we extend a metabolic reconstruction of Pseudomonas putida KT2440 (iJN746 with the aerobic pathway of PCBs degradation. We assess the effectiveness of different bioremediation scenarios in reducing PCBs concentration in species and we study indices of species centrality to measure their importance in the contaminant diffusion via feeding links.The analysis of the Adriatic sea case study suggests that our framework could represent a practical tool in the design of effective remediation strategies, providing at the same time insights into the ecological role of microbial communities within food webs.

  14. Computer-Aided Grading of Gliomas Combining Automatic Segmentation and Radiomics

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2018-01-01

    Full Text Available Gliomas are the most common primary brain tumors, and the objective grading is of great importance for treatment. This paper presents an automatic computer-aided diagnosis of gliomas that combines automatic segmentation and radiomics, which can improve the diagnostic ability. The MRI data containing 220 high-grade gliomas and 54 low-grade gliomas are used to evaluate our system. A multiscale 3D convolutional neural network is trained to segment whole tumor regions. A wide range of radiomic features including first-order features, shape features, and texture features is extracted. By using support vector machines with recursive feature elimination for feature selection, a CAD system that has an extreme gradient boosting classifier with a 5-fold cross-validation is constructed for the grading of gliomas. Our CAD system is highly effective for the grading of gliomas with an accuracy of 91.27%, a weighted macroprecision of 91.27%, a weighted macrorecall of 91.27%, and a weighted macro-F1 score of 90.64%. This demonstrates that the proposed CAD system can assist radiologists for high accurate grading of gliomas and has the potential for clinical applications.

  15. Efficient computation of the elastography inverse problem by combining variational mesh adaption and a clustering technique

    International Nuclear Information System (INIS)

    Arnold, Alexander; Bruhns, Otto T; Reichling, Stefan; Mosler, Joern

    2010-01-01

    This paper is concerned with an efficient implementation suitable for the elastography inverse problem. More precisely, the novel algorithm allows us to compute the unknown stiffness distribution in soft tissue by means of the measured displacement field by considerably reducing the numerical cost compared to previous approaches. This is realized by combining and further elaborating variational mesh adaption with a clustering technique similar to those known from digital image compression. Within the variational mesh adaption, the underlying finite element discretization is only locally refined if this leads to a considerable improvement of the numerical solution. Additionally, the numerical complexity is reduced by the aforementioned clustering technique, in which the parameters describing the stiffness of the respective soft tissue are sorted according to a predefined number of intervals. By doing so, the number of unknowns associated with the elastography inverse problem can be chosen explicitly. A positive side effect of this method is the reduction of artificial noise in the data (smoothing of the solution). The performance and the rate of convergence of the resulting numerical formulation are critically analyzed by numerical examples.

  16. Combined multi-kernel head computed tomography images optimized for depicting both brain parenchyma and bone.

    Science.gov (United States)

    Takagi, Satoshi; Nagase, Hiroyuki; Hayashi, Tatsuya; Kita, Tamotsu; Hayashi, Katsumi; Sanada, Shigeru; Koike, Masayuki

    2014-01-01

    The hybrid convolution kernel technique for computed tomography (CT) is known to enable the depiction of an image set using different window settings. Our purpose was to decrease the number of artifacts in the hybrid convolution kernel technique for head CT and to determine whether our improved combined multi-kernel head CT images enabled diagnosis as a substitute for both brain (low-pass kernel-reconstructed) and bone (high-pass kernel-reconstructed) images. Forty-four patients with nondisplaced skull fractures were included. Our improved multi-kernel images were generated so that pixels of >100 Hounsfield unit in both brain and bone images were composed of CT values of bone images and other pixels were composed of CT values of brain images. Three radiologists compared the improved multi-kernel images with bone images. The improved multi-kernel images and brain images were identically displayed on the brain window settings. All three radiologists agreed that the improved multi-kernel images on the bone window settings were sufficient for diagnosing skull fractures in all patients. This improved multi-kernel technique has a simple algorithm and is practical for clinical use. Thus, simplified head CT examinations and fewer images that need to be stored can be expected.

  17. Diagnostic performance of combined noninvasive coronary angiography and myocardial perfusion imaging using 320 row detector computed tomography

    DEFF Research Database (Denmark)

    Vavere, Andrea L; Simon, Gregory G; George, Richard T

    2013-01-01

    Multidetector coronary computed tomography angiography (CTA) is a promising modality for widespread clinical application because of its noninvasive nature and high diagnostic accuracy as found in previous studies using 64 to 320 simultaneous detector rows. It is, however, limited in its ability...... to detect myocardial ischemia. In this article, we describe the design of the CORE320 study ("Combined coronary atherosclerosis and myocardial perfusion evaluation using 320 detector row computed tomography"). This prospective, multicenter, multinational study is unique in that it is designed to assess...... the diagnostic performance of combined 320-row CTA and myocardial CT perfusion imaging (CTP) in comparison with the combination of invasive coronary angiography and single-photon emission computed tomography myocardial perfusion imaging (SPECT-MPI). The trial is being performed at 16 medical centers located in 8...

  18. Attractive evolutionary equilibria

    NARCIS (Netherlands)

    Joosten, Reinoud A.M.G.; Roorda, Berend

    2011-01-01

    We present attractiveness, a refinement criterion for evolutionary equilibria. Equilibria surviving this criterion are robust to small perturbations of the underlying payoff system or the dynamics at hand. Furthermore, certain attractive equilibria are equivalent to others for certain evolutionary

  19. Combination of artificial intelligence and procedural language programs in a computer application system supporting nuclear reactor operations

    International Nuclear Information System (INIS)

    Town, G.G.; Stratton, R.C.

    1985-01-01

    A computer application system is described which provides nuclear reactor power plant operators with an improved decision support system. This system combines traditional computer applications such as graphics display with artificial intelligence methodologies such as reasoning and diagnosis so as to improve plant operability. This paper discusses the issues, and a solution, involved with the system integration of applications developed using traditional and artificial intelligence languages

  20. Combination of artificial intelligence and procedural language programs in a computer application system supporting nuclear reactor operations

    International Nuclear Information System (INIS)

    Stratton, R.C.; Town, G.G.

    1985-01-01

    A computer application system is described which provides nuclear reactor power plant operators with an improved decision support system. This system combines traditional computer applications such as graphics display with artifical intelligence methodologies such as reasoning and diagnosis so as to improve plant operability. This paper discusses the issues, and a solution, involved with the system integration of applications developed using traditional and artificial intelligence languages

  1. Evolutionary Stable Strategy

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 21; Issue 9. Evolutionary Stable Strategy: Application of Nash Equilibrium in Biology. General Article Volume 21 Issue 9 September 2016 pp 803- ... Keywords. Evolutionary game theory, evolutionary stable state, conflict, cooperation, biological games.

  2. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  3. Evolutionary experience design – the case of Otopia

    DEFF Research Database (Denmark)

    Hansen, Kenneth

    experiences with the case of “Otopia”. “Otopia” is a large scale, new media experiment, which combines the areas of computer games, sports and performance in to a spectator oriented concept; it was premiered in a dome tent at the Roskilde Festival in Denmark the summer 2005. This paper presents and discusses......The design of experiences is a complicated challenge. It might not even be possible to design such a “thing”, but only to design for it. If this is the case it could seem appropriate with an evolutionary approach. This paper introduces such an approach to the design of new public oriented...... used as a means of specifying the basic immaterial design form. This discussion leads to the suggestion of a rule-based evolutionary model for the design of situations as a practical option for designers of new spectator oriented experiences in the future The project of Otopia was supported...

  4. Evolutionary modeling-based approach for model errors correction

    Directory of Open Access Journals (Sweden)

    S. Q. Wan

    2012-08-01

    Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."

    On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  5. Evolutionary games under incompetence.

    Science.gov (United States)

    Kleshnina, Maria; Filar, Jerzy A; Ejov, Vladimir; McKerral, Jody C

    2018-02-26

    The adaptation process of a species to a new environment is a significant area of study in biology. As part of natural selection, adaptation is a mutation process which improves survival skills and reproductive functions of species. Here, we investigate this process by combining the idea of incompetence with evolutionary game theory. In the sense of evolution, incompetence and training can be interpreted as a special learning process. With focus on the social side of the problem, we analyze the influence of incompetence on behavior of species. We introduce an incompetence parameter into a learning function in a single-population game and analyze its effect on the outcome of the replicator dynamics. Incompetence can change the outcome of the game and its dynamics, indicating its significance within what are inherently imperfect natural systems.

  6. Large fluctuations and fixation in evolutionary games

    International Nuclear Information System (INIS)

    Assaf, Michael; Mobilia, Mauro

    2010-01-01

    We study large fluctuations in evolutionary games belonging to the coordination and anti-coordination classes. The dynamics of these games, modeling cooperation dilemmas, is characterized by a coexistence fixed point separating two absorbing states. We are particularly interested in the problem of fixation that refers to the possibility that a few mutants take over the entire population. Here, the fixation phenomenon is induced by large fluctuations and is investigated by a semiclassical WKB (Wentzel–Kramers–Brillouin) theory generalized to treat stochastic systems possessing multiple absorbing states. Importantly, this method allows us to analyze the combined influence of selection and random fluctuations on the evolutionary dynamics beyond the weak selection limit often considered in previous works. We accurately compute, including pre-exponential factors, the probability distribution function in the long-lived coexistence state and the mean fixation time necessary for a few mutants to take over the entire population in anti-coordination games, and also the fixation probability in the coordination class. Our analytical results compare excellently with extensive numerical simulations. Furthermore, we demonstrate that our treatment is superior to the Fokker–Planck approximation when the selection intensity is finite

  7. Algorithms for computing parsimonious evolutionary scenarios for genome evolution, the last universal common ancestor and dominance of horizontal gene transfer in the evolution of prokaryotes

    Directory of Open Access Journals (Sweden)

    Galperin Michael Y

    2003-01-01

    Full Text Available Abstract Background Comparative analysis of sequenced genomes reveals numerous instances of apparent horizontal gene transfer (HGT, at least in prokaryotes, and indicates that lineage-specific gene loss might have been even more common in evolution. This complicates the notion of a species tree, which needs to be re-interpreted as a prevailing evolutionary trend, rather than the full depiction of evolution, and makes reconstruction of ancestral genomes a non-trivial task. Results We addressed the problem of constructing parsimonious scenarios for individual sets of orthologous genes given a species tree. The orthologous sets were taken from the database of Clusters of Orthologous Groups of proteins (COGs. We show that the phyletic patterns (patterns of presence-absence in completely sequenced genomes of almost 90% of the COGs are inconsistent with the hypothetical species tree. Algorithms were developed to reconcile the phyletic patterns with the species tree by postulating gene loss, COG emergence and HGT (the latter two classes of events were collectively treated as gene gains. We prove that each of these algorithms produces a parsimonious evolutionary scenario, which can be represented as mapping of loss and gain events on the species tree. The distribution of the evolutionary events among the tree nodes substantially depends on the underlying assumptions of the reconciliation algorithm, e.g. whether or not independent gene gains (gain after loss after gain are permitted. Biological considerations suggest that, on average, gene loss might be a more likely event than gene gain. Therefore different gain penalties were used and the resulting series of reconstructed gene sets for the last universal common ancestor (LUCA of the extant life forms were analysed. The number of genes in the reconstructed LUCA gene sets grows as the gain penalty increases. However, qualitative examination of the LUCA versions reconstructed with different gain penalties

  8. Reprint of: Energetics of 2- and 3-coumaranone isomers: A combined calorimetric and computational study

    International Nuclear Information System (INIS)

    Sousa, Clara C.S.; Matos, M. Agostinha R.; Santos, Luís M.N.B.F.; Morais, Victor M.F.

    2014-01-01

    Highlights: • Experimental standard molar enthalpies of formation, sublimation of 2- and 3-coumaranone. • Mini-bomb combustion calorimetry, sublimation Calvet microcalorimetry. • DFT methods and high level composite ab initio calculations. • Theoretical estimate of the enthalpy of formation of isobenzofuranone. • Chemical shift (NICS) and the relative stability of the isomers. - Abstract: Condensed phase standard (p° = 0.1 MPa) molar enthalpies of formation for 2-coumaranone and 3-coumaranone were derived from the standard molar enthalpies of combustion, in oxygen, at T = 298.15 K, measured by mini-bomb combustion calorimetry. Standard molar enthalpies of sublimation of both isomers were determined by Calvet microcalorimetry. These results were combined to derive the standard molar enthalpies of formation of the compounds, in gas phase, at T = 298.15 K. Additionally, accurate quantum chemical calculations have been performed using DFT methods and high level composite ab initio calculations. Theoretical estimates of the enthalpies of formation of the compounds are in good agreement with the experimental values thus supporting the predictions of the same parameters for isobenzofuranone, an isomer which has not been experimentally studied. The relative stability of these isomers has been evaluated by experimental and computational results. The importance of some stabilizing electronic intramolecular interactions has been studied and quantitatively evaluated through Natural Bonding Orbital (NBO) analysis of the wave functions and the nucleus independent chemical shift (NICS) of the studied systems have been calculated in order to study and establish the effect of electronic delocalization upon the relative stability of the isomers

  9. A combined experimental and computational thermodynamic study of fluorene-9-methanol and fluorene-9-carboxylic acid

    International Nuclear Information System (INIS)

    Oliveira, Juliana A.S.A.; Calvinho, Maria M.; Notario, R.; Monte, Manuel J.S.; Ribeiro da Silva, Maria D.M.C.

    2013-01-01

    Highlights: • A thermodynamic study of two fluorene derivatives is presented. • Vapour pressures and energies of combustion were measured. • Enthalpy, entropy and Gibbs energy of sublimation were derived. • Enthalpy and Gibbs energy of formation in crystal and gas phases were calculated. • Gas phase enthalpy of formation was also estimated by quantum chemical calculations. -- Abstract: This work reports an experimental and computational thermodynamic study performed on two 9-fluorene derivatives: fluorene-9-methanol and fluorene-9-carboxylic acid. The standard (p o = 0.1 MPa) molar enthalpies of formation in the crystalline phase of these compounds were derived from the standard molar energies of combustion, in oxygen, at T = 298.15 K, measured by static bomb combustion calorimetry. A static method, based on a capacitance diaphragm gauge, and a Knudsen effusion method were used to perform the vapour pressure study of the referred compounds, yielding accurate determination of the standard molar enthalpies and entropies of sublimation and vaporisation. For fluorene-9-carboxylic acid, the enthalpy of sublimation was also determined using Calvet microcalorimetry. The enthalpy of fusion of both compounds was derived indirectly from vapour pressure results and directly from DSC experiments. Combining the thermodynamic parameters of the compounds studied, the standard Gibbs energy of formation in crystalline and gaseous phases were derived as well as the standard molar enthalpy of formation in the gaseous phase. A theoretical study at the G3 and G4 levels has been carried out, and the calculated enthalpies of formation have been compared with the experimental values

  10. Energetics of 2- and 3-coumaranone isomers: A combined calorimetric and computational study

    International Nuclear Information System (INIS)

    Sousa, Clara C.S.; Matos, M. Agostinha R.; Santos, Luís M.N.B.F.; Morais, Victor M.F.

    2013-01-01

    Highlights: • Experimental standard molar enthalpies of formation, sublimation of 2- and 3-coumaranone. • Mini-bomb combustion calorimetry, sublimation Calvet microcalorimetry. • DFT methods and high level composite ab initio calculations. • Theoretical estimate of the enthalpy of formation of isobenzofuranone. • Chemical shift (NICS) and the relative stability of the isomers. -- Abstract: Condensed phase standard (p° = 0.1 MPa) molar enthalpies of formation for 2-coumaranone and 3-coumaranone were derived from the standard molar enthalpies of combustion, in oxygen, at T = 298.15 K, measured by mini-bomb combustion calorimetry. Standard molar enthalpies of sublimation of both isomers were determined by Calvet microcalorimetry. These results were combined to derive the standard molar enthalpies of formation of the compounds, in gas phase, at T = 298.15 K. Additionally, accurate quantum chemical calculations have been performed using DFT methods and high level composite ab initio calculations. Theoretical estimates of the enthalpies of formation of the compounds are in good agreement with the experimental values thus supporting the predictions of the same parameters for isobenzofuranone, an isomer which has not been experimentally studied. The relative stability of these isomers has been evaluated by experimental and computational results. The importance of some stabilizing electronic intramolecular interactions has been studied and quantitatively evaluated through Natural Bonding Orbital (NBO) analysis of the wave functions and the nucleus independent chemical shift (NICS) of the studied systems have been calculated in order to study and establish the effect of electronic delocalization upon the relative stability of the isomers

  11. Evolutionary molecular medicine.

    Science.gov (United States)

    Nesse, Randolph M; Ganten, Detlev; Gregory, T Ryan; Omenn, Gilbert S

    2012-05-01

    Evolution has long provided a foundation for population genetics, but some major advances in evolutionary biology from the twentieth century that provide foundations for evolutionary medicine are only now being applied in molecular medicine. They include the need for both proximate and evolutionary explanations, kin selection, evolutionary models for cooperation, competition between alleles, co-evolution, and new strategies for tracing phylogenies and identifying signals of selection. Recent advances in genomics are transforming evolutionary biology in ways that create even more opportunities for progress at its interfaces with genetics, medicine, and public health. This article reviews 15 evolutionary principles and their applications in molecular medicine in hopes that readers will use them and related principles to speed the development of evolutionary molecular medicine.

  12. T and D-Bench--Innovative Combined Support for Education and Research in Computer Architecture and Embedded Systems

    Science.gov (United States)

    Soares, S. N.; Wagner, F. R.

    2011-01-01

    Teaching and Design Workbench (T&D-Bench) is a framework aimed at education and research in the areas of computer architecture and embedded systems. It includes a set of features not found in other educational environments. This set of features is the result of an original combination of design requirements for T&D-Bench: that the…

  13. Evolutionary Sound Synthesis Controlled by Gestural Data

    Directory of Open Access Journals (Sweden)

    Jose Fornari

    2011-05-01

    Full Text Available This article focuses on the interdisciplinary research involving Computer Music and Generative Visual Art. We describe the implementation of two interactive artistic systems based on principles of Gestural Data (WILSON, 2002 retrieval and self-organization (MORONI, 2003, to control an Evolutionary Sound Synthesis method (ESSynth. The first implementation uses, as gestural data, image mapping of handmade drawings. The second one uses gestural data from dynamic body movements of dance. The resulting computer output is generated by an interactive system implemented in Pure Data (PD. This system uses principles of Evolutionary Computation (EC, which yields the generation of a synthetic adaptive population of sound objects. Considering that music could be seen as “organized sound” the contribution of our study is to develop a system that aims to generate "self-organized sound" – a method that uses evolutionary computation to bridge between gesture, sound and music.

  14. Avoiding Local Optima with Interactive Evolutionary Robotics

    Science.gov (United States)

    2012-07-09

    the top of a flight of stairs selects for climbing ; suspending the robot and the target object above the ground and creating rungs between the two will...REPORT Avoiding Local Optimawith Interactive Evolutionary Robotics 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: The main bottleneck in evolutionary... robotics has traditionally been the time required to evolve robot controllers. However with the continued acceleration in computational resources, the

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  16. Dual-Modality Imaging of the Human Finger Joint Systems by Using Combined Multispectral Photoacoustic Computed Tomography and Ultrasound Computed Tomography

    Directory of Open Access Journals (Sweden)

    Yubin Liu

    2016-01-01

    Full Text Available We developed a homemade dual-modality imaging system that combines multispectral photoacoustic computed tomography and ultrasound computed tomography for reconstructing the structural and functional information of human finger joint systems. The fused multispectral photoacoustic-ultrasound computed tomography (MPAUCT system was examined by the phantom and in vivo experimental tests. The imaging results indicate that the hard tissues such as the bones and the soft tissues including the blood vessels, the tendon, the skins, and the subcutaneous tissues in the finger joints systems can be effectively recovered by using our multimodality MPAUCT system. The developed MPAUCT system is able to provide us with more comprehensive information of the human finger joints, which shows its potential for characterization and diagnosis of bone or joint diseases.

  17. Evolutionary Algorithms for Boolean Functions in Diverse Domains of Cryptography.

    Science.gov (United States)

    Picek, Stjepan; Carlet, Claude; Guilley, Sylvain; Miller, Julian F; Jakobovic, Domagoj

    2016-01-01

    The role of Boolean functions is prominent in several areas including cryptography, sequences, and coding theory. Therefore, various methods for the construction of Boolean functions with desired properties are of direct interest. New motivations on the role of Boolean functions in cryptography with attendant new properties have emerged over the years. There are still many combinations of design criteria left unexplored and in this matter evolutionary computation can play a distinct role. This article concentrates on two scenarios for the use of Boolean functions in cryptography. The first uses Boolean functions as the source of the nonlinearity in filter and combiner generators. Although relatively well explored using evolutionary algorithms, it still presents an interesting goal in terms of the practical sizes of Boolean functions. The second scenario appeared rather recently where the objective is to find Boolean functions that have various orders of the correlation immunity and minimal Hamming weight. In both these scenarios we see that evolutionary algorithms are able to find high-quality solutions where genetic programming performs the best.

  18. Studies of Malagasy Eugenia – IV: Seventeen new endemic species, a new combination, and three lectotypifications; with comments on distribution, ecological and evolutionary patterns

    Directory of Open Access Journals (Sweden)

    Neil Snow

    2015-04-01

    Full Text Available Seventeen new endemic species of the genus Eugenia L. (Myrtaceae are proposed from Madagascar, including: E. andapae N. Snow, E. barriei N. Snow, E. bemangidiensis N. Snow, E. calciscopulorum N. Snow, E. delicatissima N. Snow, Callm. & Phillipson, E. echinulata N. Snow, E. gandhii N. Snow, E. hazonjia N. Snow, E. iantarensis N. Snow, E. malcomberi N. Snow, E. manomboensis N. Snow, E. obovatifolia N. Snow, E. ranomafana N. Snow & D. Turk, E. ravelonarivoi N. Snow & Callm., E. razakamalalae N. Snow & Callm., E. tiampoka N. Snow & Callm., and E. wilsoniana N. Snow, and one new combination, Eugenia richardii (Blume N. Snow, Callm. & Phillipson is provided. Detailed descriptions, information on distribution and ecology, distribution maps, vernacular names (where known, digital images of types, comparisons to morphologically similar species. Preliminary assessment of IUCN risk of extinction and conservation recommendations are provided, including Vulnerable (4 species, Endangered (2 species, and Critically Endangered (4 species. Lectotpyes are designated for Eugenia hovarum H. Perrier, Eugenia nompa H. Perrier, and E. scottii H. Perrier respectively.

  19. Structural analysis of magnetic fusion energy systems in a combined interactive/batch computer environment

    International Nuclear Information System (INIS)

    Johnson, N.E.; Singhal, M.K.; Walls, J.C.; Gray, W.H.

    1979-01-01

    A system of computer programs has been developed to aid in the preparation of input data for and the evaluation of output data from finite element structural analyses of magnetic fusion energy devices. The system utilizes the NASTRAN structural analysis computer program and a special set of interactive pre- and post-processor computer programs, and has been designed for use in an environment wherein a time-share computer system is linked to a batch computer system. In such an environment, the analyst must only enter, review and/or manipulate data through interactive terminals linked to the time-share computer system. The primary pre-processor programs include NASDAT, NASERR and TORMAC. NASDAT and TORMAC are used to generate NASTRAN input data. NASERR performs routine error checks on this data. The NASTRAN program is run on a batch computer system using data generated by NASDAT and TORMAC. The primary post-processing programs include NASCMP and NASPOP. NASCMP is used to compress the data initially stored on magnetic tape by NASTRAN so as to facilitate interactive use of the data. NASPOP reads the data stored by NASCMP and reproduces NASTRAN output for selected grid points, elements and/or data types

  20. Comparison of {sup 18}F-fluorodeoxyglucose positron emission tomography/computed tomography, hydro-stomach computed tomography, and their combination for detecting primary gastric cancer

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Hye Young; Chung, Woo Suk; Song, E Rang; Kim, Jin Suk [Konyang University Myunggok Medical Research Institute, Konyang University Hospital, Konyang University College of Medicine, Daejeon (Korea, Republic of)

    2015-01-15

    To retrospectively compare the diagnostic accuracy for detecting primary gastric cancer on positron emission tomography/computed tomography (PET/CT) and hydro-stomach CT (S-CT) and determine whether the combination of the two techniques improves diagnostic performance. A total of 253 patients with pathologically proven primary gastric cancer underwent PET/CT and S-CT for the preoperative evaluation. Two radiologists independently reviewed the three sets (PET/CT set, S-CT set, and the combined set) of PET/CT and S-CT in a random order. They graded the likelihood for the presence of primary gastric cancer based on a 4-point scale. The diagnostic accuracy of the PET/CT set, the S-CT set, and the combined set were determined by the area under the alternative-free receiver operating characteristic curve, and sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) were calculated. Diagnostic accuracy, sensitivity, and NPV for detecting all gastric cancers and early gastric cancers (EGCs) were significantly higher with the combined set than those with the PET/CT and S-CT sets. Specificity and PPV were significantly higher with the PET/CT set than those with the combined and S-CT set for detecting all gastric cancers and EGCs. The combination of PET/CT and S-CT is more accurate than S-CT alone, particularly for detecting EGCs.

  1. Comparison of 18F-fluorodeoxyglucose positron emission tomography/computed tomography, hydro-stomach computed tomography, and their combination for detecting primary gastric cancer

    International Nuclear Information System (INIS)

    Jang, Hye Young; Chung, Woo Suk; Song, E Rang; Kim, Jin Suk

    2015-01-01

    To retrospectively compare the diagnostic accuracy for detecting primary gastric cancer on positron emission tomography/computed tomography (PET/CT) and hydro-stomach CT (S-CT) and determine whether the combination of the two techniques improves diagnostic performance. A total of 253 patients with pathologically proven primary gastric cancer underwent PET/CT and S-CT for the preoperative evaluation. Two radiologists independently reviewed the three sets (PET/CT set, S-CT set, and the combined set) of PET/CT and S-CT in a random order. They graded the likelihood for the presence of primary gastric cancer based on a 4-point scale. The diagnostic accuracy of the PET/CT set, the S-CT set, and the combined set were determined by the area under the alternative-free receiver operating characteristic curve, and sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) were calculated. Diagnostic accuracy, sensitivity, and NPV for detecting all gastric cancers and early gastric cancers (EGCs) were significantly higher with the combined set than those with the PET/CT and S-CT sets. Specificity and PPV were significantly higher with the PET/CT set than those with the combined and S-CT set for detecting all gastric cancers and EGCs. The combination of PET/CT and S-CT is more accurate than S-CT alone, particularly for detecting EGCs.

  2. A Computer-Aided Diagnosis System for Breast Cancer Combining Digital Mammography and Genomics

    Science.gov (United States)

    2006-05-01

    Huang, "Breast cancer diagnosis using self-organizing map for sonography." Ultrasound Med. Biol. 26, 405 (2000). 20 K. Horsch, M.L. Giger, L.A. Venta ...L.A. Venta , "Performance of computer-aided diagnosis in the interpretation of lesions on breast sonography." Acad Radiol 11, 272 (2004). 22 W. Chen...418. 27. Horsch K, Giger ML, Vyborny CJ, Venta LA. Performance of computer-aided diagnosis in the interpretation of lesions on breast sonography

  3. Remembering the evolutionary Freud.

    Science.gov (United States)

    Young, Allan

    2006-03-01

    Throughout his career as a writer, Sigmund Freud maintained an interest in the evolutionary origins of the human mind and its neurotic and psychotic disorders. In common with many writers then and now, he believed that the evolutionary past is conserved in the mind and the brain. Today the "evolutionary Freud" is nearly forgotten. Even among Freudians, he is regarded to be a red herring, relevant only to the extent that he diverts attention from the enduring achievements of the authentic Freud. There are three ways to explain these attitudes. First, the evolutionary Freud's key work is the "Overview of the Transference Neurosis" (1915). But it was published at an inopportune moment, forty years after the author's death, during the so-called "Freud wars." Second, Freud eventually lost interest in the "Overview" and the prospect of a comprehensive evolutionary theory of psychopathology. The publication of The Ego and the Id (1923), introducing Freud's structural theory of the psyche, marked the point of no return. Finally, Freud's evolutionary theory is simply not credible. It is based on just-so stories and a thoroughly discredited evolutionary mechanism, Lamarckian use-inheritance. Explanations one and two are probably correct but also uninteresting. Explanation number three assumes that there is a fundamental difference between Freud's evolutionary narratives (not credible) and the evolutionary accounts of psychopathology that currently circulate in psychiatry and mainstream journals (credible). The assumption is mistaken but worth investigating.

  4. Diagnostic performance of combined single photon emission computed tomographic scintimammography and ultrasonography based on computer-aided diagnosis for breast cancer

    International Nuclear Information System (INIS)

    Hwang, Kyung Hoon; Choi, Duck Joo; Choe, Won Sick; Lee, Jun Gu; Kim, Jong Hyo; Lee, Hyung Ji; Om, Kyong Sik; Lee, Byeong Il

    2007-01-01

    We investigated whether the diagnostic performance of SPECT scintimammography (SMM) can be improved by adding computer-aided diagnosis (CAD) of ultrasonography (US). We reviewed breast SPECT SMM images and corresponding US images from 40 patients with breast masses (21 malignant and 19 benign tumors.) The quantitative data of SPECT SMM were obtained as the uptake ratio of lesion to contralateral normal breast. The morphologic features of the breast lesions on US were extracted and quantitated using the automated CAD software program. The diagnostic performance of SPECT SMM and CAD of US alone was determined using receiver operating characteristic (ROC) curve analysis. The best discriminating parameter (D-value) combining SPECT SMM and the CAD of US was created. The sensitivity, specificity and accuracy of combined two diagnostic modalities were compared to those of a single one. Both SPECT SMM and CAD of US showed a relatively good diagnostic performance (area under curve=0.846 and 0.831, respectively). Combining the results of SPECT SMM and CAD of US resulted in improved diagnostic performance (area under curve=0.860), but there was no statistical difference in sensitivity, specificity and accuracy between the combined method and a single modality. It seems that combining the results of SPECT SMM and CAD of breast US do not significantly improve the diagnostic performance for diagnosis of breast cancer, compared with that of SPECT SMM alone. However, SPECT SMM and CAD of US may complement each other in differential diagnosis of breast cancer

  5. Phylogenetic inference with weighted codon evolutionary distances.

    Science.gov (United States)

    Criscuolo, Alexis; Michel, Christian J

    2009-04-01

    We develop a new approach to estimate a matrix of pairwise evolutionary distances from a codon-based alignment based on a codon evolutionary model. The method first computes a standard distance matrix for each of the three codon positions. Then these three distance matrices are weighted according to an estimate of the global evolutionary rate of each codon position and averaged into a unique distance matrix. Using a large set of both real and simulated codon-based alignments of nucleotide sequences, we show that this approach leads to distance matrices that have a significantly better treelikeness compared to those obtained by standard nucleotide evolutionary distances. We also propose an alternative weighting to eliminate the part of the noise often associated with some codon positions, particularly the third position, which is known to induce a fast evolutionary rate. Simulation results show that fast distance-based tree reconstruction algorithms on distance matrices based on this codon position weighting can lead to phylogenetic trees that are at least as accurate as, if not better, than those inferred by maximum likelihood. Finally, a well-known multigene dataset composed of eight yeast species and 106 codon-based alignments is reanalyzed and shows that our codon evolutionary distances allow building a phylogenetic tree which is similar to those obtained by non-distance-based methods (e.g., maximum parsimony and maximum likelihood) and also significantly improved compared to standard nucleotide evolutionary distance estimates.

  6. A combined experimental and computational investigation of excess molar enthalpies of (nitrobenzene + alkanol) mixtures

    International Nuclear Information System (INIS)

    Neyband, Razieh Sadat; Zarei, Hosseinali

    2015-01-01

    Highlights: • Excess molar enthalpies for the binary mixtures of nitrobenzene + alkanols mixtures were measured. • The infinite dilution excess partial molar enthalpies were calculated using the ab initio methods. • The PCM calculations were performed. • The computed excess partial molar enthalpies at infinite dilution were compared to experimental results. - Abstract: Excess molar enthalpies (H m E ) for the binary mixtures of {(nitrobenzene + ethanol), 1-propanol, 2-propanol, 1-butanol and 2-butanol} have been measured over the entire composition range at ambient pressure (81.5 kPa) and temperature 298 K using a Parr 1455 solution calorimeter. From the experimental results, the excess partial molar enthalpies (H i E ) and excess partial molar enthalpies at infinite dilution (H i E,∞ ) were calculated. The excess molar enthalpies (H m E ) are positive for all {nitrobenzene (1) + alkanol (2)} mixtures over the entire composition range. A state-of-the-art computational strategy for the evaluation of excess partial molar enthalpies at infinite dilution was followed at the M05-2X/6-311++G ∗∗ level of theory with the PCM model. The experimental excess partial molar enthalpies at infinite dilution have been compared to the computational data of the ab initio in liquid phase. Integrated experimental and computational results help to clarify the nature of the intermolecular interactions in {nitrobenzene (1) + alkanol (2)} mixtures. The experimental and computational work which was done in this study complements and extends the general research on the computation of excess partial molar enthalpy at infinite dilution of binary mixtures

  7. Computer aided molecular design with combined molecular modeling and group contribution

    DEFF Research Database (Denmark)

    Harper, Peter Mathias; Gani, Rafiqul; Kolar, Petr

    1999-01-01

    Computer-aided molecular design (CAMD) provides a means for determining molecules or mixtures of molecules (CAMMD) having a desirable set of physicochemical properties. The application range of CAMD is restricted due to limitations on the complexity of the generated molecular structures and on th......Computer-aided molecular design (CAMD) provides a means for determining molecules or mixtures of molecules (CAMMD) having a desirable set of physicochemical properties. The application range of CAMD is restricted due to limitations on the complexity of the generated molecular structures...

  8. Combined computational and experimental study of Ar beam induced defect formation in graphite

    International Nuclear Information System (INIS)

    Pregler, Sharon K.; Hayakawa, Tetsuichiro; Yasumatsu, Hisato; Kondow, Tamotsu; Sinnott, Susan B.

    2007-01-01

    Irradiation of graphite, commonly used in nuclear power plants, is known to produce structural damage. Here, experimental and computational methods are used to study defect formation in graphite during Ar irradiation at incident energies of 50 eV. The experimental samples are analyzed with scanning tunneling microscopy to quantify the size distribution of the defects that form. The computational approach is classical molecular dynamic simulations that illustrate the mechanisms by which the defects are produced. The results indicate that defects in graphite grow in concentrated areas and are nucleated by the presence of existing defects

  9. A Computer-Aided Diagnosis System for Breast Cancer Combining Mammography and Proteomics

    Science.gov (United States)

    2007-05-01

    for sonography,” Ultrasound Med. Biol. 26, 405-411 2000. 20K. Horsch, M. L. Giger, L. A. Venta , and C. J. Vyborny, “Computerized diagnosis of breast...lesions on ultrasound,” Med. Phys. 29, 157-164 2002. 21K. Horsch, M. L. Giger, C. J. Vyborny, and L. A. Venta , “Performance of computer-aided...Horsch K, Giger ML, Vyborny CJ, Venta LA. Performance of computer-aided diagnosis in the interpretation of lesions on breast sonography. Acad Radiol

  10. A Computationally-Efficient, Multi-Mechanism Based Framework for the Comprehensive Modeling of the Evolutionary Behavior of Shape Memory Alloys

    Science.gov (United States)

    Saleeb, Atef F.; Vaidyanathan, Raj

    2016-01-01

    The report summarizes the accomplishments made during the 4-year duration of the project. Here, the major emphasis is placed on the different tasks performed by the two research teams; i.e., the modeling activities by the University of Akron (UA) team and the experimental and neutron diffraction studies conducted by the University of Central Florida (UCF) team, during this 4-year period. Further technical details are given in the upcoming sections by UA and UCF for each of the milestones/years (together with the corresponding figures and captions).The project majorly involved the development, validation, and application of a general theoretical model that is capable of capturing the nonlinear hysteretic responses, including pseudoelasticity, shape memory effect, rate-dependency, multi-axiality, asymmetry in tension versus compression response of shape memory alloys. Among the targeted goals for the SMA model was its ability to account for the evolutionary character response (including transient and long term behavior under sustained cycles) for both conventional and high temperature (HT) SMAs, as well as being able to simulate some of the devices which exploit these unique material systems. This required extensive (uniaxial and multi-axial) experiments needed to guide us in calibrating and characterizing the model. Moreover, since the model is formulated on the theoretical notion of internal state variables (ISVs), neutron diffraction experiments were needed to establish the linkage between the micromechanical changes and these ISVs. In addition, the design of the model should allow easy implementation in large scale finite element application to study the behavior of devices making use of these SMA materials under different loading controls. Summary of the activities, progress/achievements made during this period is given below in details for the University of Akron and the University (Section 2.0) of Central Florida (Section 3.0).

  11. On Combining Multiple-Instance Learning and Active Learning for Computer-Aided Detection of Tuberculosis

    NARCIS (Netherlands)

    Melendez Rodriguez, J.C.; Ginneken, B. van; Maduskar, P.; Philipsen, R.H.H.M.; Ayles, H.; Sanchez, C.I.

    2016-01-01

    The major advantage of multiple-instance learning (MIL) applied to a computer-aided detection (CAD) system is that it allows optimizing the latter with case-level labels instead of accurate lesion outlines as traditionally required for a supervised approach. As shown in previous work, a MIL-based

  12. Attractive evolutionary equilibria

    OpenAIRE

    Roorda, Berend; Joosten, Reinoud

    2011-01-01

    We present attractiveness, a refinement criterion for evolutionary equilibria. Equilibria surviving this criterion are robust to small perturbations of the underlying payoff system or the dynamics at hand. Furthermore, certain attractive equilibria are equivalent to others for certain evolutionary dynamics. For instance, each attractive evolutionarily stable strategy is an attractive evolutionarily stable equilibrium for certain barycentric ray-projection dynamics, and vice versa.

  13. Evolutionary Robotics: What, Why, and Where to

    Directory of Open Access Journals (Sweden)

    Stephane eDoncieux

    2015-03-01

    Full Text Available Evolutionary robotics applies the selection, variation, and heredity principles of natural evolution to the design of robots with embodied intelligence. It can be considered as a subfield of robotics that aims to create more robust and adaptive robots. A pivotal feature of the evolutionary approach is that it considers the whole robot at once, and enables the exploitation of robot features in a holistic manner. Evolutionary robotics can also be seen as an innovative approach to the study of evolution based on a new kind of experimentalism. The use of robots as a substrate can help address questions that are difficult, if not impossible, to investigate through computer simulations or biological studies. In this paper we consider the main achievements of evolutionary robotics, focusing particularly on its contributions to both engineering and biology. We briefly elaborate on methodological issues, review some of the most interesting findings, and discuss important open issues and promising avenues for future work.

  14. Mean-Potential Law in Evolutionary Games

    Science.gov (United States)

    Nałecz-Jawecki, Paweł; Miekisz, Jacek

    2018-01-01

    The Letter presents a novel way to connect random walks, stochastic differential equations, and evolutionary game theory. We introduce a new concept of a potential function for discrete-space stochastic systems. It is based on a correspondence between one-dimensional stochastic differential equations and random walks, which may be exact not only in the continuous limit but also in finite-state spaces. Our method is useful for computation of fixation probabilities in discrete stochastic dynamical systems with two absorbing states. We apply it to evolutionary games, formulating two simple and intuitive criteria for evolutionary stability of pure Nash equilibria in finite populations. In particular, we show that the 1 /3 law of evolutionary games, introduced by Nowak et al. [Nature, 2004], follows from a more general mean-potential law.

  15. Polymorphic Evolutionary Games.

    Science.gov (United States)

    Fishman, Michael A

    2016-06-07

    In this paper, I present an analytical framework for polymorphic evolutionary games suitable for explicitly modeling evolutionary processes in diploid populations with sexual reproduction. The principal aspect of the proposed approach is adding diploid genetics cum sexual recombination to a traditional evolutionary game, and switching from phenotypes to haplotypes as the new game׳s pure strategies. Here, the relevant pure strategy׳s payoffs derived by summing the payoffs of all the phenotypes capable of producing gametes containing that particular haplotype weighted by the pertinent probabilities. The resulting game is structurally identical to the familiar Evolutionary Games with non-linear pure strategy payoffs (Hofbauer and Sigmund, 1998. Cambridge University Press), and can be analyzed in terms of an established analytical framework for such games. And these results can be translated into the terms of genotypic, and whence, phenotypic evolutionary stability pertinent to the original game. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Computational studies of a cut-wire pair and combined metamaterials

    International Nuclear Information System (INIS)

    Nguyen, Thanh Tung; Lievens, Peter; Lee, Young Pak; Vu, Dinh Lam

    2011-01-01

    The transfer-matrix method and finite-integration simulations show how the transmission properties of combined metamaterials, which consist of metallic cut-wire pairs and continuous wires, are affected by geometric parameters. The corresponding effective permittivity and permeability are retrieved from the complex scattering parameters using the standard retrieval procedure. The electromagnetic properties of the cut-wire pair as well as the left-handed behavior of the combined structure are understood by the effective medium theory. In addition, the dimensional dependence of transmission properties, the shapes of cut-wire pairs and continuous wire, and the impact of dielectric spacer are both examined. Finally, by expanding the results of previous research (Koschny et al 2003 Phys. Rev. Lett. 93 016608), we generalize the transmission picture of combined structures in terms of the correlation between electric and magnetic responses. (review)

  17. An effective chaos-geometric computational approach to analysis and prediction of evolutionary dynamics of the environmental systems: Atmospheric pollution dynamics

    Science.gov (United States)

    Buyadzhi, V. V.; Glushkov, A. V.; Khetselius, O. Yu; Bunyakova, Yu Ya; Florko, T. A.; Agayar, E. V.; Solyanikova, E. P.

    2017-10-01

    The present paper concerns the results of computational studying dynamics of the atmospheric pollutants (dioxide of nitrogen, sulphur etc) concentrations in an atmosphere of the industrial cities (Odessa) by using the dynamical systems and chaos theory methods. A chaotic behaviour in the nitrogen dioxide and sulphurous anhydride concentration time series at several sites of the Odessa city is numerically investigated. As usually, to reconstruct the corresponding attractor, the time delay and embedding dimension are needed. The former is determined by the methods of autocorrelation function and average mutual information, and the latter is calculated by means of a correlation dimension method and algorithm of false nearest neighbours. Further, the Lyapunov’s exponents spectrum, Kaplan-Yorke dimension and Kolmogorov entropy are computed. It has been found an existence of a low-D chaos in the time series of the atmospheric pollutants concentrations.

  18. 一种结合演示数据和演化优化的强化学习方法%Reinforcement learning method via combining demonstration data and evolutionary opti-mization

    Institute of Scientific and Technical Information of China (English)

    宋拴; 俞扬

    2014-01-01

    强化学习研究智能体如何从与环境的交互中学习最优的策略,以最大化长期奖赏。由于环境反馈的滞后性,强化学习问题面临巨大的决策空间,进行有效的搜索是获得成功学习的关键。以往的研究从多个角度对策略的搜索进行了探索,在搜索算法方面,研究结果表明基于演化优化的直接策略搜索方法能够获得优于传统方法的性能;在引入外部信息方面,通过加入用户提供的演示,可以有效帮助强化学习提高性能。然而,这两种有效方法的结合却鲜有研究。对用户演示与演化优化的结合进行研究,提出iNEAT+Q算法,尝试将演示数据通过预训练神经网络和引导演化优化的适应值函数的方式与演化强化学习方法结合。初步实验表明,iNEAT+Q较不使用演示数据的演化强化学习方法NEAT+Q有明显的性能改善。%Reinforcement learning aims at learning an optimal policy that maximizes the long term rewards, from interac-tions with the environment. Since the environment feedbacks commonly delay after a sequences of actions, reinforcement learning has to tackle the problem of searching in a huge policy space, and thus an effective search is the key to a success approach. Previous studies explore various ways to achieve effective search methods, one effective way is employing the evolutionary algorithm as the search method, and another direction is introducing user demonstration data to guide the search. In this work, it investigates the combination of the two directions, and proposes the iNEAT+Q approach, which trains a neural network using the demonstration data as well as integrating the demonstration data into the fitness function for the evolutionary algorithm. Preliminary empirical study shows that iNEAT+Q is superior to NEAT+Q, which is an classical evolutionary reinforcement learning approach.

  19. First results from a combined analysis of CERN computing infrastructure metrics

    Science.gov (United States)

    Duellmann, Dirk; Nieke, Christian

    2017-10-01

    The IT Analysis Working Group (AWG) has been formed at CERN across individual computing units and the experiments to attempt a cross cutting analysis of computing infrastructure and application metrics. In this presentation we will describe the first results obtained using medium/long term data (1 months — 1 year) correlating box level metrics, job level metrics from LSF and HTCondor, IO metrics from the physics analysis disk pools (EOS) and networking and application level metrics from the experiment dashboards. We will cover in particular the measurement of hardware performance and prediction of job duration, the latency sensitivity of different job types and a search for bottlenecks with the production job mix in the current infrastructure. The presentation will conclude with the proposal of a small set of metrics to simplify drawing conclusions also in the more constrained environment of public cloud deployments.

  20. A combined mechanistic and computational study of the gold(I)-catalyzed formation of substituted indenes.

    Science.gov (United States)

    Nun, Pierrick; Gaillard, Sylvain; Poater, Albert; Cavallo, Luigi; Nolan, Steven P

    2011-01-07

    Substituted indenes can be prepared after a sequence [1,3] O-acyl shift-hydroarylation-[1,3] O-acyl shift. Each step is catalyzed by a cationic NHC-Gold(I) species generated in situ after reaction between [(IPr)AuOH] and HBF(4)·OEt(2). This interesting silver-free way is fully supported by a computational study justifying the formation of each intermediate.

  1. Evolutionary design optimization of traffic signals applied to Quito city.

    Science.gov (United States)

    Armas, Rolando; Aguirre, Hernán; Daolio, Fabio; Tanaka, Kiyoshi

    2017-01-01

    This work applies evolutionary computation and machine learning methods to study the transportation system of Quito from a design optimization perspective. It couples an evolutionary algorithm with a microscopic transport simulator and uses the outcome of the optimization process to deepen our understanding of the problem and gain knowledge about the system. The work focuses on the optimization of a large number of traffic lights deployed on a wide area of the city and studies their impact on travel time, emissions and fuel consumption. An evolutionary algorithm with specialized mutation operators is proposed to search effectively in large decision spaces, evolving small populations for a short number of generations. The effects of the operators combined with a varying mutation schedule are studied, and an analysis of the parameters of the algorithm is also included. In addition, hierarchical clustering is performed on the best solutions found in several runs of the algorithm. An analysis of signal clusters and their geolocation, estimation of fuel consumption, spatial analysis of emissions, and an analysis of signal coordination provide an overall picture of the systemic effects of the optimization process.

  2. Objective definition of rosette shape variation using a combined computer vision and data mining approach.

    Directory of Open Access Journals (Sweden)

    Anyela Camargo

    Full Text Available Computer-vision based measurements of phenotypic variation have implications for crop improvement and food security because they are intrinsically objective. It should be possible therefore to use such approaches to select robust genotypes. However, plants are morphologically complex and identification of meaningful traits from automatically acquired image data is not straightforward. Bespoke algorithms can be designed to capture and/or quantitate specific features but this approach is inflexible and is not generally applicable to a wide range of traits. In this paper, we have used industry-standard computer vision techniques to extract a wide range of features from images of genetically diverse Arabidopsis rosettes growing under non-stimulated conditions, and then used statistical analysis to identify those features that provide good discrimination between ecotypes. This analysis indicates that almost all the observed shape variation can be described by 5 principal components. We describe an easily implemented pipeline including image segmentation, feature extraction and statistical analysis. This pipeline provides a cost-effective and inherently scalable method to parameterise and analyse variation in rosette shape. The acquisition of images does not require any specialised equipment and the computer routines for image processing and data analysis have been implemented using open source software. Source code for data analysis is written using the R package. The equations to calculate image descriptors have been also provided.

  3. Quantitative analysis of bowel gas by plain abdominal radiograph combined with computer image processing

    International Nuclear Information System (INIS)

    Gao Yan; Peng Kewen; Zhang Houde; Shen Bixian; Xiao Hanxin; Cai Juan

    2003-01-01

    Objective: To establish a method for quantitative analysis of bowel gas by plain abdominal radiograph and computer graphics. Methods: Plain abdominal radiographs in supine position from 25 patients with irritable bowel syndrome (IBS) and 20 health controls were studied. A gastroenterologist and a radiologist independently conducted the following procedure on each radiograph. After the outline of bowel gas was traced by axe pen, the radiograph was digitized by a digital camera and transmitted to the computer with Histogram software. The total gas area was determined as the pixel value on images. The ratio of the bowel gas quantity to the pixel value in the region surrounded by a horizontal line tangential to the superior pubic symphysis margin, a horizontal line tangential to the tenth dorsal vertebra inferior margin, and the lateral line tangential to the right and left anteriosuperior iliac crest, was defined as the gas volume score (GVS). To examine the sequential reproducibility, a second plain abdominal radiograph was performed in 5 normal controls 1 week later, and the GVS were compared. Results: Bowel gas was easily identified on the plain abdominal radiograph. Both large and small intestine located in the selected region. Both observers could finish one radiographic measurement in less than 10 mins. The correlation coefficient between the two observers was 0.986. There was no statistical difference on GVS between the two sequential radiographs in 5 health controls. Conclusion: Quantification of bowel gas based on plain abdominal radiograph and computer is simple, rapid, and reliable

  4. Combining a Novel Computer Vision Sensor with a Cleaning Robot to Achieve Autonomous Pig House Cleaning

    DEFF Research Database (Denmark)

    Andersen, Nils Axel; Braithwaite, Ian David; Blanke, Mogens

    2005-01-01

    condition based cleaning. This paper describes how a novel sensor, developed for the purpose, and algorithms for classification and learning are combined with a commercial robot to obtain an autonomous system which meets the necessary quality attributes. These include features to make selective cleaning...

  5. Combining computational models for landslide hazard assessment of Guantánamo province, Cuba

    NARCIS (Netherlands)

    Castellanos Abella, E.A.

    2008-01-01

    As part of the Cuban system for landslide disaster management, a methodology was developed for regional scale landslide hazard assessment, which is a combination of different models. The method was applied in Guantánamo province at 1:100 000 scale. The analysis started with an extensive aerial

  6. Evolutionary medicine: its scope, interest and potential.

    Science.gov (United States)

    Stearns, Stephen C

    2012-11-07

    This review is aimed at readers seeking an introductory overview, teaching courses and interested in visionary ideas. It first describes the range of topics covered by evolutionary medicine, which include human genetic variation, mismatches to modernity, reproductive medicine, degenerative disease, host-pathogen interactions and insights from comparisons with other species. It then discusses priorities for translational research, basic research and health management. Its conclusions are that evolutionary thinking should not displace other approaches to medical science, such as molecular medicine and cell and developmental biology, but that evolutionary insights can combine with and complement established approaches to reduce suffering and save lives. Because we are on the cusp of so much new research and innovative insights, it is hard to estimate how much impact evolutionary thinking will have on medicine, but it is already clear that its potential is enormous.

  7. The use of combined single photon emission computed tomography and X-ray computed tomography to assess the fate of inhaled aerosol.

    Science.gov (United States)

    Fleming, John; Conway, Joy; Majoral, Caroline; Tossici-Bolt, Livia; Katz, Ira; Caillibotte, Georges; Perchet, Diane; Pichelin, Marine; Muellinger, Bernhard; Martonen, Ted; Kroneberg, Philipp; Apiou-Sbirlea, Gabriela

    2011-02-01

    Gamma camera imaging is widely used to assess pulmonary aerosol deposition. Conventional planar imaging provides limited information on its regional distribution. In this study, single photon emission computed tomography (SPECT) was used to describe deposition in three dimensions (3D) and combined with X-ray computed tomography (CT) to relate this to lung anatomy. Its performance was compared to planar imaging. Ten SPECT/CT studies were performed on five healthy subjects following carefully controlled inhalation of radioaerosol from a nebulizer, using a variety of inhalation regimes. The 3D spatial distribution was assessed using a central-to-peripheral ratio (C/P) normalized to lung volume and for the right lung was compared to planar C/P analysis. The deposition by airway generation was calculated for each lung and the conducting airways deposition fraction compared to 24-h clearance. The 3D normalized C/P ratio correlated more closely with 24-h clearance than the 2D ratio for the right lung [coefficient of variation (COV), 9% compared to 15% p computer analysis is a useful approach for applications requiring regional information on deposition.

  8. Monitoring of facial stress during space flight: Optical computer recognition combining discriminative and generative methods

    Science.gov (United States)

    Dinges, David F.; Venkataraman, Sundara; McGlinchey, Eleanor L.; Metaxas, Dimitris N.

    2007-02-01

    Astronauts are required to perform mission-critical tasks at a high level of functional capability throughout spaceflight. Stressors can compromise their ability to do so, making early objective detection of neurobehavioral problems in spaceflight a priority. Computer optical approaches offer a completely unobtrusive way to detect distress during critical operations in space flight. A methodology was developed and a study completed to determine whether optical computer recognition algorithms could be used to discriminate facial expressions during stress induced by performance demands. Stress recognition from a facial image sequence is a subject that has not received much attention although it is an important problem for many applications beyond space flight (security, human-computer interaction, etc.). This paper proposes a comprehensive method to detect stress from facial image sequences by using a model-based tracker. The image sequences were captured as subjects underwent a battery of psychological tests under high- and low-stress conditions. A cue integration-based tracking system accurately captured the rigid and non-rigid parameters of different parts of the face (eyebrows, lips). The labeled sequences were used to train the recognition system, which consisted of generative (hidden Markov model) and discriminative (support vector machine) parts that yield results superior to using either approach individually. The current optical algorithm methods performed at a 68% accuracy rate in an experimental study of 60 healthy adults undergoing periods of high-stress versus low-stress performance demands. Accuracy and practical feasibility of the technique is being improved further with automatic multi-resolution selection for the discretization of the mask, and automated face detection and mask initialization algorithms.

  9. Computing the dynamics of biomembranes by combining conservative level set and adaptive finite element methods

    OpenAIRE

    Laadhari , Aymen; Saramito , Pierre; Misbah , Chaouqi

    2014-01-01

    International audience; The numerical simulation of the deformation of vesicle membranes under simple shear external fluid flow is considered in this paper. A new saddle-point approach is proposed for the imposition of the fluid incompressibility and the membrane inextensibility constraints, through Lagrange multipliers defined in the fluid and on the membrane respectively. Using a level set formulation, the problem is approximated by mixed finite elements combined with an automatic adaptive ...

  10. An automated tuberculosis screening strategy combining X-ray-based computer-aided detection and clinical information

    Science.gov (United States)

    Melendez, Jaime; Sánchez, Clara I.; Philipsen, Rick H. H. M.; Maduskar, Pragnya; Dawson, Rodney; Theron, Grant; Dheda, Keertan; van Ginneken, Bram

    2016-04-01

    Lack of human resources and radiological interpretation expertise impair tuberculosis (TB) screening programmes in TB-endemic countries. Computer-aided detection (CAD) constitutes a viable alternative for chest radiograph (CXR) reading. However, no automated techniques that exploit the additional clinical information typically available during screening exist. To address this issue and optimally exploit this information, a machine learning-based combination framework is introduced. We have evaluated this framework on a database containing 392 patient records from suspected TB subjects prospectively recruited in Cape Town, South Africa. Each record comprised a CAD score, automatically computed from a CXR, and 12 clinical features. Comparisons with strategies relying on either CAD scores or clinical information alone were performed. Our results indicate that the combination framework outperforms the individual strategies in terms of the area under the receiving operating characteristic curve (0.84 versus 0.78 and 0.72), specificity at 95% sensitivity (49% versus 24% and 31%) and negative predictive value (98% versus 95% and 96%). Thus, it is believed that combining CAD and clinical information to estimate the risk of active disease is a promising tool for TB screening.

  11. Combined diagnostic performance of coronary computed tomography angiography and computed tomography derived fractional flow reserve for the evaluation of myocardial ischemia: A meta-analysis.

    Science.gov (United States)

    Tan, Xiao Wei; Zheng, Qishi; Shi, Luming; Gao, Fei; Allen, John Carson; Coenen, Adriaan; Baumann, Stefan; Schoepf, U Joseph; Kassab, Ghassan S; Lim, Soo Teik; Wong, Aaron Sung Lung; Tan, Jack Wei Chieh; Yeo, Khung Keong; Chin, Chee Tang; Ho, Kay Woon; Tan, Swee Yaw; Chua, Terrance Siang Jin; Chan, Edwin Shih Yen; Tan, Ru San; Zhong, Liang

    2017-06-01

    To evaluate the combined diagnostic accuracy of coronary computed tomography angiography (CCTA) and computed tomography derived fractional flow reserve (FFRct) in patients with suspected or known coronary artery disease (CAD). PubMed, The Cochrane library, Embase and OpenGray were searched to identify studies comparing diagnostic accuracy of CCTA and FFRct. Diagnostic test measurements of FFRct were either extracted directly from the published papers or calculated from provided information. Bivariate models were conducted to synthesize the diagnostic performance of combined CCTA and FFRct at both "per-vessel" and "per-patient" levels. 7 articles were included for analysis. The combined diagnostic outcomes from "both positive" strategy, i.e. a subject was considered as "positive" only when both CCTA and FFRct were "positive", demonstrated relative high specificity (per-vessel: 0.91; per-patient: 0.81), high positive likelihood ratio (LR+, per-vessel: 7.93; per-patient: 4.26), high negative likelihood ratio (LR-, per-vessel: 0.30; per patient: 0.24) and high accuracy (per-vessel: 0.91; per-patient: 0.81) while "either positive" strategy, i.e. a subject was considered as "positive" when either CCTA or FFRct was "positive", demonstrated relative high sensitivity (per-vessel: 0.97; per-patient: 0.98), low LR+ (per-vessel: 1.50; per-patient: 1.17), low LR- (per-vessel: 0.07; per-patient: 0.09) and low accuracy (per-vessel: 0.57; per-patient: 0.54). "Both positive" strategy showed better diagnostic performance to rule in patients with non-significant stenosis compared to "either positive" strategy, as it efficiently reduces the proportion of testing false positive subjects. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Evolutionary ecology of virus emergence.

    Science.gov (United States)

    Dennehy, John J

    2017-02-01

    The cross-species transmission of viruses into new host populations, termed virus emergence, is a significant issue in public health, agriculture, wildlife management, and related fields. Virus emergence requires overlap between host populations, alterations in virus genetics to permit infection of new hosts, and adaptation to novel hosts such that between-host transmission is sustainable, all of which are the purview of the fields of ecology and evolution. A firm understanding of the ecology of viruses and how they evolve is required for understanding how and why viruses emerge. In this paper, I address the evolutionary mechanisms of virus emergence and how they relate to virus ecology. I argue that, while virus acquisition of the ability to infect new hosts is not difficult, limited evolutionary trajectories to sustained virus between-host transmission and the combined effects of mutational meltdown, bottlenecking, demographic stochasticity, density dependence, and genetic erosion in ecological sinks limit most emergence events to dead-end spillover infections. Despite the relative rarity of pandemic emerging viruses, the potential of viruses to search evolutionary space and find means to spread epidemically and the consequences of pandemic viruses that do emerge necessitate sustained attention to virus research, surveillance, prophylaxis, and treatment. © 2016 New York Academy of Sciences.

  13. Computational Model of Primary Visual Cortex Combining Visual Attention for Action Recognition.

    Directory of Open Access Journals (Sweden)

    Na Shu

    Full Text Available Humans can easily understand other people's actions through visual systems, while computers cannot. Therefore, a new bio-inspired computational model is proposed in this paper aiming for automatic action recognition. The model focuses on dynamic properties of neurons and neural networks in the primary visual cortex (V1, and simulates the procedure of information processing in V1, which consists of visual perception, visual attention and representation of human action. In our model, a family of the three-dimensional spatial-temporal correlative Gabor filters is used to model the dynamic properties of the classical receptive field of V1 simple cell tuned to different speeds and orientations in time for detection of spatiotemporal information from video sequences. Based on the inhibitory effect of stimuli outside the classical receptive field caused by lateral connections of spiking neuron networks in V1, we propose surround suppressive operator to further process spatiotemporal information. Visual attention model based on perceptual grouping is integrated into our model to filter and group different regions. Moreover, in order to represent the human action, we consider the characteristic of the neural code: mean motion map based on analysis of spike trains generated by spiking neurons. The experimental evaluation on some publicly available action datasets and comparison with the state-of-the-art approaches demonstrate the superior performance of the proposed model.

  14. O2: A novel combined online and offline computing system for the ALICE Experiment after 2018

    International Nuclear Information System (INIS)

    Ananya; Agrawal, N; Avasthi, A; Suaide, A Alarcon Do Passo; Prado, C Alves Garcia; Alt, T; Bach, M; Breitner, T; Aphecetche, L; Bala, R; Bhasin, A; Barnafoldi, G; Belikov, J; Bellini, F; Betev, L; Buncic, P; Carena, F; Carena, W; Chapeland, S; Barroso, V Chibante

    2014-01-01

    ALICE (A Large Ion Collider Experiment) is a detector dedicated to the studies with heavy ion collisions exploring the physics of strongly interacting nuclear matter and the quark-gluon plasma at the CERN LHC (Large Hadron Collider). After the second long shutdown of the LHC, the ALICE Experiment will be upgraded to make high precision measurements of rare probes at low pT, which cannot be selected with a trigger, and therefore require a very large sample of events recorded on tape. The online computing system will be completely redesigned to address the major challenge of sampling the full 50 kHz Pb-Pb interaction rate increasing the present limit by a factor of 100. This upgrade will also include the continuous un-triggered read-out of two detectors: ITS (Inner Tracking System) and TPC (Time Projection Chamber)) producing a sustained throughput of 1 TB/s. This unprecedented data rate will be reduced by adopting an entirely new strategy where calibration and reconstruction are performed online, and only the reconstruction results are stored while the raw data are discarded. This system, already demonstrated in production on the TPC data since 2011, will be optimized for the online usage of reconstruction algorithms. This implies much tighter coupling between online and offline computing systems. An R and D program has been set up to meet this huge challenge. The object of this paper is to present this program and its first results.

  15. Evolutionary engineering of industrial microorganisms-strategies and applications.

    Science.gov (United States)

    Zhu, Zhengming; Zhang, Juan; Ji, Xiaomei; Fang, Zhen; Wu, Zhimeng; Chen, Jian; Du, Guocheng

    2018-06-01

    Microbial cells have been widely used in the industry to obtain various biochemical products, and evolutionary engineering is a common method in biological research to improve their traits, such as high environmental tolerance and improvement of product yield. To obtain better integrate functions of microbial cells, evolutionary engineering combined with other biotechnologies have attracted more attention in recent years. Classical laboratory evolution has been proven effective to letting more beneficial mutations occur in different genes but also has some inherent limitations such as a long evolutionary period and uncontrolled mutation frequencies. However, recent studies showed that some new strategies may gradually overcome these limitations. In this review, we summarize the evolutionary strategies commonly used in industrial microorganisms and discuss the combination of evolutionary engineering with other biotechnologies such as systems biology and inverse metabolic engineering. Finally, we prospect the importance and application prospect of evolutionary engineering as a powerful tool especially in optimization of industrial microbial cell factories.

  16. The Multimorbidity Cluster Analysis Tool: Identifying Combinations and Permutations of Multiple Chronic Diseases Using a Record-Level Computational Analysis

    Directory of Open Access Journals (Sweden)

    Kathryn Nicholson

    2017-12-01

    Full Text Available Introduction: Multimorbidity, or the co-occurrence of multiple chronic health conditions within an individual, is an increasingly dominant presence and burden in modern health care systems.  To fully capture its complexity, further research is needed to uncover the patterns and consequences of these co-occurring health states.  As such, the Multimorbidity Cluster Analysis Tool and the accompanying Multimorbidity Cluster Analysis Toolkit have been created to allow researchers to identify distinct clusters that exist within a sample of participants or patients living with multimorbidity.  Development: The Tool and Toolkit were developed at Western University in London, Ontario, Canada.  This open-access computational program (JAVA code and executable file was developed and tested to support an analysis of thousands of individual records and up to 100 disease diagnoses or categories.  Application: The computational program can be adapted to the methodological elements of a research project, including type of data, type of chronic disease reporting, measurement of multimorbidity, sample size and research setting.  The computational program will identify all existing, and mutually exclusive, combinations and permutations within the dataset.  An application of this computational program is provided as an example, in which more than 75,000 individual records and 20 chronic disease categories resulted in the detection of 10,411 unique combinations and 24,647 unique permutations among female and male patients.  Discussion: The Tool and Toolkit are now available for use by researchers interested in exploring the complexities of multimorbidity.  Its careful use, and the comparison between results, will be valuable additions to the nuanced understanding of multimorbidity.

  17. Molluscan Evolutionary Genomics

    Energy Technology Data Exchange (ETDEWEB)

    Simison, W. Brian; Boore, Jeffrey L.

    2005-12-01

    In the last 20 years there have been dramatic advances in techniques of high-throughput DNA sequencing, most recently accelerated by the Human Genome Project, a program that has determined the three billion base pair code on which we are based. Now this tremendous capability is being directed at other genome targets that are being sampled across the broad range of life. This opens up opportunities as never before for evolutionary and organismal biologists to address questions of both processes and patterns of organismal change. We stand at the dawn of a new 'modern synthesis' period, paralleling that of the early 20th century when the fledgling field of genetics first identified the underlying basis for Darwin's theory. We must now unite the efforts of systematists, paleontologists, mathematicians, computer programmers, molecular biologists, developmental biologists, and others in the pursuit of discovering what genomics can teach us about the diversity of life. Genome-level sampling for mollusks to date has mostly been limited to mitochondrial genomes and it is likely that these will continue to provide the best targets for broad phylogenetic sampling in the near future. However, we are just beginning to see an inroad into complete nuclear genome sequencing, with several mollusks and other eutrochozoans having been selected for work about to begin. Here, we provide an overview of the state of molluscan mitochondrial genomics, highlight a few of the discoveries from this research, outline the promise of broadening this dataset, describe upcoming projects to sequence whole mollusk nuclear genomes, and challenge the community to prepare for making the best use of these data.

  18. A computer-based training system combining virtual reality and multimedia

    International Nuclear Information System (INIS)

    Stansfield, S.A.

    1993-01-01

    Training new users of complex machines is often an expensive and time-consuming process. This is particularly true for special purpose systems, such as those frequently encountered in DOE applications. This paper presents a computer-based training system intended as a partial solution to this problem. The system extends the basic virtual reality (VR) training paradigm by adding a multimedia component which may be accessed during interaction with the virtual environment: The 3D model used to create the virtual reality is also used as the primary navigation tool through the associated multimedia. This method exploits the natural mapping between a virtual world and the real world that it represents to provide a more intuitive way for the student to interact with all forms of information about the system

  19. Combined tangential-normal vector elements for computing electric and magnetic fields

    International Nuclear Information System (INIS)

    Sachdev, S.; Cendes, Z.J.

    1993-01-01

    A direct method for computing electric and magnetic fields in two dimensions is developed. This method determines both the fields and fluxes directly from Maxwell's curl and divergence equations without introducing potential functions. This allows both the curl and the divergence of the field to be set independently in all elements. The technique is based on a new type of vector finite element that simultaneously interpolates to the tangential component of the electric or the magnetic field and the normal component of the electric or magnetic flux. Continuity conditions are imposed across element edges simply by setting like variables to be the same across element edges. This guarantees the continuity of the field and flux at the mid-point of each edge and that for all edges the average value of the tangential component of the field and of the normal component of the flux is identical

  20. Combining Distance and Face-To Teaching and Learning in Spatial Computations

    Science.gov (United States)

    Gulland, E.-K.; Schut, A. G. T.; Veenendaal, B.

    2011-09-01

    Retention and passing rates as well as student engagement in computer programming and problem solving units are a major concern in tertiary spatial science courses. A number of initiatives were implemented to improve this. A pilot study reviews the changes made to the teaching and learning environment, including the addition of new resources and modifications to assessments, and investigates their effectiveness. In particular, the study focuses on the differences between students studying in traditional, oncampus mode and distance, e-learning mode. Student results and retention rates from 2009-2011, data from in-lecture clicker response units and two anonymous surveys collected in 2011 were analysed. Early results indicate that grades improved for engaged students but pass rates or grades of the struggling cohort of students did not improve significantly.

  1. Improving Wind Turbine Drivetrain Reliability Using a Combined Experimental, Computational, and Analytical Approach

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; van Dam, J.; Bergua, R.; Jove, J.; Campbell, J.

    2015-03-01

    Nontorque loads induced by the wind turbine rotor overhang weight and aerodynamic forces can greatly affect drivetrain loads and responses. If not addressed properly, these loads can result in a decrease in gearbox component life. This work uses analytical modeling, computational modeling, and experimental data to evaluate a unique drivetrain design that minimizes the effects of nontorque loads on gearbox reliability: the Pure Torque(R) drivetrain developed by Alstom. The drivetrain has a hub-support configuration that transmits nontorque loads directly into the tower rather than through the gearbox as in other design approaches. An analytical model of Alstom's Pure Torque drivetrain provides insight into the relationships among turbine component weights, aerodynamic forces, and the resulting drivetrain loads. Main shaft bending loads are orders of magnitude lower than the rated torque and are hardly affected by wind conditions and turbine operations.

  2. Combined use of computational chemistry and chemoinformatics methods for chemical discovery

    Energy Technology Data Exchange (ETDEWEB)

    Sugimoto, Manabu, E-mail: sugimoto@kumamoto-u.ac.jp [Graduate School of Science and Technology, Kumamoto University, 2-39-1, Kurokami, Chuo-ku, Kumamoto 860-8555 (Japan); Institute for Molecular Science, 38 Nishigo-Naka, Myodaiji, Okazaki 444-8585 (Japan); CREST, Japan Science and Technology Agency, 4-1-8 Honcho, Kawaguchi, Saitama 332-0012 (Japan); Ideo, Toshihiro; Iwane, Ryo [Graduate School of Science and Technology, Kumamoto University, 2-39-1, Kurokami, Chuo-ku, Kumamoto 860-8555 (Japan)

    2015-12-31

    Data analysis on numerical data by the computational chemistry calculations is carried out to obtain knowledge information of molecules. A molecular database is developed to systematically store chemical, electronic-structure, and knowledge-based information. The database is used to find molecules related to a keyword of “cancer”. Then the electronic-structure calculations are performed to quantitatively evaluate quantum chemical similarity of the molecules. Among the 377 compounds registered in the database, 24 molecules are found to be “cancer”-related. This set of molecules includes both carcinogens and anticancer drugs. The quantum chemical similarity analysis, which is carried out by using numerical results of the density-functional theory calculations, shows that, when some energy spectra are referred to, carcinogens are reasonably distinguished from the anticancer drugs. Therefore these spectral properties are considered of as important measures for classification.

  3. Experiment and computation: a combined approach to study the van der Waals complexes

    Directory of Open Access Journals (Sweden)

    Surin L.A.

    2017-01-01

    Full Text Available A review of recent results on the millimetre-wave spectroscopy of weakly bound van der Waals complexes, mostly those which contain H2 and He, is presented. In our work, we compared the experimental spectra to the theoretical bound state results, thus providing a critical test of the quality of the M–H2 and M–He potential energy surfaces (PESs which are a key issue for reliable computations of the collisional excitation and de-excitation of molecules (M = CO, NH3, H2O in the dense interstellar medium. The intermolecular interactions with He and H2 play also an important role for high resolution spectroscopy of helium or para-hydrogen clusters doped by a probe molecule (CO, HCN. Such experiments are directed on the detection of superfluid response of molecular rotation in the He and p-H2 clusters.

  4. Novel diode-based laser system for combined transcutaneous monitoring and computer-controlled intermittent treatment of jaundiced neonates

    Science.gov (United States)

    Hamza, Mostafa; El-Ahl, Mohammad H. S.; Hamza, Ahmad M.

    2001-06-01

    The high efficacy of laser phototherapy combined with transcutaneous monitoring of serum bilirubin provides optimum safety for jaundiced infants from the risk of bilirubin encephalopathy. In this paper the authors introduce the design and operating principles of a new laser system that can provide simultaneous monitoring and treatment of several jaundiced babies at one time. The new system incorporates diode-based laser sources oscillating at selected wavelengths to achieve both transcutaneous differential absorption measurements of bilirubin concentration in addition to the computer controlled intermittent laser therapy through a network of optical fibers. The detailed description and operating characteristics of this system are presented.

  5. Microstructural characterization of dental zinc phosphate cements using combined small angle neutron scattering and microfocus X-ray computed tomography

    Czech Academy of Sciences Publication Activity Database

    Viani, Alberto; Sotiriadis, Konstantinos; Kumpová, Ivana; Mancini, L.; Appavou, M.-S.

    2017-01-01

    Roč. 33, č. 4 (2017), s. 402-417 ISSN 0109-5641 R&D Projects: GA MŠk(CZ) LO1219 Keywords : zinc phosphate cements * small angle neutron scattering * X-ray micro-computed tomography * X-ray powder diffraction * zinc oxide * acid-base cements Subject RIV: JJ - Other Materials OBOR OECD: Composites (including laminates, reinforced plastics, cermets, combined natural and synthetic fibre fabrics Impact factor: 4.070, year: 2016 https://www.sciencedirect.com/science/article/pii/S0109564116305127

  6. Evaluation of models generated via hybrid evolutionary algorithms ...

    African Journals Online (AJOL)

    2016-04-02

    Apr 2, 2016 ... Evaluation of models generated via hybrid evolutionary algorithms for the prediction of Microcystis ... evolutionary algorithms (HEA) proved to be highly applica- ble to the hypertrophic reservoirs of South Africa. .... discovered and optimised using a large-scale parallel computational device and relevant soft-.

  7. Origins of evolutionary transitions

    Indian Academy of Sciences (India)

    2014-03-15

    Mar 15, 2014 ... ... of events: 'Entities that were capable of independent replication ... There have been many major evolutionary events that this definition of .... selection at level x to exclusive selection at x – will probably require a multiplicity ...

  8. Evolutionary relationships among Astroviridae

    NARCIS (Netherlands)

    Lukashov, Vladimir V.; Goudsmit, Jaap

    2002-01-01

    To study the evolutionary relationships among astroviruses, all available sequences for members of the family Astroviridae were collected. Phylogenetic analysis distinguished two deep-rooted groups: one comprising mammalian astroviruses, with ovine astrovirus being an outlier, and the other

  9. Evolutionary Multiplayer Games

    OpenAIRE

    Gokhale, Chaitanya S.; Traulsen, Arne

    2014-01-01

    Evolutionary game theory has become one of the most diverse and far reaching theories in biology. Applications of this theory range from cell dynamics to social evolution. However, many applications make it clear that inherent non-linearities of natural systems need to be taken into account. One way of introducing such non-linearities into evolutionary games is by the inclusion of multiple players. An example is of social dilemmas, where group benefits could e.g.\\ increase less than linear wi...

  10. Nuclear fuel management optimization using adaptive evolutionary algorithms with heuristics

    International Nuclear Information System (INIS)

    Axmann, J.K.; Van de Velde, A.

    1996-01-01

    Adaptive Evolutionary Algorithms in combination with expert knowledge encoded in heuristics have proved to be a robust and powerful optimization method for the design of optimized PWR fuel loading pattern. Simple parallel algorithmic structures coupled with a low amount of communications between computer processor units in use makes it possible for workstation clusters to be employed efficiently. The extension of classic evolution strategies not only by new and alternative methods but also by the inclusion of heuristics with effects on the exchange probabilities of the fuel assemblies at specific core positions leads to the RELOPAT optimization code of the Technical University of Braunschweig. In combination with the new, neutron-physical 3D nodal core simulator PRISM developed by SIEMENS the PRIMO loading pattern optimization system has been designed. Highly promising results in the recalculation of known reload plans for German PWR's new lead to a commercially usable program. (author)

  11. A combined brain-computer interface based on P300 potentials and motion-onset visual evoked potentials.

    Science.gov (United States)

    Jin, Jing; Allison, Brendan Z; Wang, Xingyu; Neuper, Christa

    2012-04-15

    Brain-computer interfaces (BCIs) allow users to communicate via brain activity alone. Many BCIs rely on the P300 and other event-related potentials (ERPs) that are elicited when target stimuli flash. Although there have been considerable research exploring ways to improve P300 BCIs, surprisingly little work has focused on new ways to change visual stimuli to elicit more recognizable ERPs. In this paper, we introduce a "combined" BCI based on P300 potentials and motion-onset visual evoked potentials (M-VEPs) and compare it with BCIs based on each simple approach (P300 and M-VEP). Offline data suggested that performance would be best in the combined paradigm. Online tests with adaptive BCIs confirmed that our combined approach is practical in an online BCI, and yielded better performance than the other two approaches (P<0.05) without annoying or overburdening the subject. The highest mean classification accuracy (96%) and practical bit rate (26.7bit/s) were obtained from the combined condition. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Advantages of 18F-fluorodeoxyglucose positron emission tomography combined with computed tomography in detecting post cardiac surgery infections.

    Science.gov (United States)

    Adjtoutah, Djamel; Azhari, Alaa; Larabi, Youcef; Dorigo, Enrica; Merlin, Charles; Marcaggi, Xavier; Nana, Armel Simplice; Camilleri, Lionel; Azarnoush, Kasra

    2014-01-01

    The 18F-fluorodeoxyglucose positron emission tomography combined with computed tomography (FDG-PET/CT) offers an excellent negative predictive value. Consequently, it is a reliable tool for excluding an infectious phenomenon in case of negativity. In case of persistent fever of unknown origin after cardiac surgery and in combination with other bacteriological examinations and medical imaging, we can rely on FDG-PET/CT to confirm or eliminate deep infections and prosthetic endocarditis. For this reason, FDG-PET/CT should be considered among the examinations to be performed in case of suspected infection after cardiac surgery. We have reported the case of a 76-year-old man who presented with a fever of unknown origin and recurrent septic shocks after a biological Bentall procedure combined with left anterior descending (LAD) coronary artery revascularization by the left internal thoracic artery. We performed a FDG-PET/CT which showed external iliac vein and right common femoral vein hyperfixation with infiltration of adjacent soft tissues, highly suspected to be an infectious process. The aim of this case report is to show that FDG-PET/CT, in combination with other bacteriological examinations and medical imaging, can be extremely helpful in detecting deep infectious sources, even during the early postoperative period.

  13. Combined rTMS and virtual reality brain-computer interface training for motor recovery after stroke

    Science.gov (United States)

    Johnson, N. N.; Carey, J.; Edelman, B. J.; Doud, A.; Grande, A.; Lakshminarayan, K.; He, B.

    2018-02-01

    Objective. Combining repetitive transcranial magnetic stimulation (rTMS) with brain-computer interface (BCI) training can address motor impairment after stroke by down-regulating exaggerated inhibition from the contralesional hemisphere and encouraging ipsilesional activation. The objective was to evaluate the efficacy of combined rTMS  +  BCI, compared to sham rTMS  +  BCI, on motor recovery after stroke in subjects with lasting motor paresis. Approach. Three stroke subjects approximately one year post-stroke participated in three weeks of combined rTMS (real or sham) and BCI, followed by three weeks of BCI alone. Behavioral and electrophysiological differences were evaluated at baseline, after three weeks, and after six weeks of treatment. Main results. Motor improvements were observed in both real rTMS  +  BCI and sham groups, but only the former showed significant alterations in inter-hemispheric inhibition in the desired direction and increased relative ipsilesional cortical activation from fMRI. In addition, significant improvements in BCI performance over time and adequate control of the virtual reality BCI paradigm were observed only in the former group. Significance. When combined, the results highlight the feasibility and efficacy of combined rTMS  +  BCI for motor recovery, demonstrated by increased ipsilesional motor activity and improvements in behavioral function for the real rTMS  +  BCI condition in particular. Our findings also demonstrate the utility of BCI training alone, as shown by behavioral improvements for the sham rTMS  +  BCI condition. This study is the first to evaluate combined rTMS and BCI training for motor rehabilitation and provides a foundation for continued work to evaluate the potential of both rTMS and virtual reality BCI training for motor recovery after stroke.

  14. Bipartite Graphs as Models of Population Structures in Evolutionary Multiplayer Games

    Science.gov (United States)

    Peña, Jorge; Rochat, Yannick

    2012-01-01

    By combining evolutionary game theory and graph theory, “games on graphs” study the evolutionary dynamics of frequency-dependent selection in population structures modeled as geographical or social networks. Networks are usually represented by means of unipartite graphs, and social interactions by two-person games such as the famous prisoner’s dilemma. Unipartite graphs have also been used for modeling interactions going beyond pairwise interactions. In this paper, we argue that bipartite graphs are a better alternative to unipartite graphs for describing population structures in evolutionary multiplayer games. To illustrate this point, we make use of bipartite graphs to investigate, by means of computer simulations, the evolution of cooperation under the conventional and the distributed N-person prisoner’s dilemma. We show that several implicit assumptions arising from the standard approach based on unipartite graphs (such as the definition of replacement neighborhoods, the intertwining of individual and group diversity, and the large overlap of interaction neighborhoods) can have a large impact on the resulting evolutionary dynamics. Our work provides a clear example of the importance of construction procedures in games on graphs, of the suitability of bigraphs and hypergraphs for computational modeling, and of the importance of concepts from social network analysis such as centrality, centralization and bipartite clustering for the understanding of dynamical processes occurring on networked population structures. PMID:22970237

  15. Evolutionary Models for Simple Biosystems

    Science.gov (United States)

    Bagnoli, Franco

    The concept of evolutionary development of structures constituted a real revolution in biology: it was possible to understand how the very complex structures of life can arise in an out-of-equilibrium system. The investigation of such systems has shown that indeed, systems under a flux of energy or matter can self-organize into complex patterns, think for instance to Rayleigh-Bernard convection, Liesegang rings, patterns formed by granular systems under shear. Following this line, one could characterize life as a state of matter, characterized by the slow, continuous process that we call evolution. In this paper we try to identify the organizational level of life, that spans several orders of magnitude from the elementary constituents to whole ecosystems. Although similar structures can be found in other contexts like ideas (memes) in neural systems and self-replicating elements (computer viruses, worms, etc.) in computer systems, we shall concentrate on biological evolutionary structure, and try to put into evidence the role and the emergence of network structure in such systems.

  16. Computational Biophysical, Biochemical, and Evolutionary Signature of Human R-Spondin Family Proteins, the Member of Canonical Wnt/β-Catenin Signaling Pathway

    Directory of Open Access Journals (Sweden)

    Ashish Ranjan Sharma

    2014-01-01

    Full Text Available In human, Wnt/β-catenin signaling pathway plays a significant role in cell growth, cell development, and disease pathogenesis. Four human (Rspos are known to activate canonical Wnt/β-catenin signaling pathway. Presently, (Rspos serve as therapeutic target for several human diseases. Henceforth, basic understanding about the molecular properties of (Rspos is essential. We approached this issue by interpreting the biochemical and biophysical properties along with molecular evolution of (Rspos thorough computational algorithm methods. Our analysis shows that signal peptide length is roughly similar in (Rspos family along with similarity in aa distribution pattern. In Rspo3, four N-glycosylation sites were noted. All members are hydrophilic in nature and showed alike GRAVY values, approximately. Conversely, Rspo3 contains the maximum positively charged residues while Rspo4 includes the lowest. Four highly aligned blocks were recorded through Gblocks. Phylogenetic analysis shows Rspo4 is being rooted with Rspo2 and similarly Rspo3 and Rspo1 have the common point of origin. Through phylogenomics study, we developed a phylogenetic tree of sixty proteins (n=60 with the orthologs and paralogs seed sequences. Protein-protein network was also illustrated. Results demonstrated in our study may help the future researchers to unfold significant physiological and therapeutic properties of (Rspos in various disease models.

  17. An Adaptive Evolutionary Algorithm for Traveling Salesman Problem with Precedence Constraints

    Directory of Open Access Journals (Sweden)

    Jinmo Sung

    2014-01-01

    Full Text Available Traveling sales man problem with precedence constraints is one of the most notorious problems in terms of the efficiency of its solution approach, even though it has very wide range of industrial applications. We propose a new evolutionary algorithm to efficiently obtain good solutions by improving the search process. Our genetic operators guarantee the feasibility of solutions over the generations of population, which significantly improves the computational efficiency even when it is combined with our flexible adaptive searching strategy. The efficiency of the algorithm is investigated by computational experiments.

  18. Nanoscopical dissection of ancestral nucleoli in Archaea: a case of study in Evolutionary Cell Biology

    KAUST Repository

    Islas Morales, Parsifal

    2018-01-01

    Evolutionary cell biology (ECB) has raised increasing attention in the last decades. Is this a new discipline and an historical opportunity to combine functional and evolutionary biology towards the insight that cell

  19. Synthesis and characterization of sulfolane-based amino alcohols: A combined experimental and computational study

    Science.gov (United States)

    Palchykov, Vitalii A.; Zarovnaya, Iryna S.; Tretiakov, Serhii V.; Reshetnyak, Alyona V.; Omelchenko, Iryna V.; Shishkin, Oleg V.; Okovytyy, Sergiy I.

    2018-04-01

    Aminolysis of 3,4-epoxysulfolane in aqueous media leads to a very complex mixture of products with unresolved stereochemistry. Herein, we report a detailed theoretical and experimental mechanistic investigation of this reaction along with extensive spectroscopic characterization of the resulting amino alcohols, using 1D and 2D NMR techniques (1H, 13C, NOE, NOESY, COSY, HSQC, HMBC) as well as XRD analysis. In addition to simple amines such as ammonia and benzylamine, our study also employed the more sterically hindered endo-bicyclo[2.2.1]hept-5-en-2-ylmethanamine. The mechanism of the aminolysis of 3,4-epoxysulfolane by aqueous ammonia was studied in more detail using quantum chemical calculations at the M06-2X/6-31++G** level of theory. The computational results led us to conclude that the most probable way of initial epoxide transformation is base-catalyzed rearrangement to a corresponding allylic alcohol. Subsequent formation of vicinal amino alcohols and diols proceeds via addition of ammonia or hydroxy-anions to activated double Cdbnd C bond with some preference of a cis-attack. Detailed analytical data obtained in the course of our work will be useful for the stereochemical identification of new sulfolane derivatives.

  20. Uncovering stability mechanisms in microbial ecosystems - combining microcosm experiments, computational modelling and ecological theory in a multidisciplinary approach

    Science.gov (United States)

    Worrich, Anja; König, Sara; Banitz, Thomas; Centler, Florian; Frank, Karin; Kästner, Matthias; Miltner, Anja; Thullner, Martin; Wick, Lukas

    2015-04-01

    Although bacterial degraders in soil are commonly exposed to fluctuating environmental conditions, the functional performance of the biodegradation processes can often be maintained by resistance and resilience mechanisms. However, there is still a gap in the mechanistic understanding of key factors contributing to the stability of such an ecosystem service. Therefore we developed an integrated approach combining microcosm experiments, simulation models and ecological theory to directly make use of the strengths of these disciplines. In a continuous interplay process, data, hypotheses, and central questions are exchanged between disciplines to initiate new experiments and models to ultimately identify buffer mechanisms and factors providing functional stability. We focus on drying and rewetting-cycles in soil ecosystems, which are a major abiotic driver for bacterial activity. Functional recovery of the system was found to depend on different spatial processes in the computational model. In particular, bacterial motility is a prerequisite for biodegradation if either bacteria or substrate are heterogeneously distributed. Hence, laboratory experiments focussing on bacterial dispersal processes were conducted and confirmed this finding also for functional resistance. Obtained results will be incorporated into the model in the next step. Overall, the combination of computational modelling and laboratory experiments identified spatial processes as the main driving force for functional stability in the considered system, and has proved a powerful methodological approach.

  1. Steroid Hydroxylation by Basidiomycete Peroxygenases: a Combined Experimental and Computational Study

    Science.gov (United States)

    Babot, Esteban D.; del Río, José C.; Cañellas, Marina; Sancho, Ferran; Lucas, Fátima; Guallar, Víctor; Kalum, Lisbeth; Lund, Henrik; Gröbe, Glenn; Scheibner, Katrin; Ullrich, René; Hofrichter, Martin; Martínez, Angel T.

    2015-01-01

    The goal of this study is the selective oxyfunctionalization of steroids under mild and environmentally friendly conditions using fungal enzymes. With this purpose, peroxygenases from three basidiomycete species were tested for the hydroxylation of a variety of steroidal compounds, using H2O2 as the only cosubstrate. Two of them are wild-type enzymes from Agrocybe aegerita and Marasmius rotula, and the third one is a recombinant enzyme from Coprinopsis cinerea. The enzymatic reactions on free and esterified sterols, steroid hydrocarbons, and ketones were monitored by gas chromatography, and the products were identified by mass spectrometry. Hydroxylation at the side chain over the steroidal rings was preferred, with the 25-hydroxyderivatives predominating. Interestingly, antiviral and other biological activities of 25-hydroxycholesterol have been reported recently (M. Blanc et al., Immunity 38:106–118, 2013, http://dx.doi.org/10.1016/j.immuni.2012.11.004). However, hydroxylation in the ring moiety and terminal hydroxylation at the side chain also was observed in some steroids, the former favored by the absence of oxygenated groups at C-3 and by the presence of conjugated double bonds in the rings. To understand the yield and selectivity differences between the different steroids, a computational study was performed using Protein Energy Landscape Exploration (PELE) software for dynamic ligand diffusion. These simulations showed that the active-site geometry and hydrophobicity favors the entrance of the steroid side chain, while the entrance of the ring is energetically penalized. Also, a direct correlation between the conversion rate and the side chain entrance ratio could be established that explains the various reaction yields observed. PMID:25862224

  2. Electronic nature of zwitterionic alkali metal methanides, silanides and germanides - a combined experimental and computational approach.

    Science.gov (United States)

    Li, H; Aquino, A J A; Cordes, D B; Hase, W L; Krempner, C

    2017-02-01

    Zwitterionic group 14 complexes of the alkali metals of formula [C(SiMe 2 OCH 2 CH 2 OMe) 3 M], (M- 1 ), [Si(SiMe 2 OCH 2 CH 2 OMe) 3 M], (M- 2 ), [Ge(SiMe 2 OCH 2 CH 2 OMe) 3 M], (M- 3 ), where M = Li, Na or K, have been prepared, structurally characterized and their electronic nature was investigated by computational methods. Zwitterions M- 2 and M- 3 were synthesized via reactions of [Si(SiMe 2 OCH 2 CH 2 OMe) 4 ] ( 2 ) and [Ge(SiMe 2 OCH 2 CH 2 OMe) 4 ] ( 3 ) with MOBu t (M = Li, Na or K), resp., in almost quantitative yields, while M- 1 were prepared from deprotonation of [HC(SiMe 2 OCH 2 CH 2 OMe) 3 ] ( 1 ) with LiBu t , NaCH 2 Ph and KCH 2 Ph, resp. X-ray crystallographic studies and DFT calculations in the gas-phase, including calculations of the NPA charges confirm the zwitterionic nature of these compounds, with the alkali metal cations being rigidly locked and charge separated from the anion by the internal OCH 2 CH 2 OMe donor groups. Natural bond orbital (NBO) analysis and the second order perturbation theory analysis of the NBOs reveal significant hyperconjugative interactions in M- 1 -M- 3 , primarily between the lone pair and the antibonding Si-O orbitals, the extent of which decreases in the order M- 1 > M- 2 > M- 3 . The experimental basicities and the calculated gas-phase basicities of M- 1 -M- 3 reveal the zwitterionic alkali metal methanides M- 1 to be significantly stronger bases than the analogous silanides M- 2 and germanium M- 3 .

  3. A Combined Experimental and Computational Approach to Subject-Specific Analysis of Knee Joint Laxity

    Science.gov (United States)

    Harris, Michael D.; Cyr, Adam J.; Ali, Azhar A.; Fitzpatrick, Clare K.; Rullkoetter, Paul J.; Maletsky, Lorin P.; Shelburne, Kevin B.

    2016-01-01

    Modeling complex knee biomechanics is a continual challenge, which has resulted in many models of varying levels of quality, complexity, and validation. Beyond modeling healthy knees, accurately mimicking pathologic knee mechanics, such as after cruciate rupture or meniscectomy, is difficult. Experimental tests of knee laxity can provide important information about ligament engagement and overall contributions to knee stability for development of subject-specific models to accurately simulate knee motion and loading. Our objective was to provide combined experimental tests and finite-element (FE) models of natural knee laxity that are subject-specific, have one-to-one experiment to model calibration, simulate ligament engagement in agreement with literature, and are adaptable for a variety of biomechanical investigations (e.g., cartilage contact, ligament strain, in vivo kinematics). Calibration involved perturbing ligament stiffness, initial ligament strain, and attachment location until model-predicted kinematics and ligament engagement matched experimental reports. Errors between model-predicted and experimental kinematics averaged ligaments agreed with literature descriptions. These results demonstrate the ability of our constraint models to be customized for multiple individuals and simultaneously call attention to the need to verify that ligament engagement is in good general agreement with literature. To facilitate further investigations of subject-specific or population based knee joint biomechanics, data collected during the experimental and modeling phases of this study are available for download by the research community. PMID:27306137

  4. Implementation of combined SVM-algorithm and computer-aided perception feedback for pulmonary nodule detection

    Science.gov (United States)

    Pietrzyk, Mariusz W.; Rannou, Didier; Brennan, Patrick C.

    2012-02-01

    This pilot study examines the effect of a novel decision support system in medical image interpretation. This system is based on combining image spatial frequency properties and eye-tracking data in order to recognize over and under calling errors. Thus, before it can be implemented as a detection aided schema, training is required during which SVMbased algorithm learns to recognize FP from all reported outcomes, and, FN from all unreported prolonged dwelled regions. Eight radiologists inspected 50 PA chest radiographs with the specific task of identifying lung nodules. Twentyfive cases contained CT proven subtle malignant lesions (5-20mm), but prevalence was not known by the subjects, who took part in two sequential reading sessions, the second, without and with support system feedback. MCMR ROC DBM and JAFROC analyses were conducted and demonstrated significantly higher scores following feedback with p values of 0.04, and 0.03 respectively, highlighting significant improvements in radiology performance once feedback was used. This positive effect on radiologists' performance might have important implications for future CAD-system development.

  5. Optimizing computed tomography pulmonary angiography using right atrium bolus monitoring combined with spontaneous respiration

    Energy Technology Data Exchange (ETDEWEB)

    Min, Wang; Jian, Li; Rui, Zhai [Jining No. 1 People' s Hospital, Department of Computed Tomography, Jining City, ShanDong Province (China); Wen, Li [Jining No. 1 People' s Hospital, Department of Gastroenterology, Jining, ShanDong (China); Dai, Lun-Hou [Shandong Chest Hospital, Department of Radiology, Jinan, ShanDong (China)

    2015-09-15

    CT pulmonary angiography (CTPA) aims to provide pulmonary arterial opacification in the absence of significant pulmonary venous filling. This requires accurate timing of the imaging acquisition to ensure synchronization with the peak pulmonary artery contrast concentration. This study was designed to test the utility of right atrium (RA) monitoring in ensuring optimal timing of CTPA acquisition. Sixty patients referred for CTPA were divided into two groups. Group A (n = 30): CTPA was performed using bolus triggering from the pulmonary trunk, suspended respiration and 70 ml of contrast agent (CA). Group B (n = 30): CTPA image acquisition was triggered using RA monitoring with spontaneous respiration and 40 ml of CA. Image quality was compared. Subjective image quality, average CT values of pulmonary arteries and density difference between artery and vein pairs were significantly higher whereas CT values of pulmonary veins were significantly lower in group B (all P < 0.05). There was no significant difference between the groups in the proportion of subjects where sixth grade pulmonary arteries were opacified (P > 0.05). RA monitoring combined with spontaneous respiration to trigger image acquisition in CTPA produces optimal contrast enhancement in pulmonary arterial structures with minimal venous filling even with reduced doses of CA. (orig.)

  6. Predicting Short-Term Electricity Demand by Combining the Advantages of ARMA and XGBoost in Fog Computing Environment

    Directory of Open Access Journals (Sweden)

    Chuanbin Li

    2018-01-01

    Full Text Available With the rapid development of IoT, the disadvantages of Cloud framework have been exposed, such as high latency, network congestion, and low reliability. Therefore, the Fog Computing framework has emerged, with an extended Fog Layer between the Cloud and terminals. In order to address the real-time prediction on electricity demand, we propose an approach based on XGBoost and ARMA in Fog Computing environment. By taking the advantages of Fog Computing framework, we first propose a prototype-based clustering algorithm to divide enterprise users into several categories based on their total electricity consumption; we then propose a model selection approach by analyzing users’ historical records of electricity consumption and identifying the most important features. Generally speaking, if the historical records pass the test of stationarity and white noise, ARMA is used to model the user’s electricity consumption in time sequence; otherwise, if the historical records do not pass the test, and some discrete features are the most important, such as weather and whether it is weekend, XGBoost will be used. The experiment results show that our proposed approach by combining the advantage of ARMA and XGBoost is more accurate than the classical models.

  7. The Raman spectrum of CaCO{sub 3} polymorphs calcite and aragonite: A combined experimental and computational study

    Energy Technology Data Exchange (ETDEWEB)

    De La Pierre, Marco, E-mail: cedric.carteret@univ-lorraine.fr, E-mail: marco.delapierre@unito.it; Maschio, Lorenzo; Orlando, Roberto; Dovesi, Roberto [Dipartimento di Chimica, Università di Torino and NIS (Nanostructured Interfaces and Surfaces) Centre of Excellence, Via P. Giuria 7, 10125 Torino (Italy); Carteret, Cédric, E-mail: cedric.carteret@univ-lorraine.fr, E-mail: marco.delapierre@unito.it; André, Erwan [Laboratoire de Chimie Physique et Microbiologie pour l’Environnement (LCPME), UMR 7564, Université de Lorraine-CNRS, 405 rue de Vandoeuvre, 54601 Villers-lès-Nancy (France)

    2014-04-28

    Powder and single crystal Raman spectra of the two most common phases of calcium carbonate are calculated with ab initio techniques (using a “hybrid” functional and a Gaussian-type basis set) and measured both at 80 K and room temperature. Frequencies of the Raman modes are in very good agreement between calculations and experiments: the mean absolute deviation at 80 K is 4 and 8 cm{sup −1} for calcite and aragonite, respectively. As regards intensities, the agreement is in general good, although the computed values overestimate the measured ones in many cases. The combined analysis permits to identify almost all the fundamental experimental Raman peaks of the two compounds, with the exception of either modes with zero computed intensity or modes overlapping with more intense peaks. Additional peaks have been identified in both calcite and aragonite, which have been assigned to {sup 18}O satellite modes or overtones. The agreement between the computed and measured spectra is quite satisfactory; in particular, simulation permits to clearly distinguish between calcite and aragonite in the case of powder spectra, and among different polarization directions of each compound in the case of single crystal spectra.

  8. Combined Model of Intrinsic and Extrinsic Variability for Computational Network Design with Application to Synthetic Biology

    Science.gov (United States)

    Toni, Tina; Tidor, Bruce

    2013-01-01

    Biological systems are inherently variable, with their dynamics influenced by intrinsic and extrinsic sources. These systems are often only partially characterized, with large uncertainties about specific sources of extrinsic variability and biochemical properties. Moreover, it is not yet well understood how different sources of variability combine and affect biological systems in concert. To successfully design biomedical therapies or synthetic circuits with robust performance, it is crucial to account for uncertainty and effects of variability. Here we introduce an efficient modeling and simulation framework to study systems that are simultaneously subject to multiple sources of variability, and apply it to make design decisions on small genetic networks that play a role of basic design elements of synthetic circuits. Specifically, the framework was used to explore the effect of transcriptional and post-transcriptional autoregulation on fluctuations in protein expression in simple genetic networks. We found that autoregulation could either suppress or increase the output variability, depending on specific noise sources and network parameters. We showed that transcriptional autoregulation was more successful than post-transcriptional in suppressing variability across a wide range of intrinsic and extrinsic magnitudes and sources. We derived the following design principles to guide the design of circuits that best suppress variability: (i) high protein cooperativity and low miRNA cooperativity, (ii) imperfect complementarity between miRNA and mRNA was preferred to perfect complementarity, and (iii) correlated expression of mRNA and miRNA – for example, on the same transcript – was best for suppression of protein variability. Results further showed that correlations in kinetic parameters between cells affected the ability to suppress variability, and that variability in transient states did not necessarily follow the same principles as variability in the steady

  9. Diagnostic accuracy of combined coronary angiography and adenosine stress myocardial perfusion imaging using 320-detector computed tomography: pilot study

    Energy Technology Data Exchange (ETDEWEB)

    Nasis, Arthur; Ko, Brian S.; Leung, Michael C.; Antonis, Paul R.; Wong, Dennis T.; Kyi, Leo; Cameron, James D.; Meredith, Ian T.; Seneviratne, Sujith K. [Southern Health and Monash University, Monash Cardiovascular Research Centre, Monash Heart, Department of Medicine Monash Medical Centre (MMC), Melbourne (Australia); Nandurkar, Dee; Troupis, John M. [MMC, Southern Health, Department of Diagnostic Imaging, Melbourne (Australia)

    2013-07-15

    To determine the diagnostic accuracy of combined 320-detector row computed tomography coronary angiography (CTA) and adenosine stress CT myocardial perfusion imaging (CTP) in detecting perfusion abnormalities caused by obstructive coronary artery disease (CAD). Twenty patients with suspected CAD who underwent initial investigation with single-photon-emission computed tomography myocardial perfusion imaging (SPECT-MPI) were recruited and underwent prospectively-gated 320-detector CTA/CTP and invasive angiography. Two blinded cardiologists evaluated invasive angiography images quantitatively (QCA). A blinded nuclear physician analysed SPECT-MPI images for fixed and reversible perfusion defects. Two blinded cardiologists assessed CTA/CTP studies qualitatively. Vessels/territories with both >50 % stenosis on QCA and corresponding perfusion defect on SPECT-MPI were defined as ischaemic and formed the reference standard. All patients completed the CTA/CTP protocol with diagnostic image quality. Of 60 vessels/territories, 17 (28 %) were ischaemic according to QCA/SPECT-MPI criteria. Sensitivity, specificity, PPV, NPV and area under the ROC curve for CTA/CTP was 94 %, 98 %, 94 %, 98 % and 0.96 (P < 0.001) on a per-vessel/territory basis. Mean CTA/CTP radiation dose was 9.2 {+-} 7.4 mSv compared with 13.2 {+-} 2.2 mSv for SPECT-MPI (P < 0.001). Combined 320-detector CTA/CTP is accurate in identifying obstructive CAD causing perfusion abnormalities compared with combined QCA/SPECT-MPI, achieved with lower radiation dose than SPECT-MPI. (orig.)

  10. Diagnostic accuracy of combined coronary angiography and adenosine stress myocardial perfusion imaging using 320-detector computed tomography: pilot study

    International Nuclear Information System (INIS)

    Nasis, Arthur; Ko, Brian S.; Leung, Michael C.; Antonis, Paul R.; Wong, Dennis T.; Kyi, Leo; Cameron, James D.; Meredith, Ian T.; Seneviratne, Sujith K.; Nandurkar, Dee; Troupis, John M.

    2013-01-01

    To determine the diagnostic accuracy of combined 320-detector row computed tomography coronary angiography (CTA) and adenosine stress CT myocardial perfusion imaging (CTP) in detecting perfusion abnormalities caused by obstructive coronary artery disease (CAD). Twenty patients with suspected CAD who underwent initial investigation with single-photon-emission computed tomography myocardial perfusion imaging (SPECT-MPI) were recruited and underwent prospectively-gated 320-detector CTA/CTP and invasive angiography. Two blinded cardiologists evaluated invasive angiography images quantitatively (QCA). A blinded nuclear physician analysed SPECT-MPI images for fixed and reversible perfusion defects. Two blinded cardiologists assessed CTA/CTP studies qualitatively. Vessels/territories with both >50 % stenosis on QCA and corresponding perfusion defect on SPECT-MPI were defined as ischaemic and formed the reference standard. All patients completed the CTA/CTP protocol with diagnostic image quality. Of 60 vessels/territories, 17 (28 %) were ischaemic according to QCA/SPECT-MPI criteria. Sensitivity, specificity, PPV, NPV and area under the ROC curve for CTA/CTP was 94 %, 98 %, 94 %, 98 % and 0.96 (P < 0.001) on a per-vessel/territory basis. Mean CTA/CTP radiation dose was 9.2 ± 7.4 mSv compared with 13.2 ± 2.2 mSv for SPECT-MPI (P < 0.001). Combined 320-detector CTA/CTP is accurate in identifying obstructive CAD causing perfusion abnormalities compared with combined QCA/SPECT-MPI, achieved with lower radiation dose than SPECT-MPI. (orig.)

  11. Evolutionary Algorithms Application Analysis in Biometric Systems

    Directory of Open Access Journals (Sweden)

    N. Goranin

    2010-01-01

    Full Text Available Wide usage of biometric information for person identity verification purposes, terrorist acts prevention measures and authenticationprocess simplification in computer systems has raised significant attention to reliability and efficiency of biometricsystems. Modern biometric systems still face many reliability and efficiency related issues such as reference databasesearch speed, errors while recognizing of biometric information or automating biometric feature extraction. Current scientificinvestigations show that application of evolutionary algorithms may significantly improve biometric systems. In thisarticle we provide a comprehensive review of main scientific research done in sphere of evolutionary algorithm applicationfor biometric system parameter improvement.

  12. Langley's CSI evolutionary model: Phase O

    Science.gov (United States)

    Belvin, W. Keith; Elliott, Kenny B.; Horta, Lucas G.; Bailey, Jim P.; Bruner, Anne M.; Sulla, Jeffrey L.; Won, John; Ugoletti, Roberto M.

    1991-01-01

    A testbed for the development of Controls Structures Interaction (CSI) technology to improve space science platform pointing is described. The evolutionary nature of the testbed will permit the study of global line-of-sight pointing in phases 0 and 1, whereas, multipayload pointing systems will be studied beginning with phase 2. The design, capabilities, and typical dynamic behavior of the phase 0 version of the CSI evolutionary model (CEM) is documented for investigator both internal and external to NASA. The model description includes line-of-sight pointing measurement, testbed structure, actuators, sensors, and real time computers, as well as finite element and state space models of major components.

  13. Genomes, Phylogeny, and Evolutionary Systems Biology

    Energy Technology Data Exchange (ETDEWEB)

    Medina, Monica

    2005-03-25

    With the completion of the human genome and the growing number of diverse genomes being sequenced, a new age of evolutionary research is currently taking shape. The myriad of technological breakthroughs in biology that are leading to the unification of broad scientific fields such as molecular biology, biochemistry, physics, mathematics and computer science are now known as systems biology. Here I present an overview, with an emphasis on eukaryotes, of how the postgenomics era is adopting comparative approaches that go beyond comparisons among model organisms to shape the nascent field of evolutionary systems biology.

  14. Towards Automatic Controller Design using Multi-Objective Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf

    of evolutionary computation, a choice was made to use multi-objective algorithms for the purpose of aiding in automatic controller design. More specifically, the choice was made to use the Non-dominated Sorting Genetic Algorithm II (NSGAII), which is one of the most potent algorithms currently in use...... for automatic controller design. However, because the field of evolutionary computation is relatively unknown in the field of control engineering, this thesis also includes a comprehensive introduction to the basic field of evolutionary computation as well as a description of how the field has previously been......In order to design the controllers of tomorrow, a need has risen for tools that can aid in the design of these. A desire to use evolutionary computation as a tool to achieve that goal is what gave inspiration for the work contained in this thesis. After having studied the foundations...

  15. Proteomics in evolutionary ecology.

    Science.gov (United States)

    Baer, B; Millar, A H

    2016-03-01

    Evolutionary ecologists are traditionally gene-focused, as genes propagate phenotypic traits across generations and mutations and recombination in the DNA generate genetic diversity required for evolutionary processes. As a consequence, the inheritance of changed DNA provides a molecular explanation for the functional changes associated with natural selection. A direct focus on proteins on the other hand, the actual molecular agents responsible for the expression of a phenotypic trait, receives far less interest from ecologists and evolutionary biologists. This is partially due to the central dogma of molecular biology that appears to define proteins as the 'dead-end of molecular information flow' as well as technical limitations in identifying and studying proteins and their diversity in the field and in many of the more exotic genera often favored in ecological studies. Here we provide an overview of a newly forming field of research that we refer to as 'Evolutionary Proteomics'. We point out that the origins of cellular function are related to the properties of polypeptide and RNA and their interactions with the environment, rather than DNA descent, and that the critical role of horizontal gene transfer in evolution is more about coopting new proteins to impact cellular processes than it is about modifying gene function. Furthermore, post-transcriptional and post-translational processes generate a remarkable diversity of mature proteins from a single gene, and the properties of these mature proteins can also influence inheritance through genetic and perhaps epigenetic mechanisms. The influence of post-transcriptional diversification on evolutionary processes could provide a novel mechanistic underpinning for elements of rapid, directed evolutionary changes and adaptations as observed for a variety of evolutionary processes. Modern state-of the art technologies based on mass spectrometry are now available to identify and quantify peptides, proteins, protein

  16. Thermodynamic properties of alkyl 1H-indole carboxylate derivatives: A combined experimental and computational study

    International Nuclear Information System (INIS)

    Carvalho, Tânia M.T.; Amaral, Luísa M.P.F.; Morais, Victor M.F.; Ribeiro da Silva, Maria D.M.C.

    2016-01-01

    Highlights: • Combustion of methyl 1H-indole-3-carboxylate and ethyl 1H-indole-2-carboxylate by static bomb calorimetry. • The Knudsen mass-loss effusion technique was used to measure the vapour pressures of compounds at different temperatures. • Enthalpies of sublimation of methyl 1H-indole-3-carboxylate and ethyl 1H-indole-2-carboxylate. • Gas-phase enthalpies of formation of methyl 1H-indole-3-carboxylate and ethyl 1H-indole-2-carboxylate have been derived. • Gas-phase enthalpies of formation estimated from G3(MP2) calculations. - Abstract: The standard (p"o = 0.1 MPa) molar enthalpies of formation, in the crystalline phase, of methyl 1H-indole-3-carboxylate and ethyl 1H-indole-2-carboxylate, at T = 298.15 K, were derived from measurements of the standard massic energies of combustion using a static bomb combustion calorimeter. The Knudsen effusion technique was used to measure the vapour pressures as a function of the temperature, which allowed determining the standard molar enthalpies of sublimation of these compounds. The standard (p"o = 0.1 MPa) molar enthalpies of formation, in the gaseous phase, at T = 298.15 K, were calculated by combining, for each compound, the standard molar enthalpy of formation, in the crystalline phase, and the standard molar enthalpy of sublimation, yielding −(207.6 ± 3.6) kJ·mol"−"1 and −(234.4 ± 2.4) kJ·mol"−"1, for methyl 1H-indole-3-carboxylate and ethyl 1H-indole-2-carboxylate, respectively. Quantum chemical studies were also conducted, in order to complement the experimental study. The gas-phase enthalpies of formation were estimated from high level ab initio molecular orbital calculations, at the G3(MP2) level, for the compounds studied experimentally, extending the study to the methyl 1H-indole-2-carboxylate and ethyl 1H-indole-3-carboxylate. The results obtained were compared with the experimental data and were also analysed in terms of structural enthalpic group contributions.

  17. Combining Human Computing and Machine Learning to Make Sense of Big (Aerial) Data for Disaster Response.

    Science.gov (United States)

    Ofli, Ferda; Meier, Patrick; Imran, Muhammad; Castillo, Carlos; Tuia, Devis; Rey, Nicolas; Briant, Julien; Millet, Pauline; Reinhard, Friedrich; Parkan, Matthew; Joost, Stéphane

    2016-03-01

    results suggest that the platform we have developed to combine crowdsourcing and machine learning to make sense of large volumes of aerial images can be used for disaster response.

  18. Protein thermal stability enhancement by designing salt bridges: a combined computational and experimental study.

    Directory of Open Access Journals (Sweden)

    Chi-Wen Lee

    Full Text Available Protein thermal stability is an important factor considered in medical and industrial applications. Many structural characteristics related to protein thermal stability have been elucidated, and increasing salt bridges is considered as one of the most efficient strategies to increase protein thermal stability. However, the accurate simulation of salt bridges remains difficult. In this study, a novel method for salt-bridge design was proposed based on the statistical analysis of 10,556 surface salt bridges on 6,493 X-ray protein structures. These salt bridges were first categorized based on pairing residues, secondary structure locations, and Cα-Cα distances. Pairing preferences generalized from statistical analysis were used to construct a salt-bridge pair index and utilized in a weighted electrostatic attraction model to find the effective pairings for designing salt bridges. The model was also coupled with B-factor, weighted contact number, relative solvent accessibility, and conservation prescreening to determine the residues appropriate for the thermal adaptive design of salt bridges. According to our method, eight putative salt-bridges were designed on a mesophilic β-glucosidase and 24 variants were constructed to verify the predictions. Six putative salt-bridges leaded to the increase of the enzyme thermal stability. A significant increase in melting temperature of 8.8, 4.8, 3.7, 1.3, 1.2, and 0.7°C of the putative salt-bridges N437K-D49, E96R-D28, E96K-D28, S440K-E70, T231K-D388, and Q277E-D282 was detected, respectively. Reversing the polarity of T231K-D388 to T231D-D388K resulted in a further increase in melting temperatures by 3.6°C, which may be caused by the transformation of an intra-subunit electrostatic interaction into an inter-subunit one depending on the local environment. The combination of the thermostable variants (N437K, E96R, T231D and D388K generated a melting temperature increase of 15.7°C. Thus, this study

  19. Concealed nuclear material identification via combined fast-neutron/γ-ray computed tomography (FNGCT): a Monte Carlo study

    Science.gov (United States)

    Licata, M.; Joyce, M. J.

    2018-02-01

    The potential of a combined and simultaneous fast-neutron/γ-ray computed tomography technique using Monte Carlo simulations is described. This technique is applied on the basis of a hypothetical tomography system comprising an isotopic radiation source (americium-beryllium) and a number (13) of organic scintillation detectors for the production and detection of both fast neutrons and γ rays, respectively. Via a combination of γ-ray and fast neutron tomography the potential is demonstrated to discern nuclear materials, such as compounds comprising plutonium and uranium, from substances that are used widely for neutron moderation and shielding. This discrimination is achieved on the basis of the difference in the attenuation characteristics of these substances. Discrimination of a variety of nuclear material compounds from shielding/moderating substances (the latter comprising lead or polyethylene for example) is shown to be challenging when using either γ-ray or neutron tomography in isolation of one another. Much-improved contrast is obtained for a combination of these tomographic modalities. This method has potential applications for in-situ, non-destructive assessments in nuclear security, safeguards, waste management and related requirements in the nuclear industry.

  20. Preoperative computed tomography for determining nodal status combined with histologic grading as a prognostic factor for patients with tongue carcinoma

    International Nuclear Information System (INIS)

    Ogura, Ichiro; Kurabayashi, Tohru; Amagasa, Teruo; Iwaki, Hiroshi; Sasaki, Takehito

    2001-01-01

    The purpose of this study was to evaluate the predictive value of preoperative neck computed tomography (CT) in combination with histologic grading as a prognostic factor for patients with tongue carcinoma. Fifty-five patients with squamous cell carcinoma of the tongue were examined by CT prior to radical neck dissection. The locoregional failure and survival rates of these patients were analyzed in relation to their clinical characteristics, histologic grading (World Health Organization, WHO) based on tongue biopsy, and imaging diagnoses prior to surgery. Logistic multivariate regression analysis showed that both histologic grading and number of metastatic lymph nodes on CT were significant and independent prognostic factors in locoregional failure (p=0.009 and p=0.009, respectively). When the numebr of metastatic lymph nodes detected on preoperative neck CT were combined with the histologic grading for the evaluation, the five-year overall survival rates of A group (0 node with any Grade, or 1 node with Grade I-II) and B group (1 node with Grade III, or 2 or more nodes with any Grade) were 74.5% and 37.5%, respectively (p=0.001). The difference was more significant than histologic grading alone or the number of metastatic lymph nodes seen on CT alone. The combination of preoperative neck CT with histologic grading of the primary tumor is useful as a prognostic indicator for patients with tongue carcinoma. (author)

  1. Face Alignment Using Boosting and Evolutionary Search

    NARCIS (Netherlands)

    Zhang, Hua; Liu, Duanduan; Poel, Mannes; Nijholt, Antinus; Zha, H.; Taniguchi, R.-I.; Maybank, S.

    2010-01-01

    In this paper, we present a face alignment approach using granular features, boosting, and an evolutionary search algorithm. Active Appearance Models (AAM) integrate a shape-texture-combined morphable face model into an efficient fitting strategy, then Boosting Appearance Models (BAM) consider the

  2. A comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements.

    Science.gov (United States)

    Abdelgaied, A; Fisher, J; Jennings, L M

    2018-02-01

    A more robust pre-clinical wear simulation framework is required in order to simulate wider and higher ranges of activities, observed in different patient populations such as younger more active patients. Such a framework will help to understand and address the reported higher failure rates for younger and more active patients (National_Joint_Registry, 2016). The current study has developed and validated a comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements (TKR). The input mechanical (elastic modulus and Poisson's ratio) and wear parameters of the moderately cross-linked ultra-high molecular weight polyethylene (UHMWPE) bearing material were independently measured from experimental studies under realistic test conditions, similar to the loading conditions found in the total knee replacements. The wear predictions from the computational wear simulation were validated against the direct experimental wear measurements for size 3 Sigma curved total knee replacements (DePuy, UK) in an independent experimental wear simulation study under three different daily activities; walking, deep squat, and stairs ascending kinematic conditions. The measured compressive mechanical properties of the moderately cross-linked UHMWPE material were more than 20% lower than that reported in the literature under tensile test conditions. The pin-on-plate wear coefficient of moderately cross-linked UHMWPE was significantly dependant of the contact stress and the degree of cross-shear at the articulating surfaces. The computational wear predictions for the TKR from the current framework were consistent and in a good agreement with the independent full TKR experimental wear simulation measurements, with 0.94 coefficient of determination of the framework. In addition, the comprehensive combined experimental and computational framework was able to explain the complex experimental wear trends from the three different daily

  3. IMACS 󈨟: Proceedings of the IMACS World Congress on Computation and Applied Mathematics (13th) Held in Dublin, Ireland on July 22-26, 1991. Volume 2. Computational Fluid Dynamics and Wave Propagation, Parallel Computing, Concurrent and Supercomputing, Computational Physics/Computational Chemistry and Evolutionary Systems

    Science.gov (United States)

    1991-01-01

    Computation 14, 1000. sensible to allow-a small networ,’ to grow 𔃻 uring ear!y training, until a 27 XViI Pres, Bil-FMonnery SA Teukoisky, &VWT...Tecnologia Fot6nica, ETSI Telecomunicaci6n, Ciudad Universitaria- 28040 Madrid Spain Abstract.- Modelling of ferroelectric liquid crystal The optical

  4. Archaeogenetics in evolutionary medicine.

    Science.gov (United States)

    Bouwman, Abigail; Rühli, Frank

    2016-09-01

    Archaeogenetics is the study of exploration of ancient DNA (aDNA) of more than 70 years old. It is an important part of the wider studies of many different areas of our past, including animal, plant and pathogen evolution and domestication events. Hereby, we address specifically the impact of research in archaeogenetics in the broader field of evolutionary medicine. Studies on ancient hominid genomes help to understand even modern health patterns. Human genetic microevolution, e.g. related to abilities of post-weaning milk consumption, and specifically genetic adaptation in disease susceptibility, e.g. towards malaria and other infectious diseases, are of the upmost importance in contributions of archeogenetics on the evolutionary understanding of human health and disease. With the increase in both the understanding of modern medical genetics and the ability to deep sequence ancient genetic information, the field of archaeogenetic evolutionary medicine is blossoming.

  5. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  6. Regularized iterative integration combined with non-linear diffusion filtering for phase-contrast x-ray computed tomography.

    Science.gov (United States)

    Burger, Karin; Koehler, Thomas; Chabior, Michael; Allner, Sebastian; Marschner, Mathias; Fehringer, Andreas; Willner, Marian; Pfeiffer, Franz; Noël, Peter

    2014-12-29

    Phase-contrast x-ray computed tomography has a high potential to become clinically implemented because of its complementarity to conventional absorption-contrast.In this study, we investigate noise-reducing but resolution-preserving analytical reconstruction methods to improve differential phase-contrast imaging. We apply the non-linear Perona-Malik filter on phase-contrast data prior or post filtered backprojected reconstruction. Secondly, the Hilbert kernel is replaced by regularized iterative integration followed by ramp filtered backprojection as used for absorption-contrast imaging. Combining the Perona-Malik filter with this integration algorithm allows to successfully reveal relevant sample features, quantitatively confirmed by significantly increased structural similarity indices and contrast-to-noise ratios. With this concept, phase-contrast imaging can be performed at considerably lower dose.

  7. Three-dimensional SPECT [single photon emission computed tomography] reconstruction of combined cone beam and parallel beam data

    International Nuclear Information System (INIS)

    Jaszczak, R.J.; Jianying Li; Huili Wang; Coleman, R.E.

    1992-01-01

    Single photon emission computed tomography (SPECT) using cone beam (CB) collimation exhibits increased sensitivity compared with acquisition geometries using parallel (P) hole collimation. However, CB collimation has a smaller field-of-view which may result in truncated projections and image artifacts. A primary objective of this work is to investigate maximum likelihood-expectation maximization (ML-EM) methods to reconstruct simultaneously acquired parallel and cone beam (P and CB) SPECT data. Simultaneous P and CB acquisition can be performed with commercially available triple camera systems by using two cone-beam collimators and a single parallel-hole collimator. The loss in overall sensitivity (relative to the use of three CB collimators) is about 15 to 20%. The authors have developed three methods to combine P and CB data using modified ML-EM algorithms. (author)

  8. Computer modelling of the combined effects of plant conditions and coal quality on burnout in utility furnaces

    Energy Technology Data Exchange (ETDEWEB)

    P. Stephenson [RWE npower Engineering, Swindon (United Kingdom)

    2007-09-15

    The aim of this paper is to describe the latest steps in the development of a computer model to predict the combined effects of plant conditions and coal quality on burnout. The work was conducted as part of RWE's contribution to the recent ECSC project 'Development of a carbon-in-ash notification system (CARNO)'. A burnout predictor code has been developed and validated; it includes both coal and plant effects and includes a burnout model based closely on CBK8. The agreement between predicted C-in-ash and plant data is encouraging, but further improvements are still desirable. The predictions obtained from the burnout predictor show that the calculated sensitivities to changes in plant condition can be very dependent on state of plant. 7 refs., 7 figs., 1 tab.

  9. Evolutionary optimization of production materials workflow processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    We present an evolutionary optimisation technique for stochastic production processes, which is able to find improved production materials workflow processes with respect to arbitrary combinations of numerical quantities associated with the production process. Working from a core fragment...... of the BPMN language, we employ an evolutionary algorithm where stochastic model checking is used as a fitness function to determine the degree of improvement of candidate processes derived from the original process through mutation and cross-over operations. We illustrate this technique using a case study...

  10. Evolutionary thinking: "A conversation with Carter Phipps about the role of evolutionary thinking in modern culture".

    Science.gov (United States)

    Hunt, Tam

    2014-12-01

    Evolution as an idea has a lengthy history, even though the idea of evolution is generally associated with Darwin today. Rebecca Stott provides an engaging and thoughtful overview of this history of evolutionary thinking in her 2013 book, Darwin's Ghosts: The Secret History of Evolution. Since Darwin, the debate over evolution-both how it takes place and, in a long war of words with religiously-oriented thinkers, whether it takes place-has been sustained and heated. A growing share of this debate is now devoted to examining how evolutionary thinking affects areas outside of biology. How do our lives change when we recognize that all is in flux? What can we learn about life more generally if we study change instead of stasis? Carter Phipps' book, Evolutionaries: Unlocking the Spiritual and Cultural Potential of Science's Greatest Idea, delves deep into this relatively new development. Phipps generally takes as a given the validity of the Modern Synthesis of evolutionary biology. His story takes us into, as the subtitle suggests, the spiritual and cultural implications of evolutionary thinking. Can religion and evolution be reconciled? Can evolutionary thinking lead to a new type of spirituality? Is our culture already being changed in ways that we don't realize by evolutionary thinking? These are all important questions and Phipps book is a great introduction to this discussion. Phipps is an author, journalist, and contributor to the emerging "integral" or "evolutionary" cultural movement that combines the insights of Integral Philosophy, evolutionary science, developmental psychology, and the social sciences. He has served as the Executive Editor of EnlightenNext magazine (no longer published) and more recently is the co-founder of the Institute for Cultural Evolution, a public policy think tank addressing the cultural roots of America's political challenges. What follows is an email interview with Phipps.

  11. High-frequency combination coding-based steady-state visual evoked potential for brain computer interface

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Feng; Zhang, Xin; Xie, Jun; Li, Yeping; Han, Chengcheng; Lili, Li; Wang, Jing [School of Mechanical Engineering, Xi’an Jiaotong University, Xi’an 710049 (China); Xu, Guang-Hua [School of Mechanical Engineering, Xi’an Jiaotong University, Xi’an 710049 (China); State Key Laboratory for Manufacturing Systems Engineering, Xi’an Jiaotong University, Xi’an 710054 (China)

    2015-03-10

    This study presents a new steady-state visual evoked potential (SSVEP) paradigm for brain computer interface (BCI) systems. The goal of this study is to increase the number of targets using fewer stimulation high frequencies, with diminishing subject’s fatigue and reducing the risk of photosensitive epileptic seizures. The new paradigm is High-Frequency Combination Coding-Based High-Frequency Steady-State Visual Evoked Potential (HFCC-SSVEP).Firstly, we studied SSVEP high frequency(beyond 25 Hz)response of SSVEP, whose paradigm is presented on the LED. The SNR (Signal to Noise Ratio) of high frequency(beyond 40 Hz) response is very low, which is been unable to be distinguished through the traditional analysis method; Secondly we investigated the HFCC-SSVEP response (beyond 25 Hz) for 3 frequencies (25Hz, 33.33Hz, and 40Hz), HFCC-SSVEP produces n{sup n} with n high stimulation frequencies through Frequence Combination Code. Further, Animproved Hilbert-huang transform (IHHT)-based variable frequency EEG feature extraction method and a local spectrum extreme target identification algorithmare adopted to extract time-frequency feature of the proposed HFCC-SSVEP response.Linear predictions and fixed sifting (iterating) 10 time is used to overcome the shortage of end effect and stopping criterion,generalized zero-crossing (GZC) is used to compute the instantaneous frequency of the proposed SSVEP respondent signals, the improved HHT-based feature extraction method for the proposed SSVEP paradigm in this study increases recognition efficiency, so as to improve ITR and to increase the stability of the BCI system. what is more, SSVEPs evoked by high-frequency stimuli (beyond 25Hz) minimally diminish subject’s fatigue and prevent safety hazards linked to photo-induced epileptic seizures, So as to ensure the system efficiency and undamaging.This study tests three subjects in order to verify the feasibility of the proposed method.

  12. High-frequency combination coding-based steady-state visual evoked potential for brain computer interface

    International Nuclear Information System (INIS)

    Zhang, Feng; Zhang, Xin; Xie, Jun; Li, Yeping; Han, Chengcheng; Lili, Li; Wang, Jing; Xu, Guang-Hua

    2015-01-01

    This study presents a new steady-state visual evoked potential (SSVEP) paradigm for brain computer interface (BCI) systems. The goal of this study is to increase the number of targets using fewer stimulation high frequencies, with diminishing subject’s fatigue and reducing the risk of photosensitive epileptic seizures. The new paradigm is High-Frequency Combination Coding-Based High-Frequency Steady-State Visual Evoked Potential (HFCC-SSVEP).Firstly, we studied SSVEP high frequency(beyond 25 Hz)response of SSVEP, whose paradigm is presented on the LED. The SNR (Signal to Noise Ratio) of high frequency(beyond 40 Hz) response is very low, which is been unable to be distinguished through the traditional analysis method; Secondly we investigated the HFCC-SSVEP response (beyond 25 Hz) for 3 frequencies (25Hz, 33.33Hz, and 40Hz), HFCC-SSVEP produces n n with n high stimulation frequencies through Frequence Combination Code. Further, Animproved Hilbert-huang transform (IHHT)-based variable frequency EEG feature extraction method and a local spectrum extreme target identification algorithmare adopted to extract time-frequency feature of the proposed HFCC-SSVEP response.Linear predictions and fixed sifting (iterating) 10 time is used to overcome the shortage of end effect and stopping criterion,generalized zero-crossing (GZC) is used to compute the instantaneous frequency of the proposed SSVEP respondent signals, the improved HHT-based feature extraction method for the proposed SSVEP paradigm in this study increases recognition efficiency, so as to improve ITR and to increase the stability of the BCI system. what is more, SSVEPs evoked by high-frequency stimuli (beyond 25Hz) minimally diminish subject’s fatigue and prevent safety hazards linked to photo-induced epileptic seizures, So as to ensure the system efficiency and undamaging.This study tests three subjects in order to verify the feasibility of the proposed method

  13. Multiobjective Multifactorial Optimization in Evolutionary Multitasking.

    Science.gov (United States)

    Gupta, Abhishek; Ong, Yew-Soon; Feng, Liang; Tan, Kay Chen

    2016-05-03

    In recent decades, the field of multiobjective optimization has attracted considerable interest among evolutionary computation researchers. One of the main features that makes evolutionary methods particularly appealing for multiobjective problems is the implicit parallelism offered by a population, which enables simultaneous convergence toward the entire Pareto front. While a plethora of related algorithms have been proposed till date, a common attribute among them is that they focus on efficiently solving only a single optimization problem at a time. Despite the known power of implicit parallelism, seldom has an attempt been made to multitask, i.e., to solve multiple optimization problems simultaneously. It is contended that the notion of evolutionary multitasking leads to the possibility of automated transfer of information across different optimization exercises that may share underlying similarities, thereby facilitating improved convergence characteristics. In particular, the potential for automated transfer is deemed invaluable from the standpoint of engineering design exercises where manual knowledge adaptation and reuse are routine. Accordingly, in this paper, we present a realization of the evolutionary multitasking paradigm within the domain of multiobjective optimization. The efficacy of the associated evolutionary algorithm is demonstrated on some benchmark test functions as well as on a real-world manufacturing process design problem from the composites industry.

  14. Ancient Biomolecules and Evolutionary Inference.

    Science.gov (United States)

    Cappellini, Enrico; Prohaska, Ana; Racimo, Fernando; Welker, Frido; Pedersen, Mikkel Winther; Allentoft, Morten E; de Barros Damgaard, Peter; Gutenbrunner, Petra; Dunne, Julie; Hammann, Simon; Roffet-Salque, Mélanie; Ilardo, Melissa; Moreno-Mayar, J Víctor; Wang, Yucheng; Sikora, Martin; Vinner, Lasse; Cox, Jürgen; Evershed, Richard P; Willerslev, Eske

    2018-04-25

    Over the last decade, studies of ancient biomolecules-particularly ancient DNA, proteins, and lipids-have revolutionized our understanding of evolutionary history. Though initially fraught with many challenges, the field now stands on firm foundations. Researchers now successfully retrieve nucleotide and amino acid sequences, as well as lipid signatures, from progressively older samples, originating from geographic areas and depositional environments that, until recently, were regarded as hostile to long-term preservation of biomolecules. Sampling frequencies and the spatial and temporal scope of studies have also increased markedly, and with them the size and quality of the data sets generated. This progress has been made possible by continuous technical innovations in analytical methods, enhanced criteria for the selection of ancient samples, integrated experimental methods, and advanced computational approaches. Here, we discuss the history and current state of ancient biomolecule research, its applications to evolutionary inference, and future directions for this young and exciting field. Expected final online publication date for the Annual Review of Biochemistry Volume 87 is June 20, 2018. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.

  15. Accurate prediction of stability changes in protein mutants by combining machine learning with structure based computational mutagenesis.

    Science.gov (United States)

    Masso, Majid; Vaisman, Iosif I

    2008-09-15

    Accurate predictive models for the impact of single amino acid substitutions on protein stability provide insight into protein structure and function. Such models are also valuable for the design and engineering of new proteins. Previously described methods have utilized properties of protein sequence or structure to predict the free energy change of mutants due to thermal (DeltaDeltaG) and denaturant (DeltaDeltaG(H2O)) denaturations, as well as mutant thermal stability (DeltaT(m)), through the application of either computational energy-based approaches or machine learning techniques. However, accuracy associated with applying these methods separately is frequently far from optimal. We detail a computational mutagenesis technique based on a four-body, knowledge-based, statistical contact potential. For any mutation due to a single amino acid replacement in a protein, the method provides an empirical normalized measure of the ensuing environmental perturbation occurring at every residue position. A feature vector is generated for the mutant by considering perturbations at the mutated position and it's ordered six nearest neighbors in the 3-dimensional (3D) protein structure. These predictors of stability change are evaluated by applying machine learning tools to large training sets of mutants derived from diverse proteins that have been experimentally studied and described. Predictive models based on our combined approach are either comparable to, or in many cases significantly outperform, previously published results. A web server with supporting documentation is available at http://proteins.gmu.edu/automute.

  16. The stress and stress intensity factors computation by BEM and FEM combination for nozzle junction under pressure and thermal loads

    International Nuclear Information System (INIS)

    Du, Q.; Cen, Z.; Zhu, H.

    1989-01-01

    This paper reports linear elastic fracture analysis based upon the stress intensity factor evaluation successfully applied to safety assessments of cracked structures. The nozzle junction are usually subjected to high pressure and thermal loads simultaneously. In validity of linear elastic fracture analysis, K can be decomposed into K P (caused by mechanic loads) and K τ (caused by thermal loads). Under thermal transient loading, explicit analysis (say by the FEM or BEM) of K tracing an entire history respectively for a range of crack depth may be much more time consuming. The techniques of weight function provide efficient means for transforming the problem into the stress computation of the uncracked structure and generation of influence function (for the given structure and size of crack). In this paper, a combination of BE-FEM has been used for the analysis of the cracked nozzle structure by techniques of weight function. The influence functions are obtained by coupled BE-FEM and the uncracked structure stress are computed by finite element methods

  17. Predicting Transport of 3,5,6-Trichloro-2-Pyridinol Into Saliva Using a Combination Experimental and Computational Approach

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Jordan Ned; Carver, Zana A.; Weber, Thomas J.; Timchalk, Charles

    2017-04-11

    A combination experimental and computational approach was developed to predict chemical transport into saliva. A serous-acinar chemical transport assay was established to measure chemical transport with non-physiological (standard cell culture medium) and physiological (using surrogate plasma and saliva medium) conditions using 3,5,6-trichloro-2-pyridinol (TCPy) a metabolite of the pesticide chlorpyrifos. High levels of TCPy protein binding was observed in cell culture medium and rat plasma resulting in different TCPy transport behaviors in the two experimental conditions. In the non-physiological transport experiment, TCPy reached equilibrium at equivalent concentrations in apical and basolateral chambers. At higher TCPy doses, increased unbound TCPy was observed, and TCPy concentrations in apical and basolateral chambers reached equilibrium faster than lower doses, suggesting only unbound TCPy is able to cross the cellular monolayer. In the physiological experiment, TCPy transport was slower than non-physiological conditions, and equilibrium was achieved at different concentrations in apical and basolateral chambers at a comparable ratio (0.034) to what was previously measured in rats dosed with TCPy (saliva:blood ratio: 0.049). A cellular transport computational model was developed based on TCPy protein binding kinetics and accurately simulated all transport experiments using different permeability coefficients for the two experimental conditions (1.4 vs 0.4 cm/hr for non-physiological and physiological experiments, respectively). The computational model was integrated into a physiologically based pharmacokinetic (PBPK) model and accurately predicted TCPy concentrations in saliva of rats dosed with TCPy. Overall, this study demonstrates an approach to predict chemical transport in saliva potentially increasing the utility of salivary biomonitoring in the future.

  18. Predicting Transport of 3,5,6-Trichloro-2-Pyridinol Into Saliva Using a Combination Experimental and Computational Approach.

    Science.gov (United States)

    Smith, Jordan Ned; Carver, Zana A; Weber, Thomas J; Timchalk, Charles

    2017-06-01

    A combination experimental and computational approach was developed to predict chemical transport into saliva. A serous-acinar chemical transport assay was established to measure chemical transport with nonphysiological (standard cell culture medium) and physiological (using surrogate plasma and saliva medium) conditions using 3,5,6-trichloro-2-pyridinol (TCPy) a metabolite of the pesticide chlorpyrifos. High levels of TCPy protein binding were observed in cell culture medium and rat plasma resulting in different TCPy transport behaviors in the 2 experimental conditions. In the nonphysiological transport experiment, TCPy reached equilibrium at equivalent concentrations in apical and basolateral chambers. At higher TCPy doses, increased unbound TCPy was observed, and TCPy concentrations in apical and basolateral chambers reached equilibrium faster than lower doses, suggesting only unbound TCPy is able to cross the cellular monolayer. In the physiological experiment, TCPy transport was slower than nonphysiological conditions, and equilibrium was achieved at different concentrations in apical and basolateral chambers at a comparable ratio (0.034) to what was previously measured in rats dosed with TCPy (saliva:blood ratio: 0.049). A cellular transport computational model was developed based on TCPy protein binding kinetics and simulated all transport experiments reasonably well using different permeability coefficients for the 2 experimental conditions (1.14 vs 0.4 cm/h for nonphysiological and physiological experiments, respectively). The computational model was integrated into a physiologically based pharmacokinetic model and accurately predicted TCPy concentrations in saliva of rats dosed with TCPy. Overall, this study demonstrates an approach to predict chemical transport in saliva, potentially increasing the utility of salivary biomonitoring in the future. © The Author 2017. Published by Oxford University Press on behalf of the Society of Toxicology. All rights

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  20. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  2. EVOLUTIONARY FOUNDATIONS FOR MOLECULAR MEDICINE

    Science.gov (United States)

    Nesse, Randolph M.; Ganten, Detlev; Gregory, T. Ryan; Omenn, Gilbert S.

    2015-01-01

    Evolution has long provided a foundation for population genetics, but many major advances in evolutionary biology from the 20th century are only now being applied in molecular medicine. They include the distinction between proximate and evolutionary explanations, kin selection, evolutionary models for cooperation, and new strategies for tracing phylogenies and identifying signals of selection. Recent advances in genomics are further transforming evolutionary biology and creating yet more opportunities for progress at the interface of evolution with genetics, medicine, and public health. This article reviews 15 evolutionary principles and their applications in molecular medicine in hopes that readers will use them and others to speed the development of evolutionary molecular medicine. PMID:22544168

  3. Brain-computer interface game applications for combined neurofeedback and biofeedback treatment for children on the autism spectrum

    Directory of Open Access Journals (Sweden)

    Elisabeth V C Friedrich

    2014-07-01

    Full Text Available Individuals with Autism Spectrum Disorder (ASD show deficits in social and communicative skills, including imitation, empathy, and shared attention, as well as restricted interests and repetitive patterns of behaviors. Evidence for and against the idea that dysfunctions in the mirror neuron system are involved in imitation and could be one underlying cause for ASD is discussed in this review. Neurofeedback interventions have reduced symptoms in children with ASD by self-regulation of brain rhythms. However, cortical deficiencies are not the only cause of these symptoms. Peripheral physiological activity, such as the heart rate, is closely linked to neurophysiological signals and associated with social engagement. Therefore, a combined approach targeting the interplay between brain, body and behavior could be more effective. Brain-computer interface applications for combined neurofeedback and biofeedback treatment for children with ASD are currently nonexistent. To facilitate their use, we have designed an innovative game that includes social interactions and provides neural- and body-based feedback that corresponds directly to the underlying significance of the trained signals as well as to the behavior that is reinforced.

  4. Automated Extraction of Cranial Landmarks from Computed Tomography Data using a Combined Method of Knowledge and Pattern Based Approaches

    Directory of Open Access Journals (Sweden)

    Roshan N. RAJAPAKSE

    2016-03-01

    Full Text Available Accurate identification of anatomical structures from medical imaging data is a significant and critical function in the medical domain. Past studies in this context have mainly utilized two main approaches, the knowledge and learning methodologies based methods. Further, most of previous reported studies have focused on identification of landmarks from lateral X-ray Computed Tomography (CT data, particularly in the field of orthodontics. However, this study focused on extracting cranial landmarks from large sets of cross sectional CT slices using a combined method of the two aforementioned approaches. The proposed method of this study is centered mainly on template data sets, which were created using the actual contour patterns extracted from CT cases for each of the landmarks in consideration. Firstly, these templates were used to devise rules which are a characteristic of the knowledge based method. Secondly, the same template sets were employed to perform template matching related to the learning methodologies approach. The proposed method was tested on two landmarks, the Dorsum sellae and the Pterygoid plate, using CT cases of 5 subjects. The results indicate that, out of the 10 tests, the output images were within the expected range (desired accuracy in 7 instances and acceptable range (near accuracy for 2 instances, thus verifying the effectiveness of the combined template sets centric approach proposed in this study.

  5. Combination of brain-computer interface training and goal-directed physical therapy in chronic stroke: a case report.

    Science.gov (United States)

    Broetz, Doris; Braun, Christoph; Weber, Cornelia; Soekadar, Surjo R; Caria, Andrea; Birbaumer, Niels

    2010-09-01

    There is no accepted and efficient rehabilitation strategy to reduce focal impairments for patients with chronic stroke who lack residual movements. A 67-year-old hemiplegic patient with no active finger extension was trained with a brain-computer interface (BCI) combined with a specific daily life-oriented physiotherapy. The BCI used electrical brain activity (EEG) and magnetic brain activity (MEG) to drive an orthosis and a robot affixed to the patient's affected upper extremity, which enabled him to move the paralyzed arm and hand driven by voluntary modulation of micro-rhythm activity. In addition, the patient practiced goal-directed physiotherapy training. Over 1 year, he completed 3 training blocks. Arm motor function, gait capacities (using Fugl-Meyer Assessment, Wolf Motor Function Test, Modified Ashworth Scale, 10-m walk speed, and goal attainment score), and brain reorganization (functional MRI, MEG) were repeatedly assessed. The ability of hand and arm movements as well as speed and safety of gait improved significantly (mean 46.6%). Improvement of motor function was associated with increased micro-oscillations in the ipsilesional motor cortex. This proof-of-principle study suggests that the combination of BCI training with goal-directed, active physical therapy may improve the motor abilities of chronic stroke patients despite apparent initial paralysis.

  6. Brain-computer interface game applications for combined neurofeedback and biofeedback treatment for children on the autism spectrum.

    Science.gov (United States)

    Friedrich, Elisabeth V C; Suttie, Neil; Sivanathan, Aparajithan; Lim, Theodore; Louchart, Sandy; Pineda, Jaime A

    2014-01-01

    Individuals with autism spectrum disorder (ASD) show deficits in social and communicative skills, including imitation, empathy, and shared attention, as well as restricted interests and repetitive patterns of behaviors. Evidence for and against the idea that dysfunctions in the mirror neuron system are involved in imitation and could be one underlying cause for ASD is discussed in this review. Neurofeedback interventions have reduced symptoms in children with ASD by self-regulation of brain rhythms. However, cortical deficiencies are not the only cause of these symptoms. Peripheral physiological activity, such as the heart rate and its variability, is closely linked to neurophysiological signals and associated with social engagement. Therefore, a combined approach targeting the interplay between brain, body, and behavior could be more effective. Brain-computer interface applications for combined neurofeedback and biofeedback treatment for children with ASD are currently nonexistent. To facilitate their use, we have designed an innovative game that includes social interactions and provides neural- and body-based feedback that corresponds directly to the underlying significance of the trained signals as well as to the behavior that is reinforced.

  7. Multi-objective evolutionary algorithms for fuzzy classification in survival prediction.

    Science.gov (United States)

    Jiménez, Fernando; Sánchez, Gracia; Juárez, José M

    2014-03-01

    This paper presents a novel rule-based fuzzy classification methodology for survival/mortality prediction in severe burnt patients. Due to the ethical aspects involved in this medical scenario, physicians tend not to accept a computer-based evaluation unless they understand why and how such a recommendation is given. Therefore, any fuzzy classifier model must be both accurate and interpretable. The proposed methodology is a three-step process: (1) multi-objective constrained optimization of a patient's data set, using Pareto-based elitist multi-objective evolutionary algorithms to maximize accuracy and minimize the complexity (number of rules) of classifiers, subject to interpretability constraints; this step produces a set of alternative (Pareto) classifiers; (2) linguistic labeling, which assigns a linguistic label to each fuzzy set of the classifiers; this step is essential to the interpretability of the classifiers; (3) decision making, whereby a classifier is chosen, if it is satisfactory, according to the preferences of the decision maker. If no classifier is satisfactory for the decision maker, the process starts again in step (1) with a different input parameter set. The performance of three multi-objective evolutionary algorithms, niched pre-selection multi-objective algorithm, elitist Pareto-based multi-objective evolutionary algorithm for diversity reinforcement (ENORA) and the non-dominated sorting genetic algorithm (NSGA-II), was tested using a patient's data set from an intensive care burn unit and a standard machine learning data set from an standard machine learning repository. The results are compared using the hypervolume multi-objective metric. Besides, the results have been compared with other non-evolutionary techniques and validated with a multi-objective cross-validation technique. Our proposal improves the classification rate obtained by other non-evolutionary techniques (decision trees, artificial neural networks, Naive Bayes, and case

  8. Evolutionary trends in Heteroptera

    NARCIS (Netherlands)

    Cobben, R.H.

    1968-01-01

    1. This work, the first volume of a series dealing with evolutionary trends in Heteroptera, is concerned with the egg system of about 400 species. The data are presented systematically in chapters 1 and 2 with a critical review of the literature after each family.

    2. Chapter 3 evaluates facts

  9. Evolutionary mysteries in meiosis

    NARCIS (Netherlands)

    Lenormand, Thomas; Engelstädter, Jan; Johnston, Susan E.; Wijnker, Erik; Haag, Christoph R.

    2016-01-01

    Meiosis is a key event of sexual life cycles in eukaryotes. Its mechanistic details have been uncovered in several model organisms, and most of its essential features have received various and often contradictory evolutionary interpretations. In this perspective, we present an overview of these

  10. Evolutionary perspectives on ageing.

    Science.gov (United States)

    Reichard, Martin

    2017-10-01

    From an evolutionary perspective, ageing is a decrease in fitness with chronological age - expressed by an increase in mortality risk and/or decline in reproductive success and mediated by deterioration of functional performance. While this makes ageing intuitively paradoxical - detrimental to individual fitness - evolutionary theory offers answers as to why ageing has evolved. In this review, I first briefly examine the classic evolutionary theories of ageing and their empirical tests, and highlight recent findings that have advanced our understanding of the evolution of ageing (condition-dependent survival, positive pleiotropy). I then provide an overview of recent theoretical extensions and modifications that accommodate those new discoveries. I discuss the role of indeterminate (asymptotic) growth for lifetime increases in fecundity and ageing trajectories. I outline alternative views that challenge a universal existence of senescence - namely the lack of a germ-soma distinction and the ability of tissue replacement and retrogression to younger developmental stages in modular organisms. I argue that rejuvenation at the organismal level is plausible, but includes a return to a simple developmental stage. This may exempt a particular genotype from somatic defects but, correspondingly, removes any information acquired during development. A resolution of the question of whether a rejuvenated individual is the same entity is central to the recognition of whether current evolutionary theories of ageing, with their extensions and modifications, can explain the patterns of ageing across the Tree of Life. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Editorial overview: Evolutionary psychology

    NARCIS (Netherlands)

    Gangestad, S.W.; Tybur, J.M.

    2016-01-01

    Functional approaches in psychology - which ask what behavior is good for - are almost as old as scientific psychology itself. Yet sophisticated, generative functional theories were not possible until developments in evolutionary biology in the mid-20th century. Arising in the last three decades,

  12. Biochemistry and evolutionary biology

    Indian Academy of Sciences (India)

    Biochemical information has been crucial for the development of evolutionary biology. On the one hand, the sequence information now appearing is producing a huge increase in the amount of data available for phylogenetic analysis; on the other hand, and perhaps more fundamentally, it allows understanding of the ...

  13. Evolutionary Biology Today

    Indian Academy of Sciences (India)

    Hindi and English. Port 1. Resonance, Vo1.7 ... they use. Of course, many evolutionary biologists do work with fossils or DNA, or both, but there are also large numbers of ... The first major division that I like to make is between studies focussed ...

  14. Learning: An Evolutionary Analysis

    Science.gov (United States)

    Swann, Joanna

    2009-01-01

    This paper draws on the philosophy of Karl Popper to present a descriptive evolutionary epistemology that offers philosophical solutions to the following related problems: "What happens when learning takes place?" and "What happens in human learning?" It provides a detailed analysis of how learning takes place without any direct transfer of…

  15. Complex systems, evolutionary planning?

    NARCIS (Netherlands)

    Bertolini, L.; de Roo, G.; Silva, E.A.

    2010-01-01

    Coping with uncertainty is a defining challenge for spatial planners. Accordingly, most spatial planning theories and methods are aimed at reducing uncertainty. However, the question is what should be done when this seems impossible? This chapter proposes an evolutionary interpretation of spatial

  16. Molluscan Evolutionary Development

    DEFF Research Database (Denmark)

    Wanninger, Andreas Wilhelm Georg; Koop, Damien; Moshel-Lynch, Sharon

    2008-01-01

    Brought together by Winston F. Ponder and David R. Lindberg, thirty-six experts on the evolution of the Mollusca provide an up-to-date review of its evolutionary history. The Mollusca are the second largest animal phylum and boast a fossil record of over 540 million years. They exhibit remarkable...

  17. Design of a computation tool for neutron spectrometry and dosimetry through evolutionary neural networks; Diseno de una herramienta de computo para la espectrometria y dosimetria de neutrones por medio de redes neuronales evolutivas

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz R, J. M.; Vega C, H. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Ingenieria Electrica, Av. Ramon Lopez Velarde No. 801, Col. Centro, Zacatecas (Mexico); Martinez B, M. R. [Universidad Autonoma de Zacatecas, Unidad Academica de Estudios Nucleares, Av. Ramon Lopez Velarde No. 801, Col. Centro, Zacatecas (Mexico); Gallego, E. [Universidad Politecnica de Madrid, Departamento de Ingenieria Nuclear, Jose Gutierrez Abascal No. 2, E-28006 Madrid (Spain)], e-mail: morvymmyahoo@com.mx

    2009-10-15

    The neutron dosimetry is one of the most complicated tasks of radiation protection, due to it is a complex technique and highly dependent of neutron energy. One of the first devices used to perform neutron spectrometry is the system known as spectrometric system of Bonner spheres, that continuous being one of spectrometers most commonly used. This system has disadvantages such as: the components weight, the low resolution of spectrum, long and drawn out procedure for the spectra reconstruction, which require an expert user in system management, the need of use a reconstruction code as BUNKIE, SAND, etc., which are based on an iterative reconstruction algorithm and whose greatest inconvenience is that for the spectrum reconstruction, are needed to provide to system and initial spectrum as close as possible to the desired spectrum get. Consequently, researchers have mentioned the need to developed alternative measurement techniques to improve existing monitoring systems for workers. Among these alternative techniques have been reported several reconstruction procedures based on artificial intelligence techniques such as genetic algorithms, artificial neural networks and hybrid systems of evolutionary artificial neural networks using genetic algorithms. However, the use of these techniques in the nuclear science area is not free of problems, so it has been suggested that more research is conducted in such a way as to solve these disadvantages. Because they are emerging technologies, there are no tools for the results analysis, so in this paper we present first the design of a computation tool that allow to analyze the neutron spectra and equivalent doses, obtained through the hybrid technology of neural networks and genetic algorithms. This tool provides an user graphical environment, friendly, intuitive and easy of operate. The speed of program operation is high, executing the analysis in a few seconds, so it may storage and or print the obtained information for

  18. Evolutionary approaches to autism: an overview and integration

    NARCIS (Netherlands)

    Ploeger, A.; Galis, F.

    2011-01-01

    Autism is a highly heritable neurodevelopmental disorder, which greatly reduces reproductive success. The combination of high heritability and low reproductive success raises an evolutionary question: why was autism not eliminated by natural selection? We review different perspectives on the

  19. Evolutionary heritage influences Amazon tree ecology

    Science.gov (United States)

    Coelho de Souza, Fernanda; Dexter, Kyle G.; Phillips, Oliver L.; Brienen, Roel J. W.; Chave, Jerome; Galbraith, David R.; Lopez Gonzalez, Gabriela; Monteagudo Mendoza, Abel; Pennington, R. Toby; Poorter, Lourens; Alexiades, Miguel; Álvarez-Dávila, Esteban; Andrade, Ana; Aragão, Luis E. O. C.; Araujo-Murakami, Alejandro; Arets, Eric J. M. M.; Aymard C, Gerardo A.; Baraloto, Christopher; Barroso, Jorcely G.; Bonal, Damien; Boot, Rene G. A.; Camargo, José L. C.; Comiskey, James A.; Valverde, Fernando Cornejo; de Camargo, Plínio B.; Di Fiore, Anthony; Erwin, Terry L.; Feldpausch, Ted R.; Ferreira, Leandro; Fyllas, Nikolaos M.; Gloor, Emanuel; Herault, Bruno; Herrera, Rafael; Higuchi, Niro; Honorio Coronado, Eurídice N.; Killeen, Timothy J.; Laurance, William F.; Laurance, Susan; Lloyd, Jon; Lovejoy, Thomas E.; Malhi, Yadvinder; Maracahipes, Leandro; Marimon, Beatriz S.; Marimon-Junior, Ben H.; Mendoza, Casimiro; Morandi, Paulo; Neill, David A.; Vargas, Percy Núñez; Oliveira, Edmar A.; Lenza, Eddie; Palacios, Walter A.; Peñuela-Mora, Maria C.; Pipoly, John J.; Pitman, Nigel C. A.; Prieto, Adriana; Quesada, Carlos A.; Ramirez-Angulo, Hirma; Rudas, Agustin; Ruokolainen, Kalle; Salomão, Rafael P.; Silveira, Marcos; ter Steege, Hans; Thomas-Caesar, Raquel; van der Hout, Peter; van der Heijden, Geertje M. F.; van der Meer, Peter J.; Vasquez, Rodolfo V.; Vieira, Simone A.; Vilanova, Emilio; Vos, Vincent A.; Wang, Ophelia; Young, Kenneth R.; Zagt, Roderick J.; Baker, Timothy R.

    2016-01-01

    Lineages tend to retain ecological characteristics of their ancestors through time. However, for some traits, selection during evolutionary history may have also played a role in determining trait values. To address the relative importance of these processes requires large-scale quantification of traits and evolutionary relationships among species. The Amazonian tree flora comprises a high diversity of angiosperm lineages and species with widely differing life-history characteristics, providing an excellent system to investigate the combined influences of evolutionary heritage and selection in determining trait variation. We used trait data related to the major axes of life-history variation among tropical trees (e.g. growth and mortality rates) from 577 inventory plots in closed-canopy forest, mapped onto a phylogenetic hypothesis spanning more than 300 genera including all major angiosperm clades to test for evolutionary constraints on traits. We found significant phylogenetic signal (PS) for all traits, consistent with evolutionarily related genera having more similar characteristics than expected by chance. Although there is also evidence for repeated evolution of pioneer and shade tolerant life-history strategies within independent lineages, the existence of significant PS allows clearer predictions of the links between evolutionary diversity, ecosystem function and the response of tropical forests to global change. PMID:27974517

  20. Evolutionary heritage influences Amazon tree ecology.

    Science.gov (United States)

    Coelho de Souza, Fernanda; Dexter, Kyle G; Phillips, Oliver L; Brienen, Roel J W; Chave, Jerome; Galbraith, David R; Lopez Gonzalez, Gabriela; Monteagudo Mendoza, Abel; Pennington, R Toby; Poorter, Lourens; Alexiades, Miguel; Álvarez-Dávila, Esteban; Andrade, Ana; Aragão, Luis E O C; Araujo-Murakami, Alejandro; Arets, Eric J M M; Aymard C, Gerardo A; Baraloto, Christopher; Barroso, Jorcely G; Bonal, Damien; Boot, Rene G A; Camargo, José L C; Comiskey, James A; Valverde, Fernando Cornejo; de Camargo, Plínio B; Di Fiore, Anthony; Elias, Fernando; Erwin, Terry L; Feldpausch, Ted R; Ferreira, Leandro; Fyllas, Nikolaos M; Gloor, Emanuel; Herault, Bruno; Herrera, Rafael; Higuchi, Niro; Honorio Coronado, Eurídice N; Killeen, Timothy J; Laurance, William F; Laurance, Susan; Lloyd, Jon; Lovejoy, Thomas E; Malhi, Yadvinder; Maracahipes, Leandro; Marimon, Beatriz S; Marimon-Junior, Ben H; Mendoza, Casimiro; Morandi, Paulo; Neill, David A; Vargas, Percy Núñez; Oliveira, Edmar A; Lenza, Eddie; Palacios, Walter A; Peñuela-Mora, Maria C; Pipoly, John J; Pitman, Nigel C A; Prieto, Adriana; Quesada, Carlos A; Ramirez-Angulo, Hirma; Rudas, Agustin; Ruokolainen, Kalle; Salomão, Rafael P; Silveira, Marcos; Stropp, Juliana; Ter Steege, Hans; Thomas-Caesar, Raquel; van der Hout, Peter; van der Heijden, Geertje M F; van der Meer, Peter J; Vasquez, Rodolfo V; Vieira, Simone A; Vilanova, Emilio; Vos, Vincent A; Wang, Ophelia; Young, Kenneth R; Zagt, Roderick J; Baker, Timothy R

    2016-12-14

    Lineages tend to retain ecological characteristics of their ancestors through time. However, for some traits, selection during evolutionary history may have also played a role in determining trait values. To address the relative importance of these processes requires large-scale quantification of traits and evolutionary relationships among species. The Amazonian tree flora comprises a high diversity of angiosperm lineages and species with widely differing life-history characteristics, providing an excellent system to investigate the combined influences of evolutionary heritage and selection in determining trait variation. We used trait data related to the major axes of life-history variation among tropical trees (e.g. growth and mortality rates) from 577 inventory plots in closed-canopy forest, mapped onto a phylogenetic hypothesis spanning more than 300 genera including all major angiosperm clades to test for evolutionary constraints on traits. We found significant phylogenetic signal (PS) for all traits, consistent with evolutionarily related genera having more similar characteristics than expected by chance. Although there is also evidence for repeated evolution of pioneer and shade tolerant life-history strategies within independent lineages, the existence of significant PS allows clearer predictions of the links between evolutionary diversity, ecosystem function and the response of tropical forests to global change. © 2016 The Authors.

  1. Context dependent DNA evolutionary models

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    This paper is about stochastic models for the evolution of DNA. For a set of aligned DNA sequences, connected in a phylogenetic tree, the models should be able to explain - in probabilistic terms - the differences seen in the sequences. From the estimates of the parameters in the model one can...... start to make biologically interpretations and conclusions concerning the evolutionary forces at work. In parallel with the increase in computing power, models have become more complex. Starting with Markov processes on a space with 4 states, and extended to Markov processes with 64 states, we are today...... studying models on spaces with 4n (or 64n) number of states with n well above one hundred, say. For such models it is no longer possible to calculate the transition probability analytically, and often Markov chain Monte Carlo is used in connection with likelihood analysis. This is also the approach taken...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  3. Historical change and evolutionary theory.

    Science.gov (United States)

    Masters, Roger D

    2007-09-01

    Despite advances in fields like genetics, evolutionary psychology, and human behavior and evolution--which generally focus on individual or small group behavior from a biological perspective--evolutionary biology has made little impact on studies of political change and social history. Theories of natural selection often seem inapplicable to human history because our social behavior is embedded in language (which makes possible the concepts of time and social identity on which what we call "history" depends). Peter Corning's Holistic Darwinism reconceptualizes evolutionary biology, making it possible to go beyond the barriers separating the social and natural sciences. Corning focuses on two primary processes: "synergy" (complex multivariate interactions at multiple levels between a species and its environment) and "cybernetics" (the information systems permitting communication between individuals and groups over time). Combining this frame of reference with inclusive fitness theory, it is possible to answer the most important (and puzzling) question in human history: How did a species that lived for millennia in hunter-gatherer bands form centralized states governing large populations of non-kin (including multi-ethnic empires as well as modern nation-states)? The fragility and contemporary ethnic violence in Kenya and the Congo should suffice as evidence that these issues need to be taken seriously. To explain the rise and fall of states as well as changes in human laws and customs--the core of historical research--it is essential to show how the provision of collective goods can overcome the challenge of self-interest and free-riding in some instances, yet fail to do so in others. To this end, it is now possible to consider how a state providing public goods can--under circumstances that often include effective leadership--contribute to enhanced inclusive fitness of virtually all its members. Because social behavior needs to adapt to ecology, but ecological

  4. MEGA5: Molecular Evolutionary Genetics Analysis Using Maximum Likelihood, Evolutionary Distance, and Maximum Parsimony Methods

    Science.gov (United States)

    Tamura, Koichiro; Peterson, Daniel; Peterson, Nicholas; Stecher, Glen; Nei, Masatoshi; Kumar, Sudhir

    2011-01-01

    Comparative analysis of molecular sequence data is essential for reconstructing the evolutionary histories of species and inferring the nature and extent of selective forces shaping the evolution of genes and species. Here, we announce the release of Molecular Evolutionary Genetics Analysis version 5 (MEGA5), which is a user-friendly software for mining online databases, building sequence alignments and phylogenetic trees, and using methods of evolutionary bioinformatics in basic biology, biomedicine, and evolution. The newest addition in MEGA5 is a collection of maximum likelihood (ML) analyses for inferring evolutionary trees, selecting best-fit substitution models (nucleotide or amino acid), inferring ancestral states and sequences (along with probabilities), and estimating evolutionary rates site-by-site. In computer simulation analyses, ML tree inference algorithms in MEGA5 compared favorably with other software packages in terms of computational efficiency and the accuracy of the estimates of phylogenetic trees, substitution parameters, and rate variation among sites. The MEGA user interface has now been enhanced to be activity driven to make it easier for the use of both beginners and experienced scientists. This version of MEGA is intended for the Windows platform, and it has been configured for effective use on Mac OS X and Linux desktops. It is available free of charge from http://www.megasoftware.net. PMID:21546353

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  6. Optimisation of the energy efficiency of bread-baking ovens using a combined experimental and computational approach

    International Nuclear Information System (INIS)

    Khatir, Zinedine; Paton, Joe; Thompson, Harvey; Kapur, Nik; Toropov, Vassili

    2013-01-01

    Highlights: ► A scientific framework for optimising oven operating conditions is presented. ► Experiments measuring local convective heat transfer coefficient are undertaken. ► An energy efficiency model is developed with experimentally calibrated CFD analysis. ► Designing ovens with optimum heat transfer coefficients reduces energy use. ► Results demonstrate a strong case to design and manufacture energy optimised ovens. - Abstract: Changing legislation and rising energy costs are bringing the need for efficient baking processes into much sharper focus. High-speed air impingement bread-baking ovens are complex systems using air flow to transfer heat to the product. In this paper, computational fluid dynamics (CFD) is combined with experimental analysis to develop a rigorous scientific framework for the rapid generation of forced convection oven designs. A design parameterisation of a three-dimensional generic oven model is carried out for a wide range of oven sizes and flow conditions to optimise desirable features such as temperature uniformity throughout the oven, energy efficiency and manufacturability. Coupled with the computational model, a series of experiments measuring the local convective heat transfer coefficient (h c ) are undertaken. The facility used for the heat transfer experiments is representative of a scaled-down production oven where the air temperature and velocity as well as important physical constraints such as nozzle dimensions and nozzle-to-surface distance can be varied. An efficient energy model is developed using a CFD analysis calibrated using experimentally determined inputs. Results from a range of oven designs are presented together with ensuing energy usage and savings

  7. Corrosion chemistry closing comments: opportunities in corrosion science facilitated by operando experimental characterization combined with multi-scale computational modelling.

    Science.gov (United States)

    Scully, John R

    2015-01-01

    Recent advances in characterization tools, computational capabilities, and theories have created opportunities for advancement in understanding of solid-fluid interfaces at the nanoscale in corroding metallic systems. The Faraday Discussion on Corrosion Chemistry in 2015 highlighted some of the current needs, gaps and opportunities in corrosion science. Themes were organized into several hierarchical categories that provide an organizational framework for corrosion. Opportunities to develop fundamental physical and chemical data which will enable further progress in thermodynamic and kinetic modelling of corrosion were discussed. These will enable new and better understanding of unit processes that govern corrosion at the nanoscale. Additional topics discussed included scales, films and oxides, fluid-surface and molecular-surface interactions, selected topics in corrosion science and engineering as well as corrosion control. Corrosion science and engineering topics included complex alloy dissolution, local corrosion, and modelling of specific corrosion processes that are made up of collections of temporally and spatially varying unit processes such as oxidation, ion transport, and competitive adsorption. Corrosion control and mitigation topics covered some new insights on coatings and inhibitors. Further advances in operando or in situ experimental characterization strategies at the nanoscale combined with computational modelling will enhance progress in the field, especially if coupling across length and time scales can be achieved incorporating the various phenomena encountered in corrosion. Readers are encouraged to not only to use this ad hoc organizational scheme to guide their immersion into the current opportunities in corrosion chemistry, but also to find value in the information presented in their own ways.

  8. Multi-step-prediction of chaotic time series based on co-evolutionary recurrent neural network

    International Nuclear Information System (INIS)

    Ma Qianli; Zheng Qilun; Peng Hong; Qin Jiangwei; Zhong Tanwei

    2008-01-01

    This paper proposes a co-evolutionary recurrent neural network (CERNN) for the multi-step-prediction of chaotic time series, it estimates the proper parameters of phase space reconstruction and optimizes the structure of recurrent neural networks by co-evolutionary strategy. The searching space was separated into two subspaces and the individuals are trained in a parallel computational procedure. It can dynamically combine the embedding method with the capability of recurrent neural network to incorporate past experience due to internal recurrence. The effectiveness of CERNN is evaluated by using three benchmark chaotic time series data sets: the Lorenz series, Mackey-Glass series and real-world sun spot series. The simulation results show that CERNN improves the performances of multi-step-prediction of chaotic time series

  9. Flow velocity-driven differentiation of human mesenchymal stromal cells in silk fibroin scaffolds: A combined experimental and computational approach.

    Directory of Open Access Journals (Sweden)

    Jolanda Rita Vetsch

    Full Text Available Mechanical loading plays a major role in bone remodeling and fracture healing. Mimicking the concept of mechanical loading of bone has been widely studied in bone tissue engineering by perfusion cultures. Nevertheless, there is still debate regarding the in-vitro mechanical stimulation regime. This study aims at investigating the effect of two different flow rates (vlow = 0.001m/s and vhigh = 0.061m/s on the growth of mineralized tissue produced by human mesenchymal stromal cells cultured on 3-D silk fibroin scaffolds. The flow rates applied were chosen to mimic the mechanical environment during early fracture healing or during bone remodeling, respectively. Scaffolds cultured under static conditions served as a control. Time-lapsed micro-computed tomography showed that mineralized extracellular matrix formation was completely inhibited at vlow compared to vhigh and the static group. Biochemical assays and histology confirmed these results and showed enhanced osteogenic differentiation at vhigh whereas the amount of DNA was increased at vlow. The biological response at vlow might correspond to the early stage of fracture healing, where cell proliferation and matrix production is prominent. Visual mapping of shear stresses, simulated by computational fluid dynamics, to 3-D micro-computed tomography data revealed that shear stresses up to 0.39mPa induced a higher DNA amount and shear stresses between 0.55mPa and 24mPa induced osteogenic differentiation. This study demonstrates the feasibility to drive cell behavior of human mesenchymal stromal cells by the flow velocity applied in agreement with mechanical loading mimicking early fracture healing (vlow or bone remodeling (vhigh. These results can be used in the future to tightly control the behavior of human mesenchymal stromal cells towards proliferation or differentiation. Additionally, the combination of experiment and simulation presented is a strong tool to link biological responses to

  10. Evolutionary constrained optimization

    CERN Document Server

    Deb, Kalyanmoy

    2015-01-01

    This book makes available a self-contained collection of modern research addressing the general constrained optimization problems using evolutionary algorithms. Broadly the topics covered include constraint handling for single and multi-objective optimizations; penalty function based methodology; multi-objective based methodology; new constraint handling mechanism; hybrid methodology; scaling issues in constrained optimization; design of scalable test problems; parameter adaptation in constrained optimization; handling of integer, discrete and mix variables in addition to continuous variables; application of constraint handling techniques to real-world problems; and constrained optimization in dynamic environment. There is also a separate chapter on hybrid optimization, which is gaining lots of popularity nowadays due to its capability of bridging the gap between evolutionary and classical optimization. The material in the book is useful to researchers, novice, and experts alike. The book will also be useful...

  11. Lengths of Orthologous Prokaryotic Proteins Are Affected by Evolutionary Factors

    Directory of Open Access Journals (Sweden)

    Tatiana Tatarinova

    2015-01-01

    Full Text Available Proteins of the same functional family (for example, kinases may have significantly different lengths. It is an open question whether such variation in length is random or it appears as a response to some unknown evolutionary driving factors. The main purpose of this paper is to demonstrate existence of factors affecting prokaryotic gene lengths. We believe that the ranking of genomes according to lengths of their genes, followed by the calculation of coefficients of association between genome rank and genome property, is a reasonable approach in revealing such evolutionary driving factors. As we demonstrated earlier, our chosen approach, Bubble-sort, combines stability, accuracy, and computational efficiency as compared to other ranking methods. Application of Bubble Sort to the set of 1390 prokaryotic genomes confirmed that genes of Archaeal species are generally shorter than Bacterial ones. We observed that gene lengths are affected by various factors: within each domain, different phyla have preferences for short or long genes; thermophiles tend to have shorter genes than the soil-dwellers; halophiles tend to have longer genes. We also found that species with overrepresentation of cytosines and guanines in the third position of the codon (GC3 content tend to have longer genes than species with low GC3 content.

  12. Lengths of Orthologous Prokaryotic Proteins Are Affected by Evolutionary Factors.

    Science.gov (United States)

    Tatarinova, Tatiana; Salih, Bilal; Dien Bard, Jennifer; Cohen, Irit; Bolshoy, Alexander

    2015-01-01

    Proteins of the same functional family (for example, kinases) may have significantly different lengths. It is an open question whether such variation in length is random or it appears as a response to some unknown evolutionary driving factors. The main purpose of this paper is to demonstrate existence of factors affecting prokaryotic gene lengths. We believe that the ranking of genomes according to lengths of their genes, followed by the calculation of coefficients of association between genome rank and genome property, is a reasonable approach in revealing such evolutionary driving factors. As we demonstrated earlier, our chosen approach, Bubble-sort, combines stability, accuracy, and computational efficiency as compared to other ranking methods. Application of Bubble Sort to the set of 1390 prokaryotic genomes confirmed that genes of Archaeal species are generally shorter than Bacterial ones. We observed that gene lengths are affected by various factors: within each domain, different phyla have preferences for short or long genes; thermophiles tend to have shorter genes than the soil-dwellers; halophiles tend to have longer genes. We also found that species with overrepresentation of cytosines and guanines in the third position of the codon (GC3 content) tend to have longer genes than species with low GC3 content.

  13. Clinical application of low-dose CT combined with computer-aided detection in lung cancer screening

    International Nuclear Information System (INIS)

    Xu Zushan; Hou Hongjun; Xu Yan; Ma Daqing

    2010-01-01

    Objective: To investigate the clinical value of chest low-dose CT (LDCT) combined with computer-aided detection (CAD) system for lung cancer screening in high risk population. Methods: Two hundred and nineteen healthy candidates underwent 64-slice LDCT scan. All images were reviewed in consensus by two radiologists with 15 years of thoracic CT diagnosis experience. Then the image data were analyzed with CAD alone. Finally images were reviewed by two radiologists with 5 years of CT diagnosis experience with and without CT Viewer software. The sensitivity, false positive rate of CAD for pulmonary nodule detection were calculated. SPSS 11.5 software and Chi-square test were used for the statistics. Results: Of 219 candidates ,104(47.5% ) were detected with lung nodules. There were 366 true nodules confirmed by the senior radiologists. The CAD system detected 271 (74.0%) true nodules and 424 false-positive nodules. The false-positive rate was 1.94/per case. The two junior radiologists indentifid 292 (79.8%), 286(78.1%) nodules without CAD and 336 (91.8%), 333 (91.0%) nodules with CAD respectively. There were significant differences for radiologists in indentifying nodules with or without CAD system (P<0.01). Conclusions: CAD is more sensitive than radiologists for indentifying the nodules in the central area or in the hilar region of the lung. While radiologists are more sensitive for the peripheral and sub-pleural nodules,or ground glass opacity nodules, or nodules smaller than 4 mm. CAD can not be used alone. The detection rate can be improved with the combination of radiologist and CAD in LDCT screen. (authors)

  14. Rapid estimation of the vertebral body volume: a combination of the Cavalieri principle and computed tomography images

    International Nuclear Information System (INIS)

    Odaci, Ersan; Sahin, Buenyamin; Sonmez, Osman Fikret; Kaplan, Sueleyman; Bas, Orhan; Bilgic, Sait; Bek, Yueksel; Erguer, Hayati

    2003-01-01

    Objective: The exact volume of the vertebral body is necessary for the evaluation, treatment and surgical application of related vertebral body. Thereby, the volume changes of the vertebral body are monitored, such as infectious diseases of vertebra and traumatic or non-traumatic fractures and deformities of the spine. Several studies have been conducted for the assessment of the vertebral body size based on the evaluation of the different criteria of the spine using different techniques. However, we have not found any detailed study in the literature describing the combination of the Cavalieri principle and vertebral body volume estimation. Materials and methods: In the present study we describe a rapid, simple, accurate and practical technique for estimating the volume of vertebral body. Two specimens were taken from the cadavers including ten lumbar vertebras and were scanned in axial, sagittal and coronal section planes by a computed tomography (CT) machine. The consecutive sections in 5 and 3 mm thicknesses were used to estimate the total volume of the vertebral bodies by means of the Cavalieri principle. Furthermore, to evaluate inter-observer differences the volume estimations were carried out by three performers. Results: There were no significant differences between the performers' estimates and real volumes of the vertebral bodies (P>0.05) and also between the performers' volume estimates (P>0.05). The section thickness and the section plains did not affect the accuracy of the estimates (P>0.05). A high correlation was seen between the estimates of performers and the real volumes of the vertebral bodies (r=0.881). Conclusion: We concluded that the combination of CT scanning with the Cavalieri principle is a direct and accurate technique that can be safely applied to estimate the volume of the vertebral body with the mean of 5 min and 11 s workload per vertebra

  15. Evolutionary games on graphs

    Science.gov (United States)

    Szabó, György; Fáth, Gábor

    2007-07-01

    Game theory is one of the key paradigms behind many scientific disciplines from biology to behavioral sciences to economics. In its evolutionary form and especially when the interacting agents are linked in a specific social network the underlying solution concepts and methods are very similar to those applied in non-equilibrium statistical physics. This review gives a tutorial-type overview of the field for physicists. The first four sections introduce the necessary background in classical and evolutionary game theory from the basic definitions to the most important results. The fifth section surveys the topological complications implied by non-mean-field-type social network structures in general. The next three sections discuss in detail the dynamic behavior of three prominent classes of models: the Prisoner's Dilemma, the Rock-Scissors-Paper game, and Competing Associations. The major theme of the review is in what sense and how the graph structure of interactions can modify and enrich the picture of long term behavioral patterns emerging in evolutionary games.

  16. Evolutionary mysteries in meiosis.

    Science.gov (United States)

    Lenormand, Thomas; Engelstädter, Jan; Johnston, Susan E; Wijnker, Erik; Haag, Christoph R

    2016-10-19

    Meiosis is a key event of sexual life cycles in eukaryotes. Its mechanistic details have been uncovered in several model organisms, and most of its essential features have received various and often contradictory evolutionary interpretations. In this perspective, we present an overview of these often 'weird' features. We discuss the origin of meiosis (origin of ploidy reduction and recombination, two-step meiosis), its secondary modifications (in polyploids or asexuals, inverted meiosis), its importance in punctuating life cycles (meiotic arrests, epigenetic resetting, meiotic asymmetry, meiotic fairness) and features associated with recombination (disjunction constraints, heterochiasmy, crossover interference and hotspots). We present the various evolutionary scenarios and selective pressures that have been proposed to account for these features, and we highlight that their evolutionary significance often remains largely mysterious. Resolving these mysteries will likely provide decisive steps towards understanding why sex and recombination are found in the majority of eukaryotes.This article is part of the themed issue 'Weird sex: the underappreciated diversity of sexual reproduction'. © 2016 The Author(s).

  17. Asymmetric Evolutionary Games

    Science.gov (United States)

    McAvoy, Alex; Hauert, Christoph

    2015-01-01

    Evolutionary game theory is a powerful framework for studying evolution in populations of interacting individuals. A common assumption in evolutionary game theory is that interactions are symmetric, which means that the players are distinguished by only their strategies. In nature, however, the microscopic interactions between players are nearly always asymmetric due to environmental effects, differing baseline characteristics, and other possible sources of heterogeneity. To model these phenomena, we introduce into evolutionary game theory two broad classes of asymmetric interactions: ecological and genotypic. Ecological asymmetry results from variation in the environments of the players, while genotypic asymmetry is a consequence of the players having differing baseline genotypes. We develop a theory of these forms of asymmetry for games in structured populations and use the classical social dilemmas, the Prisoner’s Dilemma and the Snowdrift Game, for illustrations. Interestingly, asymmetric games reveal essential differences between models of genetic evolution based on reproduction and models of cultural evolution based on imitation that are not apparent in symmetric games. PMID:26308326

  18. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  4. [Positron emission tomography combined with computed tomography in the initial evaluation and response assessment in primary central nervous system lymphoma].

    Science.gov (United States)

    Mercadal, Santiago; Cortés-Romera, Montserrat; Vélez, Patricia; Climent, Fina; Gámez, Cristina; González-Barca, Eva

    2015-06-08

    To evaluate the role of positron emission tomography combined with computed tomography (PET-CT) in the initial evaluation and response assessment in primary central nervous system lymphoma (PCNSL). Fourteen patients (8 males) with a median age 59.5 years diagnosed of PCNSL. A brain PET-CT and magnetic resonance imaging (MRI) were performed in the initial evaluation. In 7 patients a PET-CT after treatment was performed. PET-CT showed at diagnosis 31 hypermetabolic focuses and MRI showed 47 lesions, with a good grade of concordance between both (k = 0.61; P = .005). In the response assessment, correlation between both techniques was good, and PET-CT was helpful in the appreciation of residual MRI lesions. Overall survival at 2 years of negative vs. positive PET-CT at the end of treatment was 100 vs. 37.5%, respectively (P = .045). PET-CT can be useful in the initial evaluation of PCNSL, and especially in the assessment of response. Despite the fact that PET-CT detects less small lesions than MRI, a good correlation between MRI and PET-CT was observed. It is effective in the evaluation of residual lesions. Prospective studies are needed to confirm their possible prognostic value. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.

  5. Combined computed tomography and fluorodeoxyglucose positron emission tomography in the diagnosis of prosthetic valve endocarditis: a case series.

    Science.gov (United States)

    Bartoletti, Michele; Tumietto, Fabio; Fasulo, Giovanni; Giannella, Maddalena; Cristini, Francesco; Bonfiglioli, Rachele; Raumer, Luigi; Nanni, Cristina; Sanfilippo, Silvia; Di Eusanio, Marco; Scotton, Pier Giorgio; Graziosi, Maddalena; Rapezzi, Claudio; Fanti, Stefano; Viale, Pierluigi

    2014-01-13

    The diagnosis of prosthetic valve endocarditis is challenging. The gold standard for prosthetic valve endocarditis diagnosis is trans-esophageal echocardiography. However, trans-esophageal echocardiography may result in negative findings or yield images difficult to differentiate from thrombus in patients with prosthetic valve endocarditis. Combined computed tomography and fluorodeoxyglucose positron emission tomography is a potentially promising diagnostic tool for several infectious conditions and it has also been employed in patients with prosthetic valve endocarditis but data are still scant. We reviewed the charts of 6 patients with prosthetic aortic valves evaluated for suspicion of prosthetic valve endocarditis, at two different hospital, over a 3-year period. We found 3 patients with early-onset PVE cases and blood cultures yielding Pseudomonas aeruginosa, Staphylococcus epidermidis and Staphylococcus lugdunensis, respectively; and 3 late-onset cases in the remaining 3 patients with isolation in the blood of Streptococcus bovis, Candida albicans and P. aeruginosa, respectively. Initial trans-esophageal echocardiography was negative in all the patients, while fluorodeoxyglucose positron emission tomography showed images suspicious for prosthetic valve endocarditis. In 4 out of 6 patients valve replacement was done with histology confirming the prosthetic valve endocarditis diagnosis. After an adequate course of antibiotic therapy fluorodeoxyglucose positron emission tomography showed resolution of prosthetic valve endocarditis in all the patients. Our experience confirms the potential role of fluoroseoxyglucose positron emission tomography in the diagnosis and follow-up of prosthetic valve endocarditis.

  6. The Medical Case for a Positron Emission Tomography and X-ray Computed Tomography Combined Service in Oman.

    Science.gov (United States)

    Al-Bulushi, Naima K; Bailey, Dale; Mariani, Giuliano

    2013-11-01

    The value of a positron emission tomography and X-ray computed tomography (PET/CT) combined service in terms of diagnostic accuracy, cost-effectiveness and impact on clinical decision-making is well-documented in the literature. Its role in the management of patients presenting with cancer is shifting from early staging and restaging to the early assessment of the treatment response. Currently, the application of PET/CT has extended to non-oncological specialties-mainly neurology, cardiology and rheumatology. A further emerging application for PET/CT is the imaging of infection/inflammation. This article illustrates some of the PET/CT applications in both oncological and non-oncological disorders. In view of the absence of this modality in Oman, this article aims to increase the awareness of the importance of these imaging modalities and their significant impact on diagnosis and management in both oncological and non-oncological specialties for patients of all age groups as well as the decision-makers.

  7. A Combined Thermodynamics & Computational Method to Assess Lithium Composition in Anode and Cathode of Lithium Ion Batteries

    International Nuclear Information System (INIS)

    Zhang, Wenyu; Jiang, Lianlian; Van Durmen, Pauline; Saadat, Somaye; Yazami, Rachid

    2016-01-01

    With aim to address the open question of accurate determination of lithium composition in anode and cathode at a defined state of charge (SOC) of lithium ion batteries (LIB), we developed a method combining electrochemical thermodynamic measurements (ETM) and computational data fitting protocol. It is a common knowledge that in a lithium ion battery the SOC of anode and cathode differ from the SOC of the full-cell. Differences are in large part due to irreversible lithium losses within cell and to electrode mass unbalance. This implies that the lithium composition range in anode and in cathode during full charge and discharge cycle in full-cell is different from the composition range achieved in lithium half-cells of anode and cathode over their respective full SOC ranges. To the authors knowledge there is no unequivocal and practical method to determine the actual lithium composition of electrodes in a LIB, hence their SOC. Yet, accurate lithium composition assessment is fundamental not only for understanding the physics of electrodes but also for optimizing cell performances, particularly energy density and cycle life.

  8. Acid-base properties of the N3 ruthenium(II) solar cell sensitizer: a combined experimental and computational analysis.

    Science.gov (United States)

    Pizzoli, Giuliano; Lobello, Maria Grazia; Carlotti, Benedetta; Elisei, Fausto; Nazeeruddin, Mohammad K; Vitillaro, Giuseppe; De Angelis, Filippo

    2012-10-14

    We report a combined spectro-photometric and computational investigation of the acid-base equilibria of the N3 solar cell sensitizer [Ru(dcbpyH(2))(2)(NCS)(2)] (dcbpyH(2) = 4,4'-dicarboxyl-2,2' bipyridine) in aqueous/ethanol solutions. The absorption spectra of N3 recorded at various pH values were analyzed by Single Value Decomposition techniques, followed by Global Fitting procedures, allowing us to identify four separate acid-base equilibria and their corresponding ground state pK(a) values. DFT/TDDFT calculations were performed for the N3 dye in solution, investigating the possible relevant species obtained by sequential deprotonation of the four dye carboxylic groups. TDDFT excited state calculations provided UV-vis absorption spectra which nicely agree with the experimental spectral shapes at various pH values. The calculated pK(a) values are also in good agreement with experimental data, within <1 pK(a) unit. Based on the calculated energy differences a tentative assignment of the N3 deprotonation pathway is reported.

  9. Computational Fluid Dynamics (CFD) Simulation of Hypersonic Turbine-Based Combined-Cycle (TBCC) Inlet Mode Transition

    Science.gov (United States)

    Slater, John W.; Saunders, John D.

    2010-01-01

    Methods of computational fluid dynamics were applied to simulate the aerodynamics within the turbine flowpath of a turbine-based combined-cycle propulsion system during inlet mode transition at Mach 4. Inlet mode transition involved the rotation of a splitter cowl to close the turbine flowpath to allow the full operation of a parallel dual-mode ramjet/scramjet flowpath. Steady-state simulations were performed at splitter cowl positions of 0deg, -2deg, -4deg, and -5.7deg, at which the turbine flowpath was closed half way. The simulations satisfied one objective of providing a greater understanding of the flow during inlet mode transition. Comparisons of the simulation results with wind-tunnel test data addressed another objective of assessing the applicability of the simulation methods for simulating inlet mode transition. The simulations showed that inlet mode transition could occur in a stable manner and that accurate modeling of the interactions among the shock waves, boundary layers, and porous bleed regions was critical for evaluating the inlet static and total pressures, bleed flow rates, and bleed plenum pressures. The simulations compared well with some of the wind-tunnel data, but uncertainties in both the windtunnel data and simulations prevented a formal evaluation of the accuracy of the simulation methods.

  10. Passivity analysis of higher order evolutionary dynamics and population games

    KAUST Repository

    Mabrok, Mohamed

    2017-01-05

    Evolutionary dynamics describe how the population composition changes in response to the fitness levels, resulting in a closed-loop feedback system. Recent work established a connection between passivity theory and certain classes of population games, namely so-called “stable games”. In particular, it was shown that a combination of stable games and (an analogue of) passive evolutionary dynamics results in stable convergence to Nash equilibrium. This paper considers the converse question of necessary conditions for evolutionary dynamics to exhibit stable behaviors for all generalized stable games. Using methods from robust control analysis, we show that if an evolutionary dynamic does not satisfy a passivity property, then it is possible to construct a generalized stable game that results in instability. The results are illustrated on selected evolutionary dynamics with particular attention to replicator dynamics, which are also shown to be lossless, a special class of passive systems.

  11. Harmonic elimination in diode-clamped multilevel inverter using evolutionary algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Barkati, Said [Laboratoire d' analyse des Signaux et Systemes (LASS), Universite de M' sila, BP. 166, rue Ichbilia 28000 M' sila (Algeria); Baghli, Lotfi [Groupe de Recherche en Electrotechnique et Electronique de Nancy (GREEN), CNRS UMR 7030, Universite Henri Poincare Nancy 1, BP. 239, 54506 Vandoeuvre-les-Nancy (France); Berkouk, El Madjid; Boucherit, Mohamed-Seghir [Laboratoire de Commande des Processus (LCP), Ecole Nationale Polytechnique, BP. 182, 10 Avenue Hassen Badi, 16200 El Harrach, Alger (Algeria)

    2008-10-15

    This paper describes two evolutionary algorithms for the optimized harmonic stepped-waveform technique. Genetic algorithms and particle swarm optimization are applied to compute the switching angles in a three-phase seven-level inverter to produce the required fundamental voltage while, at the same time, specified harmonics are eliminated. Furthermore, these algorithms are also used to solve the starting point problem of the Newton-Raphson conventional method. This combination provides a very effective method for the harmonic elimination technique. This strategy is useful for different structures of seven-level inverters. The diode-clamped topology is considered in this study. (author)

  12. The Evolutionary Origins of Hierarchy.

    Science.gov (United States)

    Mengistu, Henok; Huizinga, Joost; Mouret, Jean-Baptiste; Clune, Jeff

    2016-06-01

    Hierarchical organization-the recursive composition of sub-modules-is ubiquitous in biological networks, including neural, metabolic, ecological, and genetic regulatory networks, and in human-made systems, such as large organizations and the Internet. To date, most research on hierarchy in networks has been limited to quantifying this property. However, an open, important question in evolutionary biology is why hierarchical organization evolves in the first place. It has recently been shown that modularity evolves because of the presence of a cost for network connections. Here we investigate whether such connection costs also tend to cause a hierarchical organization of such modules. In computational simulations, we find that networks without a connection cost do not evolve to be hierarchical, even when the task has a hierarchical structure. However, with a connection cost, networks evolve to be both modular and hierarchical, and these networks exhibit higher overall performance and evolvability (i.e. faster adaptation to new environments). Additional analyses confirm that hierarchy independently improves adaptability after controlling for modularity. Overall, our results suggest that the same force-the cost of connections-promotes the evolution of both hierarchy and modularity, and that these properties are important drivers of network performance and adaptability. In addition to shedding light on the emergence of hierarchy across the many domains in which it appears, these findings will also accelerate future research into evolving more complex, intelligent computational brains in the fields of artificial intelligence and robotics.

  13. The Evolutionary Origins of Hierarchy

    Science.gov (United States)

    Huizinga, Joost; Clune, Jeff

    2016-01-01

    Hierarchical organization—the recursive composition of sub-modules—is ubiquitous in biological networks, including neural, metabolic, ecological, and genetic regulatory networks, and in human-made systems, such as large organizations and the Internet. To date, most research on hierarchy in networks has been limited to quantifying this property. However, an open, important question in evolutionary biology is why hierarchical organization evolves in the first place. It has recently been shown that modularity evolves because of the presence of a cost for network connections. Here we investigate whether such connection costs also tend to cause a hierarchical organization of such modules. In computational simulations, we find that networks without a connection cost do not evolve to be hierarchical, even when the task has a hierarchical structure. However, with a connection cost, networks evolve to be both modular and hierarchical, and these networks exhibit higher overall performance and evolvability (i.e. faster adaptation to new environments). Additional analyses confirm that hierarchy independently improves adaptability after controlling for modularity. Overall, our results suggest that the same force–the cost of connections–promotes the evolution of both hierarchy and modularity, and that these properties are important drivers of network performance and adaptability. In addition to shedding light on the emergence of hierarchy across the many domains in which it appears, these findings will also accelerate future research into evolving more complex, intelligent computational brains in the fields of artificial intelligence and robotics. PMID:27280881

  14. The Evolutionary Origins of Hierarchy.

    Directory of Open Access Journals (Sweden)

    Henok Mengistu

    2016-06-01

    Full Text Available Hierarchical organization-the recursive composition of sub-modules-is ubiquitous in biological networks, including neural, metabolic, ecological, and genetic regulatory networks, and in human-made systems, such as large organizations and the Internet. To date, most research on hierarchy in networks has been limited to quantifying this property. However, an open, important question in evolutionary biology is why hierarchical organization evolves in the first place. It has recently been shown that modularity evolves because of the presence of a cost for network connections. Here we investigate whether such connection costs also tend to cause a hierarchical organization of such modules. In computational simulations, we find that networks without a connection cost do not evolve to be hierarchical, even when the task has a hierarchical structure. However, with a connection cost, networks evolve to be both modular and hierarchical, and these networks exhibit higher overall performance and evolvability (i.e. faster adaptation to new environments. Additional analyses confirm that hierarchy independently improves adaptability after controlling for modularity. Overall, our results suggest that the same force-the cost of connections-promotes the evolution of both hierarchy and modularity, and that these properties are important drivers of network performance and adaptability. In addition to shedding light on the emergence of hierarchy across the many domains in which it appears, these findings will also accelerate future research into evolving more complex, intelligent computational brains in the fields of artificial intelligence and robotics.

  15. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  16. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  18. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  19. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  2. Creating the computer player: an engaging and collaborative approach to introduce computational thinking by combining ‘unplugged’ activities with visual programming

    Directory of Open Access Journals (Sweden)

    Anna Gardeli

    2017-11-01

    Full Text Available Ongoing research is being conducted on appropriate course design, practices and teacher interventions for improving the efficiency of computer science and programming courses in K-12 education. The trend is towards a more constructivist problem-based learning approach. Computational thinking, which refers to formulating and solving problems in a form that can be efficiently processed by a computer, raises an important educational challenge. Our research aims to explore possible ways of enriching computer science teaching with a focus on development of computational thinking. We have prepared and evaluated a learning intervention for introducing computer programming to children between 10 and 14 years old; this involves students working in groups to program the behavior of the computer player of a well-known game. The programming process is split into two parts. First, students design a high-level version of their algorithm during an ‘unplugged’ pen & paper phase, and then they encode their solution as an executable program in a visual programming environment. Encouraging evaluation results have been achieved regarding the educational and motivational value of the proposed approach.

  3. Evolutionary inference via the Poisson Indel Process.

    Science.gov (United States)

    Bouchard-Côté, Alexandre; Jordan, Michael I

    2013-01-22

    We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114-124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments.

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  5. Studies in evolutionary agroecology

    DEFF Research Database (Denmark)

    Wille, Wibke

    of population performance will increase in frequency. Yield, one of the fundamental agronomic variables, is not an individual, but a population characteristic. A farmer wants a high yield per hectare; he is not interested in the performance of individual plants. When individual selection and population...... of Evolutionary Agroecology that the highest yielding individuals do not necessarily perform best as a population. The investment of resources into strategies and structures increasing individual competitive ability carries a cost. If a whole population consists of individuals investing resources to compete...

  6. Rapid identification of pearl powder from Hyriopsis cumingii by Tri-step infrared spectroscopy combined with computer vision technology

    Science.gov (United States)

    Liu, Siqi; Wei, Wei; Bai, Zhiyi; Wang, Xichang; Li, Xiaohong; Wang, Chuanxian; Liu, Xia; Liu, Yuan; Xu, Changhua

    2018-01-01

    Pearl powder, an important raw material in cosmetics and Chinese patent medicines, is commonly uneven in quality and frequently adulterated with low-cost shell powder in the market. The aim of this study is to establish an adequate approach based on Tri-step infrared spectroscopy with enhancing resolution combined with chemometrics for qualitative identification of pearl powder originated from three different quality grades of pearls and quantitative prediction of the proportions of shell powder adulterated in pearl powder. Additionally, computer vision technology (E-eyes) can investigate the color difference among different pearl powders and make it traceable to the pearl quality trait-visual color categories. Though the different grades of pearl powder or adulterated pearl powder have almost identical IR spectra, SD-IR peak intensity at about 861 cm- 1 (v2 band) exhibited regular enhancement with the increasing quality grade of pearls, while the 1082 cm- 1 (v1 band), 712 cm- 1 and 699 cm- 1 (v4 band) were just the reverse. Contrastly, only the peak intensity at 862 cm- 1 was enhanced regularly with the increasing concentration of shell powder. Thus, the bands in the ranges of (1550-1350 cm- 1, 730-680 cm- 1) and (830-880 cm- 1, 690-725 cm- 1) could be exclusive ranges to discriminate three distinct pearl powders and identify adulteration, respectively. For massive sample analysis, a qualitative classification model and a quantitative prediction model based on IR spectra was established successfully by principal component analysis (PCA) and partial least squares (PLS), respectively. The developed method demonstrated great potential for pearl powder quality control and authenticity identification in a direct, holistic manner.

  7. Evolutionary primacy of sodium bioenergetics

    Directory of Open Access Journals (Sweden)

    Wolf Yuri I

    2008-04-01

    Full Text Available Abstract Background The F- and V-type ATPases are rotary molecular machines that couple translocation of protons or sodium ions across the membrane to the synthesis or hydrolysis of ATP. Both the F-type (found in most bacteria and eukaryotic mitochondria and chloroplasts and V-type (found in archaea, some bacteria, and eukaryotic vacuoles ATPases can translocate either protons or sodium ions. The prevalent proton-dependent ATPases are generally viewed as the primary form of the enzyme whereas the sodium-translocating ATPases of some prokaryotes are usually construed as an exotic adaptation to survival in extreme environments. Results We combine structural and phylogenetic analyses to clarify the evolutionary relation between the proton- and sodium-translocating ATPases. A comparison of the structures of the membrane-embedded oligomeric proteolipid rings of sodium-dependent F- and V-ATPases reveals nearly identical sets of amino acids involved in sodium binding. We show that the sodium-dependent ATPases are scattered among proton-dependent ATPases in both the F- and the V-branches of the phylogenetic tree. Conclusion Barring convergent emergence of the same set of ligands in several lineages, these findings indicate that the use of sodium gradient for ATP synthesis is the ancestral modality of membrane bioenergetics. Thus, a primitive, sodium-impermeable but proton-permeable cell membrane that harboured a set of sodium-transporting enzymes appears to have been the evolutionary predecessor of the more structurally demanding proton-tight membranes. The use of proton as the coupling ion appears to be a later innovation that emerged on several independent occasions. Reviewers This article was reviewed by J. Peter Gogarten, Martijn A. Huynen, and Igor B. Zhulin. For the full reviews, please go to the Reviewers' comments section.

  8. Heterogeneous Compression of Large Collections of Evolutionary Trees.

    Science.gov (United States)

    Matthews, Suzanne J

    2015-01-01

    Compressing heterogeneous collections of trees is an open problem in computational phylogenetics. In a heterogeneous tree collection, each tree can contain a unique set of taxa. An ideal compression method would allow for the efficient archival of large tree collections and enable scientists to identify common evolutionary relationships over disparate analyses. In this paper, we extend TreeZip to compress heterogeneous collections of trees. TreeZip is the most efficient algorithm for compressing homogeneous tree collections. To the best of our knowledge, no other domain-based compression algorithm exists for large heterogeneous tree collections or enable their rapid analysis. Our experimental results indicate that TreeZip averages 89.03 percent (72.69 percent) space savings on unweighted (weighted) collections of trees when the level of heterogeneity in a collection is moderate. The organization of the TRZ file allows for efficient computations over heterogeneous data. For example, consensus trees can be computed in mere seconds. Lastly, combining the TreeZip compressed (TRZ) file with general-purpose compression yields average space savings of 97.34 percent (81.43 percent) on unweighted (weighted) collections of trees. Our results lead us to believe that TreeZip will prove invaluable in the efficient archival of tree collections, and enables scientists to develop novel methods for relating heterogeneous collections of trees.

  9. Resistance and relatedness on an evolutionary graph

    Science.gov (United States)

    Maciejewski, Wes

    2012-01-01

    When investigating evolution in structured populations, it is often convenient to consider the population as an evolutionary graph—individuals as nodes, and whom they may act with as edges. There has, in recent years, been a surge of interest in evolutionary graphs, especially in the study of the evolution of social behaviours. An inclusive fitness framework is best suited for this type of study. A central requirement for an inclusive fitness analysis is an expression for the genetic similarity between individuals residing on the graph. This has been a major hindrance for work in this area as highly technical mathematics are often required. Here, I derive a result that links genetic relatedness between haploid individuals on an evolutionary graph to the resistance between vertices on a corresponding electrical network. An example that demonstrates the potential computational advantage of this result over contemporary approaches is provided. This result offers more, however, to the study of population genetics than strictly computationally efficient methods. By establishing a link between gene transfer and electric circuit theory, conceptualizations of the latter can enhance understanding of the former. PMID:21849384

  10. Core principles of evolutionary medicine

    Science.gov (United States)

    Grunspan, Daniel Z; Nesse, Randolph M; Barnes, M Elizabeth; Brownell, Sara E

    2018-01-01

    Abstract Background and objectives Evolutionary medicine is a rapidly growing field that uses the principles of evolutionary biology to better understand, prevent and treat disease, and that uses studies of disease to advance basic knowledge in evolutionary biology. Over-arching principles of evolutionary medicine have been described in publications, but our study is the first to systematically elicit core principles from a diverse panel of experts in evolutionary medicine. These principles should be useful to advance recent recommendations made by The Association of American Medical Colleges and the Howard Hughes Medical Institute to make evolutionary thinking a core competency for pre-medical education. Methodology The Delphi method was used to elicit and validate a list of core principles for evolutionary medicine. The study included four surveys administered in sequence to 56 expert panelists. The initial open-ended survey created a list of possible core principles; the three subsequent surveys winnowed the list and assessed the accuracy and importance of each principle. Results Fourteen core principles elicited at least 80% of the panelists to agree or strongly agree that they were important core principles for evolutionary medicine. These principles over-lapped with concepts discussed in other articles discussing key concepts in evolutionary medicine. Conclusions and implications This set of core principles will be helpful for researchers and instructors in evolutionary medicine. We recommend that evolutionary medicine instructors use the list of core principles to construct learning goals. Evolutionary medicine is a young field, so this list of core principles will likely change as the field develops further. PMID:29493660

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  12. Intelligent computing systems emerging application areas

    CERN Document Server

    Virvou, Maria; Jain, Lakhmi

    2016-01-01

    This book at hand explores emerging scientific and technological areas in which Intelligent Computing Systems provide efficient solutions and, thus, may play a role in the years to come. It demonstrates how Intelligent Computing Systems make use of computational methodologies that mimic nature-inspired processes to address real world problems of high complexity for which exact mathematical solutions, based on physical and statistical modelling, are intractable. Common intelligent computational methodologies are presented including artificial neural networks, evolutionary computation, genetic algorithms, artificial immune systems, fuzzy logic, swarm intelligence, artificial life, virtual worlds and hybrid methodologies based on combinations of the previous. The book will be useful to researchers, practitioners and graduate students dealing with mathematically-intractable problems. It is intended for both the expert/researcher in the field of Intelligent Computing Systems, as well as for the general reader in t...

  13. Improved particle swarm optimization combined with chaos

    International Nuclear Information System (INIS)

    Liu Bo; Wang Ling; Jin Yihui; Tang Fang; Huang Dexian

    2005-01-01

    As a novel optimization technique, chaos has gained much attention and some applications during the past decade. For a given energy or cost function, by following chaotic ergodic orbits, a chaotic dynamic system may eventually reach the global optimum or its good approximation with high probability. To enhance the performance of particle swarm optimization (PSO), which is an evolutionary computation technique through individual improvement plus population cooperation and competition, hybrid particle swarm optimization algorithm is proposed by incorporating chaos. Firstly, adaptive inertia weight factor (AIWF) is introduced in PSO to efficiently balance the exploration and exploitation abilities. Secondly, PSO with AIWF and chaos are hybridized to form a chaotic PSO (CPSO), which reasonably combines the population-based evolutionary searching ability of PSO and chaotic searching behavior. Simulation results and comparisons with the standard PSO and several meta-heuristics show that the CPSO can effectively enhance the searching efficiency and greatly improve the searching quality

  14. Complexity in Evolutionary Processes

    International Nuclear Information System (INIS)

    Schuster, P.

    2010-01-01

    Darwin's principle of evolution by natural selection is readily casted into a mathematical formalism. Molecular biology revealed the mechanism of mutation and provides the basis for a kinetic theory of evolution that models correct reproduction and mutation as parallel chemical reaction channels. A result of the kinetic theory is the existence of a phase transition in evolution occurring at a critical mutation rate, which represents a localization threshold for the population in sequence space. Occurrence and nature of such phase transitions depend critically on fitness landscapes. The fitness landscape being tantamount to a mapping from sequence or genotype space into phenotype space is identified as the true source of complexity in evolution. Modeling evolution as a stochastic process is discussed and neutrality with respect to selection is shown to provide a major challenge for understanding evolutionary processes (author)

  15. Computation of beam quality parameters for Mo/Mo, Mo/Rh, Rh/Rh, and W/Al target/filter combinations in mammography

    International Nuclear Information System (INIS)

    Kharrati, Hedi; Zarrad, Boubaker

    2003-01-01

    A computer program was implemented to predict mammography x-ray beam parameters in the range 20-40 kV for Mo/Mo, Mo/Rh, Rh/Rh, and W/Al target/filter combinations. The computation method used to simulate mammography x-ray spectra is based on the Boone et al. model. The beam quality parameters such as the half-value layer (HVL), the homogeneity coefficient (HC), and the average photon energy were computed by simulating the interaction of the spectrum photons with matter. The checking of this computation was done using a comparison of the results with published data and measured values obtained at the Netherlands Metrology Institute Van Swinden Laboratorium, National Institute of Standards and Technology, and International Atomic Energy Agency. The predicted values with a mean deviation of 3.3% of HVL, 3.7% of HC, and 1.5% of average photon energy show acceptable agreement with published data and measurements for all target/filter combinations in the 23-40 kV range. The accuracy of this computation can be considered clinically acceptable and can allow an appreciable estimation for the beam quality parameters

  16. Evolutionary economics and industry location

    NARCIS (Netherlands)

    Boschma, R.A.; Frenken, K.

    2003-01-01

    This paper aims to provide the outlines of an evolutionary economic geography of industry location. We discuss two evolutionary explanations of industry location, that is, one that concentrates on spin-offs, and one that focuses attention on knowledge and agglomeration economies. We claim that both

  17. Contemporary issues in evolutionary biology

    Indian Academy of Sciences (India)

    These discussions included, among others, the possible consequences of nonDNA-based inheritance—epigenetics and cultural evolution, niche construction, and developmental mechanisms on our understanding of the evolutionary process, speciation, complexity in biology, and constructing a formal evolutionary theory.

  18. Contemporary issues in evolutionary biology

    Indian Academy of Sciences (India)

    We are delighted to bring to the readers, a set of peer-reviewed papers on evolutionary biology, published as a special issue of the Journal of Genetics. These papers emanated from ruminations upon and discussions at the Foundations of. Evolutionary Theory: the Ongoing Synthesis meeting at Coorg, India, in February ...

  19. Fixation Time for Evolutionary Graphs

    Science.gov (United States)

    Nie, Pu-Yan; Zhang, Pei-Ai

    Evolutionary graph theory (EGT) is recently proposed by Lieberman et al. in 2005. EGT is successful for explaining biological evolution and some social phenomena. It is extremely important to consider the time of fixation for EGT in many practical problems, including evolutionary theory and the evolution of cooperation. This study characterizes the time to asymptotically reach fixation.

  20. Applications of evolutionary economic geography

    NARCIS (Netherlands)

    Boschma, R.A.; Frenken, K.; Puranam, Krishna Kishore; Ravi Kumar Jain B., xx

    2008-01-01

    This paper is written as the first chapter of an edited volume on evolutionary economics and economic geography (Frenken, K., editor, Applied Evolutionary Economics and Economic Geography, Cheltenham: Edward Elgar, expected publication date February 2007). The paper reviews empirical applications of

  1. Dynamic modelling of an adsorption storage tank using a hybrid approach combining computational fluid dynamics and process simulation

    Science.gov (United States)

    Mota, J.P.B.; Esteves, I.A.A.C.; Rostam-Abadi, M.

    2004-01-01

    A computational fluid dynamics (CFD) software package has been coupled with the dynamic process simulator of an adsorption storage tank for methane fuelled vehicles. The two solvers run as independent processes and handle non-overlapping portions of the computational domain. The codes exchange data on the boundary interface of the two domains to ensure continuity of the solution and of its gradient. A software interface was developed to dynamically suspend and activate each process as necessary, and be responsible for data exchange and process synchronization. This hybrid computational tool has been successfully employed to accurately simulate the discharge of a new tank design and evaluate its performance. The case study presented here shows that CFD and process simulation are highly complementary computational tools, and that there are clear benefits to be gained from a close integration of the two. ?? 2004 Elsevier Ltd. All rights reserved.

  2. 5 CFR 591.221 - How does OPM compute the consumer expenditure weights it uses to combine price indexes?

    Science.gov (United States)

    2010-01-01

    ... OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS ALLOWANCES AND DIFFERENTIALS Cost-of-Living Allowance and Post Differential-Nonforeign Areas Cost-Of-Living Allowances § 591.221 How does OPM compute...

  3. Evolutionary Explanations of Eating Disorders

    Directory of Open Access Journals (Sweden)

    Igor Kardum

    2008-12-01

    Full Text Available This article reviews several most important evolutionary mechanisms that underlie eating disorders. The first part clarifies evolutionary foundations of mental disorders and various mechanisms leading to their development. In the second part selective pressures and evolved adaptations causing contemporary epidemic of obesity as well as differences in dietary regimes and life-style between modern humans and their ancestors are described. Concerning eating disorders, a number of current evolutionary explanations of anorexia nervosa are presented together with their main weaknesses. Evolutionary explanations of eating disorders based on the reproductive suppression hypothesis and its variants derived from kin selection theory and the model of parental manipulation were elaborated. The sexual competition hypothesis of eating disorder, adapted to flee famine hypothesis as well as explanation based on the concept of social attention holding power and the need to belonging were also explained. The importance of evolutionary theory in modern conceptualization and research of eating disorders is emphasized.

  4. Parallel Evolutionary Optimization for Neuromorphic Network Training

    Energy Technology Data Exchange (ETDEWEB)

    Schuman, Catherine D [ORNL; Disney, Adam [University of Tennessee (UT); Singh, Susheela [North Carolina State University (NCSU), Raleigh; Bruer, Grant [University of Tennessee (UT); Mitchell, John Parker [University of Tennessee (UT); Klibisz, Aleksander [University of Tennessee (UT); Plank, James [University of Tennessee (UT)

    2016-01-01

    One of the key impediments to the success of current neuromorphic computing architectures is the issue of how best to program them. Evolutionary optimization (EO) is one promising programming technique; in particular, its wide applicability makes it especially attractive for neuromorphic architectures, which can have many different characteristics. In this paper, we explore different facets of EO on a spiking neuromorphic computing model called DANNA. We focus on the performance of EO in the design of our DANNA simulator, and on how to structure EO on both multicore and massively parallel computing systems. We evaluate how our parallel methods impact the performance of EO on Titan, the U.S.'s largest open science supercomputer, and BOB, a Beowulf-style cluster of Raspberry Pi's. We also focus on how to improve the EO by evaluating commonality in higher performing neural networks, and present the result of a study that evaluates the EO performed by Titan.

  5. Selecting the Best: Evolutionary Engineering of Chemical Production in Microbes

    DEFF Research Database (Denmark)

    Shepelin, Denis; Hansen, Anne Sofie Lærke; Lennen, Rebecca

    2018-01-01

    , we focus primarily on a more challenging problem-the use of evolutionary engineering for improving the production of chemicals in microbes directly. We describe recent developments in evolutionary engineering strategies, in general, and discuss, in detail, case studies where production of a chemical......Microbial cell factories have proven to be an economical means of production for many bulk, specialty, and fine chemical products. However, we still lack both a holistic understanding of organism physiology and the ability to predictively tune enzyme activities in vivo, thus slowing down rational...... engineering of industrially relevant strains. An alternative concept to rational engineering is to use evolution as the driving force to select for desired changes, an approach often described as evolutionary engineering. In evolutionary engineering, in vivo selections for a desired phenotype are combined...

  6. Probing Stereoselectivity in Ring-Opening Metathesis Polymerization Mediated by Cyclometalated Ruthenium-Based Catalysts: A Combined Experimental and Computational Study

    OpenAIRE

    Rosebrugh, L. E.; Ahmed, T. S.; Marx, V. M.; Hartung, J.; Liu, P.; López, J. G.; Houk, K. N.; Grubbs, R. H.

    2016-01-01

    The microstructures of polymers produced by ring-opening metathesis polymerization (ROMP) with cyclometalated Ru-carbene metathesis catalysts were investigated. A strong bias for a cis,syndiotactic microstructure with minimal head-to-tail bias was observed. In instances where trans errors were introduced, it was determined that these regions were also syndiotactic. Furthermore, hypothetical reaction intermediates and transition structures were analyzed computationally. Combined experimental a...

  7. A combined vector potential-scalar potential method for FE computation of 3D magnetic fields in electrical devices with iron cores

    Science.gov (United States)

    Wang, R.; Demerdash, N. A.

    1991-01-01

    A method of combined use of magnetic vector potential based finite-element (FE) formulations and magnetic scalar potential (MSP) based formulations for computation of three-dimensional magnetostatic fields is introduced. In this method, the curl-component of the magnetic field intensity is computed by a reduced magnetic vector potential. This field intensity forms the basic of a forcing function for a global magnetic scalar potential solution over the entire volume of the region. This method allows one to include iron portions sandwiched in between conductors within partitioned current-carrying subregions. The method is most suited for large-scale global-type 3-D magnetostatic field computations in electrical devices, and in particular rotating electric machinery.

  8. Chemical evolutionary games.

    Science.gov (United States)

    Aristotelous, Andreas C; Durrett, Richard

    2014-05-01

    Inspired by the use of hybrid cellular automata in modeling cancer, we introduce a generalization of evolutionary games in which cells produce and absorb chemicals, and the chemical concentrations dictate the death rates of cells and their fitnesses. Our long term aim is to understand how the details of the interactions in a system with n species and m chemicals translate into the qualitative behavior of the system. Here, we study two simple 2×2 games with two chemicals and revisit the two and three species versions of the one chemical colicin system studied earlier by Durrett and Levin (1997). We find that in the 2×2 examples, the behavior of our new spatial model can be predicted from that of the mean field differential equation using ideas of Durrett and Levin (1994). However, in the three species colicin model, the system with diffusion does not have the coexistence which occurs in the lattices model in which sites interact with only their nearest neighbors. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Evolutionary and developmental modules.

    Science.gov (United States)

    Lacquaniti, Francesco; Ivanenko, Yuri P; d'Avella, Andrea; Zelik, Karl E; Zago, Myrka

    2013-01-01

    The identification of biological modules at the systems level often follows top-down decomposition of a task goal, or bottom-up decomposition of multidimensional data arrays into basic elements or patterns representing shared features. These approaches traditionally have been applied to mature, fully developed systems. Here we review some results from two other perspectives on modularity, namely the developmental and evolutionary perspective. There is growing evidence that modular units of development were highly preserved and recombined during evolution. We first consider a few examples of modules well identifiable from morphology. Next we consider the more difficult issue of identifying functional developmental modules. We dwell especially on modular control of locomotion to argue that the building blocks used to construct different locomotor behaviors are similar across several animal species, presumably related to ancestral neural networks of command. A recurrent theme from comparative studies is that the developmental addition of new premotor modules underlies the postnatal acquisition and refinement of several different motor behaviors in vertebrates.

  10. Combined magnetic vector-scalar potential finite element computation of 3D magnetic field and performance of modified Lundell alternators in Space Station applications. Ph.D. Thesis

    Science.gov (United States)

    Wang, Ren H.

    1991-01-01

    A method of combined use of magnetic vector potential (MVP) based finite element (FE) formulations and magnetic scalar potential (MSP) based FE formulations for computation of three-dimensional (3D) magnetostatic fields is developed. This combined MVP-MSP 3D-FE method leads to considerable reduction by nearly a factor of 3 in the number of unknowns in comparison to the number of unknowns which must be computed in global MVP based FE solutions. This method allows one to incorporate portions of iron cores sandwiched in between coils (conductors) in current-carrying regions. Thus, it greatly simplifies the geometries of current carrying regions (in comparison with the exclusive MSP based methods) in electric machinery applications. A unique feature of this approach is that the global MSP solution is single valued in nature, that is, no branch cut is needed. This is again a superiority over the exclusive MSP based methods. A Newton-Raphson procedure with a concept of an adaptive relaxation factor was developed and successfully used in solving the 3D-FE problem with magnetic material anisotropy and nonlinearity. Accordingly, this combined MVP-MSP 3D-FE method is most suited for solution of large scale global type magnetic field computations in rotating electric machinery with very complex magnetic circuit geometries, as well as nonlinear and anisotropic material properties.

  11. A Novel Handwritten Letter Recognizer Using Enhanced Evolutionary Neural Network

    Science.gov (United States)

    Mahmoudi, Fariborz; Mirzashaeri, Mohsen; Shahamatnia, Ehsan; Faridnia, Saed

    This paper introduces a novel design for handwritten letter recognition by employing a hybrid back-propagation neural network with an enhanced evolutionary algorithm. Feeding the neural network consists of a new approach which is invariant to translation, rotation, and scaling of input letters. Evolutionary algorithm is used for the global search of the search space and the back-propagation algorithm is used for the local search. The results have been computed by implementing this approach for recognizing 26 English capital letters in the handwritings of different people. The computational results show that the neural network reaches very satisfying results with relatively scarce input data and a promising performance improvement in convergence of the hybrid evolutionary back-propagation algorithms is exhibited.

  12. Energy Consumption and Indoor Environment Predicted by a Combination of Computational Fluid Dynamics and Building Energy Performance Simulation

    DEFF Research Database (Denmark)

    Nielsen, Peter Vilhelm

    2003-01-01

    An interconnection between a building energy performance simulation program and a Computational Fluid Dynamics program (CFD) for room air distribution is introduced for improvement of the predictions of both the energy consumption and the indoor environment.The article describes a calculation...

  13. Combined experimental and computational studies of pyrazinamide and nicotinamide in the context of crystal engineering and thermodynamics

    DEFF Research Database (Denmark)

    Jarzembska, Katarzyna N.; Hoser, Anna Agnieszka; Kamiński, Radosław

    2014-01-01

    and computational analysis of the β form of pyrazinamide and α form of nicotinamide. Static electron density distribution is obtained through application of Hansen and Coppens multipolar formalism, and further analyzed via Baders quantum theory of atoms in molecules (QTAIM). Geometrical and electron density...

  14. Combination of inquiry learning model and computer simulation to improve mastery concept and the correlation with critical thinking skills (CTS)

    Science.gov (United States)

    Nugraha, Muhamad Gina; Kaniawati, Ida; Rusdiana, Dadi; Kirana, Kartika Hajar

    2016-02-01

    Among the purposes of physics learning at high school is to master the physics concepts and cultivate scientific attitude (including critical attitude), develop inductive and deductive reasoning skills. According to Ennis et al., inductive and deductive reasoning skills are part of critical thinking. Based on preliminary studies, both of the competence are lack achieved, it is seen from student learning outcomes is low and learning processes that are not conducive to cultivate critical thinking (teacher-centered learning). One of learning model that predicted can increase mastery concepts and train CTS is inquiry learning model aided computer simulations. In this model, students were given the opportunity to be actively involved in the experiment and also get a good explanation with the computer simulations. From research with randomized control group pretest-posttest design, we found that the inquiry learning model aided computer simulations can significantly improve students' mastery concepts than the conventional (teacher-centered) method. With inquiry learning model aided computer simulations, 20% of students have high CTS, 63.3% were medium and 16.7% were low. CTS greatly contribute to the students' mastery concept with a correlation coefficient of 0.697 and quite contribute to the enhancement mastery concept with a correlation coefficient of 0.603.

  15. Mobile computing with special reference to readability task under the impact of vibration, colour combination and gender.

    Science.gov (United States)

    Mallick, Zulquernain; Siddiquee, Arshad Noor; Haleem, Abid

    2008-12-01

    The last 20 years have seen a tremendous growth in the field of computing with special reference to mobile computing. Ergonomic issues pertaining to this theme remains unexplored. With special reference to readability in mobile computing, an experimental research was conducted to study the gender effect on human performance under the impact of vibration in a human computer interaction environment. Fourteen subjects (7 males and 7 females) participated in the study. Three independent variables, namely gender, level of vibration and screen text/background colour, were selected for the experimental investigation while the dependent variable was the number of characters read per minute. The data collected were analyzed statistically through an experimental design for repeated measures. Results indicated that gender as an organismic variable, the level of vibration and screen text/background colour revealed statistically significant differences. However, the second order interaction was found to be statistically non-significant. These findings are discussed in light of the previous studies undertaken on the topic.

  16. Evolutionary hotspots in the Mojave Desert

    Science.gov (United States)

    Vandergast, Amy G.; Inman, Richard D.; Barr, Kelly R.; Nussear, Kenneth E.; Esque, Todd C.; Hathaway, Stacie A.; Wood, Dustin A.; Medica, Philip A.; Breinholt, Jesse W.; Stephen, Catherine L.; Gottscho, Andrew D.; Marks, Sharyn B.; Jennings, W. Bryan; Fisher, Robert N.

    2013-01-01

    Genetic diversity within species provides the raw material for adaptation and evolution. Just as regions of high species diversity are conservation targets, identifying regions containing high genetic diversity and divergence within and among populations may be important to protect future evolutionary potential. When multiple co-distributed species show spatial overlap in high genetic diversity and divergence, these regions can be considered evolutionary hotspots. We mapped spatial population genetic structure for 17 animal species across the Mojave Desert, USA. We analyzed these in concurrence and located 10 regions of high genetic diversity, divergence or both among species. These were mainly concentrated along the western and southern boundaries where ecotones between mountain, grassland and desert habitat are prevalent, and along the Colorado River. We evaluated the extent to which these hotspots overlapped protected lands and utility-scale renewable energy development projects of the Bureau of Land Management. While 30–40% of the total hotspot area was categorized as protected, between 3–7% overlapped with proposed renewable energy project footprints, and up to 17% overlapped with project footprints combined with transmission corridors. Overlap of evolutionary hotspots with renewable energy development mainly occurred in 6 of the 10 identified hotspots. Resulting GIS-based maps can be incorporated into ongoing landscape planning efforts and highlight specific regions where further investigation of impacts to population persistence and genetic connectivity may be warranted.

  17. Spatial evolutionary epidemiology of spreading epidemics.

    Science.gov (United States)

    Lion, S; Gandon, S

    2016-10-26

    Most spatial models of host-parasite interactions either neglect the possibility of pathogen evolution or consider that this process is slow enough for epidemiological dynamics to reach an equilibrium on a fast timescale. Here, we propose a novel approach to jointly model the epidemiological and evolutionary dynamics of spatially structured host and pathogen populations. Starting from a multi-strain epidemiological model, we use a combination of spatial moment equations and quantitative genetics to analyse the dynamics of mean transmission and virulence in the population. A key insight of our approach is that, even in the absence of long-term evolutionary consequences, spatial structure can affect the short-term evolution of pathogens because of the build-up of spatial differentiation in mean virulence. We show that spatial differentiation is driven by a balance between epidemiological and genetic effects, and this quantity is related to the effect of kin competition discussed in previous studies of parasite evolution in spatially structured host populations. Our analysis can be used to understand and predict the transient evolutionary dynamics of pathogens and the emergence of spatial patterns of phenotypic variation. © 2016 The Author(s).

  18. Economic modeling using evolutionary algorithms : the effect of binary encoding of strategies

    NARCIS (Netherlands)

    Waltman, L.R.; Eck, van N.J.; Dekker, Rommert; Kaymak, U.

    2011-01-01

    We are concerned with evolutionary algorithms that are employed for economic modeling purposes. We focus in particular on evolutionary algorithms that use a binary encoding of strategies. These algorithms, commonly referred to as genetic algorithms, are popular in agent-based computational economics

  19. Evolutionary disarmament in interspecific competition.

    Science.gov (United States)

    Kisdi, E; Geritz, S A

    2001-12-22

    Competitive asymmetry, which is the advantage of having a larger body or stronger weaponry than a contestant, drives spectacular evolutionary arms races in intraspecific competition. Similar asymmetries are well documented in interspecific competition, yet they seldom lead to exaggerated traits. Here we demonstrate that two species with substantially different size may undergo parallel coevolution towards a smaller size under the same ecological conditions where a single species would exhibit an evolutionary arms race. We show that disarmament occurs for a wide range of parameters in an ecologically explicit model of competition for a single shared resource; disarmament also occurs in a simple Lotka-Volterra competition model. A key property of both models is the interplay between evolutionary dynamics and population density. The mechanism does not rely on very specific features of the model. Thus, evolutionary disarmament may be widespread and may help to explain the lack of interspecific arms races.

  20. Evolutionary genetics: the Drosophila model

    Indian Academy of Sciences (India)

    Unknown

    Evolutionary genetics straddles the two fundamental processes of life, ... of the genus Drosophila have been used extensively as model systems in experimental ... issue will prove interesting, informative and thought-provoking for both estab-.

  1. Evolutionary robotics – A review

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    a need for a technique by which the robot is able to acquire new behaviours automatically .... Evolutionary robotics is a comparatively new field of robotics research, which seems to ..... Technical Report: PCIA-94-04, Institute of Psychology,.

  2. Combining discrete equations method and upwind downwind-controlled splitting for non-reacting and reacting two-fluid computations

    International Nuclear Information System (INIS)

    Tang, K.

    2012-01-01

    When numerically investigating multiphase phenomena during severe accidents in a reactor system, characteristic lengths of the multi-fluid zone (non-reactive and reactive) are found to be much smaller than the volume of the reactor containment, which makes the direct modeling of the configuration hardly achievable. Alternatively, we propose to consider the physical multiphase mixture zone as an infinitely thin interface. Then, the reactive Riemann solver is inserted into the Reactive Discrete Equations Method (RDEM) to compute high speed combustion waves represented by discontinuous interfaces. An anti-diffusive approach is also coupled with RDEM to accurately simulate reactive interfaces. Increased robustness and efficiency when computing both multiphase interfaces and reacting flows are achieved thanks to an original upwind downwind-controlled splitting method (UDCS). UDCS is capable of accurately solving interfaces on multi-dimensional unstructured meshes, including reacting fronts for both deflagration and detonation configurations. (author)

  3. Combining on-chip synthesis of a focused combinatorial library with computational target prediction reveals imidazopyridine GPCR ligands.

    Science.gov (United States)

    Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Schneider, Gisbert

    2014-01-07

    Using the example of the Ugi three-component reaction we report a fast and efficient microfluidic-assisted entry into the imidazopyridine scaffold, where building block prioritization was coupled to a new computational method for predicting ligand-target associations. We identified an innovative GPCR-modulating combinatorial chemotype featuring ligand-efficient adenosine A1/2B and adrenergic α1A/B receptor antagonists. Our results suggest the tight integration of microfluidics-assisted synthesis with computer-based target prediction as a viable approach to rapidly generate bioactivity-focused combinatorial compound libraries with high success rates. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Comprehensive computational model for combining fluid hydrodynamics, light transport and biomass growth in a Taylor vortex algal photobioreactor: Lagrangian approach.

    Science.gov (United States)

    Gao, Xi; Kong, Bo; Vigil, R Dennis

    2017-01-01

    A comprehensive quantitative model incorporating the effects of fluid flow patterns, light distribution, and algal growth kinetics on biomass growth rate is developed in order to predict the performance of a Taylor vortex algal photobioreactor for culturing Chlorella vulgaris. A commonly used Lagrangian strategy for coupling the various factors influencing algal growth was employed whereby results from computational fluid dynamics and radiation transport simulations were used to compute numerous microorganism light exposure histories, and this information in turn was used to estimate the global biomass specific growth rate. The simulations provide good quantitative agreement with experimental data and correctly predict the trend in reactor performance as a key reactor operating parameter is varied (inner cylinder rotation speed). However, biomass growth curves are consistently over-predicted and potential causes for these over-predictions and drawbacks of the Lagrangian approach are addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Evolutionary Game Theory: A Renaissance

    Directory of Open Access Journals (Sweden)

    Jonathan Newton

    2018-05-01

    Full Text Available Economic agents are not always rational or farsighted and can make decisions according to simple behavioral rules that vary according to situation and can be studied using the tools of evolutionary game theory. Furthermore, such behavioral rules are themselves subject to evolutionary forces. Paying particular attention to the work of young researchers, this essay surveys the progress made over the last decade towards understanding these phenomena, and discusses open research topics of importance to economics and the broader social sciences.

  6. Freud: the first evolutionary psychologist?

    Science.gov (United States)

    LeCroy, D

    2000-04-01

    An evolutionary perspective on attachment theory and psychoanalytic theory brings these two fields together in interesting ways. Application of the evolutionary principle of parent-offspring conflict to attachment theory suggests that attachment styles represent context-sensitive, evolved (adaptive) behaviors. In addition, an emphasis on offspring counter-strategies to adult reproductive strategies leads to consideration of attachment styles as overt manifestations of psychodynamic mediating processes, including the defense mechanisms of repression and reaction formation.

  7. Betting on the fastest horse: Using computer simulation to design a combination HIV intervention for future projects in Maharashtra, India.

    Directory of Open Access Journals (Sweden)

    Kelly V Ruggles

    Full Text Available To inform the design of a combination intervention strategy targeting HIV-infected unhealthy alcohol users in Maharashtra, India, that could be tested in future randomized control trials.Using probabilistic compartmental simulation modeling we compared intervention strategies targeting HIV-infected unhealthy alcohol users on antiretroviral therapy (ART in Maharashtra, India. We tested interventions targeting four behaviors (unhealthy alcohol consumption, risky sexual behavior, depression and antiretroviral adherence, in three formats (individual, group based, community and two durations (shorter versus longer. A total of 5,386 possible intervention combinations were tested across the population for a 20-year time horizon and intervention bundles were narrowed down based on incremental cost-effectiveness analysis using a two-step probabilistic uncertainty analysis approach.Taking into account uncertainty in transmission variables and intervention cost and effectiveness values, we were able to reduce the number of possible intervention combinations to be used in a randomized control trial from over 5,000 to less than 5. The most robust intervention bundle identified was a combination of three interventions: long individual alcohol counseling; weekly Short Message Service (SMS adherence counseling; and brief sex risk group counseling.In addition to guiding policy design, simulation modeling of HIV transmission can be used as a preparatory step to trial design, offering a method for intervention pre-selection at a reduced cost.

  8. Betting on the fastest horse: Using computer simulation to design a combination HIV intervention for future projects in Maharashtra, India.

    Science.gov (United States)

    Ruggles, Kelly V; Patel, Anik R; Schensul, Stephen; Schensul, Jean; Nucifora, Kimberly; Zhou, Qinlian; Bryant, Kendall; Braithwaite, R Scott

    2017-01-01

    To inform the design of a combination intervention strategy targeting HIV-infected unhealthy alcohol users in Maharashtra, India, that could be tested in future randomized control trials. Using probabilistic compartmental simulation modeling we compared intervention strategies targeting HIV-infected unhealthy alcohol users on antiretroviral therapy (ART) in Maharashtra, India. We tested interventions targeting four behaviors (unhealthy alcohol consumption, risky sexual behavior, depression and antiretroviral adherence), in three formats (individual, group based, community) and two durations (shorter versus longer). A total of 5,386 possible intervention combinations were tested across the population for a 20-year time horizon and intervention bundles were narrowed down based on incremental cost-effectiveness analysis using a two-step probabilistic uncertainty analysis approach. Taking into account uncertainty in transmission variables and intervention cost and effectiveness values, we were able to reduce the number of possible intervention combinations to be used in a randomized control trial from over 5,000 to less than 5. The most robust intervention bundle identified was a combination of three interventions: long individual alcohol counseling; weekly Short Message Service (SMS) adherence counseling; and brief sex risk group counseling. In addition to guiding policy design, simulation modeling of HIV transmission can be used as a preparatory step to trial design, offering a method for intervention pre-selection at a reduced cost.

  9. Can An Evolutionary Process Create English Text?

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.

    2008-10-29

    Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed to produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).

  10. Evolutionary Transgenomics: prospects and challenges

    Directory of Open Access Journals (Sweden)

    Raul eCorrea

    2015-10-01

    Full Text Available AbstractMany advances in our understanding of the genetic basis of species differences have arisen from transformation experiments, which allow us to study the effect of genes from one species (the donor when placed in the genetic background of another species (the recipient. Such interspecies transformation experiments are usually focused on candidate genes – genes that, based on work in model systems, are suspected to be responsible for certain phenotypic differences between the donor and recipient species. We suggest that the high efficiency of transformation in a few plant species, most notably Arabidopsis thaliana, combined with the small size of typical plant genes and their cis-regulatory regions allow implementation of a screening strategy that does not depend upon a priori candidate gene identification. This approach, transgenomics, entails moving many large genomic inserts of a donor species into the wild type background of a recipient species and then screening for dominant phenotypic effects. As a proof of concept, we recently conducted a transgenomic screen that analyzed more than 1100 random, large genomic inserts of the Alabama gladecress Leavenworthia alabamica for dominant phenotypic effects in the A. thaliana background. This screen identified one insert that shortens fruit and decreases A. thaliana fertility. In this paper we discuss the principles of transgenomic screens and suggest methods to help minimize the frequencies of false positive and false negative results. We argue that, because transgenomics avoids committing in advance to candidate genes it has the potential to help us identify truly novel genes or cryptic functions of known genes. Given the valuable knowledge that is likely to be gained, we believe the time is ripe for the plant evolutionary community to invest in transgenomic screens, at least in the mustard family Brassicaceae Burnett where many species are amenable to efficient transformation.

  11. The combined use of computer-guided, minimally invasive, flapless corticotomy and clear aligners as a novel approach to moderate crowding: A case report.

    Science.gov (United States)

    Cassetta, Michele; Altieri, Federica; Pandolfi, Stefano; Giansanti, Matteo

    2017-03-01

    The aim of this case report was to describe an innovative orthodontic treatment method that combined surgical and orthodontic techniques. The novel method was used to achieve a positive result in a case of moderate crowding by employing a computer-guided piezocision procedure followed by the use of clear aligners. A 23-year-old woman had a malocclusion with moderate crowding. Her periodontal indices, oral health-related quality of life (OHRQoL), and treatment time were evaluated. The treatment included interproximal corticotomy cuts extending through the entire thickness of the cortical layer, without a full-thickness flap reflection. This was achieved with a three-dimensionally printed surgical guide using computer-aided design and computer-aided manufacturing. Orthodontic force was applied to the teeth immediately after surgery by using clear appliances for better control of tooth movement. The total treatment time was 8 months. The periodontal indices improved after crowding correction, but the oral health impact profile showed a slight deterioration of OHRQoL during the 3 days following surgery. At the 2-year retention follow-up, the stability of treatment was excellent. The reduction in surgical time and patient discomfort, increased periodontal safety and patient acceptability, and accurate control of orthodontic movement without the risk of losing anchorage may encourage the use of this combined technique in appropriate cases.

  12. Selecting the Best: Evolutionary Engineering of Chemical Production in Microbes.

    Science.gov (United States)

    Shepelin, Denis; Hansen, Anne Sofie Lærke; Lennen, Rebecca; Luo, Hao; Herrgård, Markus J

    2018-05-11

    Microbial cell factories have proven to be an economical means of production for many bulk, specialty, and fine chemical products. However, we still lack both a holistic understanding of organism physiology and the ability to predictively tune enzyme activities in vivo, thus slowing down rational engineering of industrially relevant strains. An alternative concept to rational engineering is to use evolution as the driving force to select for desired changes, an approach often described as evolutionary engineering. In evolutionary engineering, in vivo selections for a desired phenotype are combined with either generation of spontaneous mutations or some form of targeted or random mutagenesis. Evolutionary engineering has been used to successfully engineer easily selectable phenotypes, such as utilization of a suboptimal nutrient source or tolerance to inhibitory substrates or products. In this review, we focus primarily on a more challenging problem-the use of evolutionary engineering for improving the production of chemicals in microbes directly. We describe recent developments in evolutionary engineering strategies, in general, and discuss, in detail, case studies where production of a chemical has been successfully achieved through evolutionary engineering by coupling production to cellular growth.

  13. Rationalization of the Color Properties of Fluorescein in the Solid State: A Combined Computational and Experimental Study.

    Science.gov (United States)

    Arhangelskis, Mihails; Eddleston, Mark D; Reid, David G; Day, Graeme M; Bučar, Dejan-Krešimir; Morris, Andrew J; Jones, William

    2016-07-11

    Fluorescein is known to exist in three tautomeric forms defined as quinoid, zwitterionic, and lactoid. In the solid state, the quinoid and zwitterionic forms give rise to red and yellow materials, respectively. The lactoid form has not been crystallized pure, although its cocrystal and solvate forms exhibit colors ranging from yellow to green. An explanation for the observed colors of the crystals is found using a combination of UV/Vis spectroscopy and plane-wave DFT calculations. The role of cocrystal coformers in modifying crystal color is also established. Several new crystal structures are determined using a combination of X-ray and electron diffraction, solid-state NMR spectroscopy, and crystal structure prediction (CSP). The protocol presented herein may be used to predict color properties of materials prior to their synthesis. © 2016 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  14. Betting on the fastest horse: Using computer simulation to design a combination HIV intervention for future projects in Maharashtra, India

    OpenAIRE

    Ruggles, Kelly V.; Patel, Anik R.; Schensul, Stephen; Schensul, Jean; Nucifora, Kimberly; Zhou, Qinlian; Bryant, Kendall; Braithwaite, R. Scott

    2017-01-01

    Objective To inform the design of a combination intervention strategy targeting HIV-infected unhealthy alcohol users in Maharashtra, India, that could be tested in future randomized control trials. Methods Using probabilistic compartmental simulation modeling we compared intervention strategies targeting HIV-infected unhealthy alcohol users on antiretroviral therapy (ART) in Maharashtra, India. We tested interventions targeting four behaviors (unhealthy alcohol consumption, risky sexual behav...

  15. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  16. More efficient evolutionary strategies for model calibration with watershed model for demonstration

    Science.gov (United States)

    Baggett, J. S.; Skahill, B. E.

    2008-12-01

    Evolutionary strategies allow automatic calibration of more complex models than traditional gradient based approaches, but they are more computationally intensive. We present several efficiency enhancements for evolution strategies, many of which are not new, but when combined have been shown to dramatically decrease the number of model runs required for calibration of synthetic problems. To reduce the number of expensive model runs we employ a surrogate objective function for an adaptively determined fraction of the population at each generation (Kern et al., 2006). We demonstrate improvements to the adaptive ranking strategy that increase its efficiency while sacrificing little reliability and further reduce the number of model runs required in densely sampled parts of parameter space. Furthermore, we include a gradient individual in each generation that is usually not selected when the search is in a global phase or when the derivatives are poorly approximated, but when selected near a smooth local minimum can dramatically increase convergence speed (Tahk et al., 2007). Finally, the selection of the gradient individual is used to adapt the size of the population near local minima. We show, by incorporating these enhancements into the Covariance Matrix Adaption Evolution Strategy (CMAES; Hansen, 2006), that their synergetic effect is greater than their individual parts. This hybrid evolutionary strategy exploits smooth structure when it is present but degrades to an ordinary evolutionary strategy, at worst, if smoothness is not present. Calibration of 2D-3D synthetic models with the modified CMAES requires approximately 10%-25% of the model runs of ordinary CMAES. Preliminary demonstration of this hybrid strategy will be shown for watershed model calibration problems. Hansen, N. (2006). The CMA Evolution Strategy: A Comparing Review. In J.A. Lozano, P. Larrañga, I. Inza and E. Bengoetxea (Eds.). Towards a new evolutionary computation. Advances in estimation of

  17. A flowsheet model of a coal-fired MHD/steam combined electricity generating cycle, using the access computer model

    International Nuclear Information System (INIS)

    Davison, J.E.; Eldershaw, C.E.

    1992-01-01

    This document forms the final report on a study of a coal-fired magnetohydrodynamic (MHD)/steam electric power generation system carried out by British Coal Corporation for the Commission of the European Communities. The study objective was to provide mass and energy balances and overall plant efficiency predictions for MHD to assist the Commission in their evaluation of advanced power generation technologies. In early 1990 the British Coal Corporation completed a study for the Commission in which a computer flowsheet modelling package was used to predict the performance of a conceptual air blown MHD plant. Since that study was carried out increasing emphasis has been placed on the possible need to reduce CO 2 emissions to counter the so-called greenhouse effect. Air blown MHD could greatly reduce CO 2 emissions per KWh by virtue of its high thermal efficiency. However, if even greater reductions in CO 2 emissions were required the CO 2 produced by coal combustion may have to be disposed of, for example into the deep ocean or underground caverns. To achieve this at minimum cost a concentrated CO 2 flue gas would be required. This could be achieved in an MHD plant by using a mixture of high purity oxygen and recycled CO 2 flue gas in the combustor. To assess this plant concept the European Commission awarded British Coal a contract to produce performance predictions using the access computer program

  18. Radiation dose reduction through combining positron emission tomography/computed tomography (PET/CT) and diagnostic CT in children and young adults with lymphoma

    Energy Technology Data Exchange (ETDEWEB)

    Qi, Zhihua; Gates, Erica L.; Trout, Andrew T. [Cincinnati Children' s Hospital Medical Center, Department of Radiology, Cincinnati, OH (United States); O' Brien, Maureen M. [Cincinnati Children' s Hospital Medical Center, Division of Oncology, Cancer and Blood Disease Institute, Cincinnati, OH (United States)

    2018-02-15

    Both [F-18]2-fluoro-2-deoxyglucose positron emission tomography/computed tomography ({sup 18}F-FDG PET/CT) and diagnostic CT are at times required for lymphoma staging. This means some body segments are exposed twice to X-rays for generation of CT data (diagnostic CT + localization CT). To describe a combined PET/diagnostic CT approach that modulates CT tube current along the z-axis, providing diagnostic CT of some body segments and localization CT of the remaining body segments, thereby reducing patient radiation dose. We retrospectively compared total patient radiation dose between combined PET/diagnostic CT and separately acquired PET/CT and diagnostic CT exams. When available, we calculated effective doses for both approaches in the same patient; otherwise, we used data from patients of similar size. To confirm image quality, we compared image noise (Hounsfield unit [HU] standard deviation) as measured in the liver on both combined and separately acquired diagnostic CT images. We used t-tests for dose comparisons and two one-sided tests for image-quality equivalence testing. Mean total effective dose for the CT component of the combined and separately acquired diagnostic CT exams were 6.20±2.69 and 8.17±2.61 mSv, respectively (P<0.0001). Average dose savings with the combined approach was 24.8±17.8% (2.60±2.51 mSv [range: 0.32-4.72 mSv]) of total CT effective dose. Image noise was not statistically significantly different between approaches (12.2±1.8 HU vs. 11.7±1.5 HU for the combined and separately acquired diagnostic CT images, respectively). A combined PET/diagnostic CT approach as described offers dose savings at similar image quality for children and young adults with lymphoma who have indications for both PET and diagnostic CT examinations. (orig.)

  19. Radiation dose reduction through combining positron emission tomography/computed tomography (PET/CT) and diagnostic CT in children and young adults with lymphoma

    International Nuclear Information System (INIS)

    Qi, Zhihua; Gates, Erica L.; Trout, Andrew T.; O'Brien, Maureen M.

    2018-01-01

    Both [F-18]2-fluoro-2-deoxyglucose positron emission tomography/computed tomography ( 18 F-FDG PET/CT) and diagnostic CT are at times required for lymphoma staging. This means some body segments are exposed twice to X-rays for generation of CT data (diagnostic CT + localization CT). To describe a combined PET/diagnostic CT approach that modulates CT tube current along the z-axis, providing diagnostic CT of some body segments and localization CT of the remaining body segments, thereby reducing patient radiation dose. We retrospectively compared total patient radiation dose between combined PET/diagnostic CT and separately acquired PET/CT and diagnostic CT exams. When available, we calculated effective doses for both approaches in the same patient; otherwise, we used data from patients of similar size. To confirm image quality, we compared image noise (Hounsfield unit [HU] standard deviation) as measured in the liver on both combined and separately acquired diagnostic CT images. We used t-tests for dose comparisons and two one-sided tests for image-quality equivalence testing. Mean total effective dose for the CT component of the combined and separately acquired diagnostic CT exams were 6.20±2.69 and 8.17±2.61 mSv, respectively (P<0.0001). Average dose savings with the combined approach was 24.8±17.8% (2.60±2.51 mSv [range: 0.32-4.72 mSv]) of total CT effective dose. Image noise was not statistically significantly different between approaches (12.2±1.8 HU vs. 11.7±1.5 HU for the combined and separately acquired diagnostic CT images, respectively). A combined PET/diagnostic CT approach as described offers dose savings at similar image quality for children and young adults with lymphoma who have indications for both PET and diagnostic CT examinations. (orig.)

  20. Intelligent Distributed Computing VI : Proceedings of the 6th International Symposium on Intelligent Distributed Computing

    CERN Document Server

    Badica, Costin; Malgeri, Michele; Unland, Rainer

    2013-01-01

    This book represents the combined peer-reviewed proceedings of the Sixth International Symposium on Intelligent Distributed Computing -- IDC~2012, of the International Workshop on Agents for Cloud -- A4C~2012 and of the Fourth International Workshop on Multi-Agent Systems Technology and Semantics -- MASTS~2012. All the events were held in Calabria, Italy during September 24-26, 2012. The 37 contributions published in this book address many topics related to theory and applications of intelligent distributed computing and multi-agent systems, including: adaptive and autonomous distributed systems, agent programming, ambient assisted living systems, business process modeling and verification, cloud computing, coalition formation, decision support systems, distributed optimization and constraint satisfaction, gesture recognition, intelligent energy management in WSNs, intelligent logistics, machine learning, mobile agents, parallel and distributed computational intelligence, parallel evolutionary computing, trus...

  1. Study on the evolutionary optimisation of the topology of network control systems

    Science.gov (United States)

    Zhou, Zude; Chen, Benyuan; Wang, Hong; Fan, Zhun

    2010-08-01

    Computer networks have been very popular in enterprise applications. However, optimisation of network designs that allows networks to be used more efficiently in industrial environment and enterprise applications remains an interesting research topic. This article mainly discusses the topology optimisation theory and methods of the network control system based on switched Ethernet in an industrial context. Factors that affect the real-time performance of the industrial control network are presented in detail, and optimisation criteria with their internal relations are analysed. After the definition of performance parameters, the normalised indices for the evaluation of the topology optimisation are proposed. The topology optimisation problem is formulated as a multi-objective optimisation problem and the evolutionary algorithm is applied to solve it. Special communication characteristics of the industrial control network are considered in the optimisation process. In respect to the evolutionary algorithm design, an improved arena algorithm is proposed for the construction of the non-dominated set of the population. In addition, for the evaluation of individuals, the integrated use of the dominative relation method and the objective function combination method, for reducing the computational cost of the algorithm, are given. Simulation tests show that the performance of the proposed algorithm is preferable and superior compared to other algorithms. The final solution greatly improves the following indices: traffic localisation, traffic balance and utilisation rate balance of switches. In addition, a new performance index with its estimation process is proposed.

  2. Evolutionary image simplification for lung nodule classification with convolutional neural networks.

    Science.gov (United States)

    Lückehe, Daniel; von Voigt, Gabriele

    2018-05-29

    Understanding decisions of deep learning techniques is important. Especially in the medical field, the reasons for a decision in a classification task are as crucial as the pure classification results. In this article, we propose a new approach to compute relevant parts of a medical image. Knowing the relevant parts makes it easier to understand decisions. In our approach, a convolutional neural network is employed to learn structures of images of lung nodules. Then, an evolutionary algorithm is applied to compute a simplified version of an unknown image based on the learned structures by the convolutional neural network. In the simplified version, irrelevant parts are removed from the original image. In the results, we show simplified images which allow the observer to focus on the relevant parts. In these images, more than 50% of the pixels are simplified. The simplified pixels do not change the meaning of the images based on the learned structures by the convolutional neural network. An experimental analysis shows the potential of the approach. Besides the examples of simplified images, we analyze the run time development. Simplified images make it easier to focus on relevant parts and to find reasons for a decision. The combination of an evolutionary algorithm employing a learned convolutional neural network is well suited for the simplification task. From a research perspective, it is interesting which areas of the images are simplified and which parts are taken as relevant.

  3. Three dimensional magnetic fields in extra high speed modified Lundell alternators computed by a combined vector-scalar magnetic potential finite element method

    Science.gov (United States)

    Demerdash, N. A.; Wang, R.; Secunde, R.

    1992-01-01

    A 3D finite element (FE) approach was developed and implemented for computation of global magnetic fields in a 14.3 kVA modified Lundell alternator. The essence of the new method is the combined use of magnetic vector and scalar potential formulations in 3D FEs. This approach makes it practical, using state of the art supercomputer resources, to globally analyze magnetic fields and operating performances of rotating machines which have truly 3D magnetic flux patterns. The 3D FE-computed fields and machine inductances as well as various machine performance simulations of the 14.3 kVA machine are presented in this paper and its two companion papers.

  4. An Efficient Evolutionary Based Method For Image Segmentation

    OpenAIRE

    Aslanzadeh, Roohollah; Qazanfari, Kazem; Rahmati, Mohammad

    2017-01-01

    The goal of this paper is to present a new efficient image segmentation method based on evolutionary computation which is a model inspired from human behavior. Based on this model, a four layer process for image segmentation is proposed using the split/merge approach. In the first layer, an image is split into numerous regions using the watershed algorithm. In the second layer, a co-evolutionary process is applied to form centers of finals segments by merging similar primary regions. In the t...

  5. Combining X-ray computed tomography and visible near-infrared spectroscopy for prediction of soil structural properties

    DEFF Research Database (Denmark)

    Katuwal, Sheela; Hermansen, Cecilie; Knadel, Maria

    2018-01-01

    agricultural fields within Denmark with a wide range of textural properties and organic C (OC) contents were studied. Macroporosity (>1.2 mm in diameter) and CTmatrix (the density of the field-moist soil matrix devoid of large macropores and stones) were determined from X-ray CT scans of undisturbed soil cores...... (19 by 20 cm). Both macroporosity and CTmatix are soil structural properties that affect the degree of preferential transport. Bulk soils from the 127 sampling locations were scanned with a vis-NIR spectrometer (400–2500 nm). Macroporosity and CTmatrix were statistically predicted with partial least......Soil structure is a key soil property affecting a soil’s flow and transport behavior. X-ray computed tomography (CT) is increasingly used to quantify soil structure. However, the availability, cost, time, and skills required for processing are still limiting the number of soils studied. Visible...

  6. Multidetector computed tomography findings of mesenteroaxial gastric volvulus combined with torsion of wandering spleen: A case report and literature review

    International Nuclear Information System (INIS)

    Youn, In Kyung; Ku, Young Mi; Lee, Su Lim

    2016-01-01

    Gastric volvulus, defined as an abnormal rotation of stomach, may be idiopathic or secondary to abnormal fixation of intraperitoneal visceral ligaments. Wandering spleen is a movable spleen resulting from absence or underdevelopment of the splenic supporting ligaments that suspend the spleen to its normal position in the left part of the supramesocolic compartment of the abdomen. Wandering spleen increases the risk of splenic torsion. Both gastric volvulus and splenic torsion are potentially life-threatening if not urgently managed with surgery. Prompt and accurate diagnosis based on multidetector computed tomography (MDCT) is crucial to prevent unforeseen complications. Gastric volvulus and coexistent torsion of wandering spleen is a very rare condition. Herein, we described a case of gastric volvulus associated with wandering spleen and intestinal non-rotation in a 15-year-old girl focusing on MDCT findings

  7. Multidetector computed tomography findings of mesenteroaxial gastric volvulus combined with torsion of wandering spleen: A case report and literature review

    Energy Technology Data Exchange (ETDEWEB)

    Youn, In Kyung; Ku, Young Mi; Lee, Su Lim [Dept. of Radiology, Uijeongbu St. Mary' s Hospital, College of Medicine, The Catholic University of Korea, Uijeongbu (Korea, Republic of)

    2016-05-15

    Gastric volvulus, defined as an abnormal rotation of stomach, may be idiopathic or secondary to abnormal fixation of intraperitoneal visceral ligaments. Wandering spleen is a movable spleen resulting from absence or underdevelopment of the splenic supporting ligaments that suspend the spleen to its normal position in the left part of the supramesocolic compartment of the abdomen. Wandering spleen increases the risk of splenic torsion. Both gastric volvulus and splenic torsion are potentially life-threatening if not urgently managed with surgery. Prompt and accurate diagnosis based on multidetector computed tomography (MDCT) is crucial to prevent unforeseen complications. Gastric volvulus and coexistent torsion of wandering spleen is a very rare condition. Herein, we described a case of gastric volvulus associated with wandering spleen and intestinal non-rotation in a 15-year-old girl focusing on MDCT findings.

  8. Metal-Free Catalytic Asymmetric Fluorination of Keto Esters Using a Combination of Hydrogen Fluoride (HF) and Oxidant: Experiment and Computation

    KAUST Repository

    Pluta, Roman

    2018-02-09

    A chiral iodoarene organocatalyst for the catalytic asymmetric fluorination has been developed. The catalyst was used in the asymmetric fluorination of carbonyl compounds, providing the products with a quaternary stereocenter with high enantioselectivities. Chiral hypervalent iodine difluoride intermediates were generated in situ by treatment of the catalyst with an oxidant and hydrogen fluoride as fluoride source. As such, the α-fluorination of a carbonyl compound was achieved with a nucleophilic fluorine source. A combined computational and experimental approach provided insight into the reaction mechanism and the origin of enantioselectivity.

  9. Metal-Free Catalytic Asymmetric Fluorination of Keto Esters Using a Combination of Hydrogen Fluoride (HF) and Oxidant: Experiment and Computation

    KAUST Repository

    Pluta, Roman; Krach, Patricia E.; Cavallo, Luigi; Falivene, Laura; Rueping, Magnus

    2018-01-01

    A chiral iodoarene organocatalyst for the catalytic asymmetric fluorination has been developed. The catalyst was used in the asymmetric fluorination of carbonyl compounds, providing the products with a quaternary stereocenter with high enantioselectivities. Chiral hypervalent iodine difluoride intermediates were generated in situ by treatment of the catalyst with an oxidant and hydrogen fluoride as fluoride source. As such, the α-fluorination of a carbonyl compound was achieved with a nucleophilic fluorine source. A combined computational and experimental approach provided insight into the reaction mechanism and the origin of enantioselectivity.

  10. Differential Control of Heme Reactivity in Alpha and Beta Subunits of Hemoglobin: A Combined Raman Spectroscopic and Computational Study

    Science.gov (United States)

    2015-01-01

    The use of hybrid hemoglobin (Hb), with mesoheme substituted for protoheme, allows separate monitoring of the α or β hemes along the allosteric pathway. Using resonance Raman (rR) spectroscopy in silica gel, which greatly slows protein motions, we have observed that the Fe–histidine stretching frequency, νFeHis, which is a monitor of heme reactivity, evolves between frequencies characteristic of the R and T states, for both α or β chains, prior to the quaternary R–T and T–R shifts. Computation of νFeHis, using QM/MM and the conformational search program PELE, produced remarkable agreement with experiment. Analysis of the PELE structures showed that the νFeHis shifts resulted from heme distortion and, in the α chain, Fe–His bond tilting. These results support the tertiary two-state model of ligand binding (Henry et al., Biophys. Chem.2002, 98, 149). Experimentally, the νFeHis evolution is faster for β than for α chains, and pump–probe rR spectroscopy in solution reveals an inflection in the νFeHis time course at 3 μs for β but not for α hemes, an interval previously shown to be the first step in the R–T transition. In the α chain νFeHis dropped sharply at 20 μs, the final step in the R–T transition. The time courses are fully consistent with recent computational mapping of the R–T transition via conjugate peak refinement by Karplus and co-workers (Fischer et al., Proc. Natl. Acad. Sci. U. S. A.2011, 108, 5608). The effector molecule IHP was found to lower νFeHis selectively for α chains within the R state, and a binding site in the α1α2 cleft is suggested. PMID:24991732

  11. A new surrogate modeling technique combining Kriging and polynomial chaos expansions – Application to uncertainty analysis in computational dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Kersaudy, Pierric, E-mail: pierric.kersaudy@orange.com [Orange Labs, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France); Whist Lab, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France); ESYCOM, Université Paris-Est Marne-la-Vallée, 5 boulevard Descartes, 77700 Marne-la-Vallée (France); Sudret, Bruno [ETH Zürich, Chair of Risk, Safety and Uncertainty Quantification, Stefano-Franscini-Platz 5, 8093 Zürich (Switzerland); Varsier, Nadège [Orange Labs, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France); Whist Lab, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France); Picon, Odile [ESYCOM, Université Paris-Est Marne-la-Vallée, 5 boulevard Descartes, 77700 Marne-la-Vallée (France); Wiart, Joe [Orange Labs, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France); Whist Lab, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux (France)

    2015-04-01

    In numerical dosimetry, the recent advances in high performance computing led to a strong reduction of the required computational time to assess the specific absorption rate (SAR) characterizing the human exposure to electromagnetic waves. However, this procedure remains time-consuming and a single simulation can request several hours. As a consequence, the influence of uncertain input parameters on the SAR cannot be analyzed using crude Monte Carlo simulation. The solution presented here to perform such an analysis is surrogate modeling. This paper proposes a novel approach to build such a surrogate model from a design of experiments. Considering a sparse representation of the polynomial chaos expansions using least-angle regression as a selection algorithm to retain the most influential polynomials, this paper proposes to use the selected polynomials as regression functions for the universal Kriging model. The leave-one-out cross validation is used to select the optimal number of polynomials in the deterministic part of the Kriging model. The proposed approach, called LARS-Kriging-PC modeling, is applied to three benchmark examples and then to a full-scale metamodeling problem involving the exposure of a numerical fetus model to a femtocell device. The performances of the LARS-Kriging-PC are compared to an ordinary Kriging model and to a classical sparse polynomial chaos expansion. The LARS-Kriging-PC appears to have better performances than the two other approaches. A significant accuracy improvement is observed compared to the ordinary Kriging or to the sparse polynomial chaos depending on the studied case. This approach seems to be an optimal solution between the two other classical approaches. A global sensitivity analysis is finally performed on the LARS-Kriging-PC model of the fetus exposure problem.

  12. Is the bipyridyl thorium metallocene a low-valent thorium complex? A combined experimental and computational study

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Wenshan; Lukens, Wayne W.; Zi, Guofu; Maron, Laurent; Walter, Marc D.

    2012-01-12

    Bipyridyl thorium metallocenes [5-1,2,4-(Me3C)3C5H2]2Th(bipy) (1) and [5-1,3-(Me3C)2C5H3]2Th(bipy) (2) have been investigated by magnetic susceptibility and computational studies. The magnetic susceptibility data reveal that 1 and 2 are not diamagnetic, but they behave as temperature independent paramagnets (TIPs). To rationalize this observation, density functional theory (DFT) and complete active space SCF (CASSCF) calculations have been undertaken, which indicated that Cp2Th(bipy) has indeed a Th(IV)(bipy2-) ground state (f0d0 2, S = 0), but the open-shell singlet (f0d1 1, S = 0) (almost degenerate with its triplet congener) is lying only 9.2 kcal/mol higher in energy. Complexes 1 and 2 react cleanly with Ph2CS to give [ 5-1,2,4-(Me3C)3C5H2]2Th[(bipy)(SCPh2)] (3) and [ 5-1,3-(Me3C)2C5H3]2Th[(bipy)(SCPh2)] (4), respectively, in quantitative conversions. Since no intermediates were observed experimentally, this reaction was also studied computationally. Coordination of Ph2CS to 2 in its S = 0 ground state is not possible, but Ph2CS can coordinate to 2 in its triplet state (S = 1) upon which a single electron transfer (SET) from the (bipy2-) fragment to Ph2CS followed by C-C coupling takes place.

  13. Turning the Page on Pen-and-Paper Questionnaires: Combining Ecological Momentary Assessment and Computer Adaptive Testing to Transform Psychological Assessment in the 21st Century.

    Science.gov (United States)

    Gibbons, Chris J

    2016-01-01

    The current paper describes new opportunities for patient-centred assessment methods which have come about by the increased adoption of affordable smart technologies in biopsychosocial research and medical care. In this commentary, we review modern assessment methods including item response theory (IRT), computer adaptive testing (CAT), and ecological momentary assessment (EMA) and explain how these methods may be combined to improve psychological assessment. We demonstrate both how a 'naïve' selection of a small group of items in an EMA can lead to unacceptably unreliable assessments and how IRT can provide detailed information on the individual information that each item gives thus allowing short form assessments to be selected with acceptable reliability. The combination of CAT and IRT can ensure assessments are precise, efficient, and well targeted to the individual; allowing EMAs to be both brief and accurate.

  14. Minimally invasive computer-navigated total hip arthroplasty, following the concept of femur first and combined anteversion: design of a blinded randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Woerner Michael

    2011-08-01

    Full Text Available Abstract Background Impingement can be a serious complication after total hip arthroplasty (THA, and is one of the major causes of postoperative pain, dislocation, aseptic loosening, and implant breakage. Minimally invasive THA and computer-navigated surgery were introduced several years ago. We have developed a novel, computer-assisted operation method for THA following the concept of "femur first"/"combined anteversion", which incorporates various aspects of performing a functional optimization of the cup position, and comprehensively addresses range of motion (ROM as well as cup containment and alignment parameters. Hence, the purpose of this study is to assess whether the artificial joint's ROM can be improved by this computer-assisted operation method. Second, the clinical and radiological outcome will be evaluated. Methods/Design A registered patient- and observer-blinded randomized controlled trial will be conducted. Patients between the ages of 50 and 75 admitted for primary unilateral THA will be included. Patients will be randomly allocated to either receive minimally invasive computer-navigated "femur first" THA or the conventional minimally invasive THA procedure. Self-reported functional status and health-related quality of life (questionnaires will be assessed both preoperatively and postoperatively. Perioperative complications will be registered. Radiographic evaluation will take place up to 6 weeks postoperatively with a computed tomography (CT scan. Component position will be evaluated by an independent external institute on a 3D reconstruction of the femur/pelvis using image-processing software. Postoperative ROM will be calculated by an algorithm which automatically determines bony and prosthetic impingements. Discussion In the past, computer navigation has improved the accuracy of component positioning. So far, there are only few objective data quantifying the risks and benefits of computer navigated THA. Therefore, this

  15. Evolutionary foundations for cancer biology.

    Science.gov (United States)

    Aktipis, C Athena; Nesse, Randolph M

    2013-01-01

    New applications of evolutionary biology are transforming our understanding of cancer. The articles in this special issue provide many specific examples, such as microorganisms inducing cancers, the significance of within-tumor heterogeneity, and the possibility that lower dose chemotherapy may sometimes promote longer survival. Underlying these specific advances is a large-scale transformation, as cancer research incorporates evolutionary methods into its toolkit, and asks new evolutionary questions about why we are vulnerable to cancer. Evolution explains why cancer exists at all, how neoplasms grow, why cancer is remarkably rare, and why it occurs despite powerful cancer suppression mechanisms. Cancer exists because of somatic selection; mutations in somatic cells result in some dividing faster than others, in some cases generating neoplasms. Neoplasms grow, or do not, in complex cellular ecosystems. Cancer is relatively rare because of natural selection; our genomes were derived disproportionally from individuals with effective mechanisms for suppressing cancer. Cancer occurs nonetheless for the same six evolutionary reasons that explain why we remain vulnerable to other diseases. These four principles-cancers evolve by somatic selection, neoplasms grow in complex ecosystems, natural selection has shaped powerful cancer defenses, and the limitations of those defenses have evolutionary explanations-provide a foundation for understanding, preventing, and treating cancer.

  16. Evolutionary and molecular foundations of multiple contemporary functions of the nitroreductase superfamily.

    Science.gov (United States)

    Akiva, Eyal; Copp, Janine N; Tokuriki, Nobuhiko; Babbitt, Patricia C

    2017-11-07

    Insight regarding how diverse enzymatic functions and reactions have evolved from ancestral scaffolds is fundamental to understanding chemical and evolutionary biology, and for the exploitation of enzymes for biotechnology. We undertook an extensive computational analysis using a unique and comprehensive combination of tools that include large-scale phylogenetic reconstruction to determine the sequence, structural, and functional relationships of the functionally diverse flavin mononucleotide-dependent nitroreductase (NTR) superfamily (>24,000 sequences from all domains of life, 54 structures, and >10 enzymatic functions). Our results suggest an evolutionary model in which contemporary subgroups of the superfamily have diverged in a radial manner from a minimal flavin-binding scaffold. We identified the structural design principle for this divergence: Insertions at key positions in the minimal scaffold that, combined with the fixation of key residues, have led to functional specialization. These results will aid future efforts to delineate the emergence of functional diversity in enzyme superfamilies, provide clues for functional inference for superfamily members of unknown function, and facilitate rational redesign of the NTR scaffold. Copyright © 2017 the Author(s). Published by PNAS.

  17. Evolutionary biology and life histories

    Directory of Open Access Journals (Sweden)

    Brown, C. R.

    2004-06-01

    Full Text Available The demographic processes that drive the spread of populations through environments and in turn determine the abundance of organisms are the same demographic processes that drive the spread of genes through populations and in turn determine gene frequencies and fitness. Conceptually, marked similarities exist in the dynamic processes underlying population ecology and those underlying evolutionary biology. Central to an understanding of both disciplines is life history and its component demographic rates, such as survival, fecundity, and age of first breeding, and biologists from both fields have a vested interest in good analytical machinery for the estimation and analysis of these demographic rates. In the EURING conferences, we have been striving since the mid 1980s to promote a quantitative understanding of demographic rates through interdisciplinary collaboration between ecologists and statisticians. From the ecological side, the principal impetus has come from population biology, and in particular from wildlife biology, but the importance of good quantitative insights into demographic processes has long been recognized by a number of evolutionary biologists (e.g., Nichols & Kendall, 1995; Clobert, 1995; Cooch et al., 2002. In organizing this session, we have aimed to create a forum for those committed to gaining the best possible understanding of evolutionary processes through the application of modern quantitative methods for the collection and interpretation of data on marked animal populations. Here we present a short overview of the material presented in the session on evolutionary biology and life histories. In a plenary talk, Brown & Brown (2004 explored how mark–recapture methods have allowed a better understanding of the evolution of group–living and alternative reproductive tactics in colonial cliff swallows (Petrochelidon pyrrhonota. By estimating the number of transient birds passing through colonies of different sizes, they

  18. Nash evolutionary algorithms : Testing problem size in reconstruction problems in frame structures

    OpenAIRE

    Greiner, D.; Periaux, Jacques; Emperador, J.M.; Galván, B.; Winter, G.

    2016-01-01

    The use of evolutionary algorithms has been enhanced in recent years for solving real engineering problems, where the requirements of intense computational calculations are needed, especially when computational engineering simulations are involved (use of finite element method, boundary element method, etc). The coupling of game-theory concepts in evolutionary algorithms has been a recent line of research which could enhance the efficiency of the optimum design procedure and th...

  19. Radiation dose reduction through combining positron emission tomography/computed tomography (PET/CT) and diagnostic CT in children and young adults with lymphoma.

    Science.gov (United States)

    Qi, Zhihua; Gates, Erica L; O'Brien, Maureen M; Trout, Andrew T

    2018-02-01

    Both [F-18]2-fluoro-2-deoxyglucose positron emission tomography/computed tomography ( 18 F-FDG PET/CT) and diagnostic CT are at times required for lymphoma staging. This means some body segments are exposed twice to X-rays for generation of CT data (diagnostic CT + localization CT). To describe a combined PET/diagnostic CT approach that modulates CT tube current along the z-axis, providing diagnostic CT of some body segments and localization CT of the remaining body segments, thereby reducing patient radiation dose. We retrospectively compared total patient radiation dose between combined PET/diagnostic CT and separately acquired PET/CT and diagnostic CT exams. When available, we calculated effective doses for both approaches in the same patient; otherwise, we used data from patients of similar size. To confirm image quality, we compared image noise (Hounsfield unit [HU] standard deviation) as measured in the liver on both combined and separately acquired diagnostic CT images. We used t-tests for dose comparisons and two one-sided tests for image-quality equivalence testing. Mean total effective dose for the CT component of the combined and separately acquired diagnostic CT exams were 6.20±2.69 and 8.17±2.61 mSv, respectively (PCT effective dose. Image noise was not statistically significantly different between approaches (12.2±1.8 HU vs. 11.7±1.5 HU for the combined and separately acquired diagnostic CT images, respectively). A combined PET/diagnostic CT approach as described offers dose savings at similar image quality for children and young adults with lymphoma who have indications for both PET and diagnostic CT examinations.

  20. Cloud Computing Security Model with Combination of Data Encryption Standard Algorithm (DES) and Least Significant Bit (LSB)

    Science.gov (United States)

    Basri, M.; Mawengkang, H.; Zamzami, E. M.

    2018-03-01

    Limitations of storage sources is one option to switch to cloud storage. Confidentiality and security of data stored on the cloud is very important. To keep up the confidentiality and security of such data can be done one of them by using cryptography techniques. Data Encryption Standard (DES) is one of the block cipher algorithms used as standard symmetric encryption algorithm. This DES will produce 8 blocks of ciphers combined into one ciphertext, but the ciphertext are weak against brute force attacks. Therefore, the last 8 block cipher will be converted into 8 random images using Least Significant Bit (LSB) algorithm which later draws the result of cipher of DES algorithm to be merged into one.

  1. Charge Transport in 4 nm Molecular Wires with Interrupted Conjugation: Combined Experimental and Computational Evidence for Thermally Assisted Polaron Tunneling.

    Science.gov (United States)

    Taherinia, Davood; Smith, Christopher E; Ghosh, Soumen; Odoh, Samuel O; Balhorn, Luke; Gagliardi, Laura; Cramer, Christopher J; Frisbie, C Daniel

    2016-04-26

    We report the synthesis, transport measurements, and electronic structure of conjugation-broken oligophenyleneimine (CB-OPI 6) molecular wires with lengths of ∼4 nm. The wires were grown from Au surfaces using stepwise aryl imine condensation reactions between 1,4-diaminobenzene and terephthalaldehyde (1,4-benzenedicarbaldehyde). Saturated spacers (conjugation breakers) were introduced into the molecular backbone by replacing the aromatic diamine with trans-1,4-diaminocyclohexane at specific steps during the growth processes. FT-IR and ellipsometry were used to follow the imination reactions on Au surfaces. Surface coverages (∼4 molecules/nm(2)) and electronic structures of the wires were determined by cyclic voltammetry and UV-vis spectroscopy, respectively. The current-voltage (I-V) characteristics of the wires were acquired using conducting probe atomic force microscopy (CP-AFM) in which an Au-coated AFM probe was brought into contact with the wires to form metal-molecule-metal junctions with contact areas of ∼50 nm(2). The low bias resistance increased with the number of saturated spacers, but was not sensitive to the position of the spacer within the wire. Temperature dependent measurements of resistance were consistent with a localized charge (polaron) hopping mechanism in all of the wires. Activation energies were in the range of 0.18-0.26 eV (4.2-6.0 kcal/mol) with the highest belonging to the fully conjugated OPI 6 wire and the lowest to the CB3,5-OPI 6 wire (the wire with two saturated spacers). For the two other wires with a single conjugation breaker, CB3-OPI 6 and CB5-OPI 6, activation energies of 0.20 eV (4.6 kcal/mol) and 0.21 eV (4.8 kcal/mol) were found, respectively. Computational studies using density functional theory confirmed the polaronic nature of charge carriers but predicted that the semiclassical activation energy of hopping should be higher for CB-OPI molecular wires than for the OPI 6 wire. To reconcile the experimental and

  2. Evolutionary engineering for industrial microbiology.

    Science.gov (United States)

    Vanee, Niti; Fisher, Adam B; Fong, Stephen S

    2012-01-01

    Superficially, evolutionary engineering is a paradoxical field that balances competing interests. In natural settings, evolution iteratively selects and enriches subpopulations that are best adapted to a particular ecological niche using random processes such as genetic mutation. In engineering desired approaches utilize rational prospective design to address targeted problems. When considering details of evolutionary and engineering processes, more commonality can be found. Engineering relies on detailed knowledge of the problem parameters and design properties in order to predict design outcomes that would be an optimized solution. When detailed knowledge of a system is lacking, engineers often employ algorithmic search strategies to identify empirical solutions. Evolution epitomizes this iterative optimization by continuously diversifying design options from a parental design, and then selecting the progeny designs that represent satisfactory solutions. In this chapter, the technique of applying the natural principles of evolution to engineer microbes for industrial applications is discussed to highlight the challenges and principles of evolutionary engineering.

  3. IDEA: Interactive Display for Evolutionary Analyses.

    Science.gov (United States)

    Egan, Amy; Mahurkar, Anup; Crabtree, Jonathan; Badger, Jonathan H; Carlton, Jane M; Silva, Joana C

    2008-12-08

    The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood) suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. We have developed IDEA (Interactive Display for Evolutionary Analyses), an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  4. IDEA: Interactive Display for Evolutionary Analyses

    Directory of Open Access Journals (Sweden)

    Carlton Jane M

    2008-12-01

    Full Text Available Abstract Background The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. Results We have developed IDEA (Interactive Display for Evolutionary Analyses, an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. Conclusion IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.

  5. Evolutionary Optimization of Centrifugal Nozzles for Organic Vapours

    Science.gov (United States)

    Persico, Giacomo

    2017-03-01

    This paper discusses the shape-optimization of non-conventional centrifugal turbine nozzles for Organic Rankine Cycle applications. The optimal aerodynamic design is supported by the use of a non-intrusive, gradient-free technique specifically developed for shape optimization of turbomachinery profiles. The method is constructed as a combination of a geometrical parametrization technique based on B-Splines, a high-fidelity and experimentally validated Computational Fluid Dynamic solver, and a surrogate-based evolutionary algorithm. The non-ideal gas behaviour featuring the flow of organic fluids in the cascades of interest is introduced via a look-up-table approach, which is rigorously applied throughout the whole optimization process. Two transonic centrifugal nozzles are considered, featuring very different loading and radial extension. The use of a systematic and automatic design method to such a non-conventional configuration highlights the character of centrifugal cascades; the blades require a specific and non-trivial definition of the shape, especially in the rear part, to avoid the onset of shock waves. It is shown that the optimization acts in similar way for the two cascades, identifying an optimal curvature of the blade that both provides a relevant increase of cascade performance and a reduction of downstream gradients.

  6. Evolutionary neural networks: a new alternative for neutron spectrometry

    International Nuclear Information System (INIS)

    Ortiz R, J. M.; Martinez B, M. R.; Vega C, H. R.; Galleo, E.

    2009-10-01

    A device used to perform neutron spectroscopy is the system known as a system of Bonner spheres spectrometer, this system has some disadvantages, one of these is the need for reconstruction using a code that is based on an iterative reconstruction algorithm, whose greater inconvenience is the need for a initial spectrum, as close as possible to the spectrum that is desired to avoid this inconvenience has been reported several procedures in reconstruction, combined with various types of experimental methods, based on artificial intelligence technology how genetic algorithms, artificial neural networks and hybrid systems evolved artificial neural networks using genetic algorithms. This paper analyzes the intersection of neural networks and evolutionary algorithms applied in the neutron spectroscopy and dosimetry. Due to this is an emerging technology, there are not tools for doing analysis of the obtained results, by what this paper presents a computing tool to analyze the neutron spectra and the equivalent doses obtained through the hybrid technology of neural networks and genetic algorithms. The toolmaker offers a user graphical environment, friendly and easy to operate. (author)

  7. Smoking-related interstitial fibrosis combined with pulmonary emphysema: computed tomography-pathologic correlative study using lobectomy specimens.

    Science.gov (United States)

    Otani, Hideji; Tanaka, Tomonori; Murata, Kiyoshi; Fukuoka, Junya; Nitta, Norihisa; Nagatani, Yukihiro; Sonoda, Akinaga; Takahashi, Masashi

    2016-01-01

    To evaluate the incidence and pathologic correlation of thin-section computed tomography (TSCT) findings in smoking-related interstitial fibrosis (SRIF) with pulmonary emphysema. Our study included 172 consecutive patients who underwent TSCT and subsequent lobectomy. TSCT findings including clustered cysts with visible walls (CCVW) and ground-glass attenuation with/without reticulation (GGAR) were evaluated and compared in nonsmokers and smokers and among lung locations. TSCT findings, especially CCVW, were also compared with histological findings using lobectomy specimens. The incidence of CCVW and GGAR was significantly higher in smokers than in nonsmokers (34.1% and 40.7%, respectively, vs 2.0% and 12.2%). CCVW and GGAR were frequently found in the lower and peripheral zones. Histologically, CCVW corresponded more often with SRIF with emphysema than usual interstitial pneumonia (UIP, 63.3% vs 30%). CCVW of irregular size and shape were seen in 19 of 20 SRIF with emphysema and in seven of nine UIP-manifested areas with similar round cysts. A less-involved subpleural parenchyma was observed more frequently in SRIF with emphysema. SRIF with emphysema is a more frequent pathological finding than UIP in patients with CCVW on TSCT. The irregular size and shape of CCVW and a less-involved subpleural parenchyma may be a clue suggesting the presence of SRIF with emphysema.

  8. Large-Scale Compute-Intensive Analysis via a Combined In-situ and Co-scheduling Workflow Approach

    Energy Technology Data Exchange (ETDEWEB)

    Messer, Bronson [ORNL; Sewell, Christopher [Los Alamos National Laboratory (LANL); Heitmann, Katrin [ORNL; Finkel, Dr. Hal J [Argonne National Laboratory (ANL); Fasel, Patricia [Los Alamos National Laboratory (LANL); Zagaris, George [Lawrence Livermore National Laboratory (LLNL); Pope, Adrian [Los Alamos National Laboratory (LANL); Habib, Salman [ORNL; Parete-Koon, Suzanne T [ORNL

    2015-01-01

    Large-scale simulations can produce tens of terabytes of data per analysis cycle, complicating and limiting the efficiency of workflows. Traditionally, outputs are stored on the file system and analyzed in post-processing. With the rapidly increasing size and complexity of simulations, this approach faces an uncertain future. Trending techniques consist of performing the analysis in situ, utilizing the same resources as the simulation, and/or off-loading subsets of the data to a compute-intensive analysis system. We introduce an analysis framework developed for HACC, a cosmological N-body code, that uses both in situ and co-scheduling approaches for handling Petabyte-size outputs. An initial in situ step is used to reduce the amount of data to be analyzed, and to separate out the data-intensive tasks handled off-line. The analysis routines are implemented using the PISTON/VTK-m framework, allowing a single implementation of an algorithm that simultaneously targets a variety of GPU, multi-core, and many-core architectures.

  9. Development of the protocol for purification of artemisinin based on combination of commercial and computationally designed adsorbents.

    Science.gov (United States)

    Piletska, Elena V; Karim, Kal; Cutler, Malcolm; Piletsky, Sergey A

    2013-01-01

    A polymeric adsorbent for extraction of the antimalarial drug artemisinin from Artemisia annua L. was computationally designed. This polymer demonstrated a high capacity for artemisinin (120 mg g(-1) ), quantitative recovery (87%) and was found to be an effective material for purification of artemisinin from complex plant matrix. The artemisinin quantification was conducted using an optimised HPLC-MS protocol, which was characterised by high precision and linearity in the concentration range between 0.05 and 2 μg mL(-1) . Optimisation of the purification protocol also involved screening of commercial adsorbents for the removal of waxes and other interfering natural compounds, which inhibit the crystallisation of artemisinin. As a result of a two step-purification protocol crystals of artemisinin were obtained, and artemisinin purity was evaluated as 75%. By performing the second stage of purification twice, the purity of artemisinin can be further improved to 99%. The developed protocol produced high-purity artemisinin using only a few purification steps that makes it suitable for large scale industrial manufacturing process. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. The structural flexibility of the human copper chaperone Atox1: Insights from combined pulsed EPR studies and computations.

    Science.gov (United States)

    Levy, Ariel R; Turgeman, Meital; Gevorkyan-Aiapetov, Lada; Ruthstein, Sharon

    2017-08-01

    Metallochaperones are responsible for shuttling metal ions to target proteins. Thus, a metallochaperone's structure must be sufficiently flexible both to hold onto its ion while traversing the cytoplasm and to transfer the ion to or from a partner protein. Here, we sought to shed light on the structure of Atox1, a metallochaperone involved in the human copper regulation system. Atox1 shuttles copper ions from the main copper transporter, Ctr1, to the ATP7b transporter in the Golgi apparatus. Conventional biophysical tools such as X-ray or NMR cannot always target the various conformational states of metallochaperones, owing to a requirement for crystallography or low sensitivity and resolution. Electron paramagnetic resonance (EPR) spectroscopy has recently emerged as a powerful tool for resolving biological reactions and mechanisms in solution. When coupled with computational methods, EPR with site-directed spin labeling and nanoscale distance measurements can provide structural information on a protein or protein complex in solution. We use these methods to show that Atox1 can accommodate at least four different conformations in the apo state (unbound to copper), and two different conformations in the holo state (bound to copper). We also demonstrate that the structure of Atox1 in the holo form is more compact than in the apo form. Our data provide insight regarding the structural mechanisms through which Atox1 can fulfill its dual role of copper binding and transfer. © 2017 The Protein Society.

  11. Evolutionary Aesthetics and Print Advertising

    Directory of Open Access Journals (Sweden)

    Kamil Luczaj

    2015-06-01

    Full Text Available The article analyzes the extent to which predictions based on the theory of evolutionary aesthetics are utilized by the advertising industry. The purpose of a comprehensive content analysis of print advertising is to determine whether the items indicated by evolutionists such as animals, flowers, certain types of landscapes, beautiful humans, and some colors are part of real advertising strategies. This article has shown that many evolutionary hypotheses (although not all of them are supported by empirical data. Along with these hypotheses, some inferences from Bourdieu’s cultural capital theory were tested. It turned out that advertising uses both biological schemata and cultural patterns to make an image more likable.

  12. The evolutionary psychology of hunger.

    Science.gov (United States)

    Al-Shawaf, Laith

    2016-10-01

    An evolutionary psychological perspective suggests that emotions can be understood as coordinating mechanisms whose job is to regulate various psychological and physiological programs in the service of solving an adaptive problem. This paper suggests that it may also be fruitful to approach hunger from this coordinating mechanism perspective. To this end, I put forward an evolutionary task analysis of hunger, generating novel a priori hypotheses about the coordinating effects of hunger on psychological processes such as perception, attention, categorization, and memory. This approach appears empirically fruitful in that it yields a bounty of testable new hypotheses. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Diversity-Guided Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Ursem, Rasmus Kjær

    2002-01-01

    Population diversity is undoubtably a key issue in the performance of evolutionary algorithms. A common hypothesis is that high diversity is important to avoid premature convergence and to escape local optima. Various diversity measures have been used to analyze algorithms, but so far few...... algorithms have used a measure to guide the search. The diversity-guided evolutionary algorithm (DGEA) uses the wellknown distance-to-average-point measure to alternate between phases of exploration (mutation) and phases of exploitation (recombination and selection). The DGEA showed remarkable results...

  14. An object-oriented computational model for combined cycle cogeneration analysis; Um modelo computacional para analise de ciclos combinados para projetos de sistemas de cogeracao

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Alexandre M. da; Balestieri, Jose A.P.; Magalhaes Filho, Paulo [UNESP, Guaratingueta, SP (Brazil). Escola de Engenharia. Dept. de Energia]. E-mails: amarcial@uol.com.br; perella@feg.unesp.br; pfilho@feg.unesp.br

    2000-07-01

    This paper presents the use of computational resources in a simulation procedure to predict the performance of combined cycle cogeneration systems in which energetic analysis is used in the modeling. Thermal demand of a consuming process are used as the main entrance data and, associated to the performance characteristics of each component of the system, it is evaluated the influence of some parameters of the system such as thermal efficiency and global efficiency. The computational language is Visual Basic for Applications associated to an electronic sheet. Two combined cycle cogeneration schemes are pre-defined: one is composed of a gas turbine, heat recovery steam generator and a back pressure steam turbine with one extraction, in which both are connected to the different pressure level process plant; the other scheme has a difference a two extraction-condensing steam turbine instead of the back pressure one. Some illustrative graphics are generated for allowing comparison of the appraised systems. The strategy of the system simulation is obtained by carefully linking the information of various components according to the flow diagrams. (author)

  15. libNeuroML and PyLEMS: using Python to combine procedural and declarative modeling approaches in computational neuroscience.

    Science.gov (United States)

    Vella, Michael; Cannon, Robert C; Crook, Sharon; Davison, Andrew P; Ganapathy, Gautham; Robinson, Hugh P C; Silver, R Angus; Gleeson, Padraig

    2014-01-01

    NeuroML is an XML-based model description language, which provides a powerful common data format for defining and exchanging models of neurons and neuronal networks. In the latest version of NeuroML, the structure and behavior of ion channel, synapse, cell, and network model descriptions are based on underlying definitions provided in LEMS, a domain-independent language for expressing hierarchical mathematical models of physical entities. While declarative approaches for describing models have led to greater exchange of model elements among software tools in computational neuroscience, a frequent criticism of XML-based languages is that they are difficult to work with directly. Here we describe two Application Programming Interfaces (APIs) written in Python (http://www.python.org), which simplify the process of developing and modifying models expressed in NeuroML and LEMS. The libNeuroML API provides a Python object model with a direct mapping to all NeuroML concepts defined by the NeuroML Schema, which facilitates reading and writing the XML equivalents. In addition, it offers a memory-efficient, array-based internal representation, which is useful for handling large-scale connectomics data. The libNeuroML API also includes support for performing common operations that are required when working with NeuroML documents. Access to the LEMS data model is provided by the PyLEMS API, which provides a Python implementation of the LEMS language, including the ability to simulate most models expressed in LEMS. Together, libNeuroML and PyLEMS provide a comprehensive solution for interacting with NeuroML models in a Python environment.

  16. libNeuroML and PyLEMS: using Python to combine imperative and declarative modelling approaches in computational neuroscience

    Directory of Open Access Journals (Sweden)

    Michael eVella

    2014-04-01

    Full Text Available NeuroML is an XML-based model description language, which provides a powerful common data format for defining and exchanging models of neurons and neuronal networks. In the latest version of NeuroML, the structure and behavior of ion channel, synapse, cell,and network model descriptions are based on underlying definitions provided in LEMS, a domain-independent language for expressing hierarchical mathematical models of physical entities. While declarative approaches for describing models have led to greater exchange of model elements among software tools in computational neuroscience, a frequent criticism of XML-based languages is that they are difficult to work with directly. Here we describe two APIs (Application Programming Interfaces written in Python (http://www.python.org, which simplify the process of developing and modifying models expressed in NeuroML and LEMS. The libNeuroML API provides a Python object model with a direct mapping to all NeuroML concepts defined by the NeuroML Schema, which facilitates reading and writing the XML equivalents. In addition, it offers a memory-efficient, array-based internal representation, which is useful for handling large-scale connectomics data. The libNeuroML API also includes support for performing common operations that are required when working with NeuroML documents. Access to the LEMS data model is provided by the PyLEMS API, which provides a Python implementation of the LEMS language, including the ability to simulate most models expressed in LEMS. Together, libNeuroML and PyLEMS provide a comprehensive solution for interacting with NeuroML models in a Python environment.

  17. Fast method to compute scattering by a buried object under a randomly rough surface: PILE combined with FB-SA.

    Science.gov (United States)

    Bourlier, Christophe; Kubické, Gildas; Déchamps, Nicolas

    2008-04-01

    A fast, exact numerical method based on the method of moments (MM) is developed to calculate the scattering from an object below a randomly rough surface. Déchamps et al. [J. Opt. Soc. Am. A23, 359 (2006)] have recently developed the PILE (propagation-inside-layer expansion) method for a stack of two one-dimensional rough interfaces separating homogeneous media. From the inversion of the impedance matrix by block (in which two impedance matrices of each interface and two coupling matrices are involved), this method allows one to calculate separately and exactly the multiple-scattering contributions inside the layer in which the inverses of the impedance matrices of each interface are involved. Our purpose here is to apply this method for an object below a rough surface. In addition, to invert a matrix of large size, the forward-backward spectral acceleration (FB-SA) approach of complexity O(N) (N is the number of unknowns on the interface) proposed by Chou and Johnson [Radio Sci.33, 1277 (1998)] is applied. The new method, PILE combined with FB-SA, is tested on perfectly conducting circular and elliptic cylinders located below a dielectric rough interface obeying a Gaussian process with Gaussian and exponential height autocorrelation functions.

  18. Combined positron emission tomography/computed tomography (PET/CT) for clinical oncology: technical aspects and acquisition protocols

    International Nuclear Information System (INIS)

    Beyer, T.

    2004-01-01

    Combined PET/CT imaging is a non-invasive means of reviewing both, the anatomy and the molecular pathways of a patient during a quasi-simultaneous examination. Since the introduction of the prototype PET/CT in 1998 a rapid development of this imaging technology is being witnessed. The incorporation of fast PET detector technology into PET/CT designs and the routine use of the CT transmission images for attenuation correction of the PET allow for anato-metabolic whole-body examinations to be completed in less than 30 min. Thus, PET/CT imaging offers a logistical advantage to both, the patient and the clinicians since the two complementary exams - whenever clinically indicated - can be performed almost at the same time and a single integrated report can be created. Nevertheless, a number of pit-falls, primarily from the use of CT-based attenuation correction, have been identified and are being addressed through optimized acquisition protocols. It is fair to say, that PET/CT has been integrated in the diagnostic imaging arena, and in many cases has led to a close collaboration between different, yet complementary diagnostic and therapeutic medical disciplines. (orig.)

  19. Preoperative differentiation between T1a and ≥T1b gallbladder cancer: combined interpretation of high-resolution ultrasound and multidetector-row computed tomography

    International Nuclear Information System (INIS)

    Joo, Ijin; Baek, Jee Hyun; Kim, Jung Hoon; Han, Joon Koo; Choi, Byung Ihn; Lee, Jae Young; Park, Hee Sun

    2014-01-01

    To determine the diagnostic value of combined interpretation of high-resolution ultrasound (HRUS) and multidetector-row computed tomography (MDCT) for preoperative differentiation between T1a and ≥T1b gallbladder (GB) cancer. Eighty-seven patients with pathologically confirmed GB cancers (T1a, n = 15; ≥T1b, n = 72), who preoperatively underwent both HRUS and MDCT, were included in this retrospective study. Two reviewers independently determined the T-stages of the GB cancers on HRUS and MDCT using a five-point confidence scale (5, definitely T1a; 1, definitely ≥T1b). For individual modality interpretation, the lesions with scores ≥4 were classified as T1a, and, for combined modality interpretation, the lesions with all scores ≥4 in both modalities were classified as T1a. The McNemar test was used to compare diagnostic performance. The diagnostic accuracy of differentiation between T1a and ≥T1b GB cancer was higher using combined interpretation (90.8 % and 88.5 % for reviewers 1 and 2, respectively) than individual interpretation of HRUS (82.8 % and 83.9 %) or MDCT (74.7 % and 82.8 %) (P < 0.05, reviewer 1). Combined interpretations demonstrated 100 % specificity for both reviewers, which was significantly higher than individual interpretations (P < 0.05, both reviewers). Combined HRUS and MDCT interpretation may improve the diagnostic accuracy and specificity for differentiating between T1a and ≥T1b GB cancers. circle Differentiating between T1a and ≥T1b gallbladder cancer can help surgical planning. (orig.)

  20. Using Combined X-ray Computed Tomography and Acoustic Resonance to Understand Supercritical CO2 Behavior in Fractured Sandstone

    Science.gov (United States)

    Kneafsey, T. J.; Nakagawa, S.

    2015-12-01

    Distribution of supercritical (sc) CO2 has a large impact on its flow behavior as well as on the properties of seismic waves used for monitoring. Simultaneous imaging of scCO2 distribution in a rock core using X-ray computed tomography (CT) and measurements of seismic waves in the laboratory can help understand how the distribution evolves as scCO2 invades through rock, and the resulting seismic signatures. To this end, we performed a series of laboratory scCO2 core-flood experiments in intact and fractured anisotropic Carbon Tan sandstone samples. In these experiments, we monitored changes in the CO2 saturation distribution and sonic-frequency acoustic resonances (yielding both seismic velocity and attenuation) over the course of the floods. A short-core resonant bar test system (Split-Hopkinson Resonant Bar Apparatus) custom fit into a long X-ray transparent pressure vessel was used for the seismic measurements, and a modified General Electric medical CT scanner was used to acquire X-ray CT data from which scCO2 saturation distributions were determined. The focus of the experiments was on the impact of single fractures on the scCO2 distribution and the seismic properties. For this reason, we examined several cases including 1. intact, 2. a closely mated fracture along the core axis, 3. a sheared fracture along the core axis (both vertical and horizontal for examining the buoyancy effect), and 4. a sheared fracture perpendicular to the core axis. For the intact and closely mated fractured cores, Young's modulus declined with increasing CO2 saturation, and attenuation increased up to about 15% CO2 saturation after which attenuation declined. For cores having wide axial fractures, the Young's modulus was lower than for the intact and closely mated cases, however did not change much with CO2 pore saturation. Much lower CO2 pore saturations were achieved in these cases. Attenuation increased more rapidly however than for the intact sample. For the core

  1. Haemodynamic imaging of thoracic stent-grafts by computational fluid dynamics (CFD): presentation of a patient-specific method combining magnetic resonance imaging and numerical simulations.

    Science.gov (United States)

    Midulla, Marco; Moreno, Ramiro; Baali, Adil; Chau, Ming; Negre-Salvayre, Anne; Nicoud, Franck; Pruvo, Jean-Pierre; Haulon, Stephan; Rousseau, Hervé

    2012-10-01

    In the last decade, there was been increasing interest in finding imaging techniques able to provide a functional vascular imaging of the thoracic aorta. The purpose of this paper is to present an imaging method combining magnetic resonance imaging (MRI) and computational fluid dynamics (CFD) to obtain a patient-specific haemodynamic analysis of patients treated by thoracic endovascular aortic repair (TEVAR). MRI was used to obtain boundary conditions. MR angiography (MRA) was followed by cardiac-gated cine sequences which covered the whole thoracic aorta. Phase contrast imaging provided the inlet and outlet profiles. A CFD mesh generator was used to model the arterial morphology, and wall movements were imposed according to the cine imaging. CFD runs were processed using the finite volume (FV) method assuming blood as a homogeneous Newtonian fluid. Twenty patients (14 men; mean age 62.2 years) with different aortic lesions were evaluated. Four-dimensional mapping of velocity and wall shear stress were obtained, depicting different patterns of flow (laminar, turbulent, stenosis-like) and local alterations of parietal stress in-stent and along the native aorta. A computational method using a combined approach with MRI appears feasible and seems promising to provide detailed functional analysis of thoracic aorta after stent-graft implantation. • Functional vascular imaging of the thoracic aorta offers new diagnostic opportunities • CFD can model vascular haemodynamics for clinical aortic problems • Combining CFD with MRI offers patient specific method of aortic analysis • Haemodynamic analysis of stent-grafts could improve clinical management and follow-up.

  2. Simulation study on the operating characteristics of the heat pipe for combined evaporative cooling of computer room air-conditioning system

    International Nuclear Information System (INIS)

    Han, Zongwei; Zhang, Yanqing; Meng, Xin; Liu, Qiankun; Li, Weiliang; Han, Yu; Zhang, Yanhong

    2016-01-01

    In order to improve the energy efficiency of air conditioning systems in computer rooms, this paper proposed a new concept of integrating evaporative cooling air-conditioning system with heat pipes. Based on a computer room in Shenyang, China, a mathematical model was built to perform transient simulations of the new system. The annual dynamical performance of the new system was then compared with a typical conventional computer room air-conditioning system. The result showed that the new integrated air-conditioning system had better energy efficiency, i.e. 31.31% reduction in energy consumption and 29.49% increase in COP (coefficient of performance), due to the adoption of evaporative condenser and the separate type heat pipe technology. Further study also revealed that the incorporated heat pipes enabled a 36.88% of decrease in the operation duration of the vapor compressor, and a 53.86% of reduction for the activation times of the compressor, which could lead to a longer lifespan of the compressor. The new integrated evaporative cooling air-conditioning system was also tested in different climate regions. It showed that the energy saving of the new system was greatly affected by climate, and it had the best effect in cold and dry regions like Shenyang with up to 31.31% energy saving. In some warm and humid climate regions like Guangzhou, the energy saving could be achieved up to 13.66%. - Highlights: • A novel combined air-conditioning system of computer room is constructed. • The performance of the system and conventional system is simulated and compared. • The applicability of the system in different climate regions is investigated.

  3. Evolutionary Psychology and Intelligence Research

    Science.gov (United States)

    Kanazawa, Satoshi

    2010-01-01

    This article seeks to unify two subfields of psychology that have hitherto stood separately: evolutionary psychology and intelligence research/differential psychology. I suggest that general intelligence may simultaneously be an evolved adaptation and an individual-difference variable. Tooby and Cosmides's (1990a) notion of random quantitative…

  4. Darwinian foundations for evolutionary economics

    NARCIS (Netherlands)

    Stoelhorst, J.W.

    2008-01-01

    This paper engages with the methodological debate on the contribution of Darwinism to Veblen's (1898) evolutionary research program for economics. I argue that ontological continuity, generalized Darwinism, and multi-level selection are necessary building blocks for an explanatory framework that can

  5. Ernst Mayr and Evolutionary Biology

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 7. Polemics and Synthesis: Ernst Mayr and Evolutionary Biology. Renee M Borges. General Article Volume 10 Issue 7 July 2005 pp 21-33. Fulltext. Click here to view fulltext PDF. Permanent link:

  6. Evolutionary Biology Research in India

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 10. Evolutionary Biology Research in India. Information and Announcements Volume 5 Issue 10 October 2000 pp 102-104. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/005/10/0102-0104 ...

  7. Realism, Relativism, and Evolutionary Psychology

    NARCIS (Netherlands)

    Derksen, M.

    Against recent attempts to forge a reconciliation between constructionism and realism, I contend that, in psychology at least, stirring up conflict is a more fruitful strategy. To illustrate this thesis, I confront a school of psychology with strong realist leanings, evolutionary psychology, with

  8. Ancient Biomolecules and Evolutionary Inference

    DEFF Research Database (Denmark)

    Cappellini, Enrico; Prohaska, Ana; Racimo, Fernando

    2018-01-01

    Over the last decade, studies of ancient biomolecules-particularly ancient DNA, proteins, and lipids-have revolutionized our understanding of evolutionary history. Though initially fraught with many challenges, the field now stands on firm foundations. Researchers now successfully retrieve nucleo...

  9. Evolutionary trends in directional hearing

    DEFF Research Database (Denmark)

    Carr, Catherine E; Christensen-Dalsgaard, Jakob

    2016-01-01

    Tympanic hearing is a true evolutionary novelty that arose in parallel within early tetrapods. We propose that in these tetrapods, selection for sound localization in air acted upon pre-existing directionally sensitive brainstem circuits, similar to those in fishes. Auditory circuits in birds...

  10. Evolutionary dynamics of mammalian karyotypes

    Directory of Open Access Journals (Sweden)

    Carlo Alberto Redi

    2012-12-01

    Full Text Available This special volume of Cytogenetic and Genome Research (edited by Roscoe Stanyon, University of Florence and Alexander Graphodatsky, Siberian division of the Russian Academy of Sciences is dedicated to the fascinating long search of the forces behind the evolutionary dynamics of mammalian karyotypes, revealed after the hypotonic miracle of the 1950s....

  11. Haldane and modern evolutionary genetics

    Indian Academy of Sciences (India)

    Brian Charlesworth

    2017-11-24

    Nov 24, 2017 ... q(t) of an allele at a locus among the gametes produced at time t, to its .... the importance of disease as an evolutionary factor, which is now a ..... VII. Selection intensity as a function of mortality rate. Proc. Camb. Philos. Soc.

  12. phyloXML: XML for evolutionary biology and comparative genomics.

    Science.gov (United States)

    Han, Mira V; Zmasek, Christian M

    2009-10-27

    Evolutionary trees are central to a wide range of biological studies. In many of these studies, tree nodes and branches need to be associated (or annotated) with various attributes. For example, in studies concerned with organismal relationships, tree nodes are associated with taxonomic names, whereas tree branches have lengths and oftentimes support values. Gene trees used in comparative genomics or phylogenomics are usually annotated with taxonomic information, genome-related data, such as gene names and functional annotations, as well as events such as gene duplications, speciations, or exon shufflings, combined with information related to the evolutionary tree itself. The data standards currently used for evolutionary trees have limited capacities to incorporate such annotations of different data types. We developed a XML language, named phyloXML, for describing evolutionary trees, as well as various associated data items. PhyloXML provides elements for commonly used items, such as branch lengths, support values, taxonomic names, and gene names and identifiers. By using "property" elements, phyloXML can be adapted to novel and unforeseen use cases. We also developed various software tools for reading, writing, conversion, and visualization of phyloXML formatted data. PhyloXML is an XML language defined by a complete schema in XSD that allows storing and exchanging the structures of evolutionary trees as well as associated data. More information about phyloXML itself, the XSD schema, as well as tools implementing and supporting phyloXML, is available at http://www.phyloxml.org.

  13. Atomic-Scale Design of Iron Fischer-Tropsch Catalysts; A Combined Computational Chemistry, Experimental, and Microkinetic Modeling Approach

    Energy Technology Data Exchange (ETDEWEB)

    Manos Mavrikakis; James Dumesic; Rahul Nabar; Calvin Bartholonew; Hu Zou; Uchenna Paul

    2008-09-29

    This work focuses on (1) searching/summarizing published Fischer-Tropsch synthesis (FTS) mechanistic and kinetic studies of FTS reactions on iron catalysts; (2) preparation and characterization of unsupported iron catalysts with/without potassium/platinum promoters; (3) measurement of H{sub 2} and CO adsorption/dissociation kinetics on iron catalysts using transient methods; (3) analysis of the transient rate data to calculate kinetic parameters of early elementary steps in FTS; (4) construction of a microkinetic model of FTS on iron, and (5) validation of the model from collection of steady-state rate data for FTS on iron catalysts. Three unsupported iron catalysts and three alumina-supported iron catalysts were prepared by non-aqueous-evaporative deposition (NED) or aqueous impregnation (AI) and characterized by chemisorption, BET, temperature-programmed reduction (TPR), extent-of-reduction, XRD, and TEM methods. These catalysts, covering a wide range of dispersions and metal loadings, are well-reduced and relatively thermally stable up to 500-600 C in H{sub 2} and thus ideal for kinetic and mechanistic studies. Kinetic parameters for CO adsorption, CO dissociation, and surface carbon hydrogenation on these catalysts were determined from temperature-programmed desorption (TPD) of CO and temperature programmed surface hydrogenation (TPSR), temperature-programmed hydrogenation (TPH), and isothermal, transient hydrogenation (ITH). A microkinetic model was constructed for the early steps in FTS on polycrystalline iron from the kinetic parameters of elementary steps determined experimentally in this work and from literature values. Steady-state rate data were collected in a Berty reactor and used for validation of the microkinetic model. These rate data were fitted to 'smart' Langmuir-Hinshelwood rate expressions derived from a sequence of elementary steps and using a combination of fitted steady-state parameters and parameters specified from the transient

  14. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  15. Hybrid soft computing systems for electromyographic signals analysis: a review.

    Science.gov (United States)

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  16. Accurate calculation of mutational effects on the thermodynamics of inhibitor binding to p38α MAP kinase: a combined computational and experimental study.

    Science.gov (United States)

    Zhu, Shun; Travis, Sue M; Elcock, Adrian H

    2013-07-09

    A major current challenge for drug design efforts focused on protein kinases is the development of drug resistance caused by spontaneous mutations in the kinase catalytic domain. The ubiquity of this problem means that it would be advantageous to develop fast, effective computational methods that could be used to determine the effects of potential resistance-causing mutations before they arise in a clinical setting. With this long-term goal in mind, we have conducted a combined experimental and computational study of the thermodynamic effects of active-site mutations on a well-characterized and high-affinity interaction between a protein kinase and a small-molecule inhibitor. Specifically, we developed a fluorescence-based assay to measure the binding free energy of the small-molecule inhibitor, SB203580, to the p38α MAP kinase and used it measure the inhibitor's affinity for five different kinase mutants involving two residues (Val38 and Ala51) that contact the inhibitor in the crystal structure of the inhibitor-kinase complex. We then conducted long, explicit-solvent thermodynamic integration (TI) simulations in an attempt to reproduce the experimental relative binding affinities of the inhibitor for the five mutants; in total, a combined simulation time of 18.5 μs was obtained. Two widely used force fields - OPLS-AA/L and Amber ff99SB-ILDN - were tested in the TI simulations. Both force fields produced excellent agreement with experiment for three of the five mutants; simulations performed with the OPLS-AA/L force field, however, produced qualitatively incorrect results for the constructs that contained an A51V mutation. Interestingly, the discrepancies with the OPLS-AA/L force field could be rectified by the imposition of position restraints on the atoms of the protein backbone and the inhibitor without destroying the agreement for other mutations; the ability to reproduce experiment depended, however, upon the strength of the restraints' force constant

  17. The State of Software for Evolutionary Biology.

    Science.gov (United States)

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  18. Towards a mechanistic foundation of evolutionary theory.

    Science.gov (United States)

    Doebeli, Michael; Ispolatov, Yaroslav; Simon, Burt

    2017-02-15

    Most evolutionary thinking is based on the notion of fitness and related ideas such as fitness landscapes and evolutionary optima. Nevertheless, it is often unclear what fitness actually is, and its meaning often depends on the context. Here we argue that fitness should not be a basal ingredient in verbal or mathematical descriptions of evolution. Instead, we propose that evolutionary birth-death processes, in which individuals give birth and die at ever-changing rates, should be the basis of evolutionary theory, because such processes capture the fundamental events that generate evolutionary dynamics. In evolutionary birth-death processes, fitness is at best a derived quantity, and owing to the potential complexity of such processes, there is no guarantee that there is a simple scalar, such as fitness, that would describe long-term evolutionary outcomes. We discuss how evolutionary birth-death processes can provide useful perspectives on a number of central issues in evolution.

  19. Applied evolutionary economics and economic geography

    NARCIS (Netherlands)

    Frenken, K.

    2007-01-01

    Applied Evolutionary Economics and Economic Geography" aims to further advance empirical methodologies in evolutionary economics, with a special emphasis on geography and firm location. It does so by bringing together a select group of leading scholars including economists, geographers and

  20. Evolutionary biology of bacterial and fungal pathogens

    National Research Council Canada - National Science Library

    Baquero, F

    2008-01-01

    ... and Evolutionary Dynamics of Pathogens * 21 Keith A. Crandall and Marcos Pérez-Losada II. Evolutionary Genetics of Microbial Pathogens 4. Environmental and Social Influences on Infectious Disea...